Kimler Sidebar Menu

Kimler Adventure Pages: Journal Entries

random top 40

Laughing on Linux

Filed in:Web Dev
The Web

Laughing on Linux

November 8th, 2007  · stk

After we moved Randsco to a dedicated Linux server, I've been able to (again) use UNIX line commands via shell access. Sweet. Here's a growing reference of Linux ditties (currently just a one-line recursive, global search & replace)

It's been ages (1999) since I've worked on a Unix operating system (Sun Microsystem - Solaris). Since our recent move to a Linux web-server, I've had fun dusting off my rusty memory of line commands.

Gosh, 'Nix is such a more robust operating system than DOS or XP!

In any case, I had the need to do some fancier-than-normal stuff and thought it would be a good idea to jot the commands down, in case I needed them again. So, here's my (slowly growing) list of tested, linux-based one-offs.

 

(1) Global Search and Replace

After moving web-servers, I needed to replace a string -  http://randsco.com/directory/file.html  - with something else. It only took a few manual edits, before I realized how many files I needed to edit. What I needed was a way to perform a global search & replace, but do it across every file in the directory tree.

After hitting Google, I realized that there were a bunch of script out there, that did the trick, but this seemed to be more work than necessary? Finally, I found what I was looking for: a one-line command that performed a global (and recursive) string search and replace.

find . -type f | xargs sed -i.bak "s/oldtext/newtext/g"

This command was short, sweet and seemed to do what I wanted. It looks through every file under the directory in which it is run, searching for "oldtext" and replacing it with "newtext" and creates a backup "file.bak" for every file it changes.

I made a few test files and directories and it seemed to work. Once I did get it to work, I bent it a tad, to turn off the backup option (the cowboy that I'm learning to be).

find . -type f | xargs sed -i "s/oldtext/newtext/g"

It worked as advertised, but after running it, I realized a couple of things. (a) The command touches every file, regardless of whether it contains "oldtext" and (b) using a forward slash delimiter for SED made searching for forward slashes onerous.

Since I'd just transferred the whole of my old website to a new host, preserving the date-stamp wasn't an issue, but it might be, later on. So I searched for a solution and found this - which I have yet to test:

find . -type f | xargs grep -l 'foo' | xargs sed -i '' -e 's/foo/bar/g'

I'm not sure what the single quotes and e flag do for the sed argument (necessary?), but it's reported to work as advertised, where I found it.

Another afterthought: Though the first above commands worked fine for me, because the search string included forward slashes, the final construct looked a tad odd with all the backslash escapes (i.e., 's/http:\/\/randsco.com\/ ... /' ). I remembered that sed provides for any delimiter (you don't have to stick with forward slashes). For example, I think I could use pound signs as delimiters and the sed line would look like: 's#http://randsco.com/ ... #'  (a tad cleaner).

Source

 

(Permalink)
Views: 15460 views
5 Comments · GuestBook
default pin-it button
Updated: 21-Nov-2008
Web View Count: 15460 viewsLast Web Update: 21-Nov-2008

Your Two Sense:

XHTML tags allowed. URLs & such will be converted to links.


Subscribe to Comments

Auto convert line breaks to <br />

1.flag Sieg Comment
11/17/07
Just checking in. Funny I was just looking at my Toshiba laptop and thinking about turning it into a dual boot rig again. The thinkpad is working like a champ with XP on it, (I hate Vista) so I'll leave this one alone for now.
2.flag Maathieu Comment
11/21/08
find . -type f -print0 | xargs -0 [...]

will also allow processing of files containing spaces and newline characters in their names... Just my $0.02 :)
3.flag stk Comment
11/21/08
AFAIK, 'Nix hates files with either.

Tho this does remind me of a hugely funny mistake I once made. I wanted to delete all files having a certain prefix, so I meant to type in

rm prefix* [rtn]

but in my typing haste, I put in

rm prefix * [rtn]

That one extra spacebar hit cost me every file on the drive to the middle of the "P's" ... as it was that long before I realized my mistake ("Why is the HDD churning o hard?") and hit [Ctrl-C] :oops:
4.flag Richard Comment
10/02/09
Some may want to add

-maxdepth 1

to restrict the range of the find command to the current directory -- otherwise it will recurse downwards. You may want this, or not: I have my vi backups in a directory, that I don't want altered.
5.flag stk Comment
10/02/09
Good idea, Richard ... thanks!