Posts | Tags | Archive

Vim: Search and replace in multiple files

As is the way with Vim, there are a ton of features, but stumbling on the combination of commands that does what you want can be a bit difficult sometimes. In this case, the objective is to perform a search and replace over some files.

This is done in two steps: loading up the files to process, then issuing a command to run on each of the files.

Load up the files to search using the args command. This command supports multiple arguments and can use bash-style path completion.

1
:args src/*.cpp src/*.hpp README.txt

Perform a replace using sed-style syntax using the argdo command. This command iterates over all the files loaded by the args command and performs a command on them. In this case, it's performing the replace operation.

1
:argdo %s/FindMe/ReplaceWithMe/gec | update

The flags used in this case are:

  • g: global search (find more than a single occurance per line)
  • e: suppress "string not found" error messages
  • c: confirm each replace

Running update after the replace operation saves any changes to the file before moving to the next one.


How to download an entire Google Site

When using Google Sites, there is currently no way to make a backup of your site, or download the site so you can host it on another server.

This command uses a tool called wget to spider through a website and download all the public files to the local computer. Unix users will most likely have the wget tool already installed (if not, you can install it via your preferred package manager), while Windows users can get it from here.

Once wget is installed, run it with the following parameters:

1
2
3
#Downloads all public pages on a Google Site

wget -e robots=off -m -k -K -E -rH -Dsites.google.com http://sites.google.com/a/domain/site/

This tells wget to spider through all the links on your site and download the html files and linked content (such as images). Note that pages that aren't linked from anywhere on the site won't be downloaded.

This technique will also work for websites other than the ones hosted on Google Sites.


Backing up data to an external server via SSH

I recently needed to back up the contents of a website, but found that a disk quota was preventing me from doing so. What I really needed to do was find a way to compress all the files and, instead of storing the archive locally, pipe the output to another server.

After much Googling and messing about, I ended up with the following command:

1
2
3
#Uses the tar utility to backup files to an external server

tar zcvf - /path/to/backup | ssh user@server:port dd of="filename.tgz" obs=1024

Of course, this is only practical for a one-off data dump. If regular backups were needed, using rsync would be the best option, as it only transfers incremental changes. An excellent tutorial can be found here.

© Carey Metcalfe. Built using Pelican. Theme is subtle by Carey Metcalfe. Based on svbhack by Giulio Fidente.