I recently needed to back up the contents of a website, but found that a disk quota was preventing me from doing so. What I really needed to do was find a way to compress all the files and, instead of storing the archive locally, pipe the output to another server.
After much Googling and messing about, I ended up with the following command:
1 2 3
#Uses the tar utility to backup files to an external server tar zcvf - /path/to/backup | ssh user@server:port dd of="filename.tgz" obs=1024
Of course, this is only practical for a one-off data dump. If regular backups were needed, using rsync would be the best option, as it only transfers incremental changes. An excellent tutorial can be found here.