backing up links locally

UPDATE: 2005.07.06: the following linked article no longer exists, but I use the curl statement below in a cronjob, and that still works great for backing up my bookmarks to a local machine.

Josh has a great post about Automatic backups with iCal and Applescript (go read that first, so you’ll know what we’re talking about). One thing he asks for a QuickSilver plug-in to search the resulting xml file, but there’s not one (yet).

There’s a way around that with’s HTML feed. Here’s how you might go about modding Josh’s script to QuickSilver-ize it:

  • FIRST: you can use curl (which is already installed under 10.3.x) instead of wget and skip his bits about installing fink and wget (though both are nice to have). The shell bit to use in place of his wget statement is:
    curl -o ~/Library/ -O '
    (you can adjust what html is actually sent back from, see the docs)
  • use QuickSilver to open the resulting delicious-bak.html (or whatever you name it) with Safari.

The end result is a big HTML page of all your links that can be searched using the Find command in Safari.

The missing step in my additions to Josh’s great idea of using wget, AppleScript, and iCal is that Safari isn’t scriptable enough to add a bookmark. Otherwise, you could continue the AppleScript and have it cycle through the Safari bookmarks.

I tried adding another “do shell script” bit to the AppleScript to copy the downloaded file to the Firefox bookmarks.html location, but Firefox didn’t treat it as a “real” bookmarks file, otherwise that could have been a solution (unless you don’t like/use Firefox).

So, my solution is by no means perfect, but it does allow you to browse your links in any browers and you can at least launch that giant page ‘o links via QuickSilver. We’re this close to having it!