Mac OS X hack: download multiple files with curl

wget is an incredibly useful GNU tool on Linux. Unfortunately, it doesn’t come with OS X (as of Mountain Lion). OS X includes curl, which is a very handy tool but lacks at least one important feature of wget: the ability to use wildcards to get multiple files at the same time. For example, let’s say you want todownload a subset of files from an FTP server. With wget, you could type:

wget wget ftp://ftp.dos.state.fl.us/public/doc/cor/0102*.dat

Here is how to mimic that process with curl and a few UNIX command-line tricks. (Update: the site I used in this example five years ago is no longer working).


1. Download the directory listing and save it in a file.

curl -L http://www.wiz-worx.com/iges5x/wysiwyg/igs > index.html

2. Use grep with regular expressions to parse the .html file, extract the .igs file names and save them in a text file.

rep -o 'f[0-9]*x.igs' index.html > iges_file_list.txt

3. Use a bash loop to iterate over the text file and fetch each file with curl.

while read line; do
curl -O http://www.wiz-worx.com/iges5x/wysiwyg/igs/$line;
done < iges_file_list.txt

Another advantage of the curl approach is that wget doesn’t support wildcard characters with HTTP URLs (only FTP).

2 thoughts on “Mac OS X hack: download multiple files with curl

Leave a Reply