How to download a webpage to a PDF in Terminal

pdfterminal

I am trying to find a way to download an URL and convert it into a PDF.

I have found lots of suggestions on how to turn PDFs into HTML or HTML into PDFs, but that's not the same as what I want to do.

Basically I am looking for the same result as if I had opened the page in {insert browser of choice} and selected "Print" and then PDF » Save as PDF…

But I want to be able to do it all from Terminal, automatically, so that I can feed it a handful of URLs at once and have it do it all for me.

I don't need or want recursion to additional pages.

[Note: I have tried wget --page-requisites --convert-links but it does not always actually fetch images which are loaded from a remote server. Also, assume for this execs that the URLs are not behind any sort of login requirement.]

Best Answer

Check out Wkhtmltopdf.

You can download the binary from that website. Note that you should change the file permissions to +x to make it executable.

For usage, check out http://code.google.com/p/wkhtmltopdf/wiki/Usage or execute:

wkhtmltopdf-0.9.9-OS-X.i368 --help