Adobe Acrobat Pro allows the user to convert an entire website (or subset) to a PDF. Is there anything else available for OS X to do the same thing? I don't want to spend several hundred dollars.
Ideally, it would allow the user to rip a URL and all other URLs on the same "prefix" to a single PDF. For example, given the URL "http://example.com/a_web_page", it would pull "http://example.com/a_web_page/index.html", "http://example.com/a_web_page/a", "http://example.com/a_web_page/b", etc., but not "http://example.com/index.html" to a single PDF.
I have found whole books released under Creative Commons licenses as HTML pages (one per chapter, or section). I would like to capture the entire book to a single PDF to read on my iPad.
Thanks.
Best Answer
It doesn’t do “whole websites” because it would be hard for it to know “how deep” (tho I agree it could be configured in a way of: go down XX levels), but in any case, if your HTML pages contain a full chapter and assuming the books have about 20-30 chapters, is not that “bad”.
With that in mind, I have an inexpensive application that does this job, and it’s called Web Snapper.
It may help you because the price tag is $15.