I want to download the source files for a webpage which is a database search engine. Using curl I'm only able to download the main html page. I would also like to download all the javascript files, css files, and php files that are linked to the webpage and mentioned in the main html page. Is this possible to do using curl/wget or some other utility?
Download all source files for a webpage
command linecurlwebwget
Best Answer
First of all, you should check with the website operator that this an acceptable use of their service. After that, you can do something like this:
-p
gets the requisites to view the page (the Javascript, CSS, etc).-k
converts the links on the page to those that can be used for local viewing.From
man wget
: