curl //website//
will get me the source code but from there how would I filter our every unique path and obtain the number of them?
the question:
Use cURL from your machine to obtain the source code of the "https://www.inlanefreight.com" website and filter all unique paths of that domain. Submit the number of these paths as the answer.
from the question, I do not know the meaning of "UNIQUE PATHS", but I think it means something similar to what you get from executing
$wget -p
I used this method and it worked somehow
wget --spider --recursive https://www.inlanefreight.com
this will show
Found 10 broken links.
https://www.inlanefreight.com/wp-content/themes/ben_theme/fonts/glyphicons-halflings-regular.svg
https://www.inlanefreight.com/wp-content/themes/ben_theme/fonts/glyphicons-halflings-regular.eot
https://www.inlanefreight.com/wp-content/themes/ben_theme/images/testimonial-back.jpg
https://www.inlanefreight.com/wp-content/themes/ben_theme/css/grabbing.png
https://www.inlanefreight.com/wp-content/themes/ben_theme/fonts/glyphicons-halflings-regular.woff
https://www.inlanefreight.com/wp-content/themes/ben_theme/fonts/glyphicons-halflings-regular.woff2
https://www.inlanefreight.com/wp-content/themes/ben_theme/images/subscriber-back.jpg
https://www.inlanefreight.com/wp-content/themes/ben_theme/fonts/glyphicons-halflings-regular.eot?
https://www.inlanefreight.com/wp-content/themes/ben_theme/images/fun-back.jpg
https://www.inlanefreight.com/wp-content/themes/ben_theme/fonts/glyphicons-halflings-regular.ttf
FINISHED --2020-12-06 05:34:58--
Total wall clock time: 2.5s
Downloaded: 23 files, 794K in 0.1s (5.36 MB/s)
at the bottom.
assuming 23 downloads and 10 broken links all add up to be the unique path I got 33 and it was the correct answer.
Best Answer
I used this method and it worked somehow
this will show-
-at the bottom. Now, assuming 23 downloads and 10 broken links all add up to be the unique path I got 33 and it was the correct answer.