Curl get all links of a web-page

curllinks

I used to utilize following command to get all links of a web-page and then grep what I want:

curl $URL 2>&1 | grep -o -E 'href="([^"#]+)"' | cut -d'"' -f2 | egrep $CMP-[0-9].[0-9].[0-9]$ | cut -d'-' -f3

It was doing great till yesterday. I tried to run curl itself and I saw it returns:

% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                               Dload  Upload   Total   Spent    Left  Speed
0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0

Was there any possible updates which causes the command not working or what?

EDIT 1:

I changed my approach to wget regarding this answer:

wget -q $URL -O - | grep -o -E 'href="([^"#]+)"' | cut -d'"' -f2 | egrep $CMP-[0-9].[0-9].[0-9]$ | cut -d'-' -f3

But still doesn't know why curl approach suddenly stopped working.

Best Answer

You can use argument -s for curl, it is for the quiet mode. It will not show progress meter or error message.

Related Question