Network traffic capturing is possible but messy. There are many applications running on my computer that communicate over HTML and they would fill a log up with their automated API handling and wouldn't reflect what I was visiting.
And as you correctly say, it won't show you HTTPS. The URL is an encrypted part of the request.
I would target the browser directly. These keep a decent enough history in a SQLite3 database which makes querying them pretty simple once you have the sqlite3 package installed (sudo apt-get install sqlite3
). You can simply run:
sqlite3 ~/.mozilla/firefox/*.default/places.sqlite "select url from moz_places order by last_visit_date desc limit 10;"
And that will output the last 10 URLs you visited.
Chrome has a similar setup and can be queried equally simply:
sqlite3 ~/.config/google-chrome/Default/History "select url from urls order by last_visit_time desc limit 10;"
This works but I had some database locking issues with Chrome. It seems much more reliable in Firefox. The only way around the database lock I found was to make a copy of the database. This works even when the main DB is locked up and shouldn't cause issues:
cp ~/.config/google-chrome/Default/History history.tmp
sqlite3 history.tmp "select url from urls order by last_visit_time desc limit 10;"
rm history.tmp
This approach might even be advisable for Firefox too. FF doesn't appear to lock (or takes shorter locks) but I'm not sure what would happen if you caught it mid-write.
To turn this into a live display, it's either something you would need to poll (it's not that involved a SQL query, so that might be fine) or use something like inotifywait
to monitor the database file for changes.
Best Answer
Oops. I just found the answer. You need to add a
-url
after each `-new-tab'.Now it works. Hope this can help somebody.