Bash – How to Get Milliseconds Since Unix Epoch

bashtimestamps

I want to do a bash script that measures the launch time of a browser for that I am using an html which gets the time-stamp on-load in milliseconds using JavaScript.

In the shell script just before i call the browser I get the time-stamp with:

date +%s

The problem is that it gets the time-stamp in seconds, and I need it in milliseconds since sometimes when ran a second time the browser starts in under a second and I need to be able to measure that time precisely using milliseconds instead of seconds.

How can I get the time-stamp in milliseconds from a bash script?

Best Answer

date +%s.%N will give you, eg., 1364391019.877418748. The %N is the number of nanoseconds elapsed in the current second. Notice it is 9 digits, and by default date will pad this with zeros if it is less than 100000000. This is actually a problem if we want to do math with the number, because bash treats numbers with a leading zero as octal. This padding can be disabled by using a hyphen in the field spec, so:

echo $((`date +%s`*1000+`date +%-N`/1000000))

would naively give you milliseconds since the epoch.

However, as Stephane Chazelas points out in comment below, that's two different date calls which will yield two slightly different times. If the second has rolled over in between them, the calculation will be an entire second off. So:

echo $(($(date +'%s * 1000 + %-N / 1000000')))

Or optimized (thanks to comments below, though this should have been obvious):

echo $(( $(date '+%s%N') / 1000000));
Related Question