I get wildly different real times
when I run the following command.
dd if=/dev/random bs=1k count=1
It doesn' happen for if=/dev/null
, nor does it happen for if=/dev/urandom
I've run it 500 times. Here are the general stats (per call). The times are in seconds.
Minimum Maximum Average Median
00.002 89.999 4.50402 2.275
Does anyone have any suggestions about why this may be happening?
The system is Ubuntu 10.04 desktop. Bash version is 4.1.5(1)
It also shows the similar wild fluctuations in a VirtualBox VM running the same version of Ubuntu.
Here is the actual test code
cp /dev/null "$HOME/dd-random.secs"
for ((i=100;i<=500;i++)); do
if ((i<10)) ;then zi="00$i"
elif ((i<100)) ;then zi="0$i"
else zi="$i"
fi
echo -ne "$zi\t" >>"$HOME/dd-random.secs"
exec 3>/dev/null 4>/dev/null
{ time { dd if=/dev/random bs=1k count=1; } 1>&3 2>&4; } 2>&1 |tail -n 3|tr 'm\n' '\t' |sed -re "s/([0-9])s/\1/g" >>"$HOME/dd-random.secs"
exec 3>&- 4>&-
echo >>"$HOME/dd-random.secs"
done
Best Answer
That's exactly the difference between
/dev/random
and/dev/urandom
--random
uses the entropy pool, which gathers noise from a bunch of sources and keeps track of "how much" noise is currently in the pool, sorandom
knows how much high-quality randomness it can generate. Since the entropy pool has a finite amount of noise, reading fromrandom
might need to block if there isn't enough entropy available.urandom
never blocks, but you might get "less random" data from it.From the
random(4)
man page: