Thursday, February 19, 2009

Still bitching at /dev/random's slowness

Pondering,

So I believe I have the solution to the /dev/random being so god damn slow issue. All still totally theoretical of-course, but I reckon just thinking about it intensely for a few days will reveal the solution. 

The Problem (re-iterated maybe)
/dev/random = dog slow, in fact its so slow that its not usable at all, and I need a truly good randomized data source, No mathematical algorithm can create that. It has to be gathered.

The Proposed Solution
Quite simply put, Im going to run a sniffer on my wireless NIC and dump all the data as is to file, maybe even use a few wireless cards and some physical nics and mux the data all together. Then write a very simple C++ program which will gather a integer block size value from /dev/urandom which is not true randomness but enough for this purpose, Then read that block-size worth of data from the wireless sniffed data and convert it to a sha1 fingerprint, The fingerprint should be very unique and after converting the sha1 to a binary string, you should be pretty good to go.

This would be a very fast and good way to get very good random data. A few things spring to mind that need to be taken into account though. 

Firstly, tcpdump must strip all link layer and other common elements out of the data feed, so essentially only the packet payloads should be dumped. There would still be common elements in this data but I thing the sha1 would overcome any collisions.

Secondly, You need a noisy wireless network to run this by, which is okay for me since I have at least 17 in my listening range. But most home networks are not going to be that busy. I guess you could essentially use any data source provided its not something common, maybe that family video which you had put on DVD, or your iPhoto library! 

Ill spend some time tinkering in C++ and see if I can come up with a working model and post it here.

No comments:

Post a Comment