This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revision Both sides next revision | ||
projects:wifi_scanner [2019/11/30 13:15] neil |
projects:wifi_scanner [2020/01/06 23:37] neil |
||
---|---|---|---|
Line 30: | Line 30: | ||
?> | ?> | ||
</code> | </code> | ||
+ | |||
+ | ===== Importing the data ===== | ||
+ | The raw tcpdump logs are pretty large and full of redundant information - for around a month of wifi scanning it stores around 128 million lines of data (18.6Gb). | ||
+ | |||
+ | I run the following code to simplify the logs to just pairs of the datetime (in YYYY-MM-DD HH:MM - I strip off the seconds) and the mac address (see below for the php code): | ||
+ | |||
+ | <code bash> | ||
+ | php trim.php tcpdump.log > trimmed_tcpdump.log | ||
+ | </code> | ||
+ | |||
+ | On my laptop, this processes the log files at around 300k lines/second - so in around 8 minutes. The resulting import file is reduced to approximately x million lines. | ||
+ | |||
+ | I created a simple mysql table to store the timestamp and mac address: | ||
+ | <code sql> | ||
+ | create table wifi_data (seen_time datetime, mac varchar(17), unique (seen_time,mac)); | ||
+ | </code> | ||
+ | |||
+ | Then I import this directly to the mysql database using the mysql client: | ||
+ | <code sql> | ||
+ | load data infile 'trimmed_tcpdump.log' into table wifi_data; | ||
+ | </code> | ||
+ | |||
+ | If you have any trouble with this command, you might want to split the file into more managable parts using ''split -l 1000000 trimmed_tcpdump.log'' | ||
+ | |||
+ | ==== trim.php ==== | ||
+ | TBC | ||
===== Analysing the data ===== | ===== Analysing the data ===== |