This shows you the differences between two versions of the page.
Both sides previous revision Previous revision | Next revision Both sides next revision | ||
projects:wifi_scanner [2020/01/06 23:31] neil |
projects:wifi_scanner [2020/01/06 23:37] neil |
||
---|---|---|---|
Line 32: | Line 32: | ||
===== Importing the data ===== | ===== Importing the data ===== | ||
- | I import the raw tcpdump logs (just a timestamp and mac address) into a simple mysql table: | + | The raw tcpdump logs are pretty large and full of redundant information - for around a month of wifi scanning it stores around 128 million lines of data (18.6Gb). |
- | <code sql> | + | |
- | create table wifi_data (seen_time datetime, mac varchar(17), unique (seen_time,mac)); | + | |
- | </code> | + | |
- | These files are pretty large - for around a month of wifi scanning data is around 128.4 million lines of data (18.6Gb). I run the following code to simplify the logs to just pairs of the datetime (in YYYY-MM-DD HH:MM) and the mac address: | + | I run the following code to simplify the logs to just pairs of the datetime (in YYYY-MM-DD HH:MM - I strip off the seconds) and the mac address (see below for the php code): |
<code bash> | <code bash> | ||
Line 43: | Line 40: | ||
</code> | </code> | ||
- | This takes around 12 minutes which reduces the number of lines of data to around 20 million. Then I import this directly to the mysql database using the mysql client: | + | On my laptop, this processes the log files at around 300k lines/second - so in around 8 minutes. The resulting import file is reduced to approximately x million lines. |
+ | |||
+ | I created a simple mysql table to store the timestamp and mac address: | ||
+ | <code sql> | ||
+ | create table wifi_data (seen_time datetime, mac varchar(17), unique (seen_time,mac)); | ||
+ | </code> | ||
+ | |||
+ | Then I import this directly to the mysql database using the mysql client: | ||
<code sql> | <code sql> | ||
load data infile 'trimmed_tcpdump.log' into table wifi_data; | load data infile 'trimmed_tcpdump.log' into table wifi_data; | ||
</code> | </code> | ||
- | If you have any trouble with this command, you might want to split the file into more managble parts using ''split -l 1000000 trimmed_tcpdump.log'' | + | If you have any trouble with this command, you might want to split the file into more managable parts using ''split -l 1000000 trimmed_tcpdump.log'' |
==== trim.php ==== | ==== trim.php ==== |