This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
| 
                    projects:load_logger [2020/04/19 21:13] neil  | 
                
                    projects:load_logger [2020/11/18 06:17] (current) admin  | 
            ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| + | **Retired project**.  This had it's uses for some of my more esoteric systems (that only had busybox for example) but I've sinced moved to an infrastructure that all supports [[https://collectd.org/|collectd]]. | ||
| + | |||
| ====== Load logger ====== | ====== Load logger ====== | ||
| - | A simple bash script to log the datetime and system load to a file every minute.  I wanted something that didn't rely on having any other programs installed. | + | A simple bash script to log the datetime and system load to a file every minute.  I wanted something that didn't rely on having any other programs installed (though it does rely on the /proc/ filesystem and some basic utilities like free, ps, awk, bc). | 
| It saves data in the following format: Server ID, Date/Time, CPU usage (%), CPU max, Memory Total (Mb), Memory Used (Mb), Memory Free (Mb), Memory Shared (Mb), Memory Buffered/Cached (Mb), Memory Available (Mb), 1 minute load average, 5 minute load average, 15 minute load average, process count, network usage (one column per interface in the format [interface name, received bytes, transmitted bytes]) | It saves data in the following format: Server ID, Date/Time, CPU usage (%), CPU max, Memory Total (Mb), Memory Used (Mb), Memory Free (Mb), Memory Shared (Mb), Memory Buffered/Cached (Mb), Memory Available (Mb), 1 minute load average, 5 minute load average, 15 minute load average, process count, network usage (one column per interface in the format [interface name, received bytes, transmitted bytes]) | ||
| Line 23: | Line 25: | ||
| SERVER1,2020-04-19 17:42:13,17.1,400,15929,3401,6050,545,6477,11664,0.28,0.24,0.26,798,eno1: 0 0,lo: 43147702 0,pan1: 0 0,wlp2s0: 29433068499 0 | SERVER1,2020-04-19 17:42:13,17.1,400,15929,3401,6050,545,6477,11664,0.28,0.24,0.26,798,eno1: 0 0,lo: 43147702 0,pan1: 0 0,wlp2s0: 29433068499 0 | ||
| SERVER1,2020-04-19 17:42:41,17.1,400,15929,3404,6042,551,6483,11656,0.17,0.21,0.25,795,eno1: 0 0,lo: 43148718 0,pan1: 0 0,wlp2s0: 29433079099 0 | SERVER1,2020-04-19 17:42:41,17.1,400,15929,3404,6042,551,6483,11656,0.17,0.21,0.25,795,eno1: 0 0,lo: 43148718 0,pan1: 0 0,wlp2s0: 29433079099 0 | ||
| + | </code> | ||
| + | |||
| + | You can log this to a file using cron: | ||
| + | <code bash> | ||
| + | * * * * * /home/seven/bin/save_load.sh >> /home/seven/data/load.log | ||
| </code> | </code> | ||
| ===== Central logging ===== | ===== Central logging ===== | ||
| - | I wrote a simple script to capture this data at a central location.  I add a CURL line to the above script to POST it to the server: | + | Alternatively, you can send this information to a central location.  I wrote a quick script to save this to a database then used CURL to POST it to the server. I swapped the echo line from the script above with something like this: | 
| <code bash> | <code bash> | ||
| curl --request POST "https://myserver/" --data-urlencode "data=$OUTPUT" | curl --request POST "https://myserver/" --data-urlencode "data=$OUTPUT" | ||
| </code> | </code> | ||
| - | |||
| - | Then use cron to have it run/POST every minute. | ||