Log Analysis involves investigating activity stored in data logs by parsing, sorting, searching, correlating and grouping data into meaningful results. Cybersecurity analyst most use automated tools to process massive system logs that can contain structured and unstructured data (e.g., server logs, network logs, access control logs, error logs, application logs, etc).

Recommended Tools:

1) Linux Commands – Quickly analyze, scan and search logs using combinations of Linux commands that can be linked together with the pipe ‘|’ statement ( e.g., Find, Grep, Cmp, Uniq, Wc , More, Less, Touch, Awk, Gawk, etc. ). See ‘Related Linux Commands’ below.

2) Excel (spreadsheet option for smaller files) – Import log into excel using an Import Wizzard which uses spaces to load column data. Then use excel filters and pivot tables. 

Example Steps (using Linux commands):

Analyze a file (example.log) , remove unneeded column, search for rows with the word ‘key_word’ and create an output file (example2.out).

  1. Identify log type (e.g., error log, access log, server log, network log, etc.)
  2. Peek at a few rows in the file to see the data layout ( $ ‘cat example.log | head‘ and ‘cat example.log | tail‘ )
  3. Identify log size ( $ ‘wc -l example1.log ‘ )
  4. Reduce the noise – Removing unneeded 3rd column from a working copy (e.g., $ cut -d’,’ -f3 example1.log > example2.log ) ; ‘-d’ = comma-delim;  ‘f3’ = 3rd column
  5. Search for desired key words and store output in a separate file ($ ‘grep ‘key_word’ example2.log > example2.out‘ )

Other Options:

  • Programming Option: – See Python references to write custom code.
  • Tools – Splunk, Hadoop, etc.
  • Other EPOCH Time Converter

References:

Related Linux Commands

The following Linux commands will be useful during a lob analysis challenge:

  • head or tail – show specified # of lines (default =10). (e.g,  ‘tail -n 3 Filename’ )
  • less – show one screen at a time
  • wc – word count (bytes, characters, words, or lines) (e.g., $ wc -l filename) ;  ‘-l’ = lines
  • grep – search for a patterns ; (e.g, $ grep -i “security” filename );   ‘-i’ flag means not case sensitive
  • pipe – combine command with “|” ;(e.g., $ grep -i “security” filename | wc -l)
  • tr –  translate characters ; (e.g, $ grep “20 Jan 2017” filename | tr ‘,’ ‘\t’);  replaces commas with tabs (denoted with ‘\t’).
  • Sort –  on a column (e.g $ sort -nr -t$’\t’ -k8 filename) ;  ‘-nr’ = numeric sort reverse order;  ‘-t$’\t’’ = delimiter is the tab (‘\t’) ;  ‘-k8’ = 8th column  
  • Cmp – Compare command (e.g., cmp -l cattos.jpg kitters.jpg | gawk ‘{printf “%c”, strtonum(0$2)}’ && echo
  • Sed – select specific lines  (e.g., $ sed ‘1 d’ filename > output.txt ); ‘1 d‘ = delete the first line.
  • Cut –  remove a column (e.g., $ cut -d’,’ -f3 filename > output.txt ) ; ‘-d’ = comma-delim;  ‘f3’ = 3rd column
  • Uniq – find uniques (e.g., $ sort filename| uniq -c > authors-sorted.txt
  • Awk – replacement tool (e.g., $ awk -F “\t” ‘{print $3 ”  ” $NF}’ jan20only.tsv ) ; -F “\t” tab-separated data ; braces execute code to print the 3rd column; $NF (the “number of fields”), and adds two spaces
Bitnami