For a final project in one of my year 2 fall semester courses at college, I had to take logs from three different sources (Windows Event Manager, Linux AuditD, and NetFlow Version 9 from a router)
Two of the log sources would be easy enough, the netflow data could be grabbed using SiLK tools (rwflowcap, rwflowpack, rwfilter and rwcut) and then parsed using a custom Golang script as the rwcut output could be treated similar to CSV formatted data. The Windows Event Manager logs were easier as they had a direct option export as CSV but came with a problem of missing a CSV header (which was easily fixed by adding it back in).
The biggest challenge, would prove to be getting the Auditd logs into a database correctly and I'll walk through the steps I went to accomplish getting this application built.
Step 1: Figure out the database set up first.
During the time working on this, I kept shifting back and forth trying to decide whether to use Sqlite or MongoDB. MongoDB would be easier to work with, but I had the requirement of setting up a database with relationships, thus in the end I ended up going with a Sqlite database using the following ERD I created:
Once the design was decided on, it was time to move onto gathering the data.
This as stated before, was the most difficult of the three log types to handle, because each audit log category had different fields, there was no easy universal way of handling it. I ended up writing a script to handle basically the following functions:
- get the current audit log
- parse it and get a unique list of each audit category type
- parse the log again, but split each type into it's own text file
- using a tool called aushape (https://github.com/Scribery/aushape), convert each text file into a JSON file.
(The entire project minus this bash script can be found at https://github.com/FM1337/audit_logs_app)
from here once I had the JSON, I had to use another tool called gojson (https://github.com/ChimeraCoder/gojson) to generate golang structures for the Linux logs based off the different JSON files.
Once I had that out of the way, the rest of the project was easy.
After I had the Linux logs figured out, I was able to move onto other log types, and the next I decided to tackle were the router logs.
This was easy to do, I simply set up a pfSense box, and began collecting NetFlow using softflowd and forwarding it to a virtual machine running the SiLK toolset (https://tools.netsa.cert.org/silk/). From there I did some queries using rwfilter and outputting it using rwcut.
I then took the output from rwcut and wrote a golang application called silk2sqlite (https://github.com/FM1337/silk2sqlite), this tool would simply take the output and insert it into an sqlite3 database.
The Windows Logs were handled similarly as the Router logs were, I took the silk2sqlite code I had built and modified it to handle Windows logs (you can find the updated code at audit_logs_app GitHub repo)
The format this time was CSV but it was still easy to work with but it had an odd bug; there was a missing CSV header column from the exported data, this was easily fixed by just adding it back in.
The Final Result
All in all, I say the final result for the time I had, turned out pretty well. In the end I had imported over 2.6 million logs (most of them being from Linux)
Since I had the data, I might as well do some visualizing of it:
For the Linux Logs I could actually get the different types displayed easily in a table
The router logs were a lot of fun to work with
And like before, displaying the information in the table was quite easy.
I decided to go the extra step and make it do an IP information lookup when you click either the source IP or destination IP
Finally we have the Windows Logs
In terms of visualization it's the weakest out of the three (due to time constraints I had at the time) but it still has interesting data in the table
And when you click the "View Description" button:
I'm extremely pleased with how this project turned out and I plan to expand on it in the future, maybe even turn it into a full-blown SIEM.