Initial Security Onion Problems

I have been working on implementing Security Onion in a production environment. The two major problems that have given me headaches so far is storage on the sensor nodes, and internal networking.

Networking Problem

Security Onion uses docker to create and manage different aspects of itself. Below is a good image of the different containers that are used/spun up.

This is fine and dandy, and a great way to granularly manage and update each portion of the NIDS. However, if you are not familiar with Docker and Docker Networking in particular, you might hit a stumbling block. The Docker Bridge is configured, by default, to use the 172.17.0.0/16 network.

I found this because for some reason I could ping, ssh, https, everything to the master node; from everywhere EXCEPT my local machine. Wireshark on the master node showed that I was connecting to it, except there was no return traffic. Checking the local routing table I found that default entry for the 172.16.0.0/16 being routed to the Docker interface, instead of outbound like I needed it to.

Networking Solution

Solving this problem isn’t rocket science, and the documentation for Security Onion lay it out fairly well. Edit the file /etc/docker/daemon.json, and add the following line:

{
    "bip": "your_docker_bridge_ip/netmask"
}

That will at least allow you to get networking to work for your 172 network. However, if you need to do some more detailed work you should reference Docker’s documentation.

Storage Problem

One of the best thing about implementing Security Onion is we now have full PCAP data! That is absolutely amazing for a SMB that did not have them. Being able to pull past days of PCAP’s regarding a particular investigation and run them through analysis is awesome. However, I have quickly found that streams of PCAP’s fill up a storage solution extremely quickly. As in, when first created I was only able to store around 4 hours of PCAP’s, which isn’t long at all.

Storage Solution

There isn’t really one except adding more storage unfortunately. Even in the Security Onion documentation the problem is referenced; a life-cycle solution is recommended using TrimPCAP. In my case, I have had to add around 2 TB of storage to get anything that would really be considered helpful. My plan is to develop that life-cycle policy with TrimPCAP, and then on a daily basis shovel the trimmed files up to S3 for long-term storage. This might not assist analyst on a daily basis, but in the case of a breach or large incident, we would still have them as reference material.

Advertisement