© "Security is key and Devops is life" ~ Antonio Cheltenham

Okta -> Kafka -> Splunk... Options Galore!

September 4, 2017

Objective: Build a Proof of Concept to pull events from Okta and feed them into Splunk.

 

We are going to perform the act of pulling events from Okta api, send them through a Kafka cluster and finally pulling them in to Splunk for long term storage and analysis.  

 

The starting point of this journey begins with understanding the Okta events and system logs APIs. The system logs api provides more functionality than the events api so it would be advisable to have an understanding of what type of information you need to obtain and leverage the relevant API.

 

For this I used postman, this allowed me to retrieve the event data quickly so I can get a better understanding on if it is the correct information I need, the structure structure the data and fields used.

 

Now that I have that in order let's spin up some systems and try some shit. Here is the flow we are trying to achieve. 

Events will be pulled from Okta using a Kafka producer, feeding the events into a specific topic, which we will define. From there we will configure kakfa addon for Splunk where we will subscribe to the topic and hopefully if we configure it correctly we should see the events showing up in Splunk.

 

Please note for larger implementations I would suggest using kafka-connect vs Kafka Addon. (Your home-work is to find out why this is the case.) Hint, it is key decision based on your final design and requirements. 

 

As you would know from my previous post this will all be done using containers. Unfortunately my kubernetes skillz still weak so bare with me, I will be using docker exclusively for now. A kube update will come shortly.

 

Here is what I did:

  • Create a docker network, I called mine `splunk`

  • Spin up a Splunk docker image 

  • Spin up a Kafka docker image 

  • Wrote a Kafka producer  - Documentation

  • Install Ruby on the Kafka container (remember this is a POC, in prod you would separate this and I am writing the producer in ruby)

  • Add the Splunk Kafka-addon to the Splunk instance 

  • Configure the Kafka addon and also configure the instance to as a heavy forwarder. 

  • Install cron and set up a job to run your Ruby producer script every 5min.

 

Now you might me wondering why even use Kafka, why not just push the events directly to Splunk? The reason I would want to do this, is for flexibility. You never want to design an implementation where you can potentially make it difficult to integrate with other solutions especially when you have the potential of multiple solutions or departments utilizing the same data for different purposes. For example, with this configuration I can potentially leverage the OktaLogs topic for the following requirements:

 

  • Long term storage - Cassandra DB / S3 etc- Satisfy any regulatory obligations

  • Real Time Dash Boards -  using ELK stack - SOC or NOC solutions 

  • Reporting and Investigations - Splunk ( Please note you won't need to use both Splunk and ELK but most entities have Splunk already.. and with this model especially if you are using splunk cloud you won't have to worry about using splunk to retain data for long periods of time.

  • User Behavior Analysis - Elastic and Splunk have addon for this but you have the option now of using what ever solution your hear desire. 

  • Make it easy to migrate from one solution to another

All the information I used to pull this together can be found in the links provided throughout the article. But feel free to reach out to me directly if you have questions on any part of this process. 

 

Notes:

  • I used the httparty and ruby-kafka gem to write a simple producer. See my Github link to get started but take note you, can and should expand on this for production by using the ruby-kafka documentation.  P.S I only picked up ruby 3 weeks ago so take it easy on me. 

 

  • For the scenarios I mentioned above you are not limited these solutions but I mentioned technologies base on my personal experiences .

 

  • I omitted screen shots of my POC for Security reasons

 

At the end I had the logs in splunk but now I have to deal with some field extractions so I can utilize the time-stamps within the logs. This is still work in progress and I will share as soon I find what I consider the best way accomplish this. 

 

 

 

 

 

 

 

  

 

 

Share on Facebook
Share on Twitter
Please reload

Featured Posts

Okta -> Kafka -> Splunk... Options Galore!

September 4, 2017

1/10
Please reload

Recent Posts

August 14, 2017

Please reload

Search By Tags
Please reload

Follow Us
  • Google+ Social Icon
  • LinkedIn Social Icon
  • Twitter Long Shadow