Kippo2ElasticSearch

Kippo2ElasticSearch is a Python script to transfer data from a Kippo SSH honeypot MySQL database to an ElasticSearch instance (server or cluster). This is useful in terms of indexing and searching the dataset and makes easy to visualize important stats using Kibana.

The project also provides an exported Kibana dashboard file that you can import to your own instance and get immediate visualization results from your honeypot data. The two sample screenshots below show a portion of that dashboard with data pulled from the following honeypot test articles:

DOWNLOAD Kippo2ElasticSearch:

Kippo2ElasticSearch depends on the following Python modules: GeoIP, pony, pyes. Installing these is trivial via pip.

SCREENSHOTS (Kibana):


Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

  • iKONs

    this is interesting, kibana, sounds like from Japan 😀

  • harry

    where do I put the kippo2elasticsearch.py file ? I’ve got kippo, kibana and elasticsearch running, but if i import the json file in the webinterface from kibana, it imports an empty dashboard.

    I suspect the py file needs to be loaded into elastic search but I could find out how or whereÉ Any help ??

    • Ion

      Hi Harry.

      You don’t have to put the Python file ÒsomewhereÓ. Just edit the file and enter the correct values for your MySQL database server and ElasticSearch service and run it.

      Not the py file, but the json file instead should be uploaded to Kibana.

      Regards,
      Ion.

      • harry

        thanks for you answer, I figured it (python kippo2elasticsearch.py) out the next day, i feel rather stupid ðŸ™
        i noticed the py script loads every hack-attempt to elasticsearch everytime you run it. I made cron run the py script to keep kibana up-to-date, but after 2 days the script already takes more than 1 minute to complete (+1000 attempts in 2 days, kind of fun to watch the try haha)

        Is there a way to let the script only load the new attempts ?

      • Ion

        Hi Harry.

        Well, unless you find a way to keep state, then no. One solution could be to query ElasticSearch first and get the last doc’s ID and then use that in the SQL query and get only rows with an ID greater than that.

        But I have some news as well. I’ve created a fork of Kippo itself and added ElasticSearch support to it. So you can now save authorization attempts directly to ES. Please see my new blog post (going to publish it in some minutes) for more info.

        Regards,
        Ion

      • harry

        I get what you’re saying, but (as you probably noticed) i’m not proficient enough to implement such a construction 🙂 i’ll just set cron at a larger interval for now.

        Awesome news about your kippo fork, if i understand correctly, directly saving authorization attempts to ES should render my previous question moot, right ?

        Right now there are already 2000+ attempts (from ~30 ip’s); nearly all of them from china. Are these actual humans trying to get in, or are these just bots searching the internet for vulnerabilities ?

      • Ion

        Hi Harry,
        blog post is going up soon, but yes. You will have the ability to add your ElasticSearch instance/cluster details in Kippo’s configuration file and then every connection attempt will be automatically logged in ES much like MySQL logging works (btw, you can use them together).

        Regarding the activity you’re seeing, I can’t be sure whether it’s a human or a bot, although mostly humans are attacking the SSH service (of course using automation). Bots exploit other kind of services, for example SMB/CIFS. You can try setting up Dionaea (using Dionaea-Vagrant for example) to catch the latter.

        Regards,
        Ion

Read previous post:
Transferring Kippo’s data to ElasticSearch
Kippo-Graph 0.9.3 released, with new component: ÒKippo-IPÓ
Kippo-Graph 0.9.2, with Kippo-Playlog!
Bypassing Òclang: error: unknown argumentÓ
Using KeePass on Mac OS X
Close