top of page
  • Writer's pictureTodd Waller

Creating, testing and optimizing Splunk search time extractions

Updated: Oct 16, 2018

Hello everyone, welcome back to Old Logs New Tricks.

I am constantly looking for ways to make Splunk searching easier, not only for myself but for others using Splunk to search as well. Ont thing I am always working on is creating search time field extractions for use in searches.

There are several ways you can do this but the easiest way is to start with a search and create an in-line extraction and then save it in the UI (User Interface). I am a fan of but feel free to use any regex tester you'd like to use.

For this exercise I'm going to be extracting a field called role from some fake data I created and test it in-line. So heres the scenario. I have a bunch of logs that come in and I need to create a search time extraction to extract the role from the host name (eventually we'll make this an index time extraction). So here are my example logs: 018-10-15T23:06:59.932649+00:00 sshd[0000]:

2018-10-15T23:06:59.899742+00:00 in.tftpd[0000]:

2018-10-15T23:06:59.585795+00:00 puppet[0000]: (Class[Info]) Would have triggered 'refresh' from 1 events

2018-10-15T23:06:59.459006+00:00 puppet[0000]: Applied catalog in 27.55 seconds

2018-10-15T23:06:59.376350+00:00 clamd[0000]: SelfCheck: Database status OK.

If I go to and paste these logs I can then create a regex to extract them:

Then you can take that back to the Splunk search UI and paste it into a search to test it like: host=* | rex field=host "\D{3}\D\d{1,4}(?<example_fieldname>...)\d{1,5}\."| dedup example_fieldname | table example_fieldname

and you should get results like:

Now that you've verified that is working you just need to save it as an extraction and set permissions and it will be done.

46 views0 comments

Recent Posts

See All


Post: Blog2_Post
bottom of page