Hello everyone, welcome back to Old Logs New Tricks.
I am constantly looking for ways to make Splunk searching easier, not only for myself but for others using Splunk to search as well. Ont thing I am always working on is creating search time field extractions for use in searches.
There are several ways you can do this but the easiest way is to start with a search and create an in-line extraction and then save it in the UI (User Interface). I am a fan of regex101.com but feel free to use any regex tester you'd like to use.
For this exercise I'm going to be extracting a field called role from some fake data I created and test it in-line. So heres the scenario. I have a bunch of logs that come in and I need to create a search time extraction to extract the role from the host name (eventually we'll make this an index time extraction). So here are my example logs: 018-10-15T23:06:59.932649+00:00 secp20dtx001.domain.com sshd[0000]:
2018-10-15T23:06:59.899742+00:00 secp01cob001.domain.com in.tftpd[0000]:
2018-10-15T23:06:59.585795+00:00 secp01usp004.domain.com puppet[0000]: (Class[Info]) Would have triggered 'refresh' from 1 events
2018-10-15T23:06:59.459006+00:00 secp02api004.domain.com puppet[0000]: Applied catalog in 27.55 seconds
2018-10-15T23:06:59.376350+00:00 secp31itl006.domain.com clamd[0000]: SelfCheck: Database status OK.
If I go to regex101.com and paste these logs I can then create a regex to extract them:
Then you can take that back to the Splunk search UI and paste it into a search to test it like: host=* | rex field=host "\D{3}\D\d{1,4}(?<example_fieldname>...)\d{1,5}\."| dedup example_fieldname | table example_fieldname
and you should get results like:
Now that you've verified that is working you just need to save it as an extraction and set permissions and it will be done.
Comments