Splunk Search

How to split props/transforms from a standalone to an indexer clustering environment?

coltwanger
Contributor

I've got a multi-character delimited file, which looks something like this:

"27-MAY-16 04.25.26.746000 AM"|;|""|;|"Session"|;|"0"|;|""|;|"lkjsdf;lkjbxsadf;lkjwta4"|;|"0"|;|""|;|""|;|""|;|"server_type"|;|"Server"|;|"1234"|;|"-"|;|"255.255.255.255"|;|""|;|"HTTP_PolicyName"|;|""|;|""|;|"HTTP_Gateway"|;|""|;|""|;|""|;|""|;|""|;|""|;|"HTTP_PolicyName:1"|;|""|;|"Policy Description"|;|""|;|"Web Gateway"|;|"8612712380412232330"|;|""|;|""|;|"Scheme"|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""|;|""||?--END---?||"

I have a standalone installation of Splunk Enterprise for dev purposes and I created my props/transforms for this log file and got it working just fine. However, when I deploy it across my cluster and attempt to index these files in production, I am not getting any field extractions.

props.conf

[oracle]
DATETIME_CONFIG = 
INDEXED_EXTRACTIONS = csv
KV_MODE = none
NO_BINARY_CHECK = true
PREAMBLE_REGEX = 1
SHOULD_LINEMERGE = false
SEDCMD-01_change_delims_in_oracle_logs = s/\|;\|/,/g
REPORT-set_delimiters_oracle_logs = 01_delims_oracle_logs

transforms.conf

[01_delims_oracle_logs]
DELIMS = ","
FIELDS= Field1,Field2,Field3...

I have these deployed to the forwarder, Indexer cluster through a cluster-bundle, and even on the Search Head. SEDCMD runs just fine to replace the given delims to commas:

 "27-MAY-16 04.25.26.746000 AM","","Session","0","","lkjsdf;lkjbxsadf;lkjwta4","0","","","","server_type","Server","1234","-","255.255.255.255","","HTTP_PolicyName","","","HTTP_Gateway","","","","","","","HTTP_PolicyName:1","","Policy Description","","Web Gateway","8612712380412232330","","","Scheme","","","","","","","","","","","","","","",""||?--END---?||"

But absolutely no fields that I specify in transforms.conf are searchable. How should I spread these props/transforms across my environment? It works great in a standalone instance, just not when applied to a forwarder/cluster/standalone search head combo. I've tried disabling all of the props on the forwarder side and just leaving them on the Indexers, but that didn't change anything.

0 Karma

somesoni2
Revered Legend

All Event processing and Index-time field processing should be deployed to Indexer. From your config, following will go to indexer.
props.conf on Indexer

[oracle]
 DATETIME_CONFIG = 
 INDEXED_EXTRACTIONS = csv
 KV_MODE = none
 NO_BINARY_CHECK = true
 PREAMBLE_REGEX = 1
 SHOULD_LINEMERGE = false
 SEDCMD-01_change_delims_in_oracle_logs = s/\|;\|/,/g

All search time field extractions configurations should be deployed to Search Head. From your config, following will go to Search Head
props.conf on Search Head

[oracle]
 REPORT-set_delimiters_oracle_logs = 01_delims_oracle_logs

transforms.conf on Search Head

 [01_delims_oracle_logs]
 DELIMS = ","
 FIELDS= Field1,Field2,Field3...
0 Karma

newbie2tech
Communicator

Hi Somesoni2,

In a clustered search head, is there a way to get these props and tranforms,conf changes propagated from GUI? I know we need to via search head deployer do the change to master which will send it down to all search head members but wanted to know If at all there is a way to do it thru gui as I am not admin and would have to do a code deploy.

0 Karma

somesoni2
Revered Legend

What version of Splunk are you using?
If you're using 6.3+ version, you can setup field extraction using IFX which gives options for extracting delimited fields. See this
http://docs.splunk.com/Documentation/Splunk/6.3.0/Knowledge/ExtractfieldsinteractivelywithIFX

0 Karma

newbie2tech
Communicator

Hi Somesoni2,
Yes i am using field extraction[ my version is 6.6.3] however these extracted fields are only available to only the user who created the extraction inspite of making them global due to which my dashboards are not working for other users.
This is happening even in the stand alone instance where we use single sign on, i could see that the extractions are in my etc/user/myname/local transforms and props.conf , i moved them from user specific to etc/apps/search/local transforms and props.conf and it works.
I wanted to know how we can achieve the same in clustered search head environment. This problem happens only when we use delimited option to extract the fields, regex works fine.

0 Karma

somesoni2
Revered Legend

Check few things:
1) Make sure field extractions are global (they're as you said)
2) Make sure 'Everyone' has read permission on it (in the sharing permission page, those checkboxes for everyone role is selected). This may not be available if the user creating those field extraction is not, at least, power user.
3) Makre sure the app in which those field extraction are created has read permission to EveryOne (you'd need to take your Splunk admin's help to confirm)

0 Karma

newbie2tech
Communicator

Yes to all 3, does that leave me with deployment as an only option via search deployer.

0 Karma

somesoni2
Revered Legend

I would think so. But it's strange that the field extractions created from Web UI doesn't work in global mode. The person created it, is he power user? Also, you can ask your Splunk admin to look at the field extractions (since you've limited access) issue. It should work in the scenario described by you.

0 Karma

newbie2tech
Communicator

Hi Somesoni2,

I have had my admin create the extraction and make it global, that seems to be working and his entry is available in /etc/apps/search/local instead of /etc/users/myuser/search/local.

0 Karma

somesoni2
Revered Legend

Yes, all shareable knowledge objects should be in etc/apps. Artifacts in etc/users are private to that user.

0 Karma

newbie2tech
Communicator

Yes Soni, thank you for the guidance!!

0 Karma

coltwanger
Contributor

I should also mention that I have looked at both of these links and tried splitting them accordingly, but I'm obviously doing something wrong here 😉

http://docs.splunk.com/Documentation/Splunk/6.4.1/Admin/Configurationparametersandthedatapipeline
http://wiki.splunk.com/Where_do_I_configure_my_Splunk_settings%3F

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In November, the Splunk Threat Research Team had one release of new security content via the Enterprise ...

Index This | Divide 100 by half. What do you get?

November 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

❄️ Celebrate the season with our December lineup of Community Office Hours, Tech Talks, and Webinars! ...