I have a Python script that runs Splunk queries.
Another team at my company changed their fields to have many, many periods in them.
I updated my Python script with their new field values.
It refuses to recognize the values with periods in them as being there.
I have tried putting quotes around the field values and thought I would replace the periods with underscores and it still did not recognize them.
Any ideas why this might be happening?
I figured it out.
Its combination of using the periods in the main part of the search and underscores in the fields pipe section.
For example:
Before I was trying:
sourcetype=abc xxx.yyy.zzz.aaa | fields xxx.yyy.zzz.aaa
or
sourcetype=abc xxx_yyy_zzz_aaa | fields xxx_yyy_zzz_aaa
But the correct one that works is:
sourcetype=abc xxx.yyy.zzz.aaa | fields xxx_yyy_zzz_aaa
The only reason I can think of is that Splunk translates the periods into underscores upon running the query and maybe the fields pipe is done post query running and the other part is done before? Not sure.
I figured it out.
Its combination of using the periods in the main part of the search and underscores in the fields pipe section.
For example:
Before I was trying:
sourcetype=abc xxx.yyy.zzz.aaa | fields xxx.yyy.zzz.aaa
or
sourcetype=abc xxx_yyy_zzz_aaa | fields xxx_yyy_zzz_aaa
But the correct one that works is:
sourcetype=abc xxx.yyy.zzz.aaa | fields xxx_yyy_zzz_aaa
The only reason I can think of is that Splunk translates the periods into underscores upon running the query and maybe the fields pipe is done post query running and the other part is done before? Not sure.
Okay, when a field name in splunk contains special characters, then in the SPL you have to surround it with single quotes.
Also, in some places, you may need to escape the periods, especially if you are using anything that uses regex syntax.
What you are going to need to do is to write the simplest possible python script stub that refers to a single one of these new monsters, and tweak it until you get it working for each of the above two use cases if relevant.
Then you will know what needs to happen to all of the scripts.
As an update, I have discovered something.
Here is an example of the kpi I am trying to read in Python with the dots:
xxx.yyy.zzz.aaa
I've found if I dont include it in the search and just search by sourcetype and then in the fields list in the query, I put something like xxx_yyy*
I will get some results. (note, using underscore seems to be the only way to get fields with dots in them using python)
...so you'd think I could do : | fields xxx_yyy_zzz_aaa right?
Nope.
The first time another underscore is included, it doesnt return the field.
This isn't the best solution but its something I may be able to work with. Anyone know why when the second underscore is included, it would return the kv pair?
Also, it wont do it if I try:
| fields yyy_zzz
no results So its not the second underscore... Im not sure what it is, kinda baffled.
Here is more bizarre behavior which may point to a solution too.
When I put single quotes around the kpis with periods, I go no results back at all.
When I dont have the single quotes, I get results back. So it seems Splunk is seeing these kpis yet when I try to print them out to see them myself, they are not there.
And when I run the query with a | stats count, I get the expected number of results and when I allow it to print the _raw, I see the expected kpis with periods with the values after them. (thinking emoticon)
So the break in the processing happens from taking what the reader function in Python retrieved and printing/saving it so it can be used for calculations and reporting
I have tried the single quotes, to no avail. I guess I could try to escape the periods, but I am not using regex so I am unsure that will work. And I have taken your advice already and written a tiny query script to see if I can figure this out, yeah. Nothing works so far. I get all fields that are standard from splunk and none of the ones with the periods.
It also seems like they may have broken the ingestion. Normally, splunk will take invalid characters and replace them with underscores. If you are not seeing the fields with either the entire field including periods, the entire field including underscores, or the very last node, then ingestion has probably been broken and you should start there.
So, to be clear, when I run the query in Splunk Web, I do see the fields with the periods in their entirety with no problem so I dont think ingestion broke anything. And when I view the fields on the left hand sidebar in Splunk Web, they are listed with underscores instead of periods. I tried this in Python and it didn't recognize them.
Do you still think its possible they broke ingestion even if I can see the entire fields in Splunk Web and not when querying with Python?
As a side note, if I allow the query to retrieve the _raw field, I see the raw field, with the fields with the periods in it and the correct values but the fields (with periods) I explicitly asked for in my query I do not see listed from reader input.