All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

There is a join command, but it is NOT the first, second or third choice for a Splunk solution - you can and should always use stats to join data. join has limitations, requires a second search to do... See more...
There is a join command, but it is NOT the first, second or third choice for a Splunk solution - you can and should always use stats to join data. join has limitations, requires a second search to do the join data. This stats command is effectively calculating all the similar triplets by time and where there are two times (e.g. 2 days) you have a non-change event, so discard it.  
The rex command looks at every result.  If you need to exclude certain results then you'll have to craft a regular expression that does so.  Another option is to filter the start and end blocks out o... See more...
The rex command looks at every result.  If you need to exclude certain results then you'll have to craft a regular expression that does so.  Another option is to filter the start and end blocks out of the results - assuming they're not needed for other parts of the query.
Finding something that is not there is not Splunk's strong suit.  See this blog entry for a good write-up on it. https://www.duanewaddle.com/proving-a-negative/ FTR, if an <env> value is missing,... See more...
Finding something that is not there is not Splunk's strong suit.  See this blog entry for a good write-up on it. https://www.duanewaddle.com/proving-a-negative/ FTR, if an <env> value is missing, it will be absent from the stats command results, not zero.
Dashboard xml: I am using this dashboard  to Schedule PDF report, and all panels are showing data for 7 days. I need to show the time period at the top  of the report like Time Period: 01-17-2023 ... See more...
Dashboard xml: I am using this dashboard  to Schedule PDF report, and all panels are showing data for 7 days. I need to show the time period at the top  of the report like Time Period: 01-17-2023 to 01-23-2023 how can i do this??     <dashboard> <label> Dashboard title</label> <row> <panel> <title>first panel</title> <single> <search> <query>|tstats count as internal_logs where index=_internal </query> <earliest>-7d@d</earliest> <latest>@d</latest> <sampleRatio>1<sampleRatio> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </single> </panel> </row> <row> <panel> <title>second panel</title> <single> <search> <query>|tstats count as audit_logs where index=_audit </query> <earliest>-7d@d</earliest> <latest>@d</latest> <sampleRatio>1<sampleRatio> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </single> </panel> </row> <row> <panel> <title>Third panel</title> <single> <search> <query>|tstats count as main_logs where index=main </query> <earliest>-7d@d</earliest> <latest>@d</latest> <sampleRatio>1<sampleRatio> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </single> </panel> </row> </dashboard>      
Do not treat structured data as text; regex is not an appropriate tool.  I suspect that the text you posted is copied from Splunk's structured viewer, not in "RAW Text" format.  Is this correct? If ... See more...
Do not treat structured data as text; regex is not an appropriate tool.  I suspect that the text you posted is copied from Splunk's structured viewer, not in "RAW Text" format.  Is this correct? If it is the case, Splunk would have already given you a field named Properties.Activity, whose value is itself an escaped, but fully compliant JSON string. (This is not a preferred method to log data.  Developers usually resort to escaped JSON when the field has combined JSON and non-JSON content.)  All you should need to do is spath.   | spath input=Properties.Activity   Your sample data should give you these fields  ActivityStatus ActivityType ClientId Data.parentSpanId Data.pcm.name Data.pcm.user_id Data.traceId OriginCreationTimestamp Properties.Activity             COMPLETE CreateCashTransactionType 1126 88558259300b25e5 Transaction_Type_2892023143936842 2 9b57deb074fd41df69f90226cb03f499 2023-09-28T11:39:48.4840749+00:00 {"ClientId":"1126","TenantCode":"BL.Activities","ActivityType":"CreateCashTransactionType","Source":"Web Entry Form","SourcePath":null,"TenantContextId":"00-9b57deb074fd41df69f90226cb03f499-353e17ffab1a6d25-01","ActivityStatus":"COMPLETE","OriginCreationTimestamp":"2023-09-28T11:39:48.4840749+00:00","Data":{"traceId":"9b57deb074fd41df69f90226cb03f499","parentSpanId":"88558259300b25e5","pcm.user_id":2,"pcm.name":"Transaction_Type_2892023143936842"}}             If Splunk doesn't give you Properties.Activities, please click "Raw Text" in Splunk search window and post in text. The following is a partial emulation based on your sample data and my assumption.  You can play with it and compare with real data.   | makeresults | eval _raw = "{\"Properties\": { \"ActionId\": \"533b531b-3078-448f-a054-7f54240962af\", \"ActionName\": \"Pcm.ActivityLog.ActivityReceiver.Controllers.v1.ActivitiesController.Post (Pcm.ActivityLog.ActivityReceiver)\", \"Activity\": \"{\\\"ClientId\\\":\\\"1126\\\",\\\"TenantCode\\\":\\\"BL.Activities\\\",\\\"ActivityType\\\":\\\"CreateCashTransactionType\\\",\\\"Source\\\":\\\"Web Entry Form\\\",\\\"SourcePath\\\":null,\\\"TenantContextId\\\":\\\"00-9b57deb074fd41df69f90226cb03f499-353e17ffab1a6d25-01\\\",\\\"ActivityStatus\\\":\\\"COMPLETE\\\",\\\"OriginCreationTimestamp\\\":\\\"2023-09-28T11:39:48.4840749+00:00\\\",\\\"Data\\\":{\\\"traceId\\\":\\\"9b57deb074fd41df69f90226cb03f499\\\",\\\"parentSpanId\\\":\\\"88558259300b25e5\\\",\\\"pcm.user_id\\\":2,\\\"pcm.name\\\":\\\"Transaction_Type_2892023143936842\\\"}}\" }}" | spath ``` data emulation above ```    
OK I managed to figure out a couple of my issues.  The error: message="error:0906D06C:PEM routines:PEM_read_bio:no start line was as I discovered. I combined the key and the cert into a new file and... See more...
OK I managed to figure out a couple of my issues.  The error: message="error:0906D06C:PEM routines:PEM_read_bio:no start line was as I discovered. I combined the key and the cert into a new file and it worked.  LDAP is still an issue. I was able to disable it to fix our local admin password. Any time I enable TLS in LDAP though, I get errors: ERROR ScopedLDAPConnection [1750367 TcpChannelThread] - strategy="ldapserver.com" Error binding to LDAP. reason="Can't contact LDAP server" ERROR UiAuth [1750367 TcpChannelThread] - user=<username> action=login status=failure session= reason=user-initiated user I tried both the LDAP cert and the combined cert I created. Not sure what I'm missing.
I've got the following query to detect that a worker instance of mine is actually doing what it's supposed to on a regular basis. If it doesn't in a particular environment, the query won't return a r... See more...
I've got the following query to detect that a worker instance of mine is actually doing what it's supposed to on a regular basis. If it doesn't in a particular environment, the query won't return a row for that environment. I thought perhaps I could join the results with a literal dataset of environments, to ensure there is a row for each environment, but despite looking over the documentation, I can't find a way to make the join work. Admittedly, I'm new to Splunk querying, so might be missing something obvious, or there might be some other way of doing this without `join`.   | mstats sum(worker.my_metric) AS my_metric WHERE index="service_metrics" AND host=my-worker-* earliest=-2h BY host | eval env = replace(host, "^my-worker-(?<env>[^-]+)$", "\1") | stats sum(my_metric) AS my_metric BY env | eval active = IF(my_metric > 0, "yes", "no") | join type=right left=M right=E WHERE M.env = E.env from [{ env: "dev" }, { env: "beta" }, { env: "prod" }]      
what worked for you. please explain and help
Hello, everyone. I just ran into an issue where a stanza within apps\SplunkUniversalForwarder\local\inputs.conf on a forwarder is overwriting other apps\AppName\local\inputs.conf  from other apps in... See more...
Hello, everyone. I just ran into an issue where a stanza within apps\SplunkUniversalForwarder\local\inputs.conf on a forwarder is overwriting other apps\AppName\local\inputs.conf  from other apps in the apps folder. I would like to either disable this app, or delete the \SplunkUniversalForwarder\local folder or delete the stanza. The problem is that this has happened on multiple hosts and I need an automated method of doing this. Does anyone have an idea so that this default app that I don't even want to touch doesn't overwrite my own actually used apps? Thanks
Thanks, ITWhisperer. Perhaps I didn't ask my question clearly enough. I was looking for something like this: [\S\s\n\t\r\n\f.]+ It may be redundant, but it seems to work.
Is this still the case with 9.1.2? I'm getting the same error though I don't have privKeyPath listed in the server.conf file. My pem does have a password/key when I created it.
whats the difference between :: and = in splunk search. what are the benefits vs drawbacks
Edit the search and select "For each result" in the Trigger field.  
I want to get information related to writing of debug logs to Splunk from Saleforce Apex code. Can you provide us with steps or which Managed packe package or COnnector can we use for this.   Than... See more...
I want to get information related to writing of debug logs to Splunk from Saleforce Apex code. Can you provide us with steps or which Managed packe package or COnnector can we use for this.   Thanks, regards  Kr Saket
The tstats command will be faster, but processing a year of data for all hosts will still take a long time. | tstats prestats=true count where index=foo by _time,host span=1mon | timechart span=1mon... See more...
The tstats command will be faster, but processing a year of data for all hosts will still take a long time. | tstats prestats=true count where index=foo by _time,host span=1mon | timechart span=1mon count by host  
To find out which fields are present in the lookup and absent in the index use a subsearch, like this: | inputlookup test_data.csv where NOT [search index=test sourcetype=splunk_test_data | fields f... See more...
To find out which fields are present in the lookup and absent in the index use a subsearch, like this: | inputlookup test_data.csv where NOT [search index=test sourcetype=splunk_test_data | fields field1 field2 | format]
Nope, I am using inputs.conf.
Hello, Was a solution ever found?  I am experiencing this, a Note in an investigation is easier to read in Edit mode than after its published.  When published, it looks like one runon sentence, no s... See more...
Hello, Was a solution ever found?  I am experiencing this, a Note in an investigation is easier to read in Edit mode than after its published.  When published, it looks like one runon sentence, no spacing, no formatting. Thanks in advance! Kai
Start with (?s) | makeresults | eval _raw="ABC DEF GHI" | rex "(?s)ABC(?<middle>.*)GHI"
Just came here to say this worked for me, thanks for posting it.