All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

We have recently switched over from one proxy to another in our organisation, when trying to put the new proxy details in the relevant add-ons like serviceNOW, cisco umbrella etc the data feeds stop,... See more...
We have recently switched over from one proxy to another in our organisation, when trying to put the new proxy details in the relevant add-ons like serviceNOW, cisco umbrella etc the data feeds stop, the Network team inform me that we need to use the CA file that they supply. Does anyone know where this needs to be installed in Splunk? I thought in /etc/auth/ but not sure how we point the config to it.
Assuming only 1 + and user id is in quotes. | rex "(?<userid>[^\"]+\+[^\"]+@[^\"]+)"  
Subsearches execute before main searches (although there are exceptions), therefore trid from the main search is not available in the subsearch. However, you could try something like this stage=it s... See more...
Subsearches execute before main searches (although there are exceptions), therefore trid from the main search is not available in the subsearch. However, you could try something like this stage=it sourcetype=some_type NOT trid="<null>" reqest="POST /as/*/auth *" [ search stage=it sourcetype=another_type | rex field=message "ID=PASSLOG_(?<trid>\d+)" | stats count by trid | fields trid ] Here I have assumed trid is numeric - if not, you should define a pattern that will allow rex to extract the trid from the message field
Try changing | stats count BY location group_name to | chart count BY location group_name then use a stacked column chart
Hello, I would like to ask if there is a way to restore splunk user password. During the deployment of UF on client splunk user has been created to deploy UF. Unfortunately this pass is not working a... See more...
Hello, I would like to ask if there is a way to restore splunk user password. During the deployment of UF on client splunk user has been created to deploy UF. Unfortunately this pass is not working anymore. How to restore password for this user ? What would happen if new version of UF 9.1.2 is deployed ? Does is it help create a new user ?  Thanks in advance
Hi There!    I would like to find the values of host that were in macro 1 but not in macro 2 search 1   `macro 1` | fields host   search 2   `macro 2` | fields host   macro ... See more...
Hi There!    I would like to find the values of host that were in macro 1 but not in macro 2 search 1   `macro 1` | fields host   search 2   `macro 2` | fields host   macro 1 host a b c d macro 2 host a b e f Result Count - 2 because host c and d were not in macro 2 Thanks in Advance!
Yes, I have already created output.conf file and added the required info. It is placed under the etc/system/local/ folder. [tcpout] defaultGroup = default-autolb-group indexAndForward = 0 nego... See more...
Yes, I have already created output.conf file and added the required info. It is placed under the etc/system/local/ folder. [tcpout] defaultGroup = default-autolb-group indexAndForward = 0 negotiateProtocolLevel = 0 sslCommonNameToCheck = *.<<stack>>.splunkcloud.com sslVerifyServerCert = true useClientSSLCompression = true [tcpout-server://inputs1.<<stack>>.splunkcloud.com:9997] [tcpout-server://inputs2.<<stack>>.splunkcloud.com:9997] [tcpout-server://inputs14.align.splunkcloud.com:9997] [tcpout:default-autolb-group] disabled = false server = 54.85.90.105:9997, inputs2.<<stack>>.splunkcloud.com:9997, inputs3.<<stack>>.splunkcloud.com:9997, ..... inputs15.<<stack>>.splunkcloud.com:9997 [tcpout-server://inputs15.<<stack>>.splunkcloud.com:9997] sslCommonNameToCheck = *.<<stack>>.splunkcloud.com sslVerifyServerCert = false sslVerifyServerName = false useClientSSLCompression = true autoLBFrequency = 120 [tcpout:scs] disabled=1 server = stack.forwarders.scs.splunk.com:9997 compressed = true
Hello Splunkers!! index=messagebus "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Entry*" OR "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*... See more...
Hello Splunkers!! index=messagebus "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Entry*" OR "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Exit*" | stats count by "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName" |fields - _raw | fields AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName | rex field=AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName "(?<location>Aisle\d+)" | fields - AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName |strcat "raw" "," location group_name | stats count BY location group_name   Current visualisation I am getting by above search in column chart:      I want to obtain below visualization. Please guide me what changes I need to used in my current SPL to obtain below visualization.    
Hi, Thank you very much for your help. Below is the final query and it is giving me the required output, however I am not able to open the events on a separate tab. index="app_cleo_db" origname="... See more...
Hi, Thank you very much for your help. Below is the final query and it is giving me the required output, however I am not able to open the events on a separate tab. index="app_cleo_db" origname="GEAC_Payroll*" | rex "\sorigname=\"GEAC_Payroll\((?<digits>\d+)\)\d{8}_\d{6}\.xml\"" | search origname="*.xml" | eval Date = strftime(_time, "%Y-%m-%d %H:00:00") | eval DateOnly = strftime(_time, "%Y-%m-%d") | transaction DateOnly, origname | timechart span=1h count | where count>0 | timewrap series=exact time_format="%d-%m-%Y" 1day | eval _time=strftime(_time, "%H:%M:%S") | sort _time
Yes, I have created output.conf file and added the required info. It is placed under etc/system/local/ folder. tcpout] defaultGroup = default-autolb-group indexAndForward = 0 negotiateProtocol... See more...
Yes, I have created output.conf file and added the required info. It is placed under etc/system/local/ folder. tcpout] defaultGroup = default-autolb-group indexAndForward = 0 negotiateProtocolLevel = 0     sslCommonNameToCheck = *.<<stack>>.splunkcloud.com sslVerifyServerCert = true useClientSSLCompression = true [tcpout-server://inputs1.<<stack>>.splunkcloud.com:9997] [tcpout-server://inputs2.<<stack>>.splunkcloud.com:9997] [tcpout-server://inputs14.align.splunkcloud.com:9997] [tcpout:default-autolb-group] disabled = false server = 54.85.90.105:9997, inputs2.<<stack>>.splunkcloud.com:9997, inputs3.<<stack>>.splunkcloud.com:9997, ..... inputs15.<<stack>>.splunkcloud.com:9997 [tcpout-server://inputs15.<<stack>>.splunkcloud.com:9997] sslCommonNameToCheck = *.<<stack>>.splunkcloud.com sslVerifyServerCert = false sslVerifyServerName = false useClientSSLCompression = true autoLBFrequency = 120 [tcpout:scs] disabled=1 server = stack.forwarders.scs.splunk.com:9997 compressed = true
And here is the solution | eval row=mvrange(0,6) | mvexpand row | addinfo | eval _time=case(row=0,info_min_time,row=1,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=2,strptime(StartTime,"%Y-%m-%d %H:%... See more...
And here is the solution | eval row=mvrange(0,6) | mvexpand row | addinfo | eval _time=case(row=0,info_min_time,row=1,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=2,strptime(StartTime,"%Y-%m-%d %H:%M:%S"),row=3,strptime(EndTime,"%Y-%m-%d %H:%M:%S"),row=4,strptime(EndTime,"%Y-%m-%d %H:%M:%S"),row=5,info_max_time) | eval value=case(row=0,0,row=1,0,row=2,1,row=3,1,row=4,0,row=5,0) | table _time, value
I am using Splunk 9.0.4 and I need to make a query where I extract data from a main search. So I am interested in results from the main search:   stage=it sourcetype=some_type NOT trid="<null>... See more...
I am using Splunk 9.0.4 and I need to make a query where I extract data from a main search. So I am interested in results from the main search:   stage=it sourcetype=some_type NOT trid="<null>" reqest="POST /as/*/auth *"   But then I need filter out results from the main search, using a subsearch that operates on a different data set, using a value from a field from the main search, let's call it trid, and trid is a string that might be part of a  value called message in a subsearch. There might be more results in the subsearch, but if there is at least one result in a subsearch then the result from the main search stays in the main search, if not it should not be included in the main search. So I am interested only in the results from the main search, and the subsearch is only used to filter out some of them that does not match.   stage=it sourcetype=some_type NOT trid="<null>" reqest="POST /as/*/auth *" | fields trid [ search stage=it sourcetype=another_type | eval matches_found=if(match(message, "ID=PASSLOG_" + trid), 1, 0) | stats max(matches_found) as matches_found ] | where matches_found>0   After a few hours I cannot figure out how to make it. What is wrong with it? Please advise.
Yes I want it to color the entire row if the importer_in_csv = 0
Given your search, you have a multi-value field - if you coloured this it would be the whole field, not just the importer that was missing. Is this what you really want?
Hi All, I have been trying to extract userids which has special characters in it but with no luck. For ex let's say a field name uid contains two userids one is "roboticts@gmail.com" and the other ... See more...
Hi All, I have been trying to extract userids which has special characters in it but with no luck. For ex let's say a field name uid contains two userids one is "roboticts@gmail.com" and the other one is "difficult+1@gmail.com". Now I want to write a query which could extract only the uid with + sign in it.  Please help on this
Hi, I've built the add-on using Add-on Builder which gathers some data from user including an API key (type o f the field is password so it replaces API key with asterisks on the input creation page)... See more...
Hi, I've built the add-on using Add-on Builder which gathers some data from user including an API key (type o f the field is password so it replaces API key with asterisks on the input creation page). During the creation of an input I can see that the API key is not encrypted an passed to new_input request as a plain text in the payload body. It only happens if the API key is valid. Is there any way to remove or hide the API key there?
OK. The question is where are you getting this token from. Because apparently it's a formatted number which indeed might cause the error.
OK. So you have your UF pointed at the Cloud inputs, not at your HF. You should set your output to your HF.
Yes, I am trying to send the data to splunk cloud. The log file i am trying to receive from UF. [root@HFNode bin]# telnet inputs2.align.<<stack>>.com 9997 Trying 54.159.30.2... Connected to i... See more...
Yes, I am trying to send the data to splunk cloud. The log file i am trying to receive from UF. [root@HFNode bin]# telnet inputs2.align.<<stack>>.com 9997 Trying 54.159.30.2... Connected to inputs2.<<stack>>.splunkcloud.com. Escape character is '^]'. ^C^C^CConnection closed by foreign host. Connected successfully.
When I use my code, I can see this error. " Error in 'where' command : The operator at ',127.542 - 0.001' is invalid. The problem code is this. | where time >= $max_value$ - 0.001  When I print "... See more...
When I use my code, I can see this error. " Error in 'where' command : The operator at ',127.542 - 0.001' is invalid. The problem code is this. | where time >= $max_value$ - 0.001  When I print "max_value"  at title, I can see that value is "315,127.542"   I think the reason this problem occurred is  ',' at the max_value.  How could I remove ',' at the max_value? And If it was not the problem, How could I solve this?