All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @jokertothequinn, In order to query custom indexed fields you should add them in fields.conf on search heads; [location] INDEXED=true  
This does create the field. However, it doesn't seems to be a metatag, as the field is not working with tstats for example: |tstats count where index=main location=* by sourcetype   Following e... See more...
This does create the field. However, it doesn't seems to be a metatag, as the field is not working with tstats for example: |tstats count where index=main location=* by sourcetype   Following error appears: When used for 'tstats' searches, the 'WHERE' clause can contain only indexed fields. Ensure all fields in the 'WHERE' clause are indexed. Properly indexed fields should appear in fields.conf.
WORKS! thank you  
Hi Raj I can provide you with a python script which does the extract for you and emails it per License Rule. We use it to get a weekly summary. Please note this only applies to the usage you see ... See more...
Hi Raj I can provide you with a python script which does the extract for you and emails it per License Rule. We use it to get a weekly summary. Please note this only applies to the usage you see on the License Rule page, does not include EUM, Analytics etc. But should be easy to amend the script to add that in You can DM me if you want the script
Why are you formatting the two times before performing your calculation? Subtracting one string from another doesn't give you a number!
Hi. I have to upgrade a splunk environment from Splunk 7.2.4.2 to 9.1. I don't have the option to migrate to a new cluster. The upgrade readiness app is not available for our current version. I know... See more...
Hi. I have to upgrade a splunk environment from Splunk 7.2.4.2 to 9.1. I don't have the option to migrate to a new cluster. The upgrade readiness app is not available for our current version. I know I need to go 7 to 8 and then to 9. In the order of Cluster Master/Indexer Peers..../Search Peers/Deployer/Deployment/UF's and HF's. Can anyone offer any input on what may catch me out in the process?   Thanks  
Hi, Can you help with this one? time_difference remains empty after the calculation  
Thanks @yuanliu  this is what I was looking for great !!
I couldn't exactly understand what you're not doing with the timestamp. There are many different ways to extract the timestamp from the log. If you want to capture this field additionally, you can us... See more...
I couldn't exactly understand what you're not doing with the timestamp. There are many different ways to extract the timestamp from the log. If you want to capture this field additionally, you can use \S+\s\S+. There is example regex; ^\S+\s+\S+\s\w+\s(\S+(?:\S+\s+){1}\S+)
Looking good.  But this regex not handling the special character and digit. for example date 05/0/:2024 10:11:56.000 EST
Yes, you can directly receive logs in Splunk by opening the TCP/UDP port, but this is not a recommended method by Splunk. In the correct scenario, it would be more appropriate to write logs to a file... See more...
Yes, you can directly receive logs in Splunk by opening the TCP/UDP port, but this is not a recommended method by Splunk. In the correct scenario, it would be more appropriate to write logs to a file using software like rsyslog or syslog-ng and then monitor the file.
I think there is no need where filter. you can find external ips in search filter. Can you try this ?  source="udp:514" index="syslog" sourcetype="syslog" NOT src IN (10.0.0.0/8, 172.16.0.0/12, 192.... See more...
I think there is no need where filter. you can find external ips in search filter. Can you try this ?  source="udp:514" index="syslog" sourcetype="syslog" NOT src IN (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16)
Hello Giuseppe, The app is successfully upgraded ...However , now i am getting the following error while trying to access the configuration page. "Configuration page failed to load, the server repo... See more...
Hello Giuseppe, The app is successfully upgraded ...However , now i am getting the following error while trying to access the configuration page. "Configuration page failed to load, the server reported internal errors which may indicate you do not have access to this page.  Error: Request failed with status code 500."   Kindly assist..      
Hi, According to the documentation, you can do this, but it is strongly discouraged. In the correct scenario, it would be more sensible to use a signed certificate or perform SSL forwarding on a loa... See more...
Hi, According to the documentation, you can do this, but it is strongly discouraged. In the correct scenario, it would be more sensible to use a signed certificate or perform SSL forwarding on a load balancer. I recommend against testing this in a production environment.   in $SPLUNK_HOME/etc/system/local/server.conf [sslConfig] enableSplunkdSSL = false   [sslConfig] * Set SSL for communications on Splunk back-end under this stanza name. * NOTE: To set SSL (for example HTTPS) for Splunk Web and the browser, use the web.conf file. * Follow this stanza name with any number of the following setting/value pairs. * If you do not specify an entry for each setting, the default value is used. enableSplunkdSSL = <boolean> * Enables/disables SSL on the splunkd management port (8089) and KV store port (8191). * NOTE: Running splunkd without SSL is not recommended. * Distributed search often performs better with SSL enabled. * Default: true
Okay Than, We can use regex. Can you try this ?  | makeresults | eval description = "somestring1 somestring2 somestring3 somestring4 somestring5" | rex field=description "(?<new_field>^\S+(?:\S+\s... See more...
Okay Than, We can use regex. Can you try this ?  | makeresults | eval description = "somestring1 somestring2 somestring3 somestring4 somestring5" | rex field=description "(?<new_field>^\S+(?:\S+\s+){20}\S+)" | eval new_field_replaced = replace(description,new_field,"") | eval description = replace(description,new_field,"")    
i need to truncate the string based on word count, not based on character count. it should save truncated string (start to 25 words) into new fields. Current which you have provided doing on charac... See more...
i need to truncate the string based on word count, not based on character count. it should save truncated string (start to 25 words) into new fields. Current which you have provided doing on character count basis.
Hi @shakti , check the read grants that the user that's running Splunk has on the files to read and the execution grants. And compare them with the old ones. Ciao. Giuseppe
Hello, You can try this. | makeresults | eval test = "somestring1somestring2somestring3" | eval new_field = if(len(test)>20,substr(test,20,len(test)),null())
How can i Truncate the log description after 20 words in splunk and store in new field.
Hi Splunk experts, We have Splunk enterprise which is running on Linux. Is there any option that we can disable or skip secure inter-splunk communication for REST APIs? Please suggest me Thank you... See more...
Hi Splunk experts, We have Splunk enterprise which is running on Linux. Is there any option that we can disable or skip secure inter-splunk communication for REST APIs? Please suggest me Thank you in advance. Regards, Eshwar