All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi. I have to upgrade a splunk environment from Splunk 7.2.4.2 to 9.1. I don't have the option to migrate to a new cluster. The upgrade readiness app is not available for our current version. I know... See more...
Hi. I have to upgrade a splunk environment from Splunk 7.2.4.2 to 9.1. I don't have the option to migrate to a new cluster. The upgrade readiness app is not available for our current version. I know I need to go 7 to 8 and then to 9. In the order of Cluster Master/Indexer Peers..../Search Peers/Deployer/Deployment/UF's and HF's. Can anyone offer any input on what may catch me out in the process?   Thanks  
Hi, Can you help with this one? time_difference remains empty after the calculation  
Thanks @yuanliu  this is what I was looking for great !!
I couldn't exactly understand what you're not doing with the timestamp. There are many different ways to extract the timestamp from the log. If you want to capture this field additionally, you can us... See more...
I couldn't exactly understand what you're not doing with the timestamp. There are many different ways to extract the timestamp from the log. If you want to capture this field additionally, you can use \S+\s\S+. There is example regex; ^\S+\s+\S+\s\w+\s(\S+(?:\S+\s+){1}\S+)
Looking good.  But this regex not handling the special character and digit. for example date 05/0/:2024 10:11:56.000 EST
Yes, you can directly receive logs in Splunk by opening the TCP/UDP port, but this is not a recommended method by Splunk. In the correct scenario, it would be more appropriate to write logs to a file... See more...
Yes, you can directly receive logs in Splunk by opening the TCP/UDP port, but this is not a recommended method by Splunk. In the correct scenario, it would be more appropriate to write logs to a file using software like rsyslog or syslog-ng and then monitor the file.
I think there is no need where filter. you can find external ips in search filter. Can you try this ?  source="udp:514" index="syslog" sourcetype="syslog" NOT src IN (10.0.0.0/8, 172.16.0.0/12, 192.... See more...
I think there is no need where filter. you can find external ips in search filter. Can you try this ?  source="udp:514" index="syslog" sourcetype="syslog" NOT src IN (10.0.0.0/8, 172.16.0.0/12, 192.168.0.0/16)
Hello Giuseppe, The app is successfully upgraded ...However , now i am getting the following error while trying to access the configuration page. "Configuration page failed to load, the server repo... See more...
Hello Giuseppe, The app is successfully upgraded ...However , now i am getting the following error while trying to access the configuration page. "Configuration page failed to load, the server reported internal errors which may indicate you do not have access to this page.  Error: Request failed with status code 500."   Kindly assist..      
Hi, According to the documentation, you can do this, but it is strongly discouraged. In the correct scenario, it would be more sensible to use a signed certificate or perform SSL forwarding on a loa... See more...
Hi, According to the documentation, you can do this, but it is strongly discouraged. In the correct scenario, it would be more sensible to use a signed certificate or perform SSL forwarding on a load balancer. I recommend against testing this in a production environment.   in $SPLUNK_HOME/etc/system/local/server.conf [sslConfig] enableSplunkdSSL = false   [sslConfig] * Set SSL for communications on Splunk back-end under this stanza name. * NOTE: To set SSL (for example HTTPS) for Splunk Web and the browser, use the web.conf file. * Follow this stanza name with any number of the following setting/value pairs. * If you do not specify an entry for each setting, the default value is used. enableSplunkdSSL = <boolean> * Enables/disables SSL on the splunkd management port (8089) and KV store port (8191). * NOTE: Running splunkd without SSL is not recommended. * Distributed search often performs better with SSL enabled. * Default: true
Okay Than, We can use regex. Can you try this ?  | makeresults | eval description = "somestring1 somestring2 somestring3 somestring4 somestring5" | rex field=description "(?<new_field>^\S+(?:\S+\s... See more...
Okay Than, We can use regex. Can you try this ?  | makeresults | eval description = "somestring1 somestring2 somestring3 somestring4 somestring5" | rex field=description "(?<new_field>^\S+(?:\S+\s+){20}\S+)" | eval new_field_replaced = replace(description,new_field,"") | eval description = replace(description,new_field,"")    
i need to truncate the string based on word count, not based on character count. it should save truncated string (start to 25 words) into new fields. Current which you have provided doing on charac... See more...
i need to truncate the string based on word count, not based on character count. it should save truncated string (start to 25 words) into new fields. Current which you have provided doing on character count basis.
Hi @shakti , check the read grants that the user that's running Splunk has on the files to read and the execution grants. And compare them with the old ones. Ciao. Giuseppe
Hello, You can try this. | makeresults | eval test = "somestring1somestring2somestring3" | eval new_field = if(len(test)>20,substr(test,20,len(test)),null())
How can i Truncate the log description after 20 words in splunk and store in new field.
Hi Splunk experts, We have Splunk enterprise which is running on Linux. Is there any option that we can disable or skip secure inter-splunk communication for REST APIs? Please suggest me Thank you... See more...
Hi Splunk experts, We have Splunk enterprise which is running on Linux. Is there any option that we can disable or skip secure inter-splunk communication for REST APIs? Please suggest me Thank you in advance. Regards, Eshwar  
Hello Giuseppe, Thanks you for your reply .. I tried upgrading it but i am getting the following error : Error connecting to /services/apps/local: The read operation timed out..... Could you please ... See more...
Hello Giuseppe, Thanks you for your reply .. I tried upgrading it but i am getting the following error : Error connecting to /services/apps/local: The read operation timed out..... Could you please suggest any solution?
Check this post for more technical details. https://community.splunk.com/t5/Dashboards-Visualizations/How-to-use-dynamic-checkbox-in-table/m-p/635091    
Yes @willtseng0217  You can store data KV store. You have to just take care of a couple of things. create a lookup with a unique ID that can be used for updating records. call KVStore API with th... See more...
Yes @willtseng0217  You can store data KV store. You have to just take care of a couple of things. create a lookup with a unique ID that can be used for updating records. call KVStore API with the click of a button.  display status and action button value as per the KV Store.    I hope this will help you to move further  
@kamlesh_vaghela  Sir  really thank for response so quickly May I Know , is that possible to keep status field content when dashboard refresh ? for example ,  when I click that button , write _t... See more...
@kamlesh_vaghela  Sir  really thank for response so quickly May I Know , is that possible to keep status field content when dashboard refresh ? for example ,  when I click that button , write _time, a , b , status to the lookup table or something,  once dashboard refresh , I can lookup that and keep this status content alive
@Ryan.Paredez  Thanks for your reply. I have gone thought all the post. They are mentioning about the incorrect configuration, could be possible reason of the error. The configuration look good from ... See more...
@Ryan.Paredez  Thanks for your reply. I have gone thought all the post. They are mentioning about the incorrect configuration, could be possible reason of the error. The configuration look good from our side i.e. same config is working other PODS. I have opened the support case and we are working on it. Regards, Amit Singh Bisht