All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi there, I have a requirement where I have a large number of events which was uploaded on the 4th November but that needs to be changed to 1st November after it has been indexed. Is that possible?
Good afternoon! I'm noticing that my time format in the messages I send to /services/collector/raw isn't being parsed, or even vice versa, this field isn't displayed in splunk. My field is: "event... See more...
Good afternoon! I'm noticing that my time format in the messages I send to /services/collector/raw isn't being parsed, or even vice versa, this field isn't displayed in splunk. My field is: "eventTime": "2022-10-13T18:08:30", Please tell me the correct format.
Hi, I have events which are received when action is finished on my system. Event contains start and stop time for action and unique action_id.  So my event data is something like this: ... See more...
Hi, I have events which are received when action is finished on my system. Event contains start and stop time for action and unique action_id.  So my event data is something like this: I would like to get count of ongoing actions e.g with one minute resolution over selected time frame. How to do that ?
Hello Everyone, I have a field in this format and this information is fetched from a json array. Label  apple 1 apple 2 apple 3 banana 1 banana 2 banana 3   How can I split... See more...
Hello Everyone, I have a field in this format and this information is fetched from a json array. Label  apple 1 apple 2 apple 3 banana 1 banana 2 banana 3   How can I split this in  Apples Bananas  apple 1  banana 1 apple 2 banana 2 apple 3 banana 3   I'm not able to identify what character to use in the split function.I have read various solutions on this page but none of them match this situation.  Thanks in advance for any help you provide.  
Hi,  We have recently switched from Phantom to SOAR and I'm trying to send our triggered alerts to SOAR.  The TA we are using is Splunk for SOAR Export I have tested that from Splunk Enterp... See more...
Hi,  We have recently switched from Phantom to SOAR and I'm trying to send our triggered alerts to SOAR.  The TA we are using is Splunk for SOAR Export I have tested that from Splunk Enterprise to SOAR connect and it works. But I keep getting the following error for one alert     11-04-2022 05:31:21.724 +1100 WARN sendmodalert [17285 AlertNotifierWorker-0] - action=sendtophantom - Alert action script returned error code=1 11-04-2022 05:31:21.724 +1100 INFO sendmodalert [17285 AlertNotifierWorker-0] - action=sendtophantom - Alert action script completed in duration=1394 ms with exit code=1     This question was also asked in alerting as well. https://community.splunk.com/t5/Splunk-SOAR-f-k-a-Phantom/Unable-to-add-auth-token-or-add-phantom-instance/m-p/344908 But I feel like it could be the wrong channel.    
Hello everyone,  I am trying to find out what search string I could use to see what file was created after a malicious file was ran. The malicious file is called template.pdf, but I can't seem to fi... See more...
Hello everyone,  I am trying to find out what search string I could use to see what file was created after a malicious file was ran. The malicious file is called template.pdf, but I can't seem to figure out what search string to use to see what file was created after the user opened it. 
I use some strings in data to represent months. eg "2022-1" Run this in Search:       | makeresults format=csv data="Month 2022-1 2022-1 2022-7 2022-7 2022-7 2022-8 2022-9 2022-9 2022-9 202... See more...
I use some strings in data to represent months. eg "2022-1" Run this in Search:       | makeresults format=csv data="Month 2022-1 2022-1 2022-7 2022-7 2022-7 2022-8 2022-9 2022-9 2022-9 2022-10 2022-10 2022-10 2022-10 2022-9" | stats count by Month | sort Month       You get: Now use the Splunk query in Dashboard Studio for a Bar Chart: Why is October ("2022-10") being treated as a date suddenly???? Is this Bug or a Feature? I tried changing the strings to have a leading Zero for the month. eg. "2022-01" They are ALL then treated as Dates. How do I avoid this?   Thanks for your help
Hey all! Hoping you can help. I am currently building a dashboard that will allow users to select a option from a dropdown menu, and then type in a username in order to see all events for that inpu... See more...
Hey all! Hoping you can help. I am currently building a dashboard that will allow users to select a option from a dropdown menu, and then type in a username in order to see all events for that input for that user. I am in a bind however as the dropdown has several hundred options (unfortunately no way to slim that down) and I was wondering if there was a way to quickly and painlessly add the labels and inputs from a spreadsheet I have over into the dropdown, or if I have to go through and copy each of them individually. Any help would be greatly appreciated!
I am trying to get a json formated file into splunk. The file is being forwarded from a UF with monitor, it contains data from aircrafts (ADS-B Data). This is a sample: { "now" : 1667769466.071, "... See more...
I am trying to get a json formated file into splunk. The file is being forwarded from a UF with monitor, it contains data from aircrafts (ADS-B Data). This is a sample: { "now" : 1667769466.071, "messages" : 58728034, "aircraft" : [ {"hex":"8963e3","type":"adsb_icao","flight":"UAE3KE ","r":"A6-EPT","t":"B77W","alt_baro":35000,"alt_geom":34475,"gs":526.3,"ias":281,"tas":486,"mach":0.828,"wd":241,"ws":45,"oat":-46,"tat":-15,"track":91.85,"roll":-0.35,"mag_heading":93.69,"true_heading":94.78,"baro_rate":0,"geom_rate":0,"squawk":"7313","emergency":"none","category":"A5","nav_qnh":1013.0,"nav_altitude_mcp":35008,"nav_heading":94.92,"lat":52.301067,"lon":1.596706,"nic":8,"rc":186,"seen_pos":0.864,"r_dst":186.487,"r_dir":295.3,"version":2,"nic_baro":1,"nac_p":9,"nac_v":1,"sil":3,"sil_type":"perhour","gva":2,"sda":2,"alert":0,"spi":0,"mlat":[],"tisb":[],"messages":24165,"seen":0.9,"rssi":-25.1}, {"hex":"47a531","type":"adsb_icao","flight":"NOZ7YW ","r":"LN-NGS","t":"B738","alt_baro":33000,"alt_geom":32400,"gs":468.2,"ias":258,"tas":430,"mach":0.732,"wd":221,"ws":41,"oat":-46,"tat":-22,"track":37.19,"track_rate":-0.22,"roll":-5.45,"mag_heading":35.86,"true_heading":36.99,"baro_rate":0,"geom_rate":0,"squawk":"1410","category":"A3","nav_qnh":1013.6,"nav_altitude_mcp":32992,"nav_altitude_fms":33008,"nav_heading":35.16,"lat":52.396033,"lon":1.734820,"nic":8,"rc":186,"seen_pos":10.505,"r_dst":184.026,"r_dir":297.5,"version":2,"nic_baro":1,"nac_p":9,"nac_v":1,"sil":3,"sil_type":"perhour","gva":2,"sda":2,"alert":0,"spi":0,"mlat":[],"tisb":[],"messages":8664,"seen":6.8,"rssi":-30.0}, {"hex":"484b91","type":"adsb_icao","flight":"KLM1293 ","r":"PH-BGK","t":"B737","alt_baro":40000,"alt_geom":39400,"gs":457.5,"ias":241,"tas":466,"mach":0.796,"wd":229,"ws":32,"oat":-47,"tat":-19,"track":304.94,"track_rate":-0.03,"roll":-0.18,"mag_heading":300.06,"true_heading":301.14,"baro_rate":32,"geom_rate":-64,"squawk":"6260","category":"A0","nav_qnh":1013.2,"nav_altitude_mcp":40000,"lat":53.694841,"lon":1.827527,"nic":8,"rc":186,"seen_pos":4.051,"r_dst":224.752,"r_dir":316.3,"version":0,"nac_p":8,"nac_v":0,"sil":2,"sil_type":"unknown","alert":0,"spi":0,"mlat":[],"tisb":[],"messages":4336025,"seen":2.2,"rssi":-30.0}, {"hex":"406754","type":"adsb_icao","flight":"EZY36HD ","r":"G-EZWC","t":"A320","alt_baro":38000,"alt_geom":37900,"gs":420.8,"track":320.79,"baro_rate":-256,"squawk":"5730","category":"A3","lat":49.963852,"lon":1.830091,"nic":8,"rc":186,"seen_pos":49.821,"r_dst":179.161,"r_dir":250.1,"version":2,"nac_v":1,"sil_type":"perhour","alert":0,"spi":0,"mlat":[],"tisb":[],"messages":122526,"seen":31.2,"rssi":-27.9}, {"hex":"400f99","type":"mode_s","r":"G-DBCJ","t":"A319","alt_baro":23000,"alt_geom":22625,"gs":475.2,"track":69.81,"baro_rate":0,"nac_v":1,"alert":0,"spi":0,"mlat":[],"tisb":[],"messages":11,"seen":2.3,"rssi":-31.1} ] }   How can I get every line (starting with "hex") in a seperate event and all fields extracted? Idealy the timestamp of every event is the one from the header line 1 named "now".
Hello,   Can anyone help me out with the problem of client connected to deployment server but unable to send logs of any kind. No internal logs and no monitored logs are being received at indexer... See more...
Hello,   Can anyone help me out with the problem of client connected to deployment server but unable to send logs of any kind. No internal logs and no monitored logs are being received at indexer even though phone home is happening and apps being deployed at the client.   Thank you.  
Hello, I am very new to Splunk. I am wondering how to split these two values into separate rows. The "API_Name" values are grouped but I need them separated by date. Any assistance is appreciated! ... See more...
Hello, I am very new to Splunk. I am wondering how to split these two values into separate rows. The "API_Name" values are grouped but I need them separated by date. Any assistance is appreciated! SPL:     index=... | fields source, timestamp, a_timestamp, transaction_id, a_session_id, a_api_name, api_name, API_ID | convert timeformat="%Y-%m-%d" ctime(_time) AS date | eval sessionID=coalesce(a_session_id, transaction_id) | stats values(date) as date dc(source) as cnt values(timestamp) as start_time values(a_timestamp) as end_time values(api_name) as API_Name by sessionID | where cnt>1 | eval start=strptime(start_time, "%F %T.%Q") | eval end=strptime(end_time, "%FT%T.%Q") | eval duration(ms)=abs((end-start)*1000) | stats count, perc95(duration(ms)) as 95thPercentileRespTime(ms) values(API_Name) as API_Name by date    
We are upgrading OS version to rhel 8.6 on splunk server. Would want to know what is the checklist in respect of splunk installed apps? And how check compatibility with splunk apps ?   
Hi, In splunk cloud Es SH there is a data durability error with unhealthy instances, shows the status search factor is not met. Thanks.
NONPROD:abcd123456_DBSERVER Need to extract abcd123456 from the string...
Hi , I am forwarding logs from UF -----> HF -----------> Indexer------->Search Head i am forwarding Windows Event Logs on index = os_windows  from UF to Heavyforwarder and then to indexer Do i ... See more...
Hi , I am forwarding logs from UF -----> HF -----------> Indexer------->Search Head i am forwarding Windows Event Logs on index = os_windows  from UF to Heavyforwarder and then to indexer Do i need to create index =os_windows on Heavy forwarder . If the answer is No - i wanted to check the logs on HF if the logs are sent from UF to HF. How do i need to search. Please let me Know. Thanks   
Hello Team,   My workplace bought Splunk a year ago, and I am self-learning myself with limited access to my office Splunk the tutorial I am learning provides instruction on how to install free Spl... See more...
Hello Team,   My workplace bought Splunk a year ago, and I am self-learning myself with limited access to my office Splunk the tutorial I am learning provides instruction on how to install free Splunk on my home computer, so I installed it today I got the following error within half an hour, how do I exceed the license within 30 minutes I thought the free license is for 60 days. I do see a lot of posts on the same topic sorry for the duplicate post, can someone tell me how to troubleshoot? Do I have to uninstall and reinstall again? ---------------------- Error in 'litsearch' command: Your Splunk license expired or you have exceeded your license limit too many times. Renew your Splunk license by visiting www.splunk.com/store or calling 866.GET.SPLUNK.   The search job has failed due to an error. You may be able view the job in the Job Inspector.  
Observed a peculiar case where UF in a syslog is not reading the complete log file . If for example there exists a pan log for 4th Nov with logs available for every hour in that log file . UF seems t... See more...
Observed a peculiar case where UF in a syslog is not reading the complete log file . If for example there exists a pan log for 4th Nov with logs available for every hour in that log file . UF seems to read only the first 4 hours and then stops ingesting to the cloud .The next day when new file log ie 5th Nov file is created it again starts to read that log file for couple of hours and then stops . Points to be noted : There is only one log file (2022-11-05.log)  which keeps updating as logs get pushed to the syslog from the network host . Size of the log for one day is around 500 GB plus No CRC is used in the input setting . Can you let me know what is causing the UF to stop reading the complete log file
Hi All, I am trying to install Splunk on RedHat Linux on my personal VM. I am facing issues. Kindly help. Regards Suman P.
I am trying configure Universal Forwarder to output to an HTTP Event Collector endpoint in Cribl. This Cribl endpoint has been configured for me and the admin has disabled the use of an authenticatio... See more...
I am trying configure Universal Forwarder to output to an HTTP Event Collector endpoint in Cribl. This Cribl endpoint has been configured for me and the admin has disabled the use of an authentication token. If I try to leave the httpEventCollectorToken field out or empty, I get messages like this in splunkd.log: S2S - Authtoken is empty/size invalid, token: TcpOutputProc  - _isHttpOutConfigured=NOT_CONFIGURED TcpOutputProc - LightWeightForwarder/UniversalForwarder not configured. Please configure outputs.conf. This seems to imply Universal Forwarder will not allow me to omit the token. If I try to send data to the endpoint using curl to manually generate an HTTP POST request, it all works. Can anyone shed light on this? thanks, Rob
I have query that  returns successful logins and a profile ID.   Then from the result of those I want to create another search for each result that shows the email address of the the profile ID.  ... See more...
I have query that  returns successful logins and a profile ID.   Then from the result of those I want to create another search for each result that shows the email address of the the profile ID.   First query is  index=commerce loginSuccessful=true | stats count by profile   Then I would want to do the following.   For each "profile" index=commerce "profile email!="<null>" email!=null | table profile email