All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Table format get changed Please see picture instead  
Hi, We set up Security Command Center to send alerts to Splunk for detecting mining activity. However, I've observed that we're not receiving SCC logs in Splunk at the moment. What steps can we ta... See more...
Hi, We set up Security Command Center to send alerts to Splunk for detecting mining activity. However, I've observed that we're not receiving SCC logs in Splunk at the moment. What steps can we take to resolve this issue? Thanks
Thank you for an update. Looks like I am missing something. Eval statements do not produce the results My SPL statement --Query------ Index=xyz  | eval evTime=strptime(agent.status.policy_refre... See more...
Thank you for an update. Looks like I am missing something. Eval statements do not produce the results My SPL statement --Query------ Index=xyz  | eval evTime=strptime(agent.status.policy_refresh_at,"%Y-%m-%dT%H:%M:%S.%6NZ") | eval UpdateDate=strftime(evTime,"%Y-%m-%d") | eval UpdateTime=strftime(evTime,"%H:%M:%S.%1N") | table agent.status.policy_refresh_at, evTime, UpdateDate, UpdateTime, hostname ----------------- agent.status.policy_refresh_at evTime UpdateDate UpdateTime          hostname 2024-01-04T10:31:35.529752Z       CN******* 2024-01-04T10:31:51.654448Z       CN******* 2023-11-26T05:57:47.775675Z       gb******** 2024-01-04T10:32:14.416359Z       cn******** 2024-01-04T10:30:32.998086Z       cn*******
Hi I want to migrate or move the Splunk instance from a Mac to a Windows Server 2019. I want to make sure this license is moved to the new machine. Is there a step-by-step process to to perform this... See more...
Hi I want to migrate or move the Splunk instance from a Mac to a Windows Server 2019. I want to make sure this license is moved to the new machine. Is there a step-by-step process to to perform this activity? Thanks.
Does this give you the intended behaviour?   index=proxy c_ip=$cip$ cs_host=$cshost$ action=$action$ (dest_ip=$destip$ OR NOT dest_ip=*)   I think including an (dest_ip=$destip$ OR NOT dest_ip=*)... See more...
Does this give you the intended behaviour?   index=proxy c_ip=$cip$ cs_host=$cshost$ action=$action$ (dest_ip=$destip$ OR NOT dest_ip=*)   I think including an (dest_ip=$destip$ OR NOT dest_ip=*) will search any token input but also include results for events that don't have a dest_ip field in them. The only issue I see with this is that if a dashboard user is looking for a specific dest_ip now then they will get results matching all the other field criteria and have null dest_ip. Maybe if you wanted to filter off the events with null dest_ip when a specific dest_ip is being searched (anything other than "*") then you could add some additional filter criteria. index=proxy c_ip=$cip$ cs_host=$cshost$ action=$action$ (dest_ip=$destip$ OR NOT dest_ip=*) | eval filter_off=if(NOT "$destip$"=="*" AND isnull(dest_ip), 1, 0) | where 'filter_off'==0  
Click on "index" in the Interesting Fields area to see the name of index containing the data.  Use that value along with "index=" in the search query. I'm not an Apache expert so I can't teach you a... See more...
Click on "index" in the Interesting Fields area to see the name of index containing the data.  Use that value along with "index=" in the search query. I'm not an Apache expert so I can't teach you about that.  I can help with Splunk-specific questions, though.
Hello there, we use search filters on our role management concept. It works fine but we got stuck on the following problem: Since some of hour hosts have a physical hostname (srv1, srv2, srv3,...)... See more...
Hello there, we use search filters on our role management concept. It works fine but we got stuck on the following problem: Since some of hour hosts have a physical hostname (srv1, srv2, srv3,...) and a virtual hostname (server1-db, server2-db, server3-db, server1-web, server2-web, server3-app), we had to use a lookup table (on the search heads) in order to have the virtual names mapped to the physical hostname (which are the names identified by the splunk forwarder). Our Lookup table look like this:     sys_name,srv_name srv1,server-db1 srv2,server-db2 srv3,server-web1 srv4,server-web2 srv5,server-app1 srv6,server-app2       my Role settings look like this:     [role_metrics_db] srchFilter = index=metrics AND (host=server-db* OR srv_name=server-db*) [role_metrics_web] srchFilter = index=metrics AND (host=server-web* OR srv_name=server-web*) [role_metrics_app] srchFilter = index=metrics AND (host=server-app* OR srv_name=server-app*)     Unfortunately my search filters do not recognize either the fields "sys_name" or "srv_name".  Should the search filters be done different? Does someone had the same challenge? Any help will be appreciated. Cheers! 
I'm creating a dashboard to easily search through our web proxy logs and table out the results when troubleshooting. The issue is that sometimes the logs don't contain a destination IP, sometimes the... See more...
I'm creating a dashboard to easily search through our web proxy logs and table out the results when troubleshooting. The issue is that sometimes the logs don't contain a destination IP, sometimes they do. For the dashboard fields that you can input, one of them I want to be able to specify sometimes is the destination IP (field: dest_ip), however, the field doesn't always exist so if I use the following search (I'm excluding the tabling): index=proxy c_ip=$cip$ cs_host=$cshost$ action=$action$ dest_ip=$destip$ Dashboard values: c_ip=1.2.3.4 cs_host=* (default) action=* (default) dest_ip=* (default) It will exclude some of the logs since they don't all have the field "dest_ip" The other 3 fields exist in all logs.  In the dashboard you can input values for each of the fields.  I'm trying to allow that for dest_ip but it doesn't always exist - that's the issue I'm trying to overcome.
Please teach me, because it's my first time using splunk.
Hi ,  I have snow data for change requests in splunk, I want to create a dashboard which gives the average duration of change request ( from actual start date and actual end date ) for type of the ... See more...
Hi ,  I have snow data for change requests in splunk, I want to create a dashboard which gives the average duration of change request ( from actual start date and actual end date ) for type of the change .  type of change can derived from short_description field.    On y-axis ( average duration ) and on x -axis ( type of change request( short_description) , I have written this query but this is not giving the average duration of change . The result which I am getting is too high , may be its calculating for all the events for same change number . Not sure .   index=servicenow short_description IN ("abc", "xyz", "123") | eval start_date_epoch = strptime(dv_opened_at, "%Y-%m-%d %H:%M:%S"), end_date_epoch = strptime(dv_closed_at, "%Y-%m-%d %H:%M:%S") | eval duration_hours = (end_date_epoch - start_date_epoch ) /3600 | eval avg_duration = round (avg_duration_hours, 0) | stats avg(duration_hours) as avg_duration by change_number, short_description | eventstats avg(avg_duration) as overall_avg_duration by short_description | eval ocb = round (overall_avg_duration ,0) | table short_description, ocb  
If you want to search for "Apache" then add "Apache" (with or without quotes) to the query.
One thing missing from the red box is an index specifier, but that's a Best Practice that doesn't address the problem. Otherwise, it appears as though the query is as complete as it can be without k... See more...
One thing missing from the red box is an index specifier, but that's a Best Practice that doesn't address the problem. Otherwise, it appears as though the query is as complete as it can be without knowing more about the data.  If only Apache writes to log.txt then all is good, but if other applications write to the same file name then you'll need to figure out what is unique to Apache data.  Another option is to change the input so Apache logs are in a bespoke index or source.
I need a command that will help me identify this file as Apache. The result will be the word Apache circled in red(image) .
 
Try the summary index and look for either source=splunk-storage-detail or source=splunk-storage-summary, depending on what you are looking for.
Hi @steve32507, good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Po... See more...
Hi @steve32507, good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
@richgalloway thank you for the quick response.  I'm new to Splunk and need to set up an email notification.  I've been working through documentation for several days now, and I'm still not getting t... See more...
@richgalloway thank you for the quick response.  I'm new to Splunk and need to set up an email notification.  I've been working through documentation for several days now, and I'm still not getting this done. Would you please tell me how to accomplish this? 1) Verify the user add/update/delete/activate events are indexed in Splunk.
Please explain what you mean by "not working".  What is the input, what are the expected results, and what are the actual results?
Thank you, have a good day!
Hi @selvam_sekar, you can correlate the two searches using the stats command, something like this: index=abc ((sourcetype=1.1 source=1.2 "downstream") OR (sourcetype=2.1 source=2.2 "do not write t... See more...
Hi @selvam_sekar, you can correlate the two searches using the stats command, something like this: index=abc ((sourcetype=1.1 source=1.2 "downstream") OR (sourcetype=2.1 source=2.2 "do not write to downstream")) "executioneid=*" | stats values(sourcetype) AS sourcetype values(source) AS source BY executioneid you can also add conditions e.g. the presence in both the sourcetypes or only in one of them. Ciao. Giuseppe