All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @shakti , there are many python scripts in the bin folder. Anyway, upgrade the app anche check. Ciao. Giuseppe
Hello @elizabethl_splu  any news? Is it available in V9.2? Thanks.  
More words please. "after migration" can mean anything - an upgrade from version to version, an attempt to move the app between different environments, upgrade of the underlying Splunk version... We ... See more...
More words please. "after migration" can mean anything - an upgrade from version to version, an attempt to move the app between different environments, upgrade of the underlying Splunk version... We have no idea what happened in the first place. Then there's the issue of "page not loading" - what does it mean? Does it show any errors? Do other pages of the same app load properly and only config page doesn't load? Did you check your _internal index for errors?
Hello Everyone, I have created and alert which uses sendresults command to format the email notification. But the problem i have with this is, It does not have View Splunk Results link to view the ... See more...
Hello Everyone, I have created and alert which uses sendresults command to format the email notification. But the problem i have with this is, It does not have View Splunk Results link to view the splunk results. So i have add addinfo the search to grab search id and appended to the splunk url. https://<hostname>:8000/en-US/app/search/search?sid=scheduler__user__search__RMD5fa2e7e4e362d_at_".$info_sid$." | eval application_name = "<a href=https://<hostname>:8000/en-US/app/search/security_events_dashboard?form.field2=&form.application_name=" . application_name . ">" . application_name . "</a>" | eval email_subj="Security Events Alert", email_body="<p>Hello Everyone,</p><p>You are receiving this notification because the application has one or more security events reported in the last 24 hours..<br></p><p> Please click on the link available in the table to fetch events for specific application.</p> </p><p>To view splunk results <a href=https://<hostname>:8000/en-US/app/search/search?sid=scheduler__user__search__RMD5fa2e7e4e362d_at_".$info_sid$.">Click here</a></p> Iam able to receive the link but this link is not loading. Could someone please assist me on this. I want to receive a link similar to the one which i will receive when an alert is triggered. Regards, Sai
map is slow and limited - try something like this | timechart span=10m aligntime=latest count by host | addcoltotals label="Total" labelfield=_time | tail 2 | eval _time=if(_time=="Total", _time, "l... See more...
map is slow and limited - try something like this | timechart span=10m aligntime=latest count by host | addcoltotals label="Total" labelfield=_time | tail 2 | eval _time=if(_time=="Total", _time, "last_count_of_events") | fields - _span | transpose 0 column_name=host header_field=_time | eval avg_count_of_events=round(Total/6) | eval percent_of_increase = round((last_count_of_events/avg_count_of_events)*100)-100 | table host avg_count_of_events last_count_of_events percent_of_increase
thanks for the post, helped me solve a very similar issue that I`ve encountered.
Hi Splunk team, My question : Can we create two modules to add the devices which are not listed in the entity management. if we can create two modules means is it affect other things in ITSI? ... See more...
Hi Splunk team, My question : Can we create two modules to add the devices which are not listed in the entity management. if we can create two modules means is it affect other things in ITSI? Please provide me the steps and guide how to overcome with this issue Thanks
I need to calculate the average number of events in the last hour and compare it with the number of events in the last 10 minutes for each host. index="cloudflare" | spath path=ClientRequestHost ou... See more...
I need to calculate the average number of events in the last hour and compare it with the number of events in the last 10 minutes for each host. index="cloudflare" | spath path=ClientRequestHost output=host | stats count as event_count by host | eval avg_count_of_events = round(event_count/6) | map search="search index=cloudflare ClientRequestHost=$host$ earliest=-10min | stats count as last_count_of_events | eval host=$host$ | eval avg_count_of_events=$avg_count_of_events$ | eval event_count=$event_count$ " | eval percent_of_increase = round((last_count_of_events/avg_count_of_events)*100)-100 | table host avg_count_of_events last_count_of_events percent_of_increase Is the more effective way to do that?
Thank you for your reply , Guiseppi ....the app version is 3.2.4 and splunk version is 9.0.4 ... However , I dont see any python scripts for this app in the backend ....should I upgrade the app to 4.... See more...
Thank you for your reply , Guiseppi ....the app version is 3.2.4 and splunk version is 9.0.4 ... However , I dont see any python scripts for this app in the backend ....should I upgrade the app to 4.1.1 and then see what happens? Please do let me know what do you think?
Hi  can i directly push asa firewall logs i
Again I am sorry but that is not clear.  You can use Splunk SOAR to run SPL in Splunk/ES using the Splunk App's run_query action. You will have to either know the full SPL and run manually, or if ... See more...
Again I am sorry but that is not clear.  You can use Splunk SOAR to run SPL in Splunk/ES using the Splunk App's run_query action. You will have to either know the full SPL and run manually, or if they are standard, regularly used searches you can build playbooks to dynamically populate the SPL before running it in Splunk.  If you want to find all the events that came from ES then that would usually be to check how many containers/events have that specific label which can be done via REST: https://<soar_url>/rest/container?_filter_label=<your_label> This should return a count of all containers/events with that label.
"simple" is a subjective term! Assuming you can evaluate state based on the events as being either 0 or 1 (I have used a random number to simulate different events), then you could try something lik... See more...
"simple" is a subjective term! Assuming you can evaluate state based on the events as being either 0 or 1 (I have used a random number to simulate different events), then you could try something like this | eval state=if(random()%5 == 0, 0, 1) | streamstats range(state) as changed count as host_event by host global=f window=2 | eval changed = if(host_event == 1, 1, changed) | where changed == 1 | streamstats range(_time) as interval last(state) as state by host global=f window=2 | appendpipe [| stats last(state) as state last(_time) as last_event by host | addinfo | eval _time=info_max_time | eval interval=_time-last_event | fields - info_* last_event]
@phanTom , Hi  Typically, we address security incidents using SOAR directly. My inquiry pertains to identifying searches within SOAR through Splunk ES. Is there a specific REST call command for thi... See more...
@phanTom , Hi  Typically, we address security incidents using SOAR directly. My inquiry pertains to identifying searches within SOAR through Splunk ES. Is there a specific REST call command for this purpose?
| makeresults count=1 | eval json_data="{\"data\": {\"a\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" | append [ makeresults count=1 | eval json_dat... See more...
| makeresults count=1 | eval json_data="{\"data\": {\"a\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" | append [ makeresults count=1 | eval json_data="{\"data\": {\"b\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" ] | append [ makeresults count=1 | eval json_data="{\"data\": {\"c\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" ] | append [ makeresults count=1 | eval json_data="{\"data\": {\"d\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" ] | spath input=json_data | fields - json_data | transpose 0 column_name=field header_field=_time | eval field=mvindex(split(field,"."),0).".".mvindex(split(field,"."),2).".".mvindex(split(field,"."),3) | transpose 0 header_field=field | fields - column
Instead of | eval Target="5" try this | streamstats count as row | eventstats max(row) as max | eval Target=if(row=1 OR row=max,5,null()) | fields - row max Then set the graph to connect when value... See more...
Instead of | eval Target="5" try this | streamstats count as row | eventstats max(row) as max | eval Target=if(row=1 OR row=max,5,null()) | fields - row max Then set the graph to connect when values are null
It is producting below result ,  I want to read the x and y field   | makeresults count=1 | eval json_data="{\"data\": {\"a\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"... See more...
It is producting below result ,  I want to read the x and y field   | makeresults count=1 | eval json_data="{\"data\": {\"a\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" | append [ makeresults count=1 | eval json_data="{\"data\": {\"b\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" ] | append [ makeresults count=1 | eval json_data="{\"data\": {\"c\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" ] | append [ makeresults count=1 | eval json_data="{\"data\": {\"d\": {\"x\": {\"mock_x_field\": \"value_x\"}, \"y\": {\"mock_y_field\": \"value_y\"}}}}" ] | spath | fields - _raw | transpose 0 column_name=field | eval field=mvindex(split(field,"."),0).".".mvindex(split(field,"."),2).".".mvindex(split(field,"."),3) | transpose 0 header_field=field | fields - column    
Please clarify your question - you say you want to add a column but appendpipe will add row(s) - you  have 5 locations not 4, do you wish to exclude a particular location or sum the logins for all lo... See more...
Please clarify your question - you say you want to add a column but appendpipe will add row(s) - you  have 5 locations not 4, do you wish to exclude a particular location or sum the logins for all locations by desk?
@AL3Z Correlation Searches reside in Enterprise Security. To get them into SOAR you need to use the SOAR Adaptive responses (send_to_phantom/run_playbook) on the Notable Event which will create a con... See more...
@AL3Z Correlation Searches reside in Enterprise Security. To get them into SOAR you need to use the SOAR Adaptive responses (send_to_phantom/run_playbook) on the Notable Event which will create a container/event in SOAR for you to run automation against.  If I misunderstood the ask please expand and I will try to help. 
Hi @gabrieltrust , if you have a lookup (called e.g. perimeter.csv) with at least one field (host), you can run something like this: | tstats count WHERE index=* NOT [ | inputlookup perimeter.csv |... See more...
Hi @gabrieltrust , if you have a lookup (called e.g. perimeter.csv) with at least one field (host), you can run something like this: | tstats count WHERE index=* NOT [ | inputlookup perimeter.csv | fields host ] BY host Ciao. giuseppe
It depends what you want to do next - if you just want to remove the a, b, c, and d from the field names, you could just do this | spath | fields - _raw | transpose 0 column_name=field | eval field=... See more...
It depends what you want to do next - if you just want to remove the a, b, c, and d from the field names, you could just do this | spath | fields - _raw | transpose 0 column_name=field | eval field=mvindex(split(field,"."),0).".".mvindex(split(field,"."),2).".".mvindex(split(field,"."),3) | transpose 0 header_field=field | fields - column