All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The first version  depends="$t1_token$,$t2_token$" should work (it does for me). Which version of Splunk are you using?
Hi  Your case should end with ,1=1, 100) and not 1==1,100
Thank You
^([A-Za-z0-9]\.|[A-Za-z0-9][A-Za-z0-9-]{0,61}[A-Za-z0-9]\.){1,3}[A-Za-z]{2,6}$
Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Tr... See more...
Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Traceback (most recent call last): File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\connection.py", line 175, in _new_conn (self._dns_host, self.port), self.timeout, **extra_kw File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\util\connection.py", line 72, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): File "C:\Program Files\Splunk\Python-3.7\lib\socket.py", line 752, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno 11001] getaddrinfo failed         The official Akamai SIEM app was not designed to ingest the Prolexic API so unfortunately is of no use to me. Many thanks.
Thanks so much! This is exactly what I was trying to achieve. Apologies about the wrongly formatted data, but your dummy data is correct. For wildcard, I meant a field name that appears multiple tim... See more...
Thanks so much! This is exactly what I was trying to achieve. Apologies about the wrongly formatted data, but your dummy data is correct. For wildcard, I meant a field name that appears multiple times but can have any number of different subfields (i.e. `wildcard_field.*`) but I wasn't sure if this was the correct terminology, but your answer does work exactly for this field. Thanks again for your answer. It solves my problem, and I have also learnt a bit more about searching in Splunk, which I really appreciate.
Depending on what you are trying to do with the StartTime, you may need one or both of the evals eventtype=windows_index_windows eventtype=hostmon_windows host="///" (Name="///*") OR (Name="///*") O... See more...
Depending on what you are trying to do with the StartTime, you may need one or both of the evals eventtype=windows_index_windows eventtype=hostmon_windows host="///" (Name="///*") OR (Name="///*") OR (Name="///*") StartTime="*" | table Name, StartTime ``` Parse strptime the StartTime into an epoch time value ``` | eval epochStartTime=strptime(StartTime,"%Y%m%d%H%M%S.%6N%z") ``` Format strftime the epoch time into a string ``` | eval stringStartTime=strftime(epochStartTime,"%F %T.%6N")
Hello! I am new to Splunk and attempting the BOTS workshop, Hunting an APT with Splunk - Reconnaissance, and have encountered an issue. Following the video, I tried to access the identity centre, a... See more...
Hello! I am new to Splunk and attempting the BOTS workshop, Hunting an APT with Splunk - Reconnaissance, and have encountered an issue. Following the video, I tried to access the identity centre, asset centre and the Frothly environment network diagram. However none of these are working for me. The Frothly environment shows a blank screen. The Identity and Asset centres show an error in 'inputlookup' command: External command based lookup 'identity_lookup_expanded is not available because KV store initialisation has failed. Does anyone have any idea how to get around this, or has anyone else encountered this error?
@marnall  Thanks for your reply.  You are right! It is the same problem! But, it was said that it is fixed in 9.1 and I don't have any problem on 9.1.3! However, same bug has re-appeared on 9.2.0.1... See more...
@marnall  Thanks for your reply.  You are right! It is the same problem! But, it was said that it is fixed in 9.1 and I don't have any problem on 9.1.3! However, same bug has re-appeared on 9.2.0.1 again! 
It works perfectly for the specified date, but how do I do include it in my command if I want your answer to be the format for every StartTime : eventtype=windows_index_windows eventtype=hostmon_w... See more...
It works perfectly for the specified date, but how do I do include it in my command if I want your answer to be the format for every StartTime : eventtype=windows_index_windows eventtype=hostmon_windows host="///" (Name="///*") OR (Name="///*") OR (Name="///*")  StartTime="*" | table Name, StartTime Sorry if I'm not clear, English is not my first language 
Does this help? | makeresults | eval time="20240324090045.560961-240" | eval _time=strptime(time,"%Y%m%d%H%M%S.%6N%z") | eval stringtime=strftime(_time,"%F %T.%6N")
Your dummy data looks like it might be JSON, but it isn't correctly formatted. I am not sure what you mean by "wildcard fields" Here is a runanywhere example with assumptions about your dummy data ... See more...
Your dummy data looks like it might be JSON, but it isn't correctly formatted. I am not sure what you mean by "wildcard fields" Here is a runanywhere example with assumptions about your dummy data   | makeresults format=json data="[{ \"event\": \"type_A\", \"entity_id\": 123 } , { \"event\": \"type_B\", \"entity_id\": 123, \"wildcard_field.field1\": \"val1\", \"wildcard_field.field2\": \"val2\" } , { \"event\": \"type_B\", \"entity_id\": 345, \"wildcard_field.field1\": \"val1\", \"wildcard_field.field2\": \"val2\" }]" ``` The lines above set up what I assume represents your dummy data ``` | eventstats values(event) as events by entity_id | where event=="type_B" AND mvcount(events) == 2 | fields - events | table wildcard_field.*  
Hello ! I have a date showing in this format (str) : 20240324090045.560961-240 I've been trying to convert it into a readable date format for my dashboards, with no success so far. Does someone ... See more...
Hello ! I have a date showing in this format (str) : 20240324090045.560961-240 I've been trying to convert it into a readable date format for my dashboards, with no success so far. Does someone have a solution ? I'm getting nowhere with the splunk docs or other solved problems. Thanks !
First of all, thanks for creating the dummy data, it was very useful, although slightly wrong. I corrected the data to remove Y1_3 which doesn't exist in your tables nor your desired results. Try s... See more...
First of all, thanks for creating the dummy data, it was very useful, although slightly wrong. I corrected the data to remove Y1_3 which doesn't exist in your tables nor your desired results. Try something like this | makeresults | eval _raw="ID_A;ID_B;X1;X2 A1;B1;X1_1;X2_1 A2;B2;X1_2A;X2_2 A2;B2;X1_2B;X2_2 A3;B3;X1_3;X2_3 " | multikv forceheader=1 | table ID_A, ID_B, X1, X2 | append [ | makeresults | eval _raw="ID_A;ID_B;Y1;Y2 A2;B2;Y1_2; A2;B2;;Y2_2 A3;B3;;Y2_3A A3;B3;;Y2_3B A4;B4;Y1_4;Y2_4 " | multikv forceheader=1 | table ID_A, ID_B, Y1, Y2 ] | append [ | makeresults | eval _raw="ID_B;ID_C;Z1 B1;C1;Z1_1 B3;C3;Z1_3 B5;C5;Z1_5 " | multikv forceheader=1 | table ID_B, ID_C, Z1 ] | table ID_A, ID_B, ID_C, X1, X2, Y1, Y2, Z1 | eventstats values(ID_C) as ID_C values(Z1) as Z1 by ID_B | eventstats values(X1) as X1 values(X2) as X2 values(Y1) as Y1 values(Y2) as Y2 by ID_A ID_B | eventstats values(X1) as X1 values(X2) as X2 values(Y1) as Y1 values(Y2) as Y2 values(ID_A) as ID_A by ID_C ID_B | mvexpand Y2 | mvexpand X1 | fillnull value="N/A" | streamstats count by ID_A ID_B ID_C X1 X2 Y1 Y2 Z1 | where count==1 | foreach * [| eval <<FIELD>>=if(<<FIELD>>=="N/A",null(),<<FIELD>>)] | fields - count Note that mvexpand can cause memory issues, so you need to check your job status to ensure the search works correctly with real data.
I'm trying to achieve the following search and hoped others might have some helpful suggestions? I have two events from a summary index: `type_A` and `type_B`. They share a common field `entity_id... See more...
I'm trying to achieve the following search and hoped others might have some helpful suggestions? I have two events from a summary index: `type_A` and `type_B`. They share a common field `entity_id` that may or may not match. I want to get all events of `type_B` where there is an event of `type_A` with a matching `entity_id`.  From this result, in `type_B` I have some wildcard fields (a common `wildcard_field` name with different sub-fields, such as `wildcard_field.field1`, `wildcard_field.field2`) and I want to extract the data for those fields into a table for visualisation. Example of event structure:     { event: type_A; entity_id: 123; } { event: type_B; entity_id: 123; // Matches a type_A event wildcard_field.field1: val1; wildcard_field.field2: val2; } { event: type_B; entity_id: 345; // This one won't have a matching type_A event wildcard_field.field1: val1; wildcard_field.field2: val2; }     Thank you for any suggestions  
I have two dropdownlist. Second dropdown list should show/hide based on first drop down list value. (based on two values)  With one value works fine. <input type="dropdown" token="sourceToken" depen... See more...
I have two dropdownlist. Second dropdown list should show/hide based on first drop down list value. (based on two values)  With one value works fine. <input type="dropdown" token="sourceToken" depends="$t1_token$" searchWhenChanged="false"> I have tried as below for two values : But it not working <input type="dropdown" token="sourceToken" depends="$t1_token$,$t2_token$" searchWhenChanged="false"> <input type="dropdown" token="sourceToken" depends="t1_token,t2_token" searchWhenChanged="false"> it is not working. Please advise
Field names with special characters such as hyphens often need to be quoted with single quotes. index=* uri=validate | eval SLA=1000| stats count as total_calls count(eval('execution-time' < SLA)) a... See more...
Field names with special characters such as hyphens often need to be quoted with single quotes. index=* uri=validate | eval SLA=1000| stats count as total_calls count(eval('execution-time' < SLA)) as sla_compliant_count
Hello @ezmo1982, Yes, the exact feature was released in ES 7.2.0 - https://docs.splunk.com/Documentation/ES/7.2.0/RN/Enhancements as a part of https://ideas.splunk.com/ideas/ESSID-I-189     P... See more...
Hello @ezmo1982, Yes, the exact feature was released in ES 7.2.0 - https://docs.splunk.com/Documentation/ES/7.2.0/RN/Enhancements as a part of https://ideas.splunk.com/ideas/ESSID-I-189     Please accept the solution and hit Karma, if this helps!
I am trying to get the count of the hit to the particular api and based on the field called execution-time  I am calculating SLA , Somehow I am able to see the no of requests coming to the api but n... See more...
I am trying to get the count of the hit to the particular api and based on the field called execution-time  I am calculating SLA , Somehow I am able to see the no of requests coming to the api but not able to get the SLA count using below query, Can some one help me where I am doing wrong in the below query,   index=* uri=validate | eval SLA=1000| stats count as total_calls count(eval(execution-time < SLA)) as sla_compliant_count
Small correction: In the result table, rows 4 and 5, ID_C should be "C3" (from table Z).