All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Glad that you have managed to resolve the issue like me. I went through high & low searching for the solutions as well. Luckily managed to Google out the link I sent you and resolved the issue. @d... See more...
Glad that you have managed to resolve the issue like me. I went through high & low searching for the solutions as well. Luckily managed to Google out the link I sent you and resolved the issue. @dwthomas16 Happy Splunking.
Hi    I want to know that what will happen after splunk universal forwarder reached throughput limit, because i found my universal forwarder is stop ingest the data at a certain monment every day, a... See more...
Hi    I want to know that what will happen after splunk universal forwarder reached throughput limit, because i found my universal forwarder is stop ingest the data at a certain monment every day, and i don't know waht happend here, and i just set up the thruput in limits.conf, and restart the UF, the remain data will be collected,  although i'm not sure if it will still be effective next time... so the throughput limit reached, the Splunk UF will stop collecting data until next restart?   
I need to run a curl command to run various tasks such as creating searches, accessing searches etc. I have the below command which works perfectly   curl -k -u admin:test12345 https://127.0.0.1:8... See more...
I need to run a curl command to run various tasks such as creating searches, accessing searches etc. I have the below command which works perfectly   curl -k -u admin:test12345 https://127.0.0.1:8089/services/saved/searches/ \ -d name=test_durable \ -d cron_schedule="*/15 * * * *" \ -d description="This test job is a durable saved search" \ -d dispatch.earliest_time="-15h@h" -d dispatch.latest_time=now \ --data-urlencode search="search index=_audit sourcetype=audittrail | stats count by host"   but given that I may have to craft various curl commands with different -d flags, I want to be able to pass values through a file so I used below command   curl -k -u admin:test12345 https://127.0.0.1:8089/services/saved/searches/ --data-binary data.json   where data.json looks like this { "name": "test_durable", "cron_schedule": "*/15 * * * *", "description": "This test job is a durable saved search", "dispatch.earliest_time": "-15h@h", "dispatch.latest_time": "now", "search": "search index=_audit sourcetype=audittrail | stats count by host" } but in doing so I get following error   <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="ERROR">Cannot perform action "POST" without a target name to act on.</msg> </messages> </response>   So after going through lot of different posts on this topic, I realised Splunk seems to have problem with json format or mainly extracting the 'name' attribute from json format. Can someone please assist with how I can craft Curl command that uses data from a file like I am using above and get correct response from Splunk ?
Hi @gcusello, I setup 2 vms one is a Splunk Enterprise instance and the other is the Universal forwarder. I was able to figure it out I had something wrong in my inputs.conf in the forwarder. Than... See more...
Hi @gcusello, I setup 2 vms one is a Splunk Enterprise instance and the other is the Universal forwarder. I was able to figure it out I had something wrong in my inputs.conf in the forwarder. Thank you for clarifying me regarding the trial license as I had that doubt. Cheers.
hi, I am facing same issue. did you end up fixing it?
Hello, I was aware that splunk is very versatile application which allows the users to manipulate the data is many ways.  I have extracted the fields of event_name, task_id , event_id. I am trying t... See more...
Hello, I was aware that splunk is very versatile application which allows the users to manipulate the data is many ways.  I have extracted the fields of event_name, task_id , event_id. I am trying to create an alert if there is an increment in the event_id for the same task_id & event_name when latest even arrives in the splunk. For example, event at 3:36:40.395 PM have the task_id which is 3  & event_id which is 1223680  AND  the latest even arrived at 3:52:40.395 PM which have task_id 3 & event_id which is 1223681 I am trying to create an alert because for the same task_id (3), event_name (server_state) there is an increment in event_id. I believe it is only possible if we store the previous event_id in a variable for the same event_name & task_id so that we can compare it with the new event_id. However, we have four different task_id, I am not sure how save the event_id for all those different task_id's. Any help would be appreciated.   Log File Explanation:   8/01/2023 3:52:40.395 PM server_state|3 1123681 5 Date Timestamp event_name|task_id event_id random_number     Sample Log file:   8/01/2023 3:52:40.395 PM server_state|3 1223681 5 8/01/2023 3:50:40.395 PM server_state|2 1201257 3 8/01/2023 3:45:40.395 PM server_state|1 1135465 2 8/01/2023 3:41:40.395 PM server_state|0 1545468 5 8/01/2023 3:36:40.395 PM server_state|3 1223680 0 8/01/2023 3:25:40.395 PM server_state|2 1201256 2 8/01/2023 3:15:40.395 PM server_state|1 1135464 3 8/01/2023 3:10:40.395 PM server_state|0 1545467 8     Thank You
Are all these numbers in a single field or part of a larger raw event. Assuming these are in a single field in the event, then simply | eval numbers=split(your_big_long_numbers_field, ",") which wi... See more...
Are all these numbers in a single field or part of a larger raw event. Assuming these are in a single field in the event, then simply | eval numbers=split(your_big_long_numbers_field, ",") which will make a new field called numbers which will contain a multivalue field with all your split numbers in. If you then want to make a new row for each of those numbers, use | mvexpand numbers
Have the dashboards in different tabs in the browser and use a browser tab cycler to cycle between the tabs?  
Is the dashboard search using tokens in the search?
You will have to the the use the colorPalette expression syntax as in the example below - you can simply copy this XML row into an existing dashboard to see how it works - it's a dummy search that ju... See more...
You will have to the the use the colorPalette expression syntax as in the example below - you can simply copy this XML row into an existing dashboard to see how it works - it's a dummy search that just creates a random time and when it's in the out of hours range it goes red. <row> <panel> <table> <title>Turning the Time column red if outside hours 18:00 to 06:00</title> <search> <query>| makeresults | eval _time=now() - (random() % 86400) | eval Date=strftime(_time, "%F"), Time=strftime(_time, "%T") | eval EventCode=4624, Account_Name="user ".(random() % 10) | table Date Time EventCode Account_Name</query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">100</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <format type="color" field="Time"> <colorPalette type="expression">if(tonumber(substr(value,1,2))&gt;=18 OR tonumber(substr(value,1,2))&lt;6, "#FF0000", "#FFFFFF")</colorPalette> </format> </table> </panel> </row>  
In your outer search  index=firstindex Email_Address remove the word "Email_Address" - I assume you want to look for a field that is called Email_Address in the firstIndex data using the values com... See more...
In your outer search  index=firstindex Email_Address remove the word "Email_Address" - I assume you want to look for a field that is called Email_Address in the firstIndex data using the values coming from the subsearch, but with this search you are looking for the WORD Email_Address as well as the value of the Email_Address FIELD  coming from the subsearch. You can see what a subsearch returns by running it on its own and using the | format specifier, e.g. index=secondindex user="dreamer" | fields Email_Address | head 1 | format  
OK, so if ALL your hosts are in the logs, you just need this index="index" source="C:\\Windows\\System32\\LogFiles\\Log.log" earliest=-45m latest=now | eval Detection=if(match(_raw, "Detection!"), ... See more...
OK, so if ALL your hosts are in the logs, you just need this index="index" source="C:\\Windows\\System32\\LogFiles\\Log.log" earliest=-45m latest=now | eval Detection=if(match(_raw, "Detection!"), 1, 0) | stats sum(Detection) as Detections by host This finds all events from Log.Log and then line 2 sets the value of a new field to 1 if the word "Detection!" is found in the event. Then the stats will add together all the Detection events for each host This is a key technique in Splunk for getting different sets of information from the same data, by first selecting ALL the data you want to consider and then using the eval statement (Splunk's Swiss Army knife) to set some indicator (in this case, determining if a particular event is the one you are really interested in counting) and then the stats just adds up all detections. This hosts that do NOT have the Detection! word, will always have Detection=0, so will end up with a Detections column value of 0 Hope this helps.  
So, try the suggestion - you only need the single search as I posted earlier, but with your updated search it should be like this index=dl* ("Error_MongoDB") OR ("Record_Inserted") | eval Status=if... See more...
So, try the suggestion - you only need the single search as I posted earlier, but with your updated search it should be like this index=dl* ("Error_MongoDB") OR ("Record_Inserted") | eval Status=if(match(_raw, "Error_MongoDB"), "Failure", "Success") | rename msg.attribute.ticketId as ticketId | timechart span=1d dc(ticketId) by Status | eval FailurePercentage = (Failure/Success)*100 | fillnull FailurePercentage You don't need to use all the fields/table commands - the timechart will remove all the unnecessary fields anyway
Sorry, still not sure I get it, you say partial matches of both A and B, so for your second example what are the rules there? field a = AAAAA\ABCDE-SS410009$ field b = A=AAAAA\ABCDE-SS410009,B=Domai... See more...
Sorry, still not sure I get it, you say partial matches of both A and B, so for your second example what are the rules there? field a = AAAAA\ABCDE-SS410009$ field b = A=AAAAA\ABCDE-SS410009,B=Domain,C=AB,D=XXX,E=NET Now I want to match  field a= AAAAA\ABCDE-SS410009 field b= AAAAA\ABCDE-SS410009 like this In the above, you show that all characters up to and excluding the final $ sign are found in B, so you appear to be showing the longest match of A found in B. So, if A had AAAAA\ABCDE-PP921234$ would you expect to see AAAAA\ABCDE as a match result and if A had BBBBB\ABCDE-SS410009$ would you expect to see ABCDE-SS410009 as a match  Also is the A= part in B related to field 'a'?
Hi @splunk_learn, you cannot use two Splunk instances because they have the same IP address and hostname. use two virtual machines. A trial license is a full feature license, so the issue isn't th... See more...
Hi @splunk_learn, you cannot use two Splunk instances because they have the same IP address and hostname. use two virtual machines. A trial license is a full feature license, so the issue isn't the license. Ciao. Giuseppe
Hi @rsannala, yes it's possible as described at https://docs.splunk.com/Documentation/Splunk/latest/Data/Advancedsourcetypeoverrides remember that you have to perform this transformation on the fir... See more...
Hi @rsannala, yes it's possible as described at https://docs.splunk.com/Documentation/Splunk/latest/Data/Advancedsourcetypeoverrides remember that you have to perform this transformation on the first Splunk full instance, an Heavy Forwarder (if present) or an Indexer. Ciao. Giuseppe
Hi @leykmekoo, are you sure that the field to use as search key is exactly named "Email_Address" in both the searches and that values are compatible? if you manually extract a value from the subsea... See more...
Hi @leykmekoo, are you sure that the field to use as search key is exactly named "Email_Address" in both the searches and that values are compatible? if you manually extract a value from the subsearch, do you have results using this result in the main search? Ciao. Giuseppe
Hi @jamaluddin-k, forwarding data from GUI is a feature to send logs to another Splunk instance not to a syslog server. If you want to send logs to a syslog server, you have to follow the instructi... See more...
Hi @jamaluddin-k, forwarding data from GUI is a feature to send logs to another Splunk instance not to a syslog server. If you want to send logs to a syslog server, you have to follow the instructions at https://docs.splunk.com/Documentation/Splunk/9.1.1/Forwarding/Forwarddatatothird-partysystemsd#Syslog_data Ciao. Giuseppe
Hi @Ammar, let me understand: is your issue that the search doesn't find any result or that the search finds results but you don't have any action? in the first  case, you have to debug your search... See more...
Hi @Ammar, let me understand: is your issue that the search doesn't find any result or that the search finds results but you don't have any action? in the first  case, you have to debug your search: I see that you didn't used the index definition, if the index to use isn't in the default search path, you cannot find anything: index=your_index host=192.168.1.1 "DST=192.168.1.174" | stats count AS Requests BY SRC | sort -Requests | where Requests>50 Then are you sure that in your logs you have a scring exaclty "DST=192.168.1.174"? this isn't a field definition used for the search: if you have the field DST (that usually is in lowercase!) you can use it without quotes. in the second case, you have to check the response actions configuration, which one did you configured? To be listed in the triggered alerts or to receive an email you have to configure this actions in the response actions, it isn't automatic by default. Ciao. Giuseppe
I audit windows computers. My search looks for the date, time, EventCode and Account_Name:   Date                        Time            EventCode  Account_Name 2023/08/29       16:09:30     4624 ... See more...
I audit windows computers. My search looks for the date, time, EventCode and Account_Name:   Date                        Time            EventCode  Account_Name 2023/08/29       16:09:30     4624                   jsmith   I would like the Time field to turn red when a user signs in after hours (1800 - 0559). I have tried clicking on the pen in the time column and selecting Color than Ranges. I always get error messages about not putting the numbers in correct order. What do I need to do?