All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The inputlookup command reads from a single lookup.  There is no provision for reading multiple files at once (via wildcards, for instance).  Go to https://ideas.splunk.com to make a case for this en... See more...
The inputlookup command reads from a single lookup.  There is no provision for reading multiple files at once (via wildcards, for instance).  Go to https://ideas.splunk.com to make a case for this enhancement to inputlookup.
Hello I am trying to get filename (name.exe) from a full path (dir + filename) from windows folders, ex: C:\dir1\dir2\filename.ext using code as below:   index = os_sysmon NOT Image="*... See more...
Hello I am trying to get filename (name.exe) from a full path (dir + filename) from windows folders, ex: C:\dir1\dir2\filename.ext using code as below:   index = os_sysmon NOT Image="*Sysmon*" EventCode=1 | rex field=Image "Executable=(?P<Executable>[^\\\]+)$" | table Image Executable   Problem: Executable always empty Can you please advise? best regards Altin
Thank you.
Hello I am trying to test the functionality of sending an email that will be sent because of an alert. For that, first I tried to send an email using the sendemail command. I used the free subscript... See more...
Hello I am trying to test the functionality of sending an email that will be sent because of an alert. For that, first I tried to send an email using the sendemail command. I used the free subscription of Brevo to get an accessible SMTP server to send an email. Then I tried configuring the email settings in my Splunk Enterprise. Below are the SS of my email settings For the password, I am using the MasterKey provided in the Brevo for my SMPT   For the rest of the settings, I kept them as the default   I am trying to send the data to a dummy email in Mailinator. Below is my searched SPL with the error. It is giving me an error for the email set as Send Email as user(Splunk) which I kept as default. I tried using my personal Gmail ID as well but got the same error for that ID. Can anyone please help me on how to debug or resolve this issue.
https://docs.splunk.com/Documentation/Splunk/9.1.1/Indexer/Setupmultipleindexes   You don't have to add your app to the indexers but you must define your index on the indexers.  A stand alone insta... See more...
https://docs.splunk.com/Documentation/Splunk/9.1.1/Indexer/Setupmultipleindexes   You don't have to add your app to the indexers but you must define your index on the indexers.  A stand alone instance can define via GUI management, however if you have an indexing cluster you must use the CLI to edit an indexes.conf file which is pushed in the CM bundle to the IDX tier.
Fill in the empty values using the mvmap function. | makeresults | eval _raw="{\"name\": \"my name\", \"values\": [{\"rank\": 1, \"value\": \"\"}, {\"rank\": 2, \"value\": \"a\"}, {\"rank\": 3, \"va... See more...
Fill in the empty values using the mvmap function. | makeresults | eval _raw="{\"name\": \"my name\", \"values\": [{\"rank\": 1, \"value\": \"\"}, {\"rank\": 2, \"value\": \"a\"}, {\"rank\": 3, \"value\": \"b\"}, {\"rank\": 4, \"value\": \"c\"}]}" | spath | rename values{}.rank as rank | rename values{}.value as value | eval value=mvmap(value,if(value="", "[empty]", value)) | table name, rank, value  
@dural_yyz Thanks for the insight, I've declared the index in my app's indexes.conf which is installed on the HF which essentially is being populated by scripted input.  But is there a way around w... See more...
@dural_yyz Thanks for the insight, I've declared the index in my app's indexes.conf which is installed on the HF which essentially is being populated by scripted input.  But is there a way around where I don't have to install my app on the indexers? And also can you please provide the reference where it mentions that I have to install my app in Indexer?
It appears that the field action has text values and you are trying to apply a volume limit where statement.  You could create a new field of 'tmp' if action IN (value1 value2), "1","0").  At that po... See more...
It appears that the field action has text values and you are trying to apply a volume limit where statement.  You could create a new field of 'tmp' if action IN (value1 value2), "1","0").  At that point you can stats count or sum the new field and apply your where statement based upon your own needs.   Just a thought.
That exam is in its beta period.  Exam results will not be returned until Splunk has enough results to assess the exam itself.  It could be months until we learn how we did.
Hi @yasit, it isn't correct: if you are trying to send logs to a not existing index, you have a message (someting like this: "unconfigured/disabled/deleted index=wineventlog with source="source::Win... See more...
Hi @yasit, it isn't correct: if you are trying to send logs to a not existing index, you have a message (someting like this: "unconfigured/disabled/deleted index=wineventlog with source="source::WinEventLog:System"), but the index isn't automatically created. Ciao. Giuseppe
Hello! I am trying to get the streamfwd app to capture traffic on an interface located on my virtual machine. Does this app not recognize link layer virtualization? This is the error I am receiving ... See more...
Hello! I am trying to get the streamfwd app to capture traffic on an interface located on my virtual machine. Does this app not recognize link layer virtualization? This is the error I am receiving and currently can't find a workaround... "(SnifferReactor/PcapNetworkCapture.cpp:238)  stream.NetworkCapture - SnifferReactor unrecognized link layer for device <lo0>: 253" I was also receiving the same error when I changed my streamfwd.conf to capture on a different network interface. Even tried putting the interface into promiscuous mode. Any help/troubleshooting on this would be appreciated! Fysa, I am using a 64bit CentOS8.
Agreed - you need to have the index defined on the indexers.  Since the HF cooks the data when it comes across you need to have matching configuration at the receiving side.  Failure to do this will ... See more...
Agreed - you need to have the index defined on the indexers.  Since the HF cooks the data when it comes across you need to have matching configuration at the receiving side.  Failure to do this will mean your data will route to the last chance index. On the indexer check btool config for indexes.conf [default] lastChanceIndex = <index name> * An index that receives events that are otherwise not associated with a valid index. * If you do not specify a valid index with this setting, such events are dropped entirely. * Routes the following kinds of events to the specified index: * events with a non-existent index specified at an input layer, like an invalid "index" setting in inputs.conf * events with a non-existent index computed at index-time, like an invalid _MetaData:Index value set from a "FORMAT" setting in transforms.conf * You must set 'lastChanceIndex' to an existing, enabled index. Splunk software cannot start otherwise. * If set to "default", then the default index specified by the 'defaultDatabase' setting is used as a last chance index. * Default: empty string  
thanks @gcusello  what seems to be the issue? my understanding was that by default if Splunk receives data for an index that doesn't exist, it will attempt to create the index dynamically.   
Hi @yasit, you have two choices: install the app also on Indexers (I don't hint), manually create the index on the Indexer. usually this is described in the instructions, which is the app? Cia... See more...
Hi @yasit, you have two choices: install the app also on Indexers (I don't hint), manually create the index on the Indexer. usually this is described in the instructions, which is the app? Ciao. Giuseppe
my app contains the index.conf which declares the index that is installed on the heavy forwarder and it is not installed on the indexer. The problem is that data does not land on the indexer ... See more...
my app contains the index.conf which declares the index that is installed on the heavy forwarder and it is not installed on the indexer. The problem is that data does not land on the indexer      
Thanks for looking into it, however, it did not go through, it still gives an error The argument '(eval(action IN (Not Found,Forbidden)))' is invalid
Hello @richgalloway,  Thanks for the information, I will try to do that ! Regards, GaetanVP
Hi, I have query | makeresults | eval _raw="{\"name\": \"my name\", \"values\": [{\"rank\": 1, \"value\": \"\"}, {\"rank\": 2, \"value\": \"a\"}, {\"rank\": 3, \"value\": \"b\"}, {\"rank\": 4, \... See more...
Hi, I have query | makeresults | eval _raw="{\"name\": \"my name\", \"values\": [{\"rank\": 1, \"value\": \"\"}, {\"rank\": 2, \"value\": \"a\"}, {\"rank\": 3, \"value\": \"b\"}, {\"rank\": 4, \"value\": \"c\"}]}" | spath | rename values{}.rank as rank | rename values{}.value as value | table name, rank, value Producing result Because in the first item of values, value is empty the values in the table are shifted one up and are not aligned with the rank.  How could I conditionally update the value to, say, [empty] if that is empty string in the data?
You are almost there. | stats count (eval(action IN ("Not Found","Forbidden"))) as failures by src | where failures>100 | table src
The feature you are looking for is trellis.  But Splunk doesn't currently do trellis for table visualization. (I'm almost sure that Grafana does.)  You can sort of hack something yourself if you are ... See more...
The feature you are looking for is trellis.  But Splunk doesn't currently do trellis for table visualization. (I'm almost sure that Grafana does.)  You can sort of hack something yourself if you are willing to get into the nitty-gritty Simple XML programming. (Or in Dashboard Studio source.)  Oh, you also need to know all possible values of SectionName in advance.  The basic idea is Run a <query /> to populate an aggregate token with values of Attribute in the same search window, e.g.,   index = websphere_cct (Object= "HJn5server1" Env="Prod") OR (Object = "HJn7server3" Env="UAT") SectionName="Process Definition" | spath path=Attributes | eval Attributes = mvappend("SectionName", json_array_to_mv(json_keys(Attributes)))   (Note this only runs in Splunk 8.0 and later.) Use <condition><progress /></condition> to set or unset a dedicated token for every possible SectionName value.  If the value exist in that aggregate token, set the token, otherwise unset it. Use these dedicated tokens to hide or show tables, one for each possible SectionName value. You can read about hide-and-show in Access tokens to show or hide user interface components, about set dynamic tokens in Search tokens for dynamic display example. Here is a mock dashboard you can play with. (I included comments about the actual search that you can substitute.)  Alas! The code is too long.  You can download/copy from here: Mock table trellis in Splunk Simple XML.   Here is a screenshot: As you can see, from your illustrated attribute list of 9, my mock search pretends to have found 4.  So, only those 4 corresponding trellis show on the left-hand side.  If you edit the attribute selection (in source), different tables will show.  In edit mode, all 9 tables are visible, with hidden ones in grey. (Right-hand side is the big table you illustrated, with all 9 attributes.) Several notes: You could have saved volunteers lots of time (and done yourself a favor) by illustrating sample data that matches your desired output.  The JSON in the description has too little in common with the table you show. As your search restricts SectionName to "Process Definition", it doesn't seem to make sense to list SectionName in the table. (SectionName is not an Attribute, any way.)  But I still included it in my emulation. Maintenance is painful and not very scalable like a true trellis feature. Hope this helps.