All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi community, Good Day...! Need your help to create dashboard for all the notables which are all in the in-progress and pending status along with the assignee names. I was using the below quire b... See more...
Hi community, Good Day...! Need your help to create dashboard for all the notables which are all in the in-progress and pending status along with the assignee names. I was using the below quire but couldn't able to get the correct details please help me out the correct one. | `es_notable_events` | search timeDiff_type=current | stats sum(count) as count by urgency | `stats2chart("urgency")` Thanks in Advance, Kishore.
I am attempting to use a Generic S3 Bucket with CDR files with multiple folders inside to visualize the data. I am getting the following error and not sure why the account isn't found. In building th... See more...
I am attempting to use a Generic S3 Bucket with CDR files with multiple folders inside to visualize the data. I am getting the following error and not sure why the account isn't found. In building the source Splunk autofills the values and I can log into S3 with the account. What logs or remediation should I do? File "C:\Program Files\Splunk\etc\apps\Splunk_TA_aws\bin\splunk_ta_aws\modinputs\generic_s3\aws_s3_data_loader.py", line 86, in index_data self._do_index_data() File "C:\Program Files\Splunk\etc\apps\Splunk_TA_aws\bin\splunk_ta_aws\modinputs\generic_s3\aws_s3_data_loader.py", line 107, in _do_index_data self.collect_data() File "C:\Program Files\Splunk\etc\apps\Splunk_TA_aws\bin\splunk_ta_aws\modinputs\generic_s3\aws_s3_data_loader.py", line 153, in collect_data self._discover_keys(index_store) File "C:\Program Files\Splunk\etc\apps\Splunk_TA_aws\bin\splunk_ta_aws\modinputs\generic_s3\aws_s3_data_loader.py", line 223, in _discover_keys credentials = self._generate_credentials() File "C:\Program Files\Splunk\etc\apps\Splunk_TA_aws\bin\splunk_ta_aws\modinputs\generic_s3\aws_s3_data_loader.py", line 384, in _generate_credentials self._config.get(tac.aws_iam_role), File "C:\Program Files\Splunk\etc\apps\Splunk_TA_aws\bin\splunk_ta_aws\common\aws_credentials.py", line 158, in load credentials = self._load(aws_account_name, aws_iam_role_name) File "C:\Program Files\Splunk\etc\apps\Splunk_TA_aws\bin\splunk_ta_aws\common\aws_credentials.py", line 169, in _load raise AWSAccountError('account not found', aws_account_name) splunk_ta_aws.common.aws_credentials.AWSAccountError: account not found
Anyone having issues with nano second formatting from JSON logs. Currently it seems like times get rounded up or something. What's sent:      2020-09-11T10:23:44.30373164-05:00 What shows:     9/... See more...
Anyone having issues with nano second formatting from JSON logs. Currently it seems like times get rounded up or something. What's sent:      2020-09-11T10:23:44.30373164-05:00 What shows:     9/11/20 10:23:44.000 AM If you have/experienced this issue, how are you coping with or how did you solved it?
Hi Community, I am stuck in preparing a dashboard where i have a text input with token name as "uname" . I want to use that token value(uname) in the search string i am writing under the dropdown i... See more...
Hi Community, I am stuck in preparing a dashboard where i have a text input with token name as "uname" . I want to use that token value(uname) in the search string i am writing under the dropdown input to be able to get all the JsessionIds against that userid. <input type="text" token="uname">   <label>Username</label> </input>   <input type="dropdown" token="session">   <label>Jsessionid</label>   <fieldForLabel>Jsessionid</fieldForLabel>   <fieldForValue>Jsessionid</fieldForValue>   <search>     <query>index=middleware_logs userName=$uname$|stats count by JSessionID </query>   </search> The token value doesnt get picked up under dropdown search query. Is there a way to make this work?\ Please help  
We are planing on move our HOT data to  a faster storage. This new drive is already present in our Windows Index, so what is the best way to approach this. We use NetApp for storage thank you
When calculating a minimum value across events how can you then retain an event field related to the minimum value and display the minimum value with the event field?   
Hello, we don't receive audit logs anymore since upgrade from R77 to R80.1, anyone has an idea? FW logs are OK. Should we switch to Checkpoint app for Splunk : https://splunkbase.splunk.com/app/429... See more...
Hello, we don't receive audit logs anymore since upgrade from R77 to R80.1, anyone has an idea? FW logs are OK. Should we switch to Checkpoint app for Splunk : https://splunkbase.splunk.com/app/4293/ using Log exporter? Thanks for your help. Splunk Enterprise 7.3.4 / OPSEC LEA app 5.0    
Fresh install of Splunk 7.3.3 on Windows Server 2016 index server. I kept getting the "Splunk installation ended prematurely" with no clear evidence as to why. I tried most of the possible solution... See more...
Fresh install of Splunk 7.3.3 on Windows Server 2016 index server. I kept getting the "Splunk installation ended prematurely" with no clear evidence as to why. I tried most of the possible solutions in older posts, with no improvement. Turns out my domain Splunk service account had been set up with membership in a slew of groups, including the domain admin group. This apparently caused a permissions disconnect in the index server local security policy, where both Deny logon as a batch job and Deny logon as a service included Domain Admin. So, even though my Splunk service account had permission to Logon as a batch job and Logon as a service, it was blocked because of the Domain Admin membership. I removed the service account from the Domain Admin group and the installation completed successfully.
Hello, I have a playbook that is currently in production and I don't want to randomly test it without asking the question first.  We have a condition that has to be met in order for our playbook to ... See more...
Hello, I have a playbook that is currently in production and I don't want to randomly test it without asking the question first.  We have a condition that has to be met in order for our playbook to continue via an if / else  decision filter. This filter is based on whether or not an user is an Employee or Non-Employee. However, we have other employee and non-employee types, example would be "Employee Executive".  With this, currently the operators are  == Employee OR == Non-Employee   I'm wondering if the "in" option is more of a contains?  could I switch the operator values to just "in Employee", since the word Employee is in all string options we would want to evaluate to true on?  anything else is false and follows the else path. 
I want to check how Splunk was deployed in our environment in the past. Whether it is done using tgz file or rpm/dpkg method? Our environment is a Linux environment. Can someone help me out on th... See more...
I want to check how Splunk was deployed in our environment in the past. Whether it is done using tgz file or rpm/dpkg method? Our environment is a Linux environment. Can someone help me out on this?
Hello, What is the maximum number of event that can be pushed from Splunk to SNow event management on one call. Currently we are facing issue when we tried to forward 1000's events during major outa... See more...
Hello, What is the maximum number of event that can be pushed from Splunk to SNow event management on one call. Currently we are facing issue when we tried to forward 1000's events during major outage to SNow and it takes lot of time to complete the request where subsequent calls are skipped due to previous call still running in the background.  Regards, Naresh
Hello Splunker's, I have created my own app and would like to know how to add a dashboard to the app bar. For example, I would like create a new tab to the app bar called main. It will be my main in... See more...
Hello Splunker's, I have created my own app and would like to know how to add a dashboard to the app bar. For example, I would like create a new tab to the app bar called main. It will be my main interactive panel. Thank you Marco
Hi everyone. I am still learning Splunk so that I will need your assistance on this, please. I am currently working on a PoC where our firewalls are sending traffic logs to Splunk. In order to sho... See more...
Hi everyone. I am still learning Splunk so that I will need your assistance on this, please. I am currently working on a PoC where our firewalls are sending traffic logs to Splunk. In order to shorten the size of data being ingested into Splunk I would like to know if there is an option to remove “variable” names from traffic being transmitted. In summary, this is a typical traffic log that we receive from our firewalls in order to ingest it into Splunk: Instead, we would like to ingest only "values" from each “key/value” pair. In this case, instead of time=17:28:50 devname=”my3700d”, only 17:28:50 “my3700d” would be ingested. Could you guys please assist me on this issue? Thank you very much. Fmandelli
Hi Splunkers,   Can anyone please help with search time line break for the following log.     {"audits":[{"id":"000","version":1,"modified":"2020-09-11T12:28:44.351897585Z","sortValues":null,"ac... See more...
Hi Splunkers,   Can anyone please help with search time line break for the following log.     {"audits":[{"id":"000","version":1,"modified":"2020-09-11T12:28:44.351897585Z","sortValues":null,"action":{"JobID":"97979779797","Name":"TA"},"user":"x","object":"Jobs","type":"w","identifier":"0h0hh0h0hh"},{"id":"879789","version":1,"modified":"2020-09-11T12:27:46.568076802Z","sortValues":null,"action":{"JobID":"0000000"},"user":"KKK","object":"Jobs","type":"delete","identifier":""},{"id":"90808","version":1,"modified":"2020-09-11T12:25:04.808661137Z","sortValues":null,"action":{"JobID":"9889808088","Name":"KA"},"user":"uy","object":"Jobs","type":"add","identifier":"9878979797"},     I want to break the logs using {"id. Thank you!  
Hi all, I'm new to splunk and i had hard time extracting fields  using regex for the following example : Class (six, seven) Can someone help me with the above example , i want class as a field nam... See more...
Hi all, I'm new to splunk and i had hard time extracting fields  using regex for the following example : Class (six, seven) Can someone help me with the above example , i want class as a field name and six,seven as two values for the field class. Thanks for the help in Advance.
i have one host with multiple sourcetype , i want to extract some field but  that field also have some different so for all events i have to write different different rex command , is there any way t... See more...
i have one host with multiple sourcetype , i want to extract some field but  that field also have some different so for all events i have to write different different rex command , is there any way to write rex command for all events   like this Win_7_cuckoo.vmx packer-centos6.vmx test-vm-auto2.vmx win-10-test1.vmx  so from here except .vmx  can any one  help for this ?
In the documentation at https://docs.splunk.com/Documentation/AddOns/released/AWS/CloudWatchLogs we see below limitation. Can you please clarify on the same? Can't we use Splunk Add-on for AWS for s... See more...
In the documentation at https://docs.splunk.com/Documentation/AddOns/released/AWS/CloudWatchLogs we see below limitation. Can you please clarify on the same? Can't we use Splunk Add-on for AWS for streaming cloudwatch logs?   Due to rate limitations, don't use the Splunk Add-on for AWS to collect CloudWatch Log data which has the source type aws:cloudwatchlogs:*. Instead, use the Splunk Add-on for Amazon Kinesis Firehose to collect CloudWatch Log and VPC Flow Logs. The Spunk Add-on for Amazon Kinesis Firehose includes index-time logic to perform the correct knowledge extraction for these events through the Kinesis input as well.
We are using a clustered environment with indexers,  search-heads, a deployer, and a heavy forwarder (all running on the same Splunk Enterprise version). We are setting up TA on a heavy forwarder wh... See more...
We are using a clustered environment with indexers,  search-heads, a deployer, and a heavy forwarder (all running on the same Splunk Enterprise version). We are setting up TA on a heavy forwarder which generates a CSV file(lookup table). We would like to auto copy/replicate a CSV file from heavy forwarder to search-heads/indexer. Let me know how to achieve this? Thanks in advance.
Hi,  Here is my xml code of 2 pannels : <panel> <single> <search base="abc"> <query> | eval Duration=round(abs((relative_time(now(), "@d")-relative_time(strptime(Timestamp, "%Y-%m-%dT%H:%M:%S.%... See more...
Hi,  Here is my xml code of 2 pannels : <panel> <single> <search base="abc"> <query> | eval Duration=round(abs((relative_time(now(), "@d")-relative_time(strptime(Timestamp, "%Y-%m-%dT%H:%M:%S.%N"), "@d"))/86400),0) | table Duration | rename Duration | head 1</query> </search> <option name="colorBy">value</option> <option name="colorMode">none</option> <option name="drilldown">all</option> <option name="numberPrecision">0</option> <option name="rangeColors">["0x53a051","0xf1813f","0xdc4e41"]</option> <option name="rangeValues">[0,1]</option> <option name="refresh.display">progressbar</option> <option name="showSparkline">1</option> <option name="showTrendIndicator">1</option> <option name="trellis.enabled">0</option> <option name="trellis.scales.shared">1</option> <option name="trellis.size">medium</option> <option name="trendColorInterpretation">standard</option> <option name="trendDisplayMode">absolute</option> <option name="useColors">1</option> <option name="useThousandSeparators">1</option> <drilldown> <link target="_blank">/app/myapp/myapp__details</link> </drilldown> </single> </panel> <panel> <single> <search base="abc"> <query> | eval dummy="true" | eval epochnow = now() | eval epochHorodate=strptime(Horodate, "%Y-%m-%dT%H:%M:%S")| eval Horodate=strftime(epochHorodate, "%Y-%m-%dT%H:%M:%S") | where epochnow&gt;=epochHorodate | eval _time=Horodate | stats count(eval(Statut=="KO")) as KO by _time | sort _time ASC | addcoltotals</query> </search> <option name="colorBy">value</option> <option name="colorMode">none</option> <option name="drilldown">all</option> <option name="rangeColors">["0x53a051","0xf8be34","0xdc4e41"]</option> <option name="rangeValues">[0,50]</option> <option name="refresh.display">progressbar</option> <option name="trellis.enabled">0</option> <option name="unit">KO</option> <option name="useColors">1</option> <option name="useThousandSeparators">1</option> <drilldown> <link target="_blank">/app/myapp/myapp__details</link> </drilldown> </single> </panel> The first drilldown works perfectly, but the second which is exactly the same doesn't works at all. Can you please help me ?  Thanks.  
Hey, Is there any way to user the REST API to get the html of an existing dashboard? I know I can get the XML of a dashboard but I need the html in order to render it in our application(no iframe).