All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I want to get below installer. splunk-add-on-for-unix-and-linux_523.tgz It seems same add-on with below URL because the downloaded file name for ver 8.3.0/8.3.1 is  similar with above an add-on ins... See more...
I want to get below installer. splunk-add-on-for-unix-and-linux_523.tgz It seems same add-on with below URL because the downloaded file name for ver 8.3.0/8.3.1 is  similar with above an add-on installer name what I want. https://splunkbase.splunk.com/app/833/ I also checked release history. But I was not able to find the way of getting past add-on files. Maybe install file for 5.2.3 is too old and Splunk won't support.. If there is the way of get pas released install file, please share me.. Thank you!
Hi All, Splunk cloud is not receiving the  logs form Windows Universal Forwarder. I see the below logs from Splunkd. can please help me to resolve the issue.   Splunkd logs: AutoLoadBalancedConne... See more...
Hi All, Splunk cloud is not receiving the  logs form Windows Universal Forwarder. I see the below logs from Splunkd. can please help me to resolve the issue.   Splunkd logs: AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Closing stream for idx=10.236.162.24:9997 10-08-2021 09:49:11.748 +0100 INFO AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Connected to idx=10.236.82.132:9997, pset=0, reuse=0. 10-08-2021 09:49:17.976 +0100 INFO TailReader [2576 tailreader0] - Batch input finished reading file='C:\Program Files\SplunkUniversalForwarder\var\spool\splunk\tracker.log' 10-08-2021 09:49:47.803 +0100 INFO TailReader [2576 tailreader0] - Batch input finished reading file='C:\Program Files\SplunkUniversalForwarder\var\spool\splunk\tracker.log' 10-08-2021 09:49:51.617 +0100 INFO AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Closing stream for idx=10.236.82.132:9997 10-08-2021 09:49:51.618 +0100 INFO AutoLoadBalancedConnectionStrategy [3780 TcpOutEloop] - Connected to idx=10.236.162.97:9997, pset=0, reuse=0.
I thought I was following OK practice as these were customisations to collections.conf and transforms.conf and savedsearches.conf in the local directory But it appears the app owner just got rid of ... See more...
I thought I was following OK practice as these were customisations to collections.conf and transforms.conf and savedsearches.conf in the local directory But it appears the app owner just got rid of them when I upgraded to 5.0.0 to 5.1.0 Working to recover the situation and have pinged the developer. The data should be recreated Was it my fault by adding stanzas to a commercial app ? or should I have been protected if I stuck to local copies ?    
Hi, I'm trying to build a search to find the count, min,max and Avg within the 99th percentile, all work apart from the count, not sure if I am missing something: index="main" source="C:\\inetpub\\... See more...
Hi, I'm trying to build a search to find the count, min,max and Avg within the 99th percentile, all work apart from the count, not sure if I am missing something: index="main" source="C:\\inetpub\\logs\\LogFiles\\*" |bin span=1d _time | eval ResponseTime= time_taken/1000000 | eval responseTime= time_taken/1000000 | timechart span=1mon p99(responseTime) as 99thPercentile | stats min(99thPercentile) as p99responseTimemin max(99thPercentile) as p99responseTimemax avg(99thPercentile) as p99responseTimeavg count(99thPercentile) by _time   Thanks   Joe
can any one help me with step by step ssl configuration from UF to HF and HF to Indexers?   Any help would be apprecitaed
I need to create a table that includes the filename, the domain name of which file came from, the source IP, the destination IP, and the date/time stamp. What should the search query be? Also with ... See more...
I need to create a table that includes the filename, the domain name of which file came from, the source IP, the destination IP, and the date/time stamp. What should the search query be? Also with this -  Find the executable they uploaded. Once found, detail the following in a single table: What was the filename? When was it uploaded?  Was the upload successful?  Where did it come from?
Hello I am looking a simple SPL to  to detect activity from users without MFA in AWS. I have the search below which suggest I am getting some where. but just need to confirm if there is more to it  ... See more...
Hello I am looking a simple SPL to  to detect activity from users without MFA in AWS. I have the search below which suggest I am getting some where. but just need to confirm if there is more to it    sourcetype=cloudtrail userIdentity.sessionContext.attributes.mfaAuthenticated=false
Hello, I have 4 python scripts  to parse data that we receive in Linux machine once a day where HF has installed. Currently, I am running my python scripts manually every day  in that Linux machine ... See more...
Hello, I have 4 python scripts  to parse data that we receive in Linux machine once a day where HF has installed. Currently, I am running my python scripts manually every day  in that Linux machine to perform that task . Is there any ways, I can  write Cron Expression  to automate my python scripts so that python scripts will run automatically once a day in that Linux machine where HF has installed. Thank you so much, any help will be highly appreciated.   
I want to delete this field (VID) from one of my search query, this is not available under  Field extractions. and what is the difference between (a and #) ?  
Hi,       I have recently integrated and migrated AWS Simple Queue Serivce (SQS) logs to splunk. I am trying to search for the SQS logs in splunk and its not returning anything. Below is the query ... See more...
Hi,       I have recently integrated and migrated AWS Simple Queue Serivce (SQS) logs to splunk. I am trying to search for the SQS logs in splunk and its not returning anything. Below is the query i am using,    index="application-index" aws_account_id="123456" sourcetype="aws:sqs" Please correct if i missed anything here. Thanks.
Hello Splunk community, Let's say my input to Splunk is three csv files that use the following schema. Each csv populate an index: Faults, Incidents and Status For each Faults entry there is on... See more...
Hello Splunk community, Let's say my input to Splunk is three csv files that use the following schema. Each csv populate an index: Faults, Incidents and Status For each Faults entry there is one (and just one) Status entry. That Status entry will have parent_id = id of that fault. In the same way there is also a 'Status' entry for each Incident. When I am querying Splunk or making dashboards I have to retrieve information not only from 'Faults' or 'Incident' indexes but also from 'Status'. That makes me use a lot of joining indexes queries like this:   index="faults" |join type=outer status_id [search index="status" | rename id as status_id]   I liked this solution at first because 'Faults' and 'Incident' indexes look very clean, but I have read that these types of SPL queries are computational expensive and I am concerned that perhaps this will not escalate well in the future. Should I perhaps modify the schema and remove the Status index and put all that information in the Faults and Incidents like this? Thank you all a lot in advance for your answers. Fran  
We have multiple environments and we have to check whether splunk logs indexed or not. Based on environment we have to run query. In this below query, it's running as: (Dev1 is missing in env=traffi... See more...
We have multiple environments and we have to check whether splunk logs indexed or not. Based on environment we have to run query. In this below query, it's running as: (Dev1 is missing in env=trafficui) index=*_dev sourcetype=WebSphere* OR sourcetype=http* env=trafficui OR env=trafficbatchDev1 NOT source=*/nodeagent/* NOT source=*/dmgr/* NOT source=*native_stdout* NOT source=*native_stderr* | stats min(_time) as FirstTime, max(_time) as lastTime by host index sourcetype source | convert ctime(FirstTime) | convert ctime(lastTime)</set>   But I wanted like: index=*_dev sourcetype=WebSphere* OR sourcetype=http* env=trafficuiDev1 OR env=trafficbatchDev1 NOT source=*/nodeagent/* NOT source=*/dmgr/* NOT source=*native_stdout* NOT source=*native_stderr* | stats min(_time) as FirstTime, max(_time) as lastTime by host index sourcetype source | convert ctime(FirstTime) | convert ctime(lastTime)</set>   Original Query: <form> <label>DEV1 - Logs Indexed</label> <fieldset submitButton="false" autoRun="true"> <input type="dropdown" token="env_tok" searchWhenChanged="true"> <label>Select Environment</label> <choice value="Dev1">DEV1</choice> <choice value="Dev2">DEV2</choice> <default>Dev1</default> <initialValue>Dev1</initialValue> </input> <input type="dropdown" token="app_tok" searchWhenChanged="true"> <label>Select Application</label> <fieldForLabel>env</fieldForLabel> <fieldForValue>env</fieldForValue> <choice value="trafficui OR env=trafficbatch">Traffic</choice> <choice value="roadsui OR env=roadsbatch">Roads</choice> <change> <condition value="Roads"> <set token="new_search">index=*_dev sourcetype=WebSphere* OR sourcetype=http* env=$app_tok$$env_tok$ NOT source=*/nodeagent/* NOT source=*/dmgr/* NOT source=*native_stdout* NOT source=*native_stderr* | stats min(_time) as FirstTime, max(_time) as lastTime by host index sourcetype source | convert ctime(FirstTime) | convert ctime(lastTime)</set> </condition> <default>Roads</default> <initialValue>Roads</initialValue> </input> <input type="time" searchWhenChanged="true"> <label>Select Date</label> <default> <earliest>-1d@d</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <table> <search> <query>$new_search$ </query> </search> <option name="showPager">true</option> <option name="count">50</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </form>
Hi Splunkers,   We have a plan to upgrade splunk version to 8.1 in PROD environment. Before it we have upgraded version on test environment. In our organisation we are using one customized app whic... See more...
Hi Splunkers,   We have a plan to upgrade splunk version to 8.1 in PROD environment. Before it we have upgraded version on test environment. In our organisation we are using one customized app which is using for sending alert to CAUIM from splunk and this app was build in Splunk add-on builder. In PROD we have a source code of the old application ,  that same file we have copied into test after upgrade the version. Now we have to recompile this application which will be compatible with Python 3 and splunk new version .  We wants to know the flow of recompilation. We tried to export the file but couldn't exported. Anybody knows the location where that file is getting stored. Kindly let us know,   Thanks & Regards , Abhijeet Bandre.
Hi I have log file like this, need to extract "id" from lines that A=20 and match these lines to lines where that B=10, finally show them in a single table. 1-where A=20 export "id(s)" from the... See more...
Hi I have log file like this, need to extract "id" from lines that A=20 and match these lines to lines where that B=10, finally show them in a single table. 1-where A=20 export "id(s)" from these lines: 07:59:42.213 app module: Z[200]id[12]A[20] 07:59:42.213 app module: Y[300]id[88]A[20] 2-if "id" that export in pervious step and B=10 matched join them and make table. share field is "id" between these lines. 07:58:21.533 app module: Q[230]id[12]B[10] 07:58:21.533 app module: V[230]id[88]B[10] expected result: id        A      B 12     20  10 88     20  10 any idea? Thanks,
I'm a Splunk Add-on/App developer, and the Splunk-App I developed has passed the app inspect and runs well on Splunk Enterprise, but how can I get Splunk Cloud certification flag? The flag like this... See more...
I'm a Splunk Add-on/App developer, and the Splunk-App I developed has passed the app inspect and runs well on Splunk Enterprise, but how can I get Splunk Cloud certification flag? The flag like this picture: Only customers can send Splunk Cloud App request? Really? I think It's too inconvenient, as a developer, I can't submit my own review request...
Hello, I have Universal Forward and Heavy Forward in Linux machine, how would I stop and restart them.  Any help will be highly appreciated. Thank you so much, appreciate your support in these effor... See more...
Hello, I have Universal Forward and Heavy Forward in Linux machine, how would I stop and restart them.  Any help will be highly appreciated. Thank you so much, appreciate your support in these efforts.
Is there an SPL to list all my Hosts (Win & Linus), version of their UF, date & time & TZ please? Thanks a million.
Has anyone run into an issue where the Microsoft 365 App for Splunk is causing a search head to crash? I'm wondering if part of the issue is the use of the sankey visualizations on some of the dashbo... See more...
Has anyone run into an issue where the Microsoft 365 App for Splunk is causing a search head to crash? I'm wondering if part of the issue is the use of the sankey visualizations on some of the dashboards given the volume of data it is trying to display. The SH that are crashing are moderately beefy.  https://splunkbase.splunk.com/app/3786/ Does the new answers backend support tagging the app? I forget how to do that.
We have Splunk Ent. + ES. I have a dashboard that I 'd like to install in Security Essentials. What level permission does a user need to install this in Security Essentials? ES admin or Ent. Admin le... See more...
We have Splunk Ent. + ES. I have a dashboard that I 'd like to install in Security Essentials. What level permission does a user need to install this in Security Essentials? ES admin or Ent. Admin level permission? Thanks a million for your reply.
Coming from older version of Splunk.  We have basically html links that when selected opened a new tab with a presaved search.  We have about 30 searches on a single page which all are unique and all... See more...
Coming from older version of Splunk.  We have basically html links that when selected opened a new tab with a presaved search.  We have about 30 searches on a single page which all are unique and all open a new tab to display.  How is this done with the newest version of Splunk?  The only thing I can find has is a panel that launches a saved search in a new tab but it also shows in on the dashboard which is what I don't want because I need to have 29 other items that can be selected.