All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

If those hosts didn’t contains any other data then just stop UF then remove …/var/lib/splunk/fishbucket (check spelling on your node) directory. Start UF service and it start to indexing everything f... See more...
If those hosts didn’t contains any other data then just stop UF then remove …/var/lib/splunk/fishbucket (check spelling on your node) directory. Start UF service and it start to indexing everything from scratch. Then do this same for other nodes. If those nodes contains also some other events what you cannot reindexing the you must remove those file by file and start reindexing only for those files.  Here is one old post about it https://community.splunk.com/t5/Deployment-Architecture/Use-btprobe-reset-to-re-index-multiple-files/td-p/313186
Hi @Uma.Boppana, Thanks for letting me know. I think the next best step is to contact AppD Support. Please follow the link above to an article that will walk you through it if you're not familiar w... See more...
Hi @Uma.Boppana, Thanks for letting me know. I think the next best step is to contact AppD Support. Please follow the link above to an article that will walk you through it if you're not familiar with it.
This article has instructions for embedding Dashboard Studio JSON in XML dashboard file.
Hello, I have a Palo Alto Firewall in my environment and would like to set it up to forward logs to a Splunk indexer which is also the syslog server. The environment is small and we are not allowed t... See more...
Hello, I have a Palo Alto Firewall in my environment and would like to set it up to forward logs to a Splunk indexer which is also the syslog server. The environment is small and we are not allowed to log in to anything to download software, so using the App or Add-on isn't possible. Is there a way to directly send my Palo logs to the Splunk indexer?
Ok got it after doing all these configuration when i will search my index with its source i will get my result in logs itself right? 
Not sure the best way to go about this. We had an index that originally had a 30 day retention that they wanted extended to 1 year after it had been running for awhile. It was also originally setup t... See more...
Not sure the best way to go about this. We had an index that originally had a 30 day retention that they wanted extended to 1 year after it had been running for awhile. It was also originally setup to to collect new data going forward but they now also want all the historical data pulled into splunk as this was replacing a different tool. How do I restore the already retentioned data and collect the old data that was originally outside of the window of what we wanted pulled in? I've already adjusted the retention period and removed the ignoreolder=7d from the config. Am I just better off rebuilding the whole thing from scratch?   [monitor://<Path>] index=<Appname> sourcetype=chr crcSalt = <SOURCE> [<Appname>] homePath = volume:hot/$_index_name coldPath = volume:cold/$_index_name summaryHomePath = volume:summaries/$_index_name thawedPath = /opt/splunk/data/thawed/$_index_name enableTsidxReduction = false maxDataSize = auto_high_volume frozenTimePeriodInSecs = 31536000
Great, this also works and is actually simpler than the spath solution, thanks!
To keep this knowledge up to date, to register you can use: http://splk.it/slack    
We had a Splunk instance on-premise that was storing our colddb data from Splunk on to and isilon storage, we have moved to the cloud and want to move some of our indexed data in the colddb on the is... See more...
We had a Splunk instance on-premise that was storing our colddb data from Splunk on to and isilon storage, we have moved to the cloud and want to move some of our indexed data in the colddb on the isilon to an aws instance we have setup in the cloud, does anyone have any suggestions on how to go about moving the data to the aws. We would just like to have the rawdata and not all the metadata from  the apps. 
Is there a powershell command to find out if splunk is indeed forwarding logs to splunk console? I can check if agent is installed andrunning but how about forwarding?   which log should i check for?
Hey all,   I'm looking to trail splunk cloud and more specifically the Data Manager feature.  I've successfully logged in to my trial instance (Version: 9.3.2408.107 Experience: Classic) but "Data ... See more...
Hey all,   I'm looking to trail splunk cloud and more specifically the Data Manager feature.  I've successfully logged in to my trial instance (Version: 9.3.2408.107 Experience: Classic) but "Data Manager" isn't present as an App in the list.  Is this not something I can use with a trial version?  
https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/Inputsconf interval = [<decimal>|<cron schedule>] * How often, in seconds, to run the specified command, or a valid "cron" schedule. * If y... See more...
https://docs.splunk.com/Documentation/Splunk/9.4.0/Admin/Inputsconf interval = [<decimal>|<cron schedule>] * How often, in seconds, to run the specified command, or a valid "cron" schedule. * If you specify the interval as a number, it may have a fractional component; for example, 3.14 * To specify a cron schedule, use the following format: * "<minute> <hour> <day of month> <month> <day of week>" * Cron special characters are acceptable. You can use combinations of "*", ",", "/", and "-" to specify wildcards, separate values, specify ranges of values, and step values. * The cron implementation for data inputs does not currently support names of months or days. * The special value "0" forces this scripted input to be run continuously. As soon as the script exits, the input restarts it. * The special value "-1" causes the scripted input to run once on start-up. * NOTE: when you specify a cron schedule, the input does not run the script on start-up. * Default: 60.0
$row.<column-field-name>.value$ Since the token referenced the column field name that is as specific as it gets.  I understand that you want to have multiple columns with unique URL for a reference.... See more...
$row.<column-field-name>.value$ Since the token referenced the column field name that is as specific as it gets.  I understand that you want to have multiple columns with unique URL for a reference.  All the documentation at this time points very specifically that you are limited to only one URL per row.
Please tell me why you didn't see the answer. What is the difference between Splunk SOAR licenses?
Can you tell me which format your are ingesting from these examples. https://learn.microsoft.com/en-us/previous-versions/iis/6.0-sdk/ms525807(v=vs.90) #Software: Internet Information Services 6.0 ... See more...
Can you tell me which format your are ingesting from these examples. https://learn.microsoft.com/en-us/previous-versions/iis/6.0-sdk/ms525807(v=vs.90) #Software: Internet Information Services 6.0 #Version: 1.0 #Date: 2001-05-02 17:42:15 #Fields: time c-ip cs-method cs-uri-stem sc-status cs-version 17:42:15 172.16.255.255 GET /default.htm 200 HTTP/1.0 OR 192.168.114.201, -, 03/20/01, 7:55:20, W3SVC2, SALES1, 172.21.13.45, 4502, 163, 3223, 200, 0, GET, /DeptLogo.gif, -, 172.16.255.255, anonymous, 03/20/01, 23:58:11, MSFTPSVC, SALES1, 172.16.255.255, 60, 275, 0, 0, 0, PASS, /Intro.htm, -,   Once you confirm which format someone should be able to provide a recommended props.conf for the ingested sourcetype. Ofcourse you could opt for the app from Splunk base which looks to be very complete for IIS server logs. https://splunkbase.splunk.com/app/3185  
Hi @Nawab  Have you tried integrated using  add on and documention, this documentation helps to setup the iis logs https://docs.splunk.com/Documentation/AddOns/released/MSIIS/About  https://... See more...
Hi @Nawab  Have you tried integrated using  add on and documention, this documentation helps to setup the iis logs https://docs.splunk.com/Documentation/AddOns/released/MSIIS/About  https://splunkbase.splunk.com/app/3185 
@PickleRick  Error - INFO: Cannot use span, bins on a remote storage index unless cached=f is specified (index "_internal").
And how are we supposed to know what is the cause of this error when we don't know what error it is?
@PickleRick  I need to get the result like the below shown as in image but while am running span=1d it shows error   
Hi @Real_captain , ok, try something like this: <your_search> | stats values("Start Time") AS "Start Time" values("End Time") AS "End Time" values(Status) AS Status BY Job | tr... See more...
Hi @Real_captain , ok, try something like this: <your_search> | stats values("Start Time") AS "Start Time" values("End Time") AS "End Time" values(Status) AS Status BY Job | transpose column_name=Job header_field=Job Please, check the syntax of the transpose command because I cannot test it. Ciao. Giuseppe