All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Running on SHC and it's transformed results.
If you know stanza name you should add also it.  Currently there is also splunk app called Admin's little helper, which you could use to run btool from MC or splunk cloud. I strongly recommended to ... See more...
If you know stanza name you should add also it.  Currently there is also splunk app called Admin's little helper, which you could use to run btool from MC or splunk cloud. I strongly recommended to install and use it in any distributed environments! https://splunkbase.splunk.com/app/6368
that is my 99% use case for btool , the aggregated list of xxxxx.conf by file --debug then filter with grep.
Are you running this on SH or SHC? Are your result set events or transformed results? https://docs.splunk.com/Documentation/Splunk/9.4.2/SearchReference/Loadjob
Yes sounds like I will need to proceed with a secondary index then. The key here is that I need to conform to this data model, not just build a search outputting data in this format. Thanks everybody... See more...
Yes sounds like I will need to proceed with a secondary index then. The key here is that I need to conform to this data model, not just build a search outputting data in this format. Thanks everybody for all these useful answers!
Have you try to dispatch your query first then look when it’s ready and after that export results instead of do it in one operation?
If I recall right there aren’t any native method to get app from enterprise to your own workstation. You could install e.g. git client or some app like Splunk Version Control https://splunkbase.splu... See more...
If I recall right there aren’t any native method to get app from enterprise to your own workstation. You could install e.g. git client or some app like Splunk Version Control https://splunkbase.splunk.com/app/4355 and use it to get app via git.
Hi @schnatt  This isnt something that is available natively in Splunk Enterprise, and there arent many apps on Splunkbase around this sort of thing - however there is "Package Apps for Splunk" which... See more...
Hi @schnatt  This isnt something that is available natively in Splunk Enterprise, and there arent many apps on Splunkbase around this sort of thing - however there is "Package Apps for Splunk" which allows you to export apps to S3 / Azure. Does this help at all?  Dont get confused by the app export capabilities in Splunk Cloud (https://www.splunk.com/en_us/blog/platform/introducing-splunk-cloud-app-export.html?locale=en_us) which unfortunately are not available in Splunk Enterprise.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Navanitha  There are a number of things here which could be affecting this. Did you notice the time it took before crashing out? e.g. 1 minute, 5 minutes? There is a --max-time param you can pas... See more...
Hi @Navanitha  There are a number of things here which could be affecting this. Did you notice the time it took before crashing out? e.g. 1 minute, 5 minutes? There is a --max-time param you can pass to curl so Im wondering if this could help.  Are you able to find the job in Splunk to see what its status is, how long it took to execute and how many results it returned? The easiest way to do this is probably via the Job manager (Top Right of Splunk screen under "Activity"). Is there a proxy/firewall between your machine and Splunk? Sometimes firewalls have a tendency to kill downloads or long-running HTTP calls.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
I am running a rest APi basically curl to query Splunk for results and export them to the server.  below is my api query.  My Splunk query is very big and the results are also kind of huge.  Query is... See more...
I am running a rest APi basically curl to query Splunk for results and export them to the server.  below is my api query.  My Splunk query is very big and the results are also kind of huge.  Query is running fine but I don't see any results.   #!/bin/bash search_query=$(cat <<'EOF' search index=my long splunk query EOF ) echo "Running Splunk search..." curl --http1.1 -k -u admin:password \ "https://<splunk uri>:8089/services/search/jobs/export" \ --data-urlencode "search=$search_query" \ -d output_mode=csv \ -d earliest_time='-24d@d' \ -d latest_time='@d' \ -o output-file.csv echo "Done. Results in output-file.csv"   This pi returns below results -  curl: (18) transfer closed with outstanding read data remaining with empty ouput-file.csv.  Looks like it is not able to run such huge query.  I tried the curl command with some simple search query and it is working.  How can I make this work ?
@bishida thanks.  The URL you linked was the one I used and lists smartagent.  Is there a different URL?
As said don’t use DS as HF. Those are separate roles and didn’t work well together in any bigger environment.  @livehybrid already post splunk instructions how to do it. When you are using IHF (in... See more...
As said don’t use DS as HF. Those are separate roles and didn’t work well together in any bigger environment.  @livehybrid already post splunk instructions how to do it. When you are using IHF (intermediate heavy forwarder) you always should use at least two to avoid unneeded service break when you need to do restart after config changes. Then you should consider to use async forwarding to spread events equally over your SCP.  As @gcusello said you shouldn’t index anything locally when you have SCP. Or if you really need it then you must buy separate splunk enterprise license for it.  Then when you have both Linux and windows UFs you must use Linux as a DS if you want manage both platforms. It’s technically impossible manage Linux UFs correctly from windows DS. 
Hi, i want developers (with GUI-access only) to be able to export the apps they have built. I can't find any information about that in the documentation. Is there a way to export apps without comma... See more...
Hi, i want developers (with GUI-access only) to be able to export the apps they have built. I can't find any information about that in the documentation. Is there a way to export apps without commandline-access to the server? Best Regards, M
I was able to successfully get this working with the guidance, thanks.
Hello hcpr, we did run into the same issue shortly after my previous post, and I forgot to give an update in here. The app "missioncontrol" exposes a few endpoints to do with incidents and investig... See more...
Hello hcpr, we did run into the same issue shortly after my previous post, and I forgot to give an update in here. The app "missioncontrol" exposes a few endpoints to do with incidents and investigations, and tracing the behavior of Enterprise Security when fetching comments led us to its OpenAPI spec which you can find at missioncontrol/mcopenapi.yaml. The incidents endpoint, when fed a finding/notable ID, will return a list of comments. I recommend everyone to take a look at the requests in your browser's developer tools when interacting with finding notes on the Analyst Queue to see how the endpoint works. Ultimately, we went that way and implemented a custom command to perform the same requests at search time. This is now working flawlessly for us so far, getting even those notes which have no incident_id or source in mc_notes. Hope this helps!
Yes, there isn’t really a single global “disable all dashboard refreshes” switch built into Splunk—whether you’re running on-premises (Splunk Enterprise) or in Splunk Cloud. By design, Splunk dashboa... See more...
Yes, there isn’t really a single global “disable all dashboard refreshes” switch built into Splunk—whether you’re running on-premises (Splunk Enterprise) or in Splunk Cloud. By design, Splunk dashboards refresh automatically based on how panels are configured (e.g., scheduled searches or real-time searches) because this keeps them updated for users. However, there are ways to reduce or control refreshes.
The latest 9.4.x has own issues with platformVersion etc. so please check 1st are there anything else which could affect your environment! Then select suitable version to update.
Is the | history command supposed to include details of scheduled searches as well? It's not clearly mentioned in the documentation, so I'm asking for clarification.
In slack there have been discussions that staring of 9.3.x some user preferences have moved into kvstore. Unfortunately that is not documented. I’m not sure if that affect also in this case, but I sug... See more...
In slack there have been discussions that staring of 9.3.x some user preferences have moved into kvstore. Unfortunately that is not documented. I’m not sure if that affect also in this case, but I suggest that you will create a support case for Splunk.
Hi,  Following up on the above discussion, has anyone else discovered that there are quite a few instances where the "incident_id" field is blank in the mc_notes lookup? The other fields (autor.use... See more...
Hi,  Following up on the above discussion, has anyone else discovered that there are quite a few instances where the "incident_id" field is blank in the mc_notes lookup? The other fields (autor.username, create_time and content) contain the correct information but there is nothing in incident_id. Makes it a bit difficult to match the note to the corresponding incident