All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

  ITSI menus send the users to "suite_redirect" page, that also fails to load with shows "oops" for non admin users Usually after an ITSI Upgrade (observed on 4.9 and later), on a Search-Head cluster.
Since upgrading to ITSI 4.9, the app reverted to the free version "ITEssential Work" (ITE-W) and most premium features are gone. This is because the ITSI license is now mandatory, or that the license... See more...
Since upgrading to ITSI 4.9, the app reverted to the free version "ITEssential Work" (ITE-W) and most premium features are gone. This is because the ITSI license is now mandatory, or that the license-master was not properly upgraded.
I have a requirement where I need to make an API call and write the data to a lookup file that I can use locally. The API calls returns data in a CSV format.    Previously, I used the Ad-on build... See more...
I have a requirement where I need to make an API call and write the data to a lookup file that I can use locally. The API calls returns data in a CSV format.    Previously, I used the Ad-on builder to create a python script that would make make the API request and index this data. However, I have a new requirement to skip the index entirely and write to a local lookup on the search head. The Ad-on builder wont help as it only shows examples of how to write the data to an index.   Thank you!
I have a search I can compose using multiple appends and sub-searches to accomplish, but I assume there's an easier way I'm just not seeing, and hoping someone can help. (maybe using | chart?) Esse... See more...
I have a search I can compose using multiple appends and sub-searches to accomplish, but I assume there's an easier way I'm just not seeing, and hoping someone can help. (maybe using | chart?) Essentially, I have a set of user login data... username and login_event (successful, failed, account locked, etc...). I'd like to display a chart showing total events (by login_event) and distinctive count by username, which might look like below: login_event count successful 1600 failed 200 account locked 10 successful (distinct usernames) 1200 failed (distinct usernames) 50 account locked (distinct usernames) 9
Hello Team, I have deployed Istio based Application on Kubernetes. And I want to monitor the same in Splunk APM. The application has side cars injected and is accessible from browser. I am using B... See more...
Hello Team, I have deployed Istio based Application on Kubernetes. And I want to monitor the same in Splunk APM. The application has side cars injected and is accessible from browser. I am using Bookinfo Demo Application available on istio : https://istio.io/latest/docs/setup/getting-started/ Can you please guide how to configure Otel-agent so it reports istio App traces to SPLUNK APM ? Is Istio-mixer -adapter required (it shows deprecated in documentation)
Hi,   We are using splunk website Monitoring App in Splunk enterprise and we want to know if there is any option available to schedule the Maintenance window during the changes on Websites,to avo... See more...
Hi,   We are using splunk website Monitoring App in Splunk enterprise and we want to know if there is any option available to schedule the Maintenance window during the changes on Websites,to avoid alerts generated at that time.
Looking to measure heavy sources and track how much is getting indexed per day by source. the main problem is our Splunk admin team cannot give us access to the _internal index, so i cannot run the ... See more...
Looking to measure heavy sources and track how much is getting indexed per day by source. the main problem is our Splunk admin team cannot give us access to the _internal index, so i cannot run the standard  _internal metrics commands such as: index=_internal sourcetype=splunkd source=*metrics.log* group=per_source_thruput   Curious as to how accurate measuring actual log sizes with Splunk commands might be compared to _internal index stats. we dont need 100% accurate results just a ballpark estimate such as one source might be indexing 5-600Gbs per day or 1-1.5 Tb a day for example. Thinking of trying something like    index=aws-index sourcetype=someSource source="/some/source/file.log" | eval raw_len=len(_raw) | eval raw_len_kb = raw_len/1024 | eval raw_len_mb = raw_len/1024/1024 | eval raw_len_gb = raw_len/1024/1024/1024 | eval raw_len_tb = raw_len/1024/1024/1024/1024 | stats sum(raw_len_mb) as MB sum(raw_len_gb) as GB sum(raw_len_tb) as TB by source                
Hello! I have a dataset that I'd like to add a new field to where I can arbitrarily define the values with manual input without downloading and reuploading the data. I've tried editing the table but... See more...
Hello! I have a dataset that I'd like to add a new field to where I can arbitrarily define the values with manual input without downloading and reuploading the data. I've tried editing the table but it seems as though I can only enter a calculated value, some cacatenation of fields and values, or input the same value for every record. Any help is appreciated, thanks! example: original dataset OG Field 1 OG Field 2 OG Field 3 UUID timestamp value UUID timestamp value   new dataset OG Field 1 OG Field 2 OG Field 3 New Field UUID timestamp value I can input anything I want here like a comment on the record UUID timestamp value I can input something different here   I don't necessarily need to use tables so if there's another method of adding new fields to datasets from within Splunk I'm open to that as well.
I would really love to use  Campus Compliance Toolkit for NIST 800-171 But I have Splunk Cloud Enterprise. Splunkbase says version 1.0.2 works, but sadly Splunk support says it doesn't. Is there ... See more...
I would really love to use  Campus Compliance Toolkit for NIST 800-171 But I have Splunk Cloud Enterprise. Splunkbase says version 1.0.2 works, but sadly Splunk support says it doesn't. Is there any chance others like this tool and have made it work? Or is there an alternative NIST reporting app out there (that doesn't require an annual license fee)? Thank you for your feedback.
Hello, We have an app that passed the Cloud Vetting today, but I can't find it in Splunk Cloud in "Browse more apps" in order to install it. This is the app: https://splunkbase.splunk.com/app/6336/... See more...
Hello, We have an app that passed the Cloud Vetting today, but I can't find it in Splunk Cloud in "Browse more apps" in order to install it. This is the app: https://splunkbase.splunk.com/app/6336/ Do you know why? Thanks, Omer
Is there a way to add an index to the underlying Oracle table behind the Unified Audit Trail view? We have performance issues and we thought about making the rising column an index.
What is the location of Splunk commands like inputlookup,lookup,mvexpand,multikv,split,stats,eval,chart,tstats in splunk directory.
How can I put the current date in the where clause? For example with the below query I want to fetch all IDOCs that has been created today. I have just hard coded today's date. What should I use to p... See more...
How can I put the current date in the where clause? For example with the below query I want to fetch all IDOCs that has been created today. I have just hard coded today's date. What should I use to put the today's date condition? SELECT CREDAT, DOCNUM, STATUS, MESTYP, TIMESTAMP FROM idocs_details WHERE MESTYP = "ZPSWDMGMT" AND CREDAT = "20220324" anD STATUS = "51"
Hello, I was wondering if there is a timeline on whether the Status Indicator app would be able to be used in Dashboard Studio. I want to convert my Classic Dashboards into the new version and we h... See more...
Hello, I was wondering if there is a timeline on whether the Status Indicator app would be able to be used in Dashboard Studio. I want to convert my Classic Dashboards into the new version and we heavily use the Status Indicator app. Regards, Testy
Hello we are trying to add filter on the input of windows event log. the input conf is:   [WinEventLog://Security] disabled = 0 index = windows blacklist1 = 5145,5156 blacklist2 = EventCode=4... See more...
Hello we are trying to add filter on the input of windows event log. the input conf is:   [WinEventLog://Security] disabled = 0 index = windows blacklist1 = 5145,5156 blacklist2 = EventCode=4672 SubjectUserName="exchange\$" renderXml=true suppress_text=true supress_sourcename=true supress_keywords=true suppress_task=true suppress_opcode=true   blacklist1 is working good, but blacklist2 is not working. the target is to filter out the event id 4672 with the SubjectUserName equals to "exchange$". any ideas?   Thank you
I have a log events (each about 260 lines) related to our AWS EMR Cluster 'performance' metrics. It seems it's just a collection of output from certain Linux commands. ** If I want to parse ... See more...
I have a log events (each about 260 lines) related to our AWS EMR Cluster 'performance' metrics. It seems it's just a collection of output from certain Linux commands. ** If I want to parse e.g. like free -m, to generate some table output / timechart out of those, how would I start to parse these (assuming it's possible) ? Extract New fields, using Regular Expression didn't seem to work ...
i installed a new splunk enterprise server and configured it as deploy server as documented here: https://docs.splunk.com/Documentation/Splunk/8.2.5/Updating/Aboutdeploymentserver then i added an a... See more...
i installed a new splunk enterprise server and configured it as deploy server as documented here: https://docs.splunk.com/Documentation/Splunk/8.2.5/Updating/Aboutdeploymentserver then i added an app in $SPLUNK_HOME/etc/deployment-apps/MyApp on the 26 forwarders running ubuntu 20.04 i ran splunk set deploy-poll MyDS:8089 but only 20 of them show up in the forwarder management. when i remove one of them by clicking on "delete record", i can add a missing one by running splunk set deploy-poll MyDS:8089 again. a license is installed too and it says  'DeployServer': 'ENABLED' any ideas? thanks...
im trying to setup splunk to find suspicious traffic in incoming and outgoing traffic. right now im trying to exclude traffic that comes from places that are not suspicious (whitelist) like social me... See more...
im trying to setup splunk to find suspicious traffic in incoming and outgoing traffic. right now im trying to exclude traffic that comes from places that are not suspicious (whitelist) like social media websites, news websites, internal traffic etc. and use a blacklists to trigger alerts if it tries to connect to my IP. im not sure where to start here, is this even possible in splunk? there are existing blacklists online that use an api key to connect to for that blacklist, can i use that api key in splunk? do i download an entire database and upload it under my C: drive? i really hope you guys can help me here
Hi. How to set up a Health Rule from monday to fryday and from 06:00 to 23:59 h? I'm trying using this expresions Start: 0 6 * * 1-5 and 0 6 * * MON-FRY and 0 0 6 ? * MON-FRI End: 59 23 * * 1-5 a... See more...
Hi. How to set up a Health Rule from monday to fryday and from 06:00 to 23:59 h? I'm trying using this expresions Start: 0 6 * * 1-5 and 0 6 * * MON-FRY and 0 0 6 ? * MON-FRI End: 59 23 * * 1-5 and 59 23 * * MON-FRY But this message appears: "Error 0 6 * * MON-FRI is not a parseable cron expression" Wich is the right way to set up this? Thanks in advance for your help          
Hi, I am trying to create a table of top N categories per Region for a number of indexes. However, when I run the query on some indexes, the necessary fields exist in the events, i.e. category, re... See more...
Hi, I am trying to create a table of top N categories per Region for a number of indexes. However, when I run the query on some indexes, the necessary fields exist in the events, i.e. category, region, NodeName, host, .... yet, no table is produced in the statistics. The Statistics is as follows: And here are the respective events with necessary fields:   Why would that be? Thanks, Patrick