All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I can across a bug for this app: https://splunkbase.splunk.com/app/6553/ and though I'd share. The log types logs and users work fine. But with apps and groups it's configure to get "enrichment dat... See more...
I can across a bug for this app: https://splunkbase.splunk.com/app/6553/ and though I'd share. The log types logs and users work fine. But with apps and groups it's configure to get "enrichment data", this fails if you need to use a proxy. After a bit of trouble shooting I found the on line 243 in okta_utils.py there is no proxy in the request call. I updated it to the following and it works:   Before: r = requests.request("GET", url, headers=headers) After: r = requests.request("GET", url, headers=headers,proxies=proxies,timeout=reqTimeout)   I also had to add these lines to grab those settings, I added them just before the if statement: # Get Proxy settings proxies = get_proxy_settings(self.session_key, self.logger) # set RequestTimeout to 90sec reqTimeout = float(90)
Hello , I have data like below. I need to frame a query such that I can calculate number of desync for each rate-parity-group.   For example: "rate-parity-group":{"CN":{"avail":11,"price... See more...
Hello , I have data like below. I need to frame a query such that I can calculate number of desync for each rate-parity-group.   For example: "rate-parity-group":{"CN":{"avail":11,"price":11}}} rate-parity-group":{"CK":{"avail":18,"price":0},"CL":{"avail":36,"price":0},"CM":{"avail":18,"price":0}}}, "rate-parity-group":{"CL":{"avail":18,"price":0},"CM":{"avail":36,"price":0}}} Expected outcome  rate-parity-group  total-desync CL                                        54(36+18) CM                                      54 CK                                       18   Since CK,CM,CL all these rate-parity-group is dynamic so I m facing problem.  Could someone help me to get the desync count at rate-parity-group. Sample data attached in screenshot.   Thanks in Advance  
<input type="multiselect" token="product_token" searchWhenChanged="true"> <label>Product types</label> <choice value="*">All</choice> <default>*</default> <prefix>(</prefix> <suffix>)</suffix>... See more...
<input type="multiselect" token="product_token" searchWhenChanged="true"> <label>Product types</label> <choice value="*">All</choice> <default>*</default> <prefix>(</prefix> <suffix>)</suffix> <initialValue>*</initialValue> <valuePrefix>DB_Product="*</valuePrefix> <valueSuffix>*"</valueSuffix> <delimiter> OR </delimiter> <fieldForLabel>DB_Product</fieldForLabel> <fieldForValue>DB_Product</fieldForValue> <search base="base_search_Products"> <query>|dedup DB_Product | table DB_Product</query> </search> </input>   This is my input multi select , thorugh which user select product Types example - All /A,B,C,D etc I need to count, How many Product types are selcted by user . This info i need for further processing.
Is there a way to retrieve what time range does a search use?, I have tried using this endpoint curl -k -u admin:pass https://localhost:8089/services/saved/searches/search_name/history but i gu... See more...
Is there a way to retrieve what time range does a search use?, I have tried using this endpoint curl -k -u admin:pass https://localhost:8089/services/saved/searches/search_name/history but i guess it is not returning its time range Thank you
We have configured DBConnect data from MySQL db under some index at hourly frequency. Data is being pulled however we see that the count of Splunk events is much higher than the count of rows in its... See more...
We have configured DBConnect data from MySQL db under some index at hourly frequency. Data is being pulled however we see that the count of Splunk events is much higher than the count of rows in its respective table. This is due to the fact that the SQL table is real-time in nature and always have the entries updating, whereas, Splunk keeps storing the entries as per the hourly execution frequency. So as a result, Splunk will have historical events too which currently is not present in SQL table. We need to counter this situation as we plan to build some analytics report on this data so it has to be true and updated in Splunk as well.
How can we configure custom domain and SSL certificate purchased from GoDaddy in Splunk? Need to securely access the Splunk enterprise outside my network using the my purchased domain. Please help!
Hi, I would like to know how to run searches using different time ranges in dropdown. For example, an input in the dropdown would be labelled "Yesterday", and I would like to assign 2 different ti... See more...
Hi, I would like to know how to run searches using different time ranges in dropdown. For example, an input in the dropdown would be labelled "Yesterday", and I would like to assign 2 different time ranges to the same label, such that I can run 2 different searches using the separate time ranges by just selecting one input from the dropdown. I have tried defining 4 tokens under the same label, but it doesn't work, ie.   <choice value="yesterday">Yesterday</choice> <condition label="Yesterday"> <set token="custom_earliest">-8d@d+7h</set> <set token="custom_latest">@d+7h</set> <set token="breakdown_earliest">-1d@d+7h</set> <set token="breakdown_latest">@d+7h</set> </condition>    Thanks
I am looking for details if it possible to customize the splunk logs , like mask the data or redact the field or display only required fields in the logs
Hello, I have a chart with dynamic field names displayed as table and would like to change the order of the columns:     Name Season 1 Season 2 Season 3 Name1 10000 1111... See more...
Hello, I have a chart with dynamic field names displayed as table and would like to change the order of the columns:     Name Season 1 Season 2 Season 3 Name1 10000 11111 22222 Name2 9999 9997 9998 Name3 7777 5555 6666     How can I change the order of the columns? The number of Seasons is flexible and it should always start with the latest one -> Name  Season3  Season2  Season1
Hi. I work with ServiceNow, a ticketing platform.  I wish to get only the current "new" incidents and display it in a dashboard, but when I put "| search status=New" I get results which turned "r... See more...
Hi. I work with ServiceNow, a ticketing platform.  I wish to get only the current "new" incidents and display it in a dashboard, but when I put "| search status=New" I get results which turned "resolved" already. Is there a way I can display only the current new incidents?
Hello all, I am trying to filter out those noisy 4662 logs eating our license like anything as recommended in Splunk blogs and forums. Tried the below stanza for 4662 to blacklist everything exce... See more...
Hello all, I am trying to filter out those noisy 4662 logs eating our license like anything as recommended in Splunk blogs and forums. Tried the below stanza for 4662 to blacklist everything except GPO related events, but not working as expected. Any help to fix the regex part. blacklist1 = EventCode="4662" Message="Object Type:(?!\s*groupPolicyContainer)" Raw Message is below :  Message=An operation was performed on an object. Subject : Security ID: $ Account Name: $ Account Domain:  Logon ID: 0x7F897031 Object: Object Server: DS Object Type: groupPolicyContainer Object Name: CN={123456-D64E-4013-ACC5-F78A}CN=Policies,CN=System,DC=xyz,DC=xyyz,DC=com Handle ID: 0x0 Operation: Operation Type: Object Access Accesses: Read Property Access Mask: 0x10 Properties: --- Public Information distinguishedName groupPolicyContainer Additional Information: Parameter 1: - Parameter 2: Can we filter directly based on Object_Type instead of Message field like :  blacklist1 = EventCode="4662" Object_Type="(x|y)".  Any help would be great! Thanks.  
I'm trying to collapse a of data into earliest/lastest by _time,  with the time is contiguous.  Such as:  2022-08-27 07:36:00 2022-08-27 07:37:00 2022-08-27 07:38:00 2022-08-27 07:39:00 2022-... See more...
I'm trying to collapse a of data into earliest/lastest by _time,  with the time is contiguous.  Such as:  2022-08-27 07:36:00 2022-08-27 07:37:00 2022-08-27 07:38:00 2022-08-27 07:39:00 2022-08-27 07:40:00 2022-08-27 07:44:00 2022-08-27 07:45:00 2022-08-27 07:46:00 2022-08-27 08:31:00 2022-08-27 08:32:00 2022-08-27 08:33:00 2022-08-27 08:34:00 2022-08-27 08:35:00 earliest:                               latest: 2022-08-27 07:36:00   2022-08-27 07:40:00 2022-08-27 07:44:00   2022-08-27 07:46:00 2022-08-27 08:31:00   2022-08-27 08:35:00 THoughts? 
Hi, how can I combine two fields (2.1 and 2.2) into one field (Main calculation) I have a table :    I would like to convert it into something like this :   where START_TIME is the value o... See more...
Hi, how can I combine two fields (2.1 and 2.2) into one field (Main calculation) I have a table :    I would like to convert it into something like this :   where START_TIME is the value of 2.1  and the FINISH_TIME is the value of 2.2 field. Completion_time is sum of both two fields  (0.35 + 60.53)   SPL query:   | eval finish_time_epoch = strftime(strptime(FINISH_TIME, "%Y-%m-%d %H:%M:%S"),"%Y-%m-%d %H:%M:%S") | eval start_time_epoch = strftime(strptime(START_TIME, "%Y-%m-%d %H:%M:%S"),"%Y-%m-%d %H:%M:%S") | eval duration_s = strptime(FINISH_TIME, "%Y-%m-%d %H:%M:%S") - strptime(START_TIME, "%Y-%m-%d %H:%M:%S") | eval duration_min = round(duration_s / 60, 2) | rename duration_min AS Completion_time   | eval Process=if(Process="013","2.1 Main calculation",Process) | eval Process=if(Process="014","2.2 Main calculation",Process) | table Process,  2.START_TIME, 3.FINISH_TIME , 4.Completion_time | sort -START_TIME, -FINISH_TIME | sort +Process | transpose 0 header_field=Process column_name=Process | dedup Process  
Getting the following error "could not load lookup = lookup-severity_for_fireeye" after searching the query  tried checking the lookups table and definitions and also automatic lookup at the field n... See more...
Getting the following error "could not load lookup = lookup-severity_for_fireeye" after searching the query  tried checking the lookups table and definitions and also automatic lookup at the field name it is taking in the search and the lookup table is different what can I do to rectify there error
I have 60*24 =1440 records in my saved search, which mean every minutes have 1 record. |bin _time span=1m some things like these in _time column 8/26/2022 11:30:40am 8/26/2022 11:30:41am I co... See more...
I have 60*24 =1440 records in my saved search, which mean every minutes have 1 record. |bin _time span=1m some things like these in _time column 8/26/2022 11:30:40am 8/26/2022 11:30:41am I connected my splunk data to tableau, but all the datetime in 1440 records changed to 8/26/2022 12:00:00am if i use extract mode in tableau, does anyone face the same issue before?
New to Splunk cloud and EC2 universal forwarder install -   I am reading that the Cloud Universal Forwarder on Linux needs a credential file ? where is the credential file  location ? Or is it ... See more...
New to Splunk cloud and EC2 universal forwarder install -   I am reading that the Cloud Universal Forwarder on Linux needs a credential file ? where is the credential file  location ? Or is it just the same on Splunk on premise Universal Forwarder installs ? The more I research the more I get confused.  
Hello We are trying to integrate xsoar cortex with splunk cloud following the manufacturer's document, but it informs that when integrating with splunk cloud it is necessary to request an Access A... See more...
Hello We are trying to integrate xsoar cortex with splunk cloud following the manufacturer's document, but it informs that when integrating with splunk cloud it is necessary to request an Access Api for support, and we also need the IP, as shown in the images below. Is it possible to help us with this? In attachment, follow the screen Configuration. The support passed this link to follow: https://docs.splunk.com/Documentation/SplunkCloud/8.2.2203/Config/ACSIntro , but it is not working, could someone help please?   Thank you.  
Does Splunk ever plan on updating the java api to mirror the python splunk-sdk? The java library is way behind the python library when it comes to custom search.  As far as I can tell you cannot even... See more...
Does Splunk ever plan on updating the java api to mirror the python splunk-sdk? The java library is way behind the python library when it comes to custom search.  As far as I can tell you cannot even create custom searches with java for Splunk.   1. Splunk could more easily integrate with a variety of apache tools.  2. You would get a performance boost in for applications that really should not be built in java. 3.  Splunk would open itself up to a larger segment of the developer community.
We are trying to use our internal S3-compliant object store with SmartStore, but our access key and secret key expires and rotates every day, does anyone know how to handle the key rotation? I know t... See more...
We are trying to use our internal S3-compliant object store with SmartStore, but our access key and secret key expires and rotates every day, does anyone know how to handle the key rotation? I know these keys are set in the indexes.conf file, we should be able to write a script to update it, but does it require splunk restart? Any insights will be greatly appreciated.
Hi, I have a field with timestamp value "2017-09-21T20:00:00" in format. I need to convert it to the  date and time with time zone  For example, Thu Jul 18 09:30:00 PDT 2022 please do help thanks