All Topics

Top

All Topics

Is there a way to retrieve what time range does a search use?, I have tried using this endpoint curl -k -u admin:pass https://localhost:8089/services/saved/searches/search_name/history but i gu... See more...
Is there a way to retrieve what time range does a search use?, I have tried using this endpoint curl -k -u admin:pass https://localhost:8089/services/saved/searches/search_name/history but i guess it is not returning its time range Thank you
We have configured DBConnect data from MySQL db under some index at hourly frequency. Data is being pulled however we see that the count of Splunk events is much higher than the count of rows in its... See more...
We have configured DBConnect data from MySQL db under some index at hourly frequency. Data is being pulled however we see that the count of Splunk events is much higher than the count of rows in its respective table. This is due to the fact that the SQL table is real-time in nature and always have the entries updating, whereas, Splunk keeps storing the entries as per the hourly execution frequency. So as a result, Splunk will have historical events too which currently is not present in SQL table. We need to counter this situation as we plan to build some analytics report on this data so it has to be true and updated in Splunk as well.
How can we configure custom domain and SSL certificate purchased from GoDaddy in Splunk? Need to securely access the Splunk enterprise outside my network using the my purchased domain. Please help!
Hi, I would like to know how to run searches using different time ranges in dropdown. For example, an input in the dropdown would be labelled "Yesterday", and I would like to assign 2 different ti... See more...
Hi, I would like to know how to run searches using different time ranges in dropdown. For example, an input in the dropdown would be labelled "Yesterday", and I would like to assign 2 different time ranges to the same label, such that I can run 2 different searches using the separate time ranges by just selecting one input from the dropdown. I have tried defining 4 tokens under the same label, but it doesn't work, ie.   <choice value="yesterday">Yesterday</choice> <condition label="Yesterday"> <set token="custom_earliest">-8d@d+7h</set> <set token="custom_latest">@d+7h</set> <set token="breakdown_earliest">-1d@d+7h</set> <set token="breakdown_latest">@d+7h</set> </condition>    Thanks
I am looking for details if it possible to customize the splunk logs , like mask the data or redact the field or display only required fields in the logs
Hello, I have a chart with dynamic field names displayed as table and would like to change the order of the columns:     Name Season 1 Season 2 Season 3 Name1 10000 1111... See more...
Hello, I have a chart with dynamic field names displayed as table and would like to change the order of the columns:     Name Season 1 Season 2 Season 3 Name1 10000 11111 22222 Name2 9999 9997 9998 Name3 7777 5555 6666     How can I change the order of the columns? The number of Seasons is flexible and it should always start with the latest one -> Name  Season3  Season2  Season1
Hi. I work with ServiceNow, a ticketing platform.  I wish to get only the current "new" incidents and display it in a dashboard, but when I put "| search status=New" I get results which turned "r... See more...
Hi. I work with ServiceNow, a ticketing platform.  I wish to get only the current "new" incidents and display it in a dashboard, but when I put "| search status=New" I get results which turned "resolved" already. Is there a way I can display only the current new incidents?
Hello all, I am trying to filter out those noisy 4662 logs eating our license like anything as recommended in Splunk blogs and forums. Tried the below stanza for 4662 to blacklist everything exce... See more...
Hello all, I am trying to filter out those noisy 4662 logs eating our license like anything as recommended in Splunk blogs and forums. Tried the below stanza for 4662 to blacklist everything except GPO related events, but not working as expected. Any help to fix the regex part. blacklist1 = EventCode="4662" Message="Object Type:(?!\s*groupPolicyContainer)" Raw Message is below :  Message=An operation was performed on an object. Subject : Security ID: $ Account Name: $ Account Domain:  Logon ID: 0x7F897031 Object: Object Server: DS Object Type: groupPolicyContainer Object Name: CN={123456-D64E-4013-ACC5-F78A}CN=Policies,CN=System,DC=xyz,DC=xyyz,DC=com Handle ID: 0x0 Operation: Operation Type: Object Access Accesses: Read Property Access Mask: 0x10 Properties: --- Public Information distinguishedName groupPolicyContainer Additional Information: Parameter 1: - Parameter 2: Can we filter directly based on Object_Type instead of Message field like :  blacklist1 = EventCode="4662" Object_Type="(x|y)".  Any help would be great! Thanks.  
I'm trying to collapse a of data into earliest/lastest by _time,  with the time is contiguous.  Such as:  2022-08-27 07:36:00 2022-08-27 07:37:00 2022-08-27 07:38:00 2022-08-27 07:39:00 2022-... See more...
I'm trying to collapse a of data into earliest/lastest by _time,  with the time is contiguous.  Such as:  2022-08-27 07:36:00 2022-08-27 07:37:00 2022-08-27 07:38:00 2022-08-27 07:39:00 2022-08-27 07:40:00 2022-08-27 07:44:00 2022-08-27 07:45:00 2022-08-27 07:46:00 2022-08-27 08:31:00 2022-08-27 08:32:00 2022-08-27 08:33:00 2022-08-27 08:34:00 2022-08-27 08:35:00 earliest:                               latest: 2022-08-27 07:36:00   2022-08-27 07:40:00 2022-08-27 07:44:00   2022-08-27 07:46:00 2022-08-27 08:31:00   2022-08-27 08:35:00 THoughts? 
Hi, how can I combine two fields (2.1 and 2.2) into one field (Main calculation) I have a table :    I would like to convert it into something like this :   where START_TIME is the value o... See more...
Hi, how can I combine two fields (2.1 and 2.2) into one field (Main calculation) I have a table :    I would like to convert it into something like this :   where START_TIME is the value of 2.1  and the FINISH_TIME is the value of 2.2 field. Completion_time is sum of both two fields  (0.35 + 60.53)   SPL query:   | eval finish_time_epoch = strftime(strptime(FINISH_TIME, "%Y-%m-%d %H:%M:%S"),"%Y-%m-%d %H:%M:%S") | eval start_time_epoch = strftime(strptime(START_TIME, "%Y-%m-%d %H:%M:%S"),"%Y-%m-%d %H:%M:%S") | eval duration_s = strptime(FINISH_TIME, "%Y-%m-%d %H:%M:%S") - strptime(START_TIME, "%Y-%m-%d %H:%M:%S") | eval duration_min = round(duration_s / 60, 2) | rename duration_min AS Completion_time   | eval Process=if(Process="013","2.1 Main calculation",Process) | eval Process=if(Process="014","2.2 Main calculation",Process) | table Process,  2.START_TIME, 3.FINISH_TIME , 4.Completion_time | sort -START_TIME, -FINISH_TIME | sort +Process | transpose 0 header_field=Process column_name=Process | dedup Process  
Getting the following error "could not load lookup = lookup-severity_for_fireeye" after searching the query  tried checking the lookups table and definitions and also automatic lookup at the field n... See more...
Getting the following error "could not load lookup = lookup-severity_for_fireeye" after searching the query  tried checking the lookups table and definitions and also automatic lookup at the field name it is taking in the search and the lookup table is different what can I do to rectify there error
I have 60*24 =1440 records in my saved search, which mean every minutes have 1 record. |bin _time span=1m some things like these in _time column 8/26/2022 11:30:40am 8/26/2022 11:30:41am I co... See more...
I have 60*24 =1440 records in my saved search, which mean every minutes have 1 record. |bin _time span=1m some things like these in _time column 8/26/2022 11:30:40am 8/26/2022 11:30:41am I connected my splunk data to tableau, but all the datetime in 1440 records changed to 8/26/2022 12:00:00am if i use extract mode in tableau, does anyone face the same issue before?
In June we announced Splunk 9.0 which has a lot of new features and innovations. In this Tech Talk , we will walk you through the new Splunk 9.0 / Splunk Cloud Platform features. These new enhancemen... See more...
In June we announced Splunk 9.0 which has a lot of new features and innovations. In this Tech Talk , we will walk you through the new Splunk 9.0 / Splunk Cloud Platform features. These new enhancements help you with end-to-end visibility, rapid investigation and action, and more extensibility. How can you take advantage of these new features? We will show you how to upgrade to Splunk 9.0 and how to upgrade to Splunk Cloud Platform to take advantage of all the new features. Tune in  to learn about:  The new Splunk Platform features How to upgrade to Splunk 9.0 Why & How to migrate to Splunk Cloud Platform
New to Splunk cloud and EC2 universal forwarder install -   I am reading that the Cloud Universal Forwarder on Linux needs a credential file ? where is the credential file  location ? Or is it ... See more...
New to Splunk cloud and EC2 universal forwarder install -   I am reading that the Cloud Universal Forwarder on Linux needs a credential file ? where is the credential file  location ? Or is it just the same on Splunk on premise Universal Forwarder installs ? The more I research the more I get confused.  
Hello We are trying to integrate xsoar cortex with splunk cloud following the manufacturer's document, but it informs that when integrating with splunk cloud it is necessary to request an Access A... See more...
Hello We are trying to integrate xsoar cortex with splunk cloud following the manufacturer's document, but it informs that when integrating with splunk cloud it is necessary to request an Access Api for support, and we also need the IP, as shown in the images below. Is it possible to help us with this? In attachment, follow the screen Configuration. The support passed this link to follow: https://docs.splunk.com/Documentation/SplunkCloud/8.2.2203/Config/ACSIntro , but it is not working, could someone help please?   Thank you.  
Does Splunk ever plan on updating the java api to mirror the python splunk-sdk? The java library is way behind the python library when it comes to custom search.  As far as I can tell you cannot even... See more...
Does Splunk ever plan on updating the java api to mirror the python splunk-sdk? The java library is way behind the python library when it comes to custom search.  As far as I can tell you cannot even create custom searches with java for Splunk.   1. Splunk could more easily integrate with a variety of apache tools.  2. You would get a performance boost in for applications that really should not be built in java. 3.  Splunk would open itself up to a larger segment of the developer community.
We are trying to use our internal S3-compliant object store with SmartStore, but our access key and secret key expires and rotates every day, does anyone know how to handle the key rotation? I know t... See more...
We are trying to use our internal S3-compliant object store with SmartStore, but our access key and secret key expires and rotates every day, does anyone know how to handle the key rotation? I know these keys are set in the indexes.conf file, we should be able to write a script to update it, but does it require splunk restart? Any insights will be greatly appreciated.
Hi, I have a field with timestamp value "2017-09-21T20:00:00" in format. I need to convert it to the  date and time with time zone  For example, Thu Jul 18 09:30:00 PDT 2022 please do help thanks 
I have a simple .csv I ingest daily via a monitored file, my .csv has some fields in it that show dates/time, but they do NOT represent the time I want the event indexed at.  I want the _time to sho... See more...
I have a simple .csv I ingest daily via a monitored file, my .csv has some fields in it that show dates/time, but they do NOT represent the time I want the event indexed at.  I want the _time to show the time the .csv field was ingested and for Splunk to ignore the other fields in the .csv which have dates/time present.  I have created a new source type by cloning .csv and set the timestamp to use "current time", however, Splunk will still prefer to use random dates/times found in field values and only use "current time" when no fields contain any other time information. I can "fix" this by manually adding a time field in the .csv before ingesting, but I am trying to automate this process as much as possible. Is there a way I can force Splunk to ignore all date/time values found in a .csv and use ingest time for the _time value? Thank you in advance!
Hello!   We have some logs coming across which are in JSON and thus 'just work'. The problem is, inside the log field are the events we need to extract.  There are about 200 app's that will be log... See more...
Hello!   We have some logs coming across which are in JSON and thus 'just work'. The problem is, inside the log field are the events we need to extract.  There are about 200 app's that will be logging this way and each app will have different fields and values so doing a standard field extract won't work or would be 1000's of potential kvps. The format however is always the same.     { "log": "[2022-08-25 18:54:40.031] INFO JsonLogger [[MuleRuntime].uber.143312: [prd-ops-mulesoft].encryptFlow.BLOCKING @4ac358d5] [event: 25670349-e6b5-4996-9cb6-c4c9657cd9ba]: {\n \"correlationId\" : \"25670349-e6b5-4996-9cb6-c4c9657cd9ba\",\n \"message\" : \"MESSAGE_HERE\",\n \"tracePoint\" : \"END\",\n \"priority\" : \"INFO\",\n \"elapsed\" : 0,\n \"locationInfo\" : {\n \"lineInFile\" : \"95\",\n \"component\" : \"json-logger:logger\",\n \"fileName\" : \"buildLoggingAndResponse.xml\",\n \"rootContainer\" : \"responseStatus_success\"\n },\n \"timestamp\" : \"2022-08-25T18:54:40.030Z\",\n \"content\" : {\n \"ResponseStatus\" : {\n \"type\" : \"SUCCESS\",\n \"title\" : \"Encryption successful.\",\n \"status\" : \"200\",\n \"detail\" : { },\n \"correlationId\" : \"25670349-e6b5-4996-9cb6-c4c9657cd9ba\",\n \"apiMethodName\" : null,\n \"apiURL\" : \"https://app.com/prd-ops-mulesoft/encrypt/v1.0\",\n \"apiVersion\" : \"v1\",\n \"x-ConsumerRequestSentTimeStamp\" : \"\",\n \"apiRequestReceivedTimeStamp\" : \"2022-08-25T18:54:39.856Z\",\n \"apiResponseSentTimeStamp\" : \"2022-08-25T18:54:40.031Z\",\n \"userId\" : \"GID01350\",\n \"orchestrations\" : [ ]\n }\n },\n \"applicationName\" : \"ops-mulesoft\",\n \"applicationVersion\" : \"v1\",\n \"environment\" : \"PRD\",\n \"threadName\" : \"[MuleRuntime].uber.143312: [prd-ops-mulesoft].encryptFlow.BLOCKING @4ac358d5\"\n}\n", "stream": "stdout", "time": "2022-08-25T18:54:40.086450071Z", "kubernetes": { "pod_name": "prd-ops-mulesoft-94c49bdff-pcb5n", "namespace_name": "4cfa0f08-92b0-467b-9ca4-9e49083fd922", "pod_id": "e9046b5e-0d70-11ed-9db5-0050569b19f6", "labels": { "am-org-id": "01a4664d-9e16-454b-a14c-59548ef896b5", "app": "prd-ops-mulesoft", "environment": "4cfa0f08-92b0-467b-9ca4-9e49083fd922", "master-org-id": "01a4664d-9e16-454b-a14c-59548ef896b5", "organization": "01a4664d-9e16-454b-a14c-59548ef896b5", "pod-template-hash": "94c49bdff", "rtf.mulesoft.com/generation": "aab6b8074cf73151b1515de0e468478e", "rtf.mulesoft.com/id": "18d3e5d6-ce59-4837-9f3b-8aad3ccffcef", "type": "MuleApplication" }, "host": "1.1.1.1", "container_name": "app", "docker_id": "cf07f321aec551b200fb3f31f6f1c67b2678ff6f6a335d4ca41ec2565770513c", "container_hash": "rtf-runtime-registry.kprod.msap.io/mulesoft/poseidon-runtime-4.3.0@sha256:6cfeb965e0ff7671778bc53a54a05d8180d4522f0b1ef7bb25e674686b8c3b75", "container_image": "rtf-runtime-registry.kprod.msap.io/mulesoft/poseidon-runtime-4.3.0:20211222-2" } }       The JSON works fine, but the events we ALSO want extracted are in here      {\n \"correlationId\" : \"25670349-e6b5-4996-9cb6-c4c9657cd9ba\",\n \"message\" : \"MESSAGE_HERE\",\n \"tracePoint\" : \"END\",\n \"priority\" : \"INFO\",\n \"elapsed\" : 0,\n \"locationInfo\" : {\n \"lineInFile\" : \"95\",\n \"component\" : \"json-logger:logger\",\n \"fileName\" : \"buildLoggingAndResponse.xml\",\n \"rootContainer\" : \"responseStatus_success\"\n },\n \"timestamp\" : \"2022-08-25T18:54:40.030Z\",\n \"content\" : {\n \"ResponseStatus\" : {\n \"type\" : \"SUCCESS\",\n \"title\" : \"Encryption successful.\",\n \"status\" : \"200\",\n \"detail\" : { },\n \"correlationId\" : \"25670349-e6b5-4996-9cb6-c4c9657cd9ba\",\n \"apiMethodName\" : null,\n \"apiURL\" : \"https://app.com/prd-ops-mulesoft/encrypt/v1.0\",\n \"apiVersion\" : \"v1\",\n \"x-ConsumerRequestSentTimeStamp\" : \"\",\n \"apiRequestReceivedTimeStamp\" : \"2022-08-25T18:54:39.856Z\",\n \"apiResponseSentTimeStamp\" : \"2022-08-25T18:54:40.031Z\",\n \"userId\" : \"GID01350\",\n \"orchestrations\" : [ ]\n }\n },\n \"applicationName\" : \"ops-mulesoft\",\n \"applicationVersion\" : \"v1\",\n \"environment\" : \"PRD\",\n \"threadName\" : \"[MuleRuntime].uber.143312: [prd-ops-mulesoft].encryptFlow.BLOCKING @4ac358d5\"\n}     SPATH would work, but the JSON is fronted by this     "[2022-08-25 18:54:40.031] INFO JsonLogger [[MuleRuntime].uber.143312: [prd-ops-mulesoft].encryptFlow.BLOCKING @4ac358d5] [event: 25670349-e6b5-4996-9cb6-c4c9657cd9ba]:     So it doesn't treat it as JSON. This is one example, others events don't have the correlationID, etc.  We need a method that will take the raw data, parse it as JSON AND then dynamically extract the events in the log field in their KV pairs (e.g. \"correlationId\" : \"25670349-e6b5-4996-9cb6-c4c9657cd9ba\" == $1::$2) Can this be done in transforms using regex? Is this even possible, or do we ultimately need to create extractions based on every possible field?   Appreciate the guidance here! Thanks!!