All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Where is the schema for DS / code ? thanks
Hello, Sorry for the translation. Currently with the help of the DBconnect APP I am receiving the logs from an aurora database without any problem. My client is telling me that he needs to upgrade... See more...
Hello, Sorry for the translation. Currently with the help of the DBconnect APP I am receiving the logs from an aurora database without any problem. My client is telling me that he needs to upgrade from version 11.6 to 11.15. At the driver level, should I make any adjustments or can I tell them that you can update your database without any problem?   tnx
Does anyone know of a way to get bytes ingested by host and source over a specified time? I know I can use the license_usage.log to get index and sourcetype like  this ...   index="_internal" sou... See more...
Does anyone know of a way to get bytes ingested by host and source over a specified time? I know I can use the license_usage.log to get index and sourcetype like  this ...   index="_internal" source="/opt/splunk/var/log/splunk/license_usage.log" sourcetype="splunkd" type="Usage" | stats sum(b) as bytes by idx | rename idx as index | sort - bytes   or this ...   index="_internal" source="/opt/splunk/var/log/splunk/license_usage.log" sourcetype="splunkd" type="Usage" | stats sum(b) as bytes by st | rename st as sourcetype | sort - bytes   However, you cannot use it reliably for host and source because it squashes the data to prevent too many events. I know that can be tuned in server.conf with squash_threshold but that would be an arbitrary value that could potentially need to keep changing and honestly it's set that way to not overload the system. So, I'm left wondering if anyone knows of a way to get that data without using license_usage.log.
I have spent days working on this, can someone help?   how to populate previous week results? Also there are different license keys for same errors that is why it is showing 2 entries. I have the f... See more...
I have spent days working on this, can someone help?   how to populate previous week results? Also there are different license keys for same errors that is why it is showing 2 entries. I have the following code index=test sourcetype=dhi:testdata ErrorCode!=0 | `DedupDHI` | bucket _time span=1w | lookup table1 LicenseKey OUTPUT CustomerName | eval CustomerName=coalesce(CustomerName,LicenseKey) | stats count as Result by CustomerName,ErrorCode,_time | eventstats sum(Result) as Total by CustomerName | eval PercentOfTotal = round((Result/Total)*100,3) | streamstats current=f latest(Result) as Result_Prev by CustomerName,ErrorCode | eval PercentDifference = round(((Result/Result_Prev)-1)*100,2) | fillnull value="0" | append [ search index=test sourcetype=dhi:testdata ErrorCode!=0 | `DedupDHI` | lookup table1 LicenseKey OUTPUT CustomerName | eval CustomerName=coalesce(CustomerName,LicenseKey) | stats count as Result by CustomerName | eval ErrorCode="Total", PercentOfTotal=100] | fillnull value="0" | lookup table2 ErrorCode OUTPUT Description | lookup table1 LicenseKey OUTPUT CustomerName | eval CustomerName=coalesce(CustomerName,LicenseKey) | eval Error=if(ErrorCode!="Total", ErrorCode+" ("+coalesce(Description,"Description Missing - Update table2")+")", ErrorCode) | rename Result_Prev as "Previous Week Results", PercentDifference as " Percent Difference", PercentOfTotal as "Percent of Total" | fields CustomerName, Error, Result,"Previous Week Results", " Percent Difference" , "Percent of Total" | sort CustomerName, Error, PercentDifference OUTPUT - CustomerName Error Result Previous Week Results  Percent Difference Percent of Total _time customer_1 1002 (Invalid Address State Code. The two digit state code is invalid) 4 0 0 3.361 2022-08-12T00:00:00.000-0500 customer_1 1003 (Invalid Birth Year) 1 0 0 0.84 2022-08-12T00:00:00.000-0500 customer_1 1006 (Invalid UnderwritingState) 1 0 0 0.84 2022-08-12T00:00:00.000-0500 customer_1 1013 (Invalid Drivers License Format) 12 0 0 10.084 2022-08-12T00:00:00.000-0500 customer_1 1013 (Invalid Drivers License Format) 1 12 -91.67 0.84 2022-08-19T00:00:00.000-0500 customer_1 1023 (Invalid Name) 3 0 0 2.521 2022-08-12T00:00:00.000-0500 customer_1 1027 (Invalid UnderwritingState) 87 0 0 73.109 2022-08-12T00:00:00.000-0500 customer_1 1027 (Invalid UnderwritingState) 1 87 -98.85 0.84 2022-08-19T00:00:00.000-0500 customer_1 1305 (Unable to connect to data provider) 9 0 0 7.563 2022-08-12T00:00:00.000-0500 customer_1 Total 119 0 0 100 1969-12-31T18:00:00.000-0500 customer_2 1023 (Invalid Name) 16 0 0 55.172 2022-08-12T00:00:00.000-0500 customer_2 1201 (Lookback Date Not Set / Offset = 0) 1 0 0 3.448 2022-08-12T00:00:00.000-0500 customer_2 1305 (Unable to connect to data provider) 11 0 0 37.931 2022-08-12T00:00:00.000-0500 customer_2 1305 (Unable to connect to data provider) 1 11 -90.91 3.448 2022-08-19T00:00:00.000-0500 customer_2 Total 29 0 0 100 1969-12-31T18:00:00.000-0500 customer_3 1023 (Invalid Name) 3 0 0 20 2022-08-12T00:00:00.000-0500 customer_3 1027 (Invalid UnderwritingState) 11 0 0 73.333 2022-08-12T00:00:00.000-0500 customer_3 9999 (Timeout expired (9999)) 1 0 0 6.667 2022-08-12T00:00:00.000-0500 customer_3 Total 15 0 0 100 1969-12-31T18:00:00.000-0500 customer_4 1003 (Invalid Birth Year) 1 0 0 3.846 2022-08-12T00:00:00.000-0500 customer_4 1013 (Invalid Drivers License Format) 5 0 0 19.231 2022-08-12T00:00:00.000-0500 customer_4 1013 (Invalid Drivers License Format) 1 5 -80 3.846 2022-08-19T00:00:00.000-0500 customer_4 1023 (Invalid Name) 14 0 0 53.846 2022-08-12T00:00:00.000-0500 customer_4 1026 (Drivers License Number is a required field) 3 0 0 11.538 2022-08-12T00:00:00.000-0500 customer_4 9999 (Timeout expired (9999)) 1 0 0 3.846 2022-08-12T00:00:00.000-0500 customer_4 9999 (Timeout expired (9999)) 1 1 0 3.846 2022-08-19T00:00:00.000-0500 customer_4 Total 26 0 0 100 1969-12-31T18:00:00.000-0500
Hi everyone,   State ID APP _time INFO ABC Car 19/08/22 19:51 INFO ABC Car 19/08/22 19:52 INFO DEF Car 20/08/22 19:53 INFO ZZZ Book 30/... See more...
Hi everyone,   State ID APP _time INFO ABC Car 19/08/22 19:51 INFO ABC Car 19/08/22 19:52 INFO DEF Car 20/08/22 19:53 INFO ZZZ Book 30/08/22 19:51 INFO ZZZ Book 19/08/22 19:55 WARN ABC Car 19/08/22 19:56 WARN XYZ Car 20/08/22 19:51 WARN ZZZ Book 19/08/22 19:58 WARN ZZZ Book 19/08/22 19:59 ERROR ABC Car 19/08/22 20:00 ERROR ABC Car 19/08/22 20:01 ERROR XYZA Car 30/08/22 19:51   I have following data as mentioned in table above, and i have to create a statistical analysis for following requirement Find out count of distinct ID By APP for any given STATE   Ex.:  For State=Info, My Results should be: APP Count Car 2 Book 1   For State=ERROR, My Results should be: APP Count Car 2   Currently i am trying like this:       index=testdata | stats count(eval(searchmatch("*INFO*"))) BY APP         But i am Not getting count of  records with Distinct ID.    My Question is: How to use stats command with eval function and distinct function on two separate columns.
We are in SplunkCloud with ES 7.0.0 As a user with the sc_admin or ess_admin role when selecting an incident to edit, the drop-down for "Status" gives no matches. All other drop-downs give options ... See more...
We are in SplunkCloud with ES 7.0.0 As a user with the sc_admin or ess_admin role when selecting an incident to edit, the drop-down for "Status" gives no matches. All other drop-downs give options as expected. We've tried enable/disable all statuses, creating new statuses, adding/removing transitions roles for ALL statuses, granting permissions to edit_reviewstatus for additional roles, granting write permissions to kvstore reviewstatuses_lookup, and several other things. Is there a key thing we are missing to be able to change status on incidents with the ess_admin user?
reated splunk python script and set splunk web on "data input" and added all procedures but my script is not running in splunk web and i installed python splunk sdk on windows using this command pi... See more...
reated splunk python script and set splunk web on "data input" and added all procedures but my script is not running in splunk web and i installed python splunk sdk on windows using this command pip install splunk-sdk I've run my code in this folder and verified that it works C:\Program Files\Splunk\etc\apps\search\bin\python sample.py but it doesn't work in Splunk Web. How to solve this problem on Windows? Do I need to change any in the Splunk folder path? C:\Program Files\Splunk\etc\apps\search\bin\sample.py any solution solve this problem in splunk windows?
Hi, We are trying to integrate gmail logs into our Splunk Cloud instance.  We have tried the 'Splunk Addon for Google Workspace(https://splunkbase.splunk.com/app/5556/)'. The integration was smooth... See more...
Hi, We are trying to integrate gmail logs into our Splunk Cloud instance.  We have tried the 'Splunk Addon for Google Workspace(https://splunkbase.splunk.com/app/5556/)'. The integration was smooth, and we were able to see gsuite header logs in Splunk. But the problem in this case was it eventually generated large bills from Google for the bigqueries. Hence we were forced to disable it temporarily. When we did an analysis, we found that the current approach in this addon is to query all partition at once using the below query: "SELECT * FROM `{gcp_project_id}.gmail_logs_dataset.daily_*` "                     "WHERE event_info.timestamp_usec > {start_time_usec} "                     "AND event_info.timestamp_usec < {end_time_usec} "                     "ORDER BY event_info.timestamp_usec ASC" Instead of querying the whole partition, we would like to query the table of each day, it would massively reduce the cost.  I did raise a support ticket with Splunk on this, and they have confirmed it requires a code change and they cannot assure any timeline for this. Even though we tried to manually edit this part of code and upload it via custom app, it didnt succeed the vetting process.  It would be really helpful if someone could provide me an alternate solution for integrating gmail logs or a way to upload the modified addon. Much appreciated, Archa
I would like to have six intermediate forwarders before indexers.Also i am interested to configure prasing on intermediate forwarders only.can some help me how to configuration. I have done the bas... See more...
I would like to have six intermediate forwarders before indexers.Also i am interested to configure prasing on intermediate forwarders only.can some help me how to configuration. I have done the basic configuration where i am facing parsing quees and tail reader error on IF and traffic is getting blocked. can you please help me solve this problem
Hi All I have a nested JSON in my log event. On that basis, I have to create a dynamic table. {status: FINISHED    data: [       {         duration: 123        status: A      }      {        ... See more...
Hi All I have a nested JSON in my log event. On that basis, I have to create a dynamic table. {status: FINISHED    data: [       {         duration: 123        status: A      }      {         duration: 456        status: B      }      {         duration: 678        status:C      }    ]} I need to create the table for this nested one Table Structure :  status A B C Finished 123 456  678 Also, I have one more req. If in the future we get more values in the sub-part of JSON then can we add a column for that also
Is there a way to create a daily report for the amount of times when a particular playbook is ran?
Hi Splunkers ,  Query :- I have a use case in which we need to display the splunk dashboard on some screen for 8+ hours constantly  Splunk authentication method :- we are using SSO authentication... See more...
Hi Splunkers ,  Query :- I have a use case in which we need to display the splunk dashboard on some screen for 8+ hours constantly  Splunk authentication method :- we are using SSO authentication for login Splunk architecture :- 3 indexer one search head one deployment server and one cluster master  Solution Tried :- used refresh tag in dashboard, tried creating a new user which logins without SSO  Note :- we can't use splunk TV in this usecase  Any help would be appreciated 
how can i get date data in fields to use the splunk features. What is the best way for it. The raw data are coming from a kafka stream: e.g.: MAX_CURRENT BLOCK_POSITION_NUMBER OPERATING_D... See more...
how can i get date data in fields to use the splunk features. What is the best way for it. The raw data are coming from a kafka stream: e.g.: MAX_CURRENT BLOCK_POSITION_NUMBER OPERATING_DISTANCE_KM row data: {"body":"{\"namespace\":\"xxx.messages.sensor.sensortelemetrymessage\",\"payload\":{\"dataSource\":{\"endpointUrl\":\"opc.tcp://123.164.72.6:4840/\",\"id\":\"\",\"name\":\"\",\"route\":\"\"},\"data\":{\"key\":\"ns=3;s=\\\"TEST_CARRIER_DB\\\".\\\"CARRIER\\\"[105]\",\"value\":[{\"key\":\"TIMESTAMP\",\"value\":\"0001-01-01T00:00:00Z\",\"dataType\":\"DateTime\"},{\"key\":\"PLC_CARRIER\",\"value\":[{\"key\":\"OPERATION_MODE\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"TEMP_CABINET\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"F_PROG_SIG\",\"value\":[{\"key\":\"0\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"1\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"2\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"3\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"4\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"5\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"6\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"7\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"8\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"9\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"10\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"11\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"12\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"13\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"14\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"15\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"16\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"17\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"18\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"19\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"20\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"21\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"22\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"23\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"24\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"25\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"26\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"27\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"28\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"29\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"30\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"31\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"32\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"33\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"34\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"35\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"36\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"37\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"38\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"39\",\"value\":\"32\",\"dataType\":\"Byte\"}],\"dataType\":\"ByteCollection\"},{\"key\":\"PROG_DAT\",\"value\":[{\"key\":\"0\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"1\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"2\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"3\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"4\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"5\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"6\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"7\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"8\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"9\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"10\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"11\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"12\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"13\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"14\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"15\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"16\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"17\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"18\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"19\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"20\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"21\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"22\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"23\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"24\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"25\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"26\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"27\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"28\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"29\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"30\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"31\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"32\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"33\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"34\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"35\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"36\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"37\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"38\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"39\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"40\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"41\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"42\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"43\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"44\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"45\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"46\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"47\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"48\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"49\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"50\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"51\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"52\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"53\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"54\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"55\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"56\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"57\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"58\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"59\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"60\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"61\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"62\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"63\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"64\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"65\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"66\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"67\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"68\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"69\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"70\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"71\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"72\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"73\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"74\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"75\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"76\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"77\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"78\",\"value\":\"32\",\"dataType\":\"Byte\"},{\"key\":\"79\",\"value\":\"32\",\"dataType\":\"Byte\"}],\"dataType\":\"ByteCollection\"},{\"key\":\"CYCLETIME_AVERAGE\",\"value\":\"0\",\"dataType\":\"Int16\"}],\"dataType\":\"Struct<TEST_PLC_CARRIER_UDT>\"},{\"key\":\"CARRIER\",\"value\":[{\"key\":\"BLOCK_POSITION_NUMBER\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"OPERATING_HOURS\",\"value\":\"0\",\"dataType\":\"Int32\"}],\"dataType\":\"Struct<TEST_CARRIER_UDT>\"},{\"key\":\"DRIVE\",\"value\":[{\"key\":\"BMK\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"POSITION\",\"value\":\"0\",\"dataType\":\"Int32\"},{\"key\":\"SPEED\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"ACT_CURRENT\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"MAX_CURRENT\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"MAX_CURRENT_AVERAGE\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"ERRORCODE\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"OPERATING_DISTANCE_MM\",\"value\":\"0\",\"dataType\":\"Int32\"},{\"key\":\"OPERATING_DISTANCE_KM\",\"value\":\"0\",\"dataType\":\"Int32\"}],\"dataType\":\"Struct<TEST_FREQU_CONV_UDT>\"},{\"key\":\"USER\",\"value\":[{\"key\":\"BARCODE_DRIVE\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"BARCODE_DRIVE_MIN\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"WIFI_SIGNAL_STRENGTH\",\"value\":\"0\",\"dataType\":\"Int16\"},{\"key\":\"WIFI_CHANNEL\",\"value\":\"0\",\"dataType\":\"UInt16\"},{\"key\":\"TX_RATE\",\"value\":\"0\",\"dataType\":\"UInt16\"},{\"key\":\"SC_RESPONSE_TIME\",\"value\":\"0\",\"dataType\":\"UInt32\"}],\"dataType\":\"Struct<TEST_CARRIER_MAX_UDT_USER>\"}],\"status\":\"Good\",\"lastChangeTimestamp\":\"2022-08-14T06:04:02.9607079Z\",\"measurementTimestamp\":\"2022-08-14T06:04:02.9607079Z\",\"dataType\":\"Struct<TEST_CARRIER_MAX_UDT>\"}},\"id\":\"c9e9009b-f79b-44c8-9a44-ebd5a59b0814\",\"$schema\":\"https://xxxxx.blob.core.windows.net/schemas/message_schemas/2021-05-05/xxx.sensor.sensortelemetrymessage.schema.json\",\"metadata\":{\"timestamp\":\"2022-08-14T06:04:22.2257279Z\",\"correlationIds\":[],\"senderIdentifier\":{\"id\":\"aaaaa-1493-4f61-993b-e3fb046908aa\",\"name\":\"OPC UA Connector\",\"type\":\"gateway\",\"route\":\"\"},\"destinationIdentifiers\":[]}}","SCHEMA_MAPPER":"xxxx.TELEMETRY","enqueuedTime":"2022-08-14T06:04:22.225Z","YEAR":2022,"MONTH":8,"DAY":14,"HOUR":6}  
We currently have the user case  - High Number of Login Failures from a single source turned on We would like to exclude some IP ranges from the search that we failover our staff to. Our Search t... See more...
We currently have the user case  - High Number of Login Failures from a single source turned on We would like to exclude some IP ranges from the search that we failover our staff to. Our Search terms at the moment is - index=appext_o365 `o365_management_activity` Operation=UserLoginFailed record_type=AzureActiveDirectoryStsLogon app=AzureActiveDirectory | stats count dc(user) as accounts_locked values(user) as user values(LogonError) as LogonError values(authentication_method) as authentication_method values(signature) as signature values(UserAgent) as UserAgent by src_ip record_type Operation app | search accounts_locked >= 10| `high_number_of_login_failures_from_a_single_source_filter` I added | search src_ip!="###.##.##.17" |  which does remove that one IP, from the search but obviously I dont want to manually put in 1 to 128. Any assistance would be very much appreciated
I am trying to get the kvstore status of one my heavy forwarder using the command     /opt/splunk/bin/splunk show kvstore-status This command [GET /services/kvstore/status] needs splunkd to be u... See more...
I am trying to get the kvstore status of one my heavy forwarder using the command     /opt/splunk/bin/splunk show kvstore-status This command [GET /services/kvstore/status] needs splunkd to be up, and splunkd is down.
i created python scripts using splunk sdk and when i used my laptop it worked fine but when i try to use in splunk windows machine i get an error DeprecationWarning: ResultsReader is a deprecated... See more...
i created python scripts using splunk sdk and when i used my laptop it worked fine but when i try to use in splunk windows machine i get an error DeprecationWarning: ResultsReader is a deprecated function. Use the JSONResultsReader function instead in conjuction with the 'output_mode' query param set to 'json' reader = results.ResultsReader(query_results)
I'm looking at downloading the "Cisco Secure eStreamer Client Add-On for Splunk" app, but when I do, instead of version 5.0.3 (suitable for Splunk 8.1 and 8.2) I get version 3.5.2 (suitable for Splun... See more...
I'm looking at downloading the "Cisco Secure eStreamer Client Add-On for Splunk" app, but when I do, instead of version 5.0.3 (suitable for Splunk 8.1 and 8.2) I get version 3.5.2 (suitable for Splunk 7.0, and 7.1). Is this Splunk or the uploaders issue?
Hi all, Is there a possibility that when you've made a query with the hits you want, that also the next x amounts of events are being listed? For example: index=*_*_windows EventCode=4688 sourc... See more...
Hi all, Is there a possibility that when you've made a query with the hits you want, that also the next x amounts of events are being listed? For example: index=*_*_windows EventCode=4688 source=XmlWinEventLog:Security *[redacted]* host=[redacted] *schtasks.exe | table _time, TargetUserName, host, CommandLine, status this will show exactly what I need to see, but I also want to know the next 10 events that occurred after the results of this query.  I hope this makes sense, if not clear don't hesitate to message me for clarification. Many thanks in advance!
I have a search whish results in these events:      user last_event user1 2021-12-30 08:57:36.77 user2 2022-03-12 22:29:52.333 user 3 2022-03-13 08:02:48.253   I want to plot a... See more...
I have a search whish results in these events:      user last_event user1 2021-12-30 08:57:36.77 user2 2022-03-12 22:29:52.333 user 3 2022-03-13 08:02:48.253   I want to plot a chart where on the X axis there's the dates and on the Y there's the user
Hello, I am trying to install latest version of Splunk universal forwarder using chef cookbook and getting error. Earlier in version 6.5.0, once rpm installed using rpm -ivh splunkuniversal-xx.rpm,... See more...
Hello, I am trying to install latest version of Splunk universal forwarder using chef cookbook and getting error. Earlier in version 6.5.0, once rpm installed using rpm -ivh splunkuniversal-xx.rpm, I used to run command as: /splunkuniversal/bin/splunk enable boot-start --accept-license --answer-yes and then change the password /splunkuniversal/bin/splunk edit user admin -password xxxxx -roles admin -auth admin:xxxxxx service splunk start but now in version 8.2.6, after rpm install, when I am trying to run above commands, it is asking to create user. Is there any change in installation process and how I can automate it again via chef cookbook so hat it should not prompt for user creation ? Regards, Dhimanv