All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

We have json logs, from the below logs we need to get the rex for the failures count which is mentioned in the logs like (7 failures) We need rex to get the count for failures  count. {"attributes"... See more...
We have json logs, from the below logs we need to get the rex for the failures count which is mentioned in the logs like (7 failures) We need rex to get the count for failures  count. {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 3 batches with 3 failures.3", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 4 batches with 4 failures.4", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 5 batches with 5 failures.5", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 7 batches with 7 failures.7", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null} {"attributes": {"type" : "rar_Log__c", "url": "/data/log/v4.0/subject/rar"}, "Application_Id__c": "MOT-Branch", "Category__c": "MOT-Branch", "Comment__c": "Class Name: MOT_Date3DayPurgeBatch - LCT Declined or Not Funded applications deletion completed 10 batches with 10 failures.10", "Contact_Id__c": null, "CreatedById" : 657856MHQA, "CreatedDate": "2022-02-21T16:04:01.000+0000", "Description__c": null}  
Is it possible to perform custom attribute mapping when syncing user attributes using SAML2 authentication? I know we can map external attributes to first_name, last_name, etc. But we have a need to ... See more...
Is it possible to perform custom attribute mapping when syncing user attributes using SAML2 authentication? I know we can map external attributes to first_name, last_name, etc. But we have a need to set first_name to a nickname attribute if it exists or has a value and if not fallback to the firstname attribute. The configuration doesn't allow us to map two external attributes to the same SOAR user attribute. Wasn't sure if there was a way to script this somewhere or if we are stuck performing this mapping on the IdP?
I have created an addon with a few input parameters. One of them is a dropdown list box. I am seeing that when I add a data input from within the app created by the addon automatically the dropdown s... See more...
I have created an addon with a few input parameters. One of them is a dropdown list box. I am seeing that when I add a data input from within the app created by the addon automatically the dropdown shows fine and I can select an item from it. However, when I create the same data input from the Settings->Data Inputs menu item, the dropdown list box is shown as a textbox. Any ideas on what I might be doing wrong? Thanks in advance.
In the data, there is an array of 5 commit IDs. For some reason, it is only returning 3 values. Not sure why  2 values are missing. Would like a fresh set of eyes to take a look please. Query ind... See more...
In the data, there is an array of 5 commit IDs. For some reason, it is only returning 3 values. Not sure why  2 values are missing. Would like a fresh set of eyes to take a look please. Query index=XXXXX source="http:github-dev-token" eventtype="GitHub::Push" sourcetype="json_ae_git-webhook" | spath output=commit_id path=commits.id sourcetype definition [ json_ae_git-webhook ] AUTO_KV_JSON=false CHARSET=UTF-8 KV_MODE=json LINE_BREAKER=([\r\n]+) NO_BINARY_CHECK=true SHOULD_LINEMERGE=true TRUNCATE=100000 category=Structured description=JavaScript Object Notation format. For more information, visit http://json.org/ disabled=false pulldown_type=true Raw JSON data { "ref":"refs/heads/Dev", "before":"d53e9b3cb6cde4253e05019295a840d394a7bcb0", "after":"34c07bcbf557413cf42b601c1794c87db8c321d1", "commits":[ { "id":"a5c816a817d06e592d2b70cd8a088d1519f2d720", "tree_id":"15e930e14d4c62aae47a3c02c47eb24c65d11807", "distinct":false, "message":"rrrrrrrrrrrrrrrrrrrrrr", "timestamp":"2024-08-12T12:00:04-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/aaaaaaaaaaaa", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "asdafasdad.json" ] }, { "id":"a3b3b6f728ccc0eb9113e7db723fbfc4ad220882", "tree_id":"3586aeb0a33dc5e236cb266c948f83ff01320a9a", "distinct":false, "message":"xxxxxxxxxxxxxxxxxxx", "timestamp":"2024-08-12T12:05:40-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/a3b3b6f728ccc0eb9113e7db723fbfc4ad220882", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "sddddddf.json" ] }, { "id":"bdcd242d6854365ddfeae6b4f86cf7bc1766e028", "tree_id":"8286c537f7dee57395f44875ddb8b2cdb7dd48b2", "distinct":false, "message":"Updating pipeline: pl_gwp_file_landing_check. Adding Sylvan Performance", "timestamp":"2024-08-12T12:06:10-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/bdcd242d6854365ddfeae6b4f86cf7bc1766e028", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "asadwefvdx.json" ] }, { "id":"108ebd4ff8ae9dd70e669e2ca49e293684d5c37a", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":false, "message":"asdrwerwq", "timestamp":"2024-08-12T10:09:33-07:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/108ebd4ff8ae9dd70e669e2ca49e293684d5c37a", "author":{ "name":"dfsd", "email":"l.llllllllllll@aaaaaa.com", "username":"aaaaaa" }, "committer":{ "name":"lllllllllllll", "email":"l.llllllllllll@abc.com", "username":"aaaaaa" }, "added":[ ], "removed":[ ], "modified":[ "A.json", "A.json", "A.json" ] }, { "id":"34c07bcbf557413cf42b601c1794c87db8c321d1", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":true, "message":"asadasd", "timestamp":"2024-08-12T13:32:45-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/34c07bcbf557413cf42b601c1794c87db8c321d1", "author":{ "name":"aaaaaa aaaaaa", "email":"101218171+aaaaaa@users.noreply.github.com", "username":"aaaaaa" }, "committer":{ "name":"GitasdjwqaikHubasdqw", "email":"noreply@gitskcaskadahuqwdqbqwdqaw.com", "username":"wdkcszjkcsebwdqwdfqwdawsldqodqw" }, "added":[ ], "removed":[ ], "modified":[ "a.json", "A1.json", "A1.json" ] } ], "head_commit":{ "id":"34c07bcbf557413cf42b601c1794c87db8c321d1", "tree_id":"5a6d71393611718b8576f8a63cdd34ce619f17dd", "distinct":true, "message":"sadwad from xxxxxxxxxxxxxxx/IH-5942-Pipeline-Change\n\nIh 5asdsazdapeline change", "timestamp":"2024-08-12T13:32:45-05:00", "url":"https://github.com/xxxxxxxxxxxxxxx/AzureWorkload_A00008/commit/3weweeeeeeeee, "author":{ "name":"askjas", "email":"101218171+asfsfgwsrsd@users.noreply.github.com", "username":"asdwasdcqwasfdc-qwgbhvcfawdqxaiwdaszxc" }, "committer":{ "name":"GsdzvcweditHuscwsab", "email":"noreply@gitasdcwedhub.com", "username":"wefczeb-fwefvdszlow" }, "added":[ ], "removed":[ ], "modified":[ "zzzzzzz.json", "Azzzzz.json", "zzzz.json" ] } }
Hi, We have a custom python service being monitored by APM using the Opentelemetry agent. We have been successful in tracing spans related to our unsupported database driver (clickhouse-driver) but ... See more...
Hi, We have a custom python service being monitored by APM using the Opentelemetry agent. We have been successful in tracing spans related to our unsupported database driver (clickhouse-driver) but are wondering if there is some tag we can use to get APM to recognize these calls as database calls for the purposes of the "Database Query Performance" screen. I had hoped we could just fill out a bunch of the `db.*` semantic conventions but none have so far worked to get it to show as a database call (though the instrumented data do show up in the span details). Any tips?
Is there a way to get a list of valid keys for a stanza? For example: If you get "Invalid key in stanza" for something like: [file_integrity] exclude = /file/path It doesn't like the "exclu... See more...
Is there a way to get a list of valid keys for a stanza? For example: If you get "Invalid key in stanza" for something like: [file_integrity] exclude = /file/path It doesn't like the "exclude" but is there an alternative "key" value to accomplish the same? Thanks in advance!  
I'm trying to achieve the following output using the table command, but am hitting a snag.  Vision ID Transactions Good % Good Fair % Fair Unacceptable % Unacceptable Average ... See more...
I'm trying to achieve the following output using the table command, but am hitting a snag.  Vision ID Transactions Good % Good Fair % Fair Unacceptable % Unacceptable Average Response Time Report Date ABC STORE (ABCD) 159666494 159564563 99.9361601 101413 0.063515518 518 0.000324426 0.103864001 Jul-24 Total 159666494 159564563 99.9361601 101413 0.063515518 518 0.000324426 0.103864001 Jul-24                     Thresholds   response <= 1s   1s < response <= 3s 3s < response       Here is my broken query: index=etims_na sourcetype=etims_prod platformId=5 bank_fiid = ABCD | eval response_time=round(if(strftime(_time,"%Z") == "EDT",((j_timestamp-entry_timestamp)-14400000000)/1000000,((j_timestamp-entry_timestamp)-14400000000)/1000000-3600),3) | stats count AS Total count(eval(response_time<=1)) AS "Good" count(eval(response_time<=2)) AS "Fair" count(eval(response_time>2)) AS "Unacceptable" avg(response_time) AS "Average" BY Vision_ID | eval %Good= round((Good/total)*100,2), %Fair = round((Fair/total)*100,2), %Unacceptable = round((Unacceptable/total)*100,2) | addinfo | eval "Report Date"=strftime(info_min_time, "%m/%Y") | table "Vision_ID", "Transactions", "Good", "%Good" "Fair", "%Fair", "Unacceptable", "%Unacceptable", "Average", "Report Date" The help is always appreciated. Thanks!
Hello, In need download splunk enterprise 7.2.* in order to upgrade from version 6.6. Where can i find the older versions?   Thank you
Looking to add tooltip sting of site names included in the same lookup file as the long lat on a cluster map.   IS this even possible?
hello, as per https://docs.splunk.com/Documentation/Splunk/9.3.0/Forwarding/EnableforwardingonaSplunkEnterpriseinstance where are the files like outputs, props and transforms stored? i am using spl... See more...
hello, as per https://docs.splunk.com/Documentation/Splunk/9.3.0/Forwarding/EnableforwardingonaSplunkEnterpriseinstance where are the files like outputs, props and transforms stored? i am using splunk web enterprise. Also where is my $splunk_home? am trying to setup heavy forwarding to send indexed data to a database on a schedule. thanks
I'm using the Splunk TA for linux to collect serverlogs. Some background Looking in the "_internal" log I am seing a lot of these errors: 08-23-2024 15:52:39.910 +0200 WARN DateParserVerbose [6460... See more...
I'm using the Splunk TA for linux to collect serverlogs. Some background Looking in the "_internal" log I am seing a lot of these errors: 08-23-2024 15:52:39.910 +0200 WARN DateParserVerbose [6460 merging_0] - A possible timestamp match (Wed Aug 19 15:39:00 2015) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source=lastlog|host=<hostname>|lastlog|13275 08-23-2024 15:52:39.646 +0200 WARN DateParserVerbose [6460 merging_0] - A possible timestamp match (Fri Aug 7 09:08:00 2009) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source=lastlog|host=<hostname>|lastlog|13418 08-23-2024 15:52:32.378 +0200 WARN DateParserVerbose [6506 merging_1] - A possible timestamp match (Fri Aug 7 09:09:00 2009) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source=lastlog|host=<hostname>|lastlog|13338 This is slightly confusing and somewhat problematic as  the "lastlog" is collected not through a filewatch but from scripted output. The "lastlog" file is not collected/read and a stats-check on the file confirms accurate dates. However, this is not the source of the problem. I cannot se anything in the output from the commands in the script (Splunk_TA_nix/bin/lastlog.sh) which would indicate the precense of a "year"/timestamp. The indexed log does not contain "year" and the actual _time timestamp is correct. These "years" in "_internal" are also from a time when the server was not running/present, so they are not collected from any actual source "on the server". And the questions - Why am I seeing these errors - From where are these problematic "timestamps" generated - How do I fix the issue All the best  
Hello,   I want to create a dataset for Machine Learning, I want kpi name and Service Health Score as field name and their value as value for last 14 days, how do i retrieve kpi_value and health_... See more...
Hello,   I want to create a dataset for Machine Learning, I want kpi name and Service Health Score as field name and their value as value for last 14 days, how do i retrieve kpi_value and health_score value, is it stored somewhere in itsi index? I cannot find kpi_value field in index=itsi_summary #predictive analaytics #machine learning, splunk it #predictive analytic  Splunk Machine Learning Toolkit  #Splunk ITSI Also, if you have done Machine Learning / Predictive ANalytics in your environment, please suggest a approach 
What do we use for the Base URL when configuring the App's Add-on Settings? Should this be left to slack.com/api as default?
i am facing error while running datamodel below The search job has failed due to err='Error in 'SearchParser': The search specifies a macro 'isilon_index' that cannot be found.    l
Why can't I open the Support Portal page? I am having trouble referencing a case.
Hi Team, We could see latency in logs Log ingestion via - syslog Network devices --> Syslog server --> splunk  Using below query, we could see minimum 10 mins to maxminum 60 mins log la... See more...
Hi Team, We could see latency in logs Log ingestion via - syslog Network devices --> Syslog server --> splunk  Using below query, we could see minimum 10 mins to maxminum 60 mins log latency index="ABC" sourcetype="syslog" source="/syslog*" | eval indextime=strftime(_indextime,"%c") | table _raw _time indextime What should be our next steps to check where the latency is and how to fix it?
this is inputs.conf  [monitor://D:\temp\zkstats*.json] crcSalt = <SOURCE> disabled = false followTail = 0 index = abc sourcetype = zk_stats props.conf [zk_stats] KV_MODE = json INDEXED_EXTRACTIONS... See more...
this is inputs.conf  [monitor://D:\temp\zkstats*.json] crcSalt = <SOURCE> disabled = false followTail = 0 index = abc sourcetype = zk_stats props.conf [zk_stats] KV_MODE = json INDEXED_EXTRACTIONS = json however my search code index=abc sourcetype = zk_stats is not getting new events. meaning to say if zkstats20240824_0700 for example new files coming in it wont re index
Hi, I am currently dealing with some logs being forwarded via syslog to a third party system. The question is if there is an option to prevent splunk from adding an additional header to each message... See more...
Hi, I am currently dealing with some logs being forwarded via syslog to a third party system. The question is if there is an option to prevent splunk from adding an additional header to each message before it is forwarded. So there should be a way to disable the additional syslog header when using forwarding, so that the third party system receives the original message by removing the header. Any ideas, can you give me a practical example? I am trying to test by modifying the outputs.conf.  thanks, Giulia
when I upgrade ITSI app to 4.18.1. The services option in the configuration dropdown is missing Reference Screenshot:
Hi Team, We are currently using pyhton 3.9.0 version for Splunk app development. Is it ok or if it can be suggested some better version of python to develop splunk app.   Thanks, Alankrit