All Topics

Top

All Topics

The structure of JSON in my log events is roughly as follows     { "Info": { "Apps": { "ReportingServices": { "ReportTags": [ "Tag1"... See more...
The structure of JSON in my log events is roughly as follows     { "Info": { "Apps": { "ReportingServices": { "ReportTags": [ "Tag1" ], "UserTags": [ "Tag2", "Tag3" ] }, "MessageQueue": { "ReportTags": [ "Tag1", "Tag4" ], "UserTags": [ "Tag3", "Tag4", "Tag5" ] }, "Frontend": { "ClientTags": [ "Tag12", "Tag47" ] } } } }     The number of fields in "Apps" is unknown, as are their names. Given this structure I need to check if a given tag ("Tag1", "Tag2", ...) exists in in a given array ("ReportTags", "UserTags", [..]), regardless of parent. If it does, I need the distinct names of parent field names that contain this. Example 1: The input to the query is "ReportTags" and "Tag1". I'd expect it to output both "ReportingServices" and "MessageQueue" because both of them contain a "ReportTags" array that contains "Tag1". Example 2: The input to the query is "UserTags" and "Tag5". I'd expect it to output only "MessageQueue" because only this one contains a "UserTags" array that contains this "Tag5". I have looked at various questions on this forum, tried various combinations of mvexpand and such but I have not been able to write a query that does exactly this. Any hints and/or help would be greatly appreciated.
Suddenly the real-time alert is not working for Splunk, can anyone help on this how  to troubleshoot this issue
what would be the best approach for IaC setup for Splunk enterprise?   Currently we are using Azure VM and deb installation of Splunk enterprise installation and update are done manualy. We would ... See more...
what would be the best approach for IaC setup for Splunk enterprise?   Currently we are using Azure VM and deb installation of Splunk enterprise installation and update are done manualy. We would like to improve this process to install Splunk from scratch and use a better setup: 1. install it in the AKS cluster  2. use docker approach 3. use terraform for iac   what would be best-suggested approach here we have a disk e.g around 1TB of data for Splunk   thank you in advance  
Hi, I have search which populates results with email address for 1000+users. I need to send ONLY the result tagged to appropriate user via email.I have tried couple of solutions from the community, ... See more...
Hi, I have search which populates results with email address for 1000+users. I need to send ONLY the result tagged to appropriate user via email.I have tried couple of solutions from the community, but it didn't help me. I want to combine all the results assosciated for indivual user and send them one single email as the data will be more, I don't want to spam their inbox. For example: result having 4, 5, 6 should be send in one email only to malik@gmail.com and so on for other users. Please suggest 
Hi, we use the app Splunk Add-on for Microsoft Cloud Services version 5.3.1 on our HeavyForwarder. We ingest data from an eventhub which is splitted in a lot of eventhub names for different microsof... See more...
Hi, we use the app Splunk Add-on for Microsoft Cloud Services version 5.3.1 on our HeavyForwarder. We ingest data from an eventhub which is splitted in a lot of eventhub names for different microsoft services (e.x. sharepoint, exchange etc.) The default sourcetype is "mscs:azure:eventhub" but the data isn't parsed with that. In some forums it was mentioned using the sourcetype "ms:o365:management". Someone had the same trouble finding the correct sourcetype? That app itself as a lot of config in props/transforms. Thanks  
my query is we have used timechart count by clause in the splunk query. we need to compare the dynamic field values. Query :- index=sample sample="value1" | timechart count by field1 It returns so... See more...
my query is we have used timechart count by clause in the splunk query. we need to compare the dynamic field values. Query :- index=sample sample="value1" | timechart count by field1 It returns some results like  time                                               output1 output2  2024-11-13 04:00:00                8              30 2024-11-13 04:01:00                8              30   My question here is we need to compare the output1 and output2 like if the o/p1 more than 30% of o/p2 in 10 mins of interval.  
How to filter events in the dashboard with help of search box.In the search box i have to give multiple strings like error,warning so i need to sort out only error and warning logs.      In Dashbo... See more...
How to filter events in the dashboard with help of search box.In the search box i have to give multiple strings like error,warning so i need to sort out only error and warning logs.      In Dashboard XML: <input type="text" token="Text_Token" searchWhenChanged="true"> <label>Error Search (comm-seprated)</label> </input> index=test Message="*"| eval error_list=split("$Text_Token$", ",")| table PST_Time Environment Host Component FileName Message | search Message IN ("error_list") OR Environment=QDEV Component IN (AdminServer) FileName=*| search NOT Message IN ("*null*")|sort PST_Time  
May I know where I can get Splunk Enterprise REST API OpenAPI Specification(OAS) JSON file?   Thanks
I am trying to create a dashboard. It has two input text fields. I want to run a search query based on these two inputs. If input A is null AND input B is null then no search results If input A ... See more...
I am trying to create a dashboard. It has two input text fields. I want to run a search query based on these two inputs. If input A is null AND input B is null then no search results If input A is not null AND input B is null then search using only A If input A is null AND input B is not null then search using only B If input A is null AND input B is not null then search using both A and B Following is my query. It returns no results    Properties.application="xyz.api" | spath Level | search Level!=Verbose AND Level!=Debug | eval search_condition_fnum=if(len(trim("$text_fnum$"))=0 OR isnull("$text_fnum$"), "", "RenderedMessage=\"*$text_fnum$*\"") | eval search_condition_fdate=if(len(trim("$text_fdate$"))=0 OR isnull("$text_fdate$"), "", "RenderedMessage=\"*$text_fdate$*\"") | eval combined_search_condition=mvjoin(mvfilter(search_condition_fnum!="") + mvfilter(search_condition_fdate!=""), " OR ") | table search_condition_fnum, search_condition_fdate, combined_search_condition | search [| makeresults | eval search_condition=mvjoin(mvfilter(search_condition_fnum!="") + mvfilter(search_condition_fdate!=""), " OR ") | fields search_condition]  
As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now deprecated. Use Dashboard Studio for dashboard exports going forward. Check out this Lantern ... See more...
As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now deprecated. Use Dashboard Studio for dashboard exports going forward. Check out this Lantern article to learn more.
Hi Splunkers, any help with Rex has exceeded configured match_limit, consider raising the value in limits.conf. My search looks like this: | index=abc index=def process=jkl | rex field=_raw ";(?<h... See more...
Hi Splunkers, any help with Rex has exceeded configured match_limit, consider raising the value in limits.conf. My search looks like this: | index=abc index=def process=jkl | rex field=_raw ";(?<h_db_host>\w+);(?<h_instance_name>\w+);\d+;\d+;(?<h_db_name>\w+);(?<user_computer_ip>\d{1,3}(?:\.\d{1,3}){3})?;(?<user_computer_name>[^;]*)?;[-\d]+;[-\d]+;(?<audit_policy_name>[^;]+);(?<audit_policy_severity>\w+);(?<user_activity>[^;]+);(SUCCESSFUL|UNSUCCESSFUL);(?<activity_details>[^;]+);(?<application_username>[^;]*)?;{5}(?<db_user_id>\w+)?;(?<user_application>[^;]+)?;(?<db_schema>\w+)?;" | rex field=user_activity "(?<user_activity_event>.+?)\;" | fillnull value="null" | search h_db_name IN("srp1", "brp1") audit_policy_severity="CRITICAL" db_user_id=SYSTEM | table _time, env, host, h_db_host, h_instance_name, h_db_name, user_computer_ip user_computer_name audit_policy_name audit_policy_severity user_activity_event Any help will be appreciated.
At Splunk Education, we are committed to providing a robust learning experience for all users, regardless of skill level or learning preference. Whether you’re just starting your journey with Splunk ... See more...
At Splunk Education, we are committed to providing a robust learning experience for all users, regardless of skill level or learning preference. Whether you’re just starting your journey with Splunk or sharpening advanced skills, our broad range of educational resources ensures you’re prepared for every step.    Our Portfolio We offer Free eLearning to kickstart your learning, eLearning with Labs for hands-on practice, Instructor-led courses for interactive, expert guidance, and Splunk Certifications to validate your expertise. For quick tips and insights, explore our Splunk YouTube How-Tos and Splunk Lantern, where you'll find up-to-date guidance and best practices that reflect the latest in Splunk's capabilities. New Courses Available Every month, we release new courses designed to empower learners with the tools and knowledge they need to stay ahead in the evolving tech landscape. Whether you prefer self-paced eLearning or the structure of live instruction, there’s a course to fit your style. This month, we are excited to announce a new instructor-led course, a new eLearning with Labs course, and three free eLearning courses to help you advance your Splunk skills:   SOC Essentials: Investigating and Threat Hunting – Instructor-led course (Enroll) SOC Essentials: Investigating with Splunk – eLearning with labs (Enroll) Creating Classic Dashboards – Free eLearning (Enroll)  SOC Essentials: Investigating with Splunk – Free eLearning (Enroll)  Administering Splunk Observability Cloud – Free eLearning (Enroll)  These courses provide targeted insights into security operations and observability, essential for anyone looking to enhance their data-driven capabilities. Explore them today to stay ahead in your field! All courses are available through the Splunk Course Catalog, accessible via our banner or directly on our platform. Expanding Global Learning Access  As part of our commitment to accessibility and inclusion, we continue to translate eLearning courses into multiple languages and add non-English captions. This effort ensures that learners worldwide can grow their Splunk expertise in their preferred language, supporting our vision of an inclusive educational ecosystem. Each month presents new opportunities to expand your knowledge, boost your career, and enhance your contributions to enterprise resilience. Stay updated with the latest courses and continue your journey toward Splunk mastery – your next big career move could be just a course away. See you next month!  - Callie Skokos on behalf of the Splunk Education Crew
Currently trying to get eval to give multiple returns     | eval mitre_category="persistence,Defense_Evasion" | eval apt="apt1,apt2,apt3"   I would like the values to be listed as OR. that way i... See more...
Currently trying to get eval to give multiple returns     | eval mitre_category="persistence,Defense_Evasion" | eval apt="apt1,apt2,apt3"   I would like the values to be listed as OR. that way i get `apt2` or `apt3` instead of searching for `apt1,apt2,apt3`. I would like to know if there is a way to do this via one query instead of several if at all possible.
Hi Splunkers, as per thread title, I need to build one or more searches that show me, for a specific app, all alerts, reports and dashboards owned by a specific app. Now, I know very well that commu... See more...
Hi Splunkers, as per thread title, I need to build one or more searches that show me, for a specific app, all alerts, reports and dashboards owned by a specific app. Now, I know very well that community is full of topic with this problem and related answer. The issue is the following: no one works properly, in my cases. This because, when I run the search, If I specify the app, I got "mixed" results: I mean, I got an output composed by alerts owned app I'm searching for, but also other. Let me be more specific. I know that, for such kind of search, the base string is: | rest splunk_server=local /servicesNS/-/-/saved/searches | table title Whis means: ehy, return me all saved searches for all apps on local Splunk Server (a SH, in my case). So, if I execute above search, I got more or less 450 results. So, what about if I need to filter? Very simple: | rest splunk_server=local /servicesNS/-/<app name here>/saved/searches | table title That should return all and only saved searches for requested app (a custom one in my cases).  Problem: app I need info has 119 saved searches (checked on GUI in related page) Above query return me a total amount of 256; analyzing the output, it return me searches owned by other apps.  Of course, I have already performed the obvious check, which is: am I sure that searches in output belongs to different apps and are not all for the one I'm searching for? Yes, I checked and on outpur result there are also Enterprise Security Searches, so for sure search is returning me more data than the one I need.  So, my question is: what can be the root cause of this behavior, if searches ownership is correct?
hello Splunkers i have a requirement where i need to show values in statistics even if it doesn't exist, for example here's my search: index=brandprotection name IN (ali, ahmad, elias,moayad) | sta... See more...
hello Splunkers i have a requirement where i need to show values in statistics even if it doesn't exist, for example here's my search: index=brandprotection name IN (ali, ahmad, elias,moayad) | stats count by brand however sometimes in the logs Elias and Moayad names isn't there but i need to have it in the table, so i need the output to be like this   user count ahmad 7 ali 4 elias 0 moayad 0   i need a search that would show the results like the table above.     thanks
I have a index with 7 sources of which I utilize 4 sources. The alert outputs data to a lookup file as its alert function and is written something like this. index=my_index  source=source1 OR s... See more...
I have a index with 7 sources of which I utilize 4 sources. The alert outputs data to a lookup file as its alert function and is written something like this. index=my_index  source=source1 OR source=source2 OR source=source3 OR source=source4 stats commands eval commands table commands etc. I want to configure the alert to run only when all the four sources are present. I tried doing this. But the alert isnt running even when all 4 sources are present. Please help me on how to configure this.
Hi Team, I'm trying to add customized event timestamp by extracting from raw data instead of adding current time as the event time. To achieve this I created a sourcetype with following setting... See more...
Hi Team, I'm trying to add customized event timestamp by extracting from raw data instead of adding current time as the event time. To achieve this I created a sourcetype with following settings from splunk web gui after testing in lower environment. But in production it is not functioning as expected. Raw data:  2024-11-18 09:20:10.187, STAGE_INV_TXNS_ID="xxxxxxxxx", LOC="xxxxxxx", STORE_NAME="xxxxxxx", STORE_PCODE="xxxxxxxxx", TRAN_CODE="xxxx", TRANS_TYPE="xxxxxxx", TRAN_DATE_TIME="2024-11-18 09:09:27", LAST_UPDATE_USER="xxxxxx" 2024-11-18 09:20:10.187, STAGE_INV_TXNS_ID="xxxxxxxxx", LOC="xxxxxxx", STORE_NAME="xxxxxxx", STORE_PCODE="xxxxxxxxx", TRAN_CODE="xxxx", TRANS_TYPE="xxxxxxx", TRAN_DATE_TIME="2024-11-18 09:09:27", LAST_UPDATE_USER="xxxxxx" 2024-11-18 09:20:10.187, STAGE_INV_TXNS_ID="xxxxxxxxx", LOC="xxxxxxx", STORE_NAME="xxxxxxx", STORE_PCODE="xxxxxxxxx", TRAN_CODE="xxxx", TRANS_TYPE="xxxxxxx", TRAN_DATE_TIME="2024-11-18 09:09:28", LAST_UPDATE_USER="xxxxxxx" 2024-11-18 09:20:10.187, STAGE_INV_TXNS_ID="xxxxxxxxx", LOC="xxxxxxx", STORE_NAME="xxxxxxx", STORE_PCODE="xxxxxxxxx", TRAN_CODE="xxxx", TRANS_TYPE="xxxxxxx", TRAN_DATE_TIME="2024-11-18 09:09:30", LAST_UPDATE_USER="xxxxx" I want the timestamp in TRAN_DATE_TIME field to be event timestamp. This data we are pulling from database using db connect. Could you please help us in understanding whats going wrong and how it can be corrected.
Dear splunkers, Through tuning Splunk Enterprise, we required to change every connection through Splunk Instances from IP Address to Domain Name. Everything from server.conf are done except this. So... See more...
Dear splunkers, Through tuning Splunk Enterprise, we required to change every connection through Splunk Instances from IP Address to Domain Name. Everything from server.conf are done except this. So, is possible to change these Peers URI from IP Address to Domain Name and where can we find this configuration ? Thanks & best regards, Benny On  
I want to import Adaudit logs into Splunk but I don't know how The important thing is that I want to do this from the oldest logs, not from now on.
background - the designed windows log flow is Splunk Agent of Universal forwarder -> Splunk Heavy Forwarder-> Splunk Indexer. the path are monitored with inputs.conf in Universal forwarder like this... See more...
background - the designed windows log flow is Splunk Agent of Universal forwarder -> Splunk Heavy Forwarder-> Splunk Indexer. the path are monitored with inputs.conf in Universal forwarder like this [monitor://D:\test\*.csv] disabled=0 index=asr_info sourcetype=csv source=asr:report crcSalt=<SOURCE> the example content for one of the csv file is like below -  cn,comment_id,asr_number,created_by,created_date zhy,15,2024-10-12-1,cc,2024-10-28 18:10 bj,10,2024-09-12-1,cc,2024-09-12 13:55   for the 2 indexed rows, the field extractions are good except _time.  for the first row, _time is 10/12/24 6:10:00.000 PM, for the second row, _time is 9/12/24 1:55:00.000 PM Question - How to make _time be the real ingested time instead of guessing from the row content? (tried with DATETIME_CONFIG = CURRENT in both HF and index in props like - [source::asr:report] DATATIME_CONFIG = CURRENT but, it does not work )