All Topics

Top

All Topics

We have a server that is using splunk enterprise version 7.3.4. However, I couldn't find Splunk DB Connect compatible with that version in splunk base. Could I get a Splunk DB Connect installation fi... See more...
We have a server that is using splunk enterprise version 7.3.4. However, I couldn't find Splunk DB Connect compatible with that version in splunk base. Could I get a Splunk DB Connect installation file that is compatible with splunk enterprise 7.3.4?
Running the code below will yield ut_domain as ".com" instead of "somethin.shop". It seems like if the subdomain contains a valid TLD string (e.g. .com), then ut_domain is not parsed correctly. A dom... See more...
Running the code below will yield ut_domain as ".com" instead of "somethin.shop". It seems like if the subdomain contains a valid TLD string (e.g. .com), then ut_domain is not parsed correctly. A domain "somethingbad.shop" will be parsed correctly as it recognizes .shop as a TLD.       | makeresults | eval domain_full = "something.com.somethin.shop" | eval list="*" | `ut_parse(domain_full, list)`        Is it a bug? If so, how can we report it? Any workaround you can think of while waiting for bug fix?  
Dear team,  Good day! Hope you are doing well.  I need some help in understanding a correlation search. The search is as follows:  index=email sourcetype="ironport:summary" action=delivered |fi... See more...
Dear team,  Good day! Hope you are doing well.  I need some help in understanding a correlation search. The search is as follows:  index=email sourcetype="ironport:summary" action=delivered |fillnull value="" file_name senderdomain |rex field=sender "\@(?<senderdomain>[^ ]*)" | eval list="mozilla" | `ut_parse_extended(senderdomain,list)` | stats count first(subject) as subject earliest(_time) as earliest latest(_time) as latest values(file_name) as file_name by ut_domain | inputlookup append=t previously_seen_domains.csv | stats sum(count) as No_of_emails values(subject) as subject min(earliest) as earliest max(latest) as latest values(file_name) as file_name by ut_domain | eval isNew=if(earliest >= relative_time(now(), "-1d@d"), 1,0) | where isNew=1 and No_of_emails>=1 | mvcombine file_name delim=" " | eval temp_file=split(file_name," ") | rex field="temp_file" "\.(?<ext>[^\.]*$)" | eventstats values(ext) as extension by ut_domain | table latest earliest ut_domain No_of_emails subject file_name temp_file extension | eval _comment="exchange search here" | join type=outer ut_domain [search index=email sourcetype="MSExchange:2013:MessageTracking" directionality="Incoming" event_id="RECEIVE" | stats count by sender_domain | fields sender_domain | eval list="mozilla" | `ut_parse_extended(sender_domain,list)` | table ut_domain sender_domain ] | eval isExchangeFound=if(isnull(sender_domain),"false","true") | where isExchangeFound="true" | eval qualifiers=if(No_of_emails>=5,mvappend(qualifiers, "- More Than 5 emails from a previously unseen domain (Possible Spam)."),qualifiers) | cluster t=0.5 labelonly=1 showcount=0 field=file_name | eventstats dc(file_name) as similer_attach_count dc(ut_domain) as no_of_domains by cluster_label | eval qualifiers=if(similer_attach_count>=2 AND match(extension,"(?i)(bat|chm|cmd|cpl|exe|hlp|hta|jar|msi|pif|ps1|reg|scr|vbe|vbs|wsf|lnk|scr|xlsm|dotm|lnk|zip|rar|gz|html|iso|img|one)") ,mvappend(qualifiers, "- Suspicious email attachments with similar names, sent from " .no_of_domains. " previously unseen domains. (Qbot Style)"),qualifiers) | where mvcount(qualifiers)>0 | eval _comment="informational qualifier not counted" | eval qualifiers=if(match(extension,"(?i)(bat|chm|cmd|cpl|exe|hlp|hta|jar|msi|pif|ps1|reg|scr|vbe|vbs|wsf|lnk|scr|xlsm|dotm|lnk|zip|rar|gz|html|iso|img|one)") ,mvappend(qualifiers, "- Email attachment contains a suspicious file extension - " .extension ),qualifiers) | eval cluster_label=if(isnull(cluster_label),ut_domain,cluster_label) | stats values(subject) as subject values(no_of_domains) as no_of_domains values(severity) as severity values(file_name) as file_name values(ut_domain) as ut_domain values(qualifiers) as qualifiers min(earliest) as start_time max(latest) as end_time sum(No_of_emails) as No_of_emails by cluster_label | eval sev=if(no_of_domains>1,mvcount(qualifiers) + 1,mvcount(qualifiers)) | eval urgency=case(sev=1,"low",sev=2,"medium",sev>2,"high" ) | eval reason=mvappend("Alert qualifiers:", qualifiers) | eval dd=" index=email sourcetype=ironport:summary sender IN (\"*".mvjoin(ut_domain, "\", \"*")."\") | eventstats last(subject) as subject by sender | eventstats last(file_name) as file_name by sender |table _time action sender recipient subject file_name" | table start_time end_time ut_domain subject No_of_emails file_name reason urgency dd | `security_content_ctime(start_time)` | `security_content_ctime(end_time)` | rename No_of_emails as result | eval network_segment="ABC" |search ut_domain=* NOT [inputlookup domain_whitelist.csv | fields ut_domain] The expansion of the macro `ut_parse_extended(senderdomain,list)`: | lookup ut_parse_extended_lookup url as senderdomain list as list | spath input=ut_subdomain_parts | fields - ut_subdomain_parts   We have this search and it works but giving a lot of false positives. Even though a domain is added to the look up table, still we are getting an alert. I am SOC analyst and I tried to understand this query but it appears to be very difficult. Can someone please help or support me to simplify this query? It will be really helpful. This is the first time I am posting something on a community page. So, if I missed to add any information, I apologize and do let me know if more info is required and I will be more than happy to furnish them.  Appreciate your help and support. 
Lookup file `tenants.csv`   tenant, tenant1, tenant2, tenant3, tenant4,   Desired query   index=index1 (tenant1xxx OR tenant2xxx OR tenant3xxx OR tenant4xxx)   I'm having a tough time getting... See more...
Lookup file `tenants.csv`   tenant, tenant1, tenant2, tenant3, tenant4,   Desired query   index=index1 (tenant1xxx OR tenant2xxx OR tenant3xxx OR tenant4xxx)   I'm having a tough time getting this work.  Trying lookup is not working because I am not searching any existing fields.  Subsearching with inputlookup is not working at all, not sure why. So in a nutshell, I'm trying to inject (not just each value from the lookup file but also appending `xxx`), as an OR list of raw strings.  Any ideas?  
Hello, How to add space on a text on a single value?     Thank you for your help Adding spaces did not have any affect.  I was trying to align the text to the left   | makeresults | eval tes... See more...
Hello, How to add space on a text on a single value?     Thank you for your help Adding spaces did not have any affect.  I was trying to align the text to the left   | makeresults | eval test="this is a test " | table test   If I added period,   it worked   | makeresults | eval test="this is a test..........................." | table test      
We have a dashboard created with the XML Dashboard Classic editor which presents a table of alerts and allows the user to open a second dashboard to see the details.  The parent dashboard uses the dr... See more...
We have a dashboard created with the XML Dashboard Classic editor which presents a table of alerts and allows the user to open a second dashboard to see the details.  The parent dashboard uses the drill-down feature to link to the child dashboard.  The value of a hidden column in the row the user clicked on is passed to the child dashboard in the URL string.  We are porting this to Dashboard Studio and need to know how to link using the value from a hidden column?  It seems that the column must be visible for the drilldown link to work.
Hi , I was wondering what features does Splunk offer in auditing workload in DB2 z/OS. We are looking to audit a bunch of users in DB2 z/OS using SPLUNK. I would like to know what is possible with... See more...
Hi , I was wondering what features does Splunk offer in auditing workload in DB2 z/OS. We are looking to audit a bunch of users in DB2 z/OS using SPLUNK. I would like to know what is possible with Splunk and what is not.   Thank You.
February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another edition of indexEducation, the newsletter that takes an untraditional twist on what’... See more...
February 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another edition of indexEducation, the newsletter that takes an untraditional twist on what’s new with Splunk Education. We hope the updates about our courses, certification, and technical training will feed your obsession to learn, grow, and advance your careers. Let’s get started with an index for maximum performance readability: Training You Gotta Take | Things You Needa Know | Places You’ll Wanna Go  Training You Gotta Take How-Tos on the ‘Tube | 20K Splunky Subscribers Who doesn’t hit up YouTube when their dryer has stopped drying or their washer has stopped washing? So why not use YouTube to solve those Splunk use cases that you know will help your organization be more resilient? Our Splunk Education videos offer just-in-time answers and tutorials – whether you've just installed Splunk or are a seasoned user – covering search, security, visualizations, machine learning, dashboards, Splunk APIs, indexing, and much more. Ready. Set. Play. Fix.  Gotta Get Splunking | How-To on YouTube New Releases | Training is Streaming Much like that obscure documentary about Amelia Earhart you found on Netflix, there are hundreds of ways to learn something new on the Splunk Training and Enablement Platform (STEP). Digging deeper on things you already know is a valuable way to improve your outlook and your opportunities, so get started by exploring courses tailored to your interests in security, cloud, and observability. We stream new releases almost daily. Just call us STEPflix. Gotta Get Current | Tune into New Releases Things You Needa Know How to Build Resilience | Splunk Education E-Book Mental resilience. Physical resilience. Digital resilience. We’ve all had challenges that required us to tap into resilience, but sometimes we do it without the best resources. At Splunk Education, we want digital resilience to come easier by creating learning paths and growing our training curriculum. In fact, our new Splunk Education e-book is designed to guide your cybersecurity learning journey – with coursework and programs that hone in on what’s needed to identify and remediate new, AI-driven attacks and exploitations.  Needa Know the Value | Download the E-Book What the Future Holds | Executive Perspectives It’s hard to read a headline today without seeing the acronym, AI. In fact, Predictions 2024, the annual report from Splunk senior leadership is heavily focused on AI's omnipresence – underscoring its inevitable influence on cybersecurity dynamics. But, don’t worry, you’re not in this alone! Read more about how Splunk Education is supporting the future of cybersecurity and how to use our new Splunk Education e-book to guide the cybersecurity learning journey in this AI-dominated era. In the words of famous management consultant Peter Drucker, “The best way to predict the future is to create it.” Needa Know Some Predictions | Read the Blog Places You’ll Wanna Go Global Classrooms | Authorized Learning Partners Make it Possible  Have you ever wondered if there’s a way to learn Splunk besides using our self-paced courses or attending our virtual courses delivered by Splunk instructors? Welp there is! Thanks to the Splunk Authorized Learning Partner (ALP) program, learners like you can get training in your own region, timezone, and language. Discover more and explore how you can increase educational opportunities and deepen your understanding of Splunk by engaging with our ALPs. Splunk training is coming to a classroom near you.  Wanna Learn on Your Terms | Read More About the Program Go Learn on Campus | New Video Shares the Experience  At Splunk, we’re growing talent today for the benefit of society tomorrow. Whether you’re a seasoned tech professional, a leader in an organization, or a student working towards a career in IT and data, Splunk Education has you covered in so many ways. One way is through our  Splunk Academic Alliance program, which offers nonprofit colleges and universities access to data analytics and cybersecurity training for free or at a discount. Free Splunk and free training – they go together like cookies and cream. Sweet! Wanna Watch | 2-Minutes to Learn It Find Your Way | Learning Bits and Breadcrumbs Go Get Started | Onboarding Resources Available Go to STEP | Get Upskilled Go Discuss Stuff | Join the Community Go Social | LinkedIn for News Go Index It | Subscribe to our Newsletter Thanks for sharing a few minutes of your day with us – whether you’re looking to grow your mind, career, or spirit, you can bet your sweet SaaS, we got you. If you think of anything else we may have missed, please reach out to us at indexEducation@splunk.com.    Answer to Index This: Inside and outside.  
As the title suggests, our system needs a proxy to hit our SAML2 authentication service, but I don't see an option to provide a proxy, or to provide an attribute to describe a proxy like we can do in... See more...
As the title suggests, our system needs a proxy to hit our SAML2 authentication service, but I don't see an option to provide a proxy, or to provide an attribute to describe a proxy like we can do in the apps. Can someone please suggest documentation? Thank you!
Hi Team, how to Sum of the field based on the other field values. Row1 field values will be 0-9 and a-z. Sample one given below: ROW1 ROWcount 11 22 12 54 13 34 a1 56 a2 ... See more...
Hi Team, how to Sum of the field based on the other field values. Row1 field values will be 0-9 and a-z. Sample one given below: ROW1 ROWcount 11 22 12 54 13 34 a1 56 a2 78 d3 67 c4 78 c5 79 Final Output be like: ROW1 ROWcount 1 110 a 134 d 67 c 157 Thanks in Advance!!
I need to mask data before it being index. my sample his log structure. "2023-11-02 06:53:00 xx.xxx.xxx.xx GET /Security/Security/Logon 123 - xx.xxx.x.xxx Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64)... See more...
I need to mask data before it being index. my sample his log structure. "2023-11-02 06:53:00 xx.xxx.xxx.xx GET /Security/Security/Logon 123 - xx.xxx.x.xxx Mozilla/5.0+(Windows+NT+10.0;+Win64;+x64)+AppleWebKit/537.36+(KHTML,+like+Gecko)+Chrome/86.0.4240.198+Safari/537.36=RDPword=jsndksjs834u935=PDUserId=jsndksjs834u935=PDPword=jsndksjs834u935=RFuser=&securityToken=xxxxxxxx 200 0 0 14" I need to match highlights in green "RDPword=jsndksjs834u935", and "PDPword=jsndksjs834u935" I am using regex this matching the following which I don't want it match  "PDUserId=jsndksjs834u935=" RDPWord=([^=]+)=PDUUserId=([^=]+)=PDPWord=([^=]+) Can someone help me  Thanks
The Bloodhound Enterprise TA is run on the HF and generates an updated KV file every 4 hours.  I wrote a script that runs and turns the kvstore entries into alerts.  Due to some weirdness in the data... See more...
The Bloodhound Enterprise TA is run on the HF and generates an updated KV file every 4 hours.  I wrote a script that runs and turns the kvstore entries into alerts.  Due to some weirdness in the data, the question has come up, can the kvstore on the HF be copied to a SH.  I haven't found a suggestion that I think will work.  We are running Splunk Enterprise 9.1.1 on prem on servers running RHEL 8.8. TIA, Joe
How can we check if thyere is any throttling in Splunk when ingesting events via AWS Kinesis AddOn? What metrics are available for this addon?
Splunk Studio show a message or icon for a Pie chart which returns no data:   I am looking to display an icon or message if no results are found on a pie chart in place of the grey pie image on a d... See more...
Splunk Studio show a message or icon for a Pie chart which returns no data:   I am looking to display an icon or message if no results are found on a pie chart in place of the grey pie image on a dashboard if no results are found. Index=test (Eventcode=4010 OR Evnetcode=4011) | stats latest (eventcode) as latest_event_code by Site | eval Site= upper(site) | where latest_event_code=4010     I have been trying appends like the following: | stats count |eval NoResult="0" | Where count=0 | appendpipe [stats count | eval Noresult="0" | eval test="test Message"]
Hi Splunkers, I have a doubt about License Consumption. I'm not here to ask how to calculate daily ingestion and/or license consumption in a Splunk Envrinonment. Community is full of topic about th... See more...
Hi Splunkers, I have a doubt about License Consumption. I'm not here to ask how to calculate daily ingestion and/or license consumption in a Splunk Envrinonment. Community is full of topic about this and I have my search I use when no Monitor Console is configured. The point is the following: on a LM, I have 3 different environment, each one with a set of SH, indexers and so on. The only "point of contact" is the LM itself, so, in a schematic way: Env A (SHs, IDX cluster, others hosts) ---> LM "X" Env B (SHs, IDX cluster, others hosts) ---> LM "X" Env C (SHs, IDX cluster, others hosts) ---> LM "X" Question is: what about if I have to search daily license consumption for only one of above ENVs? For example, I want calculate license consumption only for Env A. First thing I thought: Ok, I have two options: Use MC Use my search on _internal logs, based on license consumption data, and specify, as idx parameter, only indexes subset for desiderd ENV. PROBLEM: ENVs have not totally different indexes. For example, index "linux_audit" is set on all 3 env. So, if I try to differentiate cluster based on their own indexes, I'm not able to do this.
Hi team, I am using AppDynamics SaaS version 23.11 and monitoring my on-premise servers and applications. Some volumetrics data on agents used,  Agent wise  Prod DTA Machine Agent 4612 3... See more...
Hi team, I am using AppDynamics SaaS version 23.11 and monitoring my on-premise servers and applications. Some volumetrics data on agents used,  Agent wise  Prod DTA Machine Agent 4612 3211 App Agent 884 414 DB Agent 13 10 Analytics Agent 12 14 I would like to know amount of traffic sent from my on-premise AppD agents ( machine, app, DB and analytics) to controller. If there is a way to get those numbers ( not expecting exact at least approx should be fine) then please let me know. Please note, we are not using any proxy in-between agents and controller.
Can I retrieve list of alerts shared in App level, Is it possible? |rest /services/saved/searches | search eai:acl.app=my_app eai:acl.sharing=app | fields eai:acl.owner eai:acl.app eai:acl.sharin... See more...
Can I retrieve list of alerts shared in App level, Is it possible? |rest /services/saved/searches | search eai:acl.app=my_app eai:acl.sharing=app | fields eai:acl.owner eai:acl.app eai:acl.sharing search title cron_schedule description
how to get m certification     
Hello! We actually noticed different results in two dashboard panels. 1-With the first, We have used the fields command to specify the fields we needed to work with, then applied a count. 2-In the... See more...
Hello! We actually noticed different results in two dashboard panels. 1-With the first, We have used the fields command to specify the fields we needed to work with, then applied a count. 2-In the second, The same query was used with the table command instead of fields and then applying a count We have noticed different results in count, query number 2 , gave a correct and complete result. Can someone please explain the difference between the two commands table and fields , and why fiels seems to give missing results Thank you
Hi Folks,   I am trying to get Splunk response from java using below method ---------------- public String executeSearch(String searchQuery) throws IOException { //String apiUrl = hostName + ... See more...
Hi Folks,   I am trying to get Splunk response from java using below method ---------------- public String executeSearch(String searchQuery) throws IOException { //String apiUrl = hostName + "/__raw/services/search/jobs/export?search=" + URLEncoder.encode(searchQuery, "UTF-8").replace("+", "%20"); String apiUrl = hostName + "/__raw/services/search/jobs/export?search=" + URLEncoder.encode(searchQuery, "UTF-8") .replace("+", "%2B") .replace("%3D", "=") .replace("%20", "+") .replace("%2A", "*") .replace("%3F", "?") .replace("%40", "@") .replace("%2C", ","); URL url = new URL(apiUrl); System.out.println("Value of Splunk URL is " + url); HttpURLConnection connection = (HttpURLConnection) url.openConnection(); connection.setRequestMethod("GET"); String credentials = userName + ":" + password; String encodedCredentials = Base64.getEncoder().encodeToString(credentials.getBytes()); connection.setRequestProperty("Authorization", "Basic " + encodedCredentials); StringBuilder response = new StringBuilder(); try (BufferedReader in = new BufferedReader(new InputStreamReader(connection.getInputStream()))) { String inputLine; while ((inputLine = in.readLine()) != null) { System.out.println("Response Line: " + inputLine); // Print each line of the response response.append(inputLine); } } return response.toString(); } public static void main(String[] args) { if (args.length < 10) { System.out.println("Insufficient arguments provided. Please provide all required arguments."); System.exit(1); // Exit with error code 1 } String hostName = args[0]; String userName = args[1]; String password = args[2]; String query = args[3]; String logFileLocation = args[4]; String fileName = args[5]; String fileType = args[6]; String startDate = args[7]; String endDate = args[8]; String time = args[9]; try { SplunkRestClient client = new SplunkRestClient(hostName, userName, password); String searchResult = client.executeSearch(query); System.out.println(searchResult); // Write search result to file String filePath = logFileLocation + File.separator + fileName + "." + fileType; Files.write(Paths.get(filePath), searchResult.getBytes()); // Check if file is empty File file = new File(filePath); if (file.length() == 0) { System.out.println("File is empty. Deleting..."); if (file.delete()) { System.out.println("File deleted successfully."); } else { System.out.println("Failed to delete file."); } } else { // Validate file contents (assuming JSON data) try { new JSONObject(new String(Files.readAllBytes(Paths.get(filePath)))); System.out.println("File contents are valid JSON."); } catch (Exception e) { System.out.println("File is corrupt. Deleting..."); /*if (file.delete()) { System.out.println("Corrupt file deleted successfully."); } else { System.out.println("Failed to delete corrupt file."); }*/ } } } catch (IOException e) { System.out.println("Error occurred while executing search: " + e.getMessage()); System.exit(2); // Exit with error code 2 } } ------------------------------- I am calling this java file using bat file :: All Splunk host name set host_nam=https://log01.oss.mykronos.com/en-US/app/search/search?earliest=@d&latest=now set host_cfn=https://cfn-log01.oss.mykronos.com/en-US/app/search/search?earliest=@d&latest=now set host_dcust=https://koss01-log01.oss.mykronos.com/en-US/app/search/search?earliest=@d&latest=now :: Splunk user name set username=******** :: Splunk user password set password=******** :: Splunk search query for CAN, AUS, EUR set query_kpi=index=*kpi* level=ERROR logger=KPI* set query_wfm=index=*wfm* level=ERROR logger=KPI* set file_type="JSON" set start_date="" set end_Date="" set time="3600" %JAVA_PATH% com.kronos.hca.daily.monitoring.processor.SplunkRestClient %host_nam% %username% %password% "%query_nam_kpi%" "%logFileLocation%" "%file_name_nam_kpi%" %file_type% %start_date% %end_Date% %time%,