All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi Splunkers I'm looking for a way to append a column with an ID based on the value of another field. Base search gives this   index=transactionlog sourcetype=transaction earliest=-1h@h latest=@h... See more...
Hi Splunkers I'm looking for a way to append a column with an ID based on the value of another field. Base search gives this   index=transactionlog sourcetype=transaction earliest=-1h@h latest=@h | table _time Item Count   datetime=2025-03-10T08:59:59 Item=Apple Count=8 datetime=2025-03-10T08:59:45 Item=Banana Count=2 datetime=2025-03-10T08:58:39 Item=Apple Count=5 datetime=2025-03-10T08:58:25 Item=Coconut Count=1 datetime=2025-03-10T08:57:36 Item=Banana Count=2     _time Item Count ... Apple 8 ... Banana 2 ... Apple 5 ... Coconut 1 ... Banana 2 I'd like something that gives this _time Item Count ID ... Apple 8 1 ... Banana 2 2 ... Apple 5 1 ... Coconut 1 3 ... Banana 2 2   The ID is local, based only on the results. Each unique item is numbered.   I've tried streamstats count but this doesn't give the desired results.   Thanks all  
Hi @gcusello , thanks for your advise. I tried running the search below but it takes quite a long time to show results. Furthermore the query does not display Object_Name as needed
@gcusello I tested again.  Yes,  I could see  all of them under interesting fields (all fields in raw data). Only lvl=<value> is not working if I add it in first line of search together with so... See more...
@gcusello I tested again.  Yes,  I could see  all of them under interesting fields (all fields in raw data). Only lvl=<value> is not working if I add it in first line of search together with sourcetype and index or if I use with search command. Rest of the fields are working fine without spath. Regards, PNV
Hi @charlottelimcl , subsearch is used only to filter results from the main search using the results of the subsearch, you instead need a join, but, please, aviud to use the join command because it'... See more...
Hi @charlottelimcl , subsearch is used only to filter results from the main search using the results of the subsearch, you instead need a join, but, please, aviud to use the join command because it's very slow and resource consuming. You could use a solution like the following: index=wineventlog source=wineventlog:security (EventCode=4688 OR (EventCode=4663 Object_Name="*hello.exe" Process_Name="*welcome.exe")) | stats earliest(_time) AS _time values(ComputerName) AS ComputerName values(eval(if(EventCode=4663,Process_Name,"") AS New_Process_Name values(eval(if(EventCode=4688,Process_Name,"") AS Initiating_Process_Name BY Account_name  You should adapt this approach to your requirements. Ciao. Giuseppe
Hi @woodcock ,  Thank you, can you elaborate please.
Thanks @yuanliu , interesting option, but question not actually about that :-). usually we use this easier option for this - choice called "All" using "*" as value
Hi all, I have the following query: index=wineventlog source=wineventlog:security EventCode=4688 [search index=wineventlog source=wineventlog:security EventCode=4663 Object_Name="*hello.exe" Proces... See more...
Hi all, I have the following query: index=wineventlog source=wineventlog:security EventCode=4688 [search index=wineventlog source=wineventlog:security EventCode=4663 Object_Name="*hello.exe" Process_Name="*welcome.exe" | fields Account_Name Process_Name | rename Process_Name as New_Process_Name] | table _time ComputerName Account_Name New_Process_Name Initiating_Process_Name   EventCode=4663 has a field called Object_Name, while EventCode=4688 does not. My end result is that I want to display a table to show the Object_Name column alongside with New_Process_Name and Initiating_Process_Name. The above query identifies the Account_Name and New_Process_Name (of the subsearch) and is fed into the main search to identify the Initiating_Process_Name. I want to be able to include the Object_Name from EventCode=4663 into this table as well. How can i do it?
Hi @Poojitha , I don't really know! which fields are listed in in tersting fields if you run the search without filters? do you see all the fields? Ciao. Giuseppe
@gcusello  : Thanks for the response. Agreed on the format. But why lvl and dpnt field are behaving different ? |search lvl="Warn" works only with spath whereas | search dpnt="test.dpmt" works even ... See more...
@gcusello  : Thanks for the response. Agreed on the format. But why lvl and dpnt field are behaving different ? |search lvl="Warn" works only with spath whereas | search dpnt="test.dpmt" works even though I do not use spath on that.
Hi, you can actually get the URL Pieces via the API. When you share a dashboard, you create a securitytoken, which you can get using the standard Dashboard API   API: https://<Controller FQDN>/con... See more...
Hi, you can actually get the URL Pieces via the API. When you share a dashboard, you create a securitytoken, which you can get using the standard Dashboard API   API: https://<Controller FQDN>/controller/restui/dashboards/getDashboard/<Dashboard ID> The response will contain a securitytoken if it has been shared. All you then do is build the shared URL with that token https://<Controller FQDN>/controller/dashboards.jsp?desktopView=false#/location=DASHBOARD_VIEWER&token=<securitytoken>&timeRange=last_1_hour.BEFORE_NOW.-1.-1.60 Adjust time range to whatever you need for your dashboard as well   Let me know if this doesn't work for you  
Hi @Poojitha , you have a json format file. You can extract fields in three ways: using spath (as you did), adding INDEXED_EXTRACTIONS=JSON to your props.conf (the best solution), using regex (... See more...
Hi @Poojitha , you have a json format file. You can extract fields in three ways: using spath (as you did), adding INDEXED_EXTRACTIONS=JSON to your props.conf (the best solution), using regex (to use only if you haven't any other solution). So, try the second option. Ciao. Giuseppe
Hi, your welcome to DM me. I have a python script which I use to output the data into CSV Format. There is no option through the UI to do this.
Dear @livehybrid, yes I confirm that my Splunk instances are running on VMs Query 1 Query 2 Regards.
Hi All, I need help in knowing below. There is a field named lvl, which is of type=string.  Raw Data :    { "time": "2025-03-10T06:20:29", "corr": "3hgewhrger2346324632434gjhf",... See more...
Hi All, I need help in knowing below. There is a field named lvl, which is of type=string.  Raw Data :    { "time": "2025-03-10T06:20:29", "corr": "3hgewhrger2346324632434gjhf", "dpnt": "test.dpmt", "appn": "test - appn", "lvl": "Warn", "mod": "test.mod", "tid": "171", "oper": "SetTestContext", "rslt": "Succeeded", "msg": "test msg", "inst": "test inst", "x-trace-id": "Root=1-65325bhg-test3;Sampled=1" }   Though lvl is of type string, if I try | search lvl="Warn" or lvl=Warn, it renders no result. Instead if I do  | spath  lvl and then | search lvl="Warn" or  lvl=Warn it is showing result. Whereas for other fields like dpnt which is again of type string, it is working fine with | search dpnt="test.dpmt".  I understand spath works on structured data format like json and xml but not getting what is happening in this case. Why is lvl string field not working as expected ? Please can anyone shade some light on this.  Thanks, PNV
Hi @splunklearner  With your 3 years of Splunk experience and your new learning in AWS, you have several promising career paths to consider. Here's my take on your options, although please remember ... See more...
Hi @splunklearner  With your 3 years of Splunk experience and your new learning in AWS, you have several promising career paths to consider. Here's my take on your options, although please remember that the job market in your area may have a stronger requirement for certain skills than others - have you had a look for the kind of job you want to be doing in a year to see what skills are required? Cybersecurity and SIEM This is perhaps the most natural extension of your current Splunk skills. Since you already understand SIEM concepts through Splunk, deepening your security knowledge would leverage your existing expertise. Recommended path: Get a security certification like Security+ or SSCP as a foundation Follow with a Splunk security certification (Splunk Enterprise Security Certified Admin) Learn about threat hunting and incident response workflows Study security frameworks like MITRE ATT&CK Time investment: 6-9 months for meaningful progress DevOps with Splunk Focus Since you've started learning AWS, building DevOps skills makes sense. Splunk is often a critical monitoring component in DevOps pipelines. Recommended path: Complete AWS Solutions Architect Associate certification Learn Infrastructure as Code (Terraform is more accessible for non-coders) Study CI/CD concepts (Jenkins, GitLab CI) Learn Docker fundamentals and basic Kubernetes concepts Focus on Splunk's role in DevOps monitoring and observability Time investment: 9-12 months SRE (Site Reliability Engineering) SRE combines aspects of systems engineering and operations, with Splunk being a valuable tool for monitoring and alerting. Recommended path: Strengthen your Linux skills Learn basic scripting with Python (start small - this is learnable!) Focus on monitoring architectures and alert design Study incident management and postmortem processes Learn about SLIs, SLOs, and SLAs Time investment: 12+ months (involves more coding skills) What would I do? Given your background and the 1-year timeframe, I recommend focusing on Cybersecurity and SIEM while gradually adding some DevOps skills. Why? Your Splunk experience gives you a head start in security, and the demand for security professionals with SIEM expertise remains high. The AWS knowledge you're building naturally complements this, as cloud security is a critical concern. For someone without coding experience, security offers more entry points that don't immediately require programming skills, though I'd suggest learning basic Python automation as you progress. Good luck with your future learning! Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Hi @SatyaMGS  If you need some high level walkthroughs on how to use Splunk, including how to search, install apps etc then I would recommend checking out some of the videos at https://www.youtube.c... See more...
Hi @SatyaMGS  If you need some high level walkthroughs on how to use Splunk, including how to search, install apps etc then I would recommend checking out some of the videos at https://www.youtube.com/SplunkHowTo In terms of setting up Carbon Black, here is what I found from other docs online that might help. Step 1: Set Up the Carbon Black App for Splunk In your Splunk web interface, go to "Apps" → "Find More Apps" Search for "Carbon Black" or "VMware Carbon Black" Install the appropriate app for your Carbon Black version Restart Splunk after installation Step 2: Configure Carbon Black API Credentials Log into your Carbon Black EDR console Navigate to the "Settings" → "API Access" Create a new API key with read permissions Note down the API key and API URL - you'll need these for Splunk Step 3: Configure the Carbon Black App in Splunk In Splunk, navigate to the installed Carbon Black app Go to "Configuration" or "Setup" within the app Enter the following details: Carbon Black Server URL (e.g., https://your-cb-server.domain) API Token/Key you created earlier Select which data types you want to collect (events, alerts, etc.) Set collection intervals Step 4: Set Up a Data Input for Carbon Black in Splunk Navigate to "Settings" → "Data inputs" in Splunk Select "Carbon Black" or the corresponding input type Configure a new input with: Name: Something descriptive like "CarbonBlack-EDR-Logs" Server URL: Your Carbon Black server address API credentials Log types to collect Collection interval (e.g., every 5 minutes) Step 5: Verify Data Collection After configuration, wait for the collection interval to pass In Splunk, run a search like: sourcetype="carbonblack:*" or index=carbonblack If properly configured, you should see Carbon Black EDR logs appearing Please let me know how you get on and consider adding karma to this or any other answer if it has helped. Regards Will
Hi Team, Actually, I'm very interested to learn Splunk SIEM. I have downloaded trail version in my Home Lab Environment. I'm using Carbon Black EDR also.  I tried to integrate Carbon EDR to Splunk a... See more...
Hi Team, Actually, I'm very interested to learn Splunk SIEM. I have downloaded trail version in my Home Lab Environment. I'm using Carbon Black EDR also.  I tried to integrate Carbon EDR to Splunk and check EDR logs in Splunk. But I don't know. How to integrate this. Please provide step-step process to do this integration.  Kindly needful.   Thanks Satya 7013634534
I know it's already a party.  But I have to agree with @PickleRick that throwing out a random SPL snippet is not a good way to use volunteers' time.  Here are four golden rules of asking an answerabl... See more...
I know it's already a party.  But I have to agree with @PickleRick that throwing out a random SPL snippet is not a good way to use volunteers' time.  Here are four golden rules of asking an answerable question that I call four commandments: Illustrate data input (in raw text, anonymize as needed), whether they are raw events or output from a search (SPL that volunteers here do not have to look at). Illustrate the desired output from illustrated data. Explain the logic between illustrated data and desired output without SPL. If you also illustrate attempted SPL, illustrate actual output and compare with desired output, explain why they look different to you if that is not painfully obvious. In this spirit, if I have to read your mind, I will start by reverse engineering what your data look like: Some your raw data contains text strings like "Duplicate Id's that needs to be displayed ::::::[6523409, 6529865]".  Given such an event, the desired output is a multivalue field containing values 6523409 and6529865.  Let us call this field "duplicate_ids".  Something to this effect: _raw duplicate_ids blah, blah, blah Duplicate Id's that needs to be displayed ::::::[6523409, 6529865] - and more blahs 6523409 6529865 Is this the use case?  If yes, here is what I do   | rex "Duplicate Id's that needs to be displayed :*(?<duplicate_ids>\[[^\]]+\])" | eval duplicate_ids = json_array_to_mv(duplicate_ids)   (The above requires Splunk 8.1 or later.  But it is not the only way to do this.) Here is an emulation for you to play with and compare with real data   | makeresults | fields - _time | eval _raw = "blah, blah, blah Duplicate Id's that needs to be displayed ::::::[6523409, 6529865] - and more blahs" ``` data emulation above ```  
As of now I am working in Splunk since 3 years. I am well versed with development and recently started working on admin part. I am still learning. I don't have any knowledge on other tools or languag... See more...
As of now I am working in Splunk since 3 years. I am well versed with development and recently started working on admin part. I am still learning. I don't have any knowledge on other tools or languages apart from Splunk. This project has requirement of AWS (our splunk instances hosted on AWS cloud). So started learning AWS. Here my doubt is along with Splunk, what tool or software can I upskill myself to get more opportunities in future? I have these thoughts... Not sure I am right.. 1. Is it good to learn DevOps because already started AWS? 2. cybersecurity and SIEM 3. SRE I have zero knowledge on coding till date. Please suggest me good path where I can upskill myself may be in next 1 yr.
Hello All, My company is using Outlook (M365 Business Standard). I want to use this Outlook as SMTP server for Splunk. Here is the information for Outlook. POP, IMAP, and SMTP settings for Outlook.c... See more...
Hello All, My company is using Outlook (M365 Business Standard). I want to use this Outlook as SMTP server for Splunk. Here is the information for Outlook. POP, IMAP, and SMTP settings for Outlook.com - Microsoft Support SMTP server name smtp-mail.outlook.com SMTP port 587 SMTP encryption STARTTLS Authentication Method OAuth2/Modern Auth As this link from Microsoft, How to set up a multifunction device or application to send email using Microsoft 365 or Office 365 | Microsoft Learn Client SMTP submission using Basic authentication in Exchange Online is scheduled for deprecation in September 2025. And for replacement, High Volume Email for Microsoft 365 is a suitable option, but it relates app password and token, which is not supported in Server settings > Email settings in Splunk. Does Splunk support SMTP server using outlook.com? Could anyone please provide guide for using outlook.com as SMTP server? FYI: Splunk Enterprise, Version: 9.1.0.2 And if we use Splunk Cloud, is it easier for using outlook.com as SMTP server?