All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello,   Where can I view notable alert suppression entries in ES? I'm looking for a way to not only audit these entries but also remove them.
Hi, So, we have a large number of domain controllers, which have Splunk Universal Forwarder installed AND Microsoft Defender for Identity. Defender has switched to using Npcap (MS agreed some kin... See more...
Hi, So, we have a large number of domain controllers, which have Splunk Universal Forwarder installed AND Microsoft Defender for Identity. Defender has switched to using Npcap (MS agreed some kind of OEM license with them), but I am told we need to keep Splunk and WinPcap for DNS traffic capture. The issue is, on startup and occasionally every few days, I get spammed from Defender complaining that it's using WinPcap instead of Npcap drivers. Ie. it seems to be dumb and when it sees both, uses winpcap first and not npcap first. If I go to Defender for Identity I don't see any issues with the sensor.  Entire AD team get over 100 messages every few days with this. Ticket open with MS has so far yielded nothing. Surely we can't be the only people with this problem? Is there a way to rename the WinPcap driver and tell Splunk to go look for the renamed driver, for instance? I don't know. There must be a fix. It's driving us nuts. Thanks!
Hi, Has anyone done anything with Azure scale sets, I guess I will need to correlate across a number of logs to deal with the lack of persistence of the devices if I want to use UF or just use the ... See more...
Hi, Has anyone done anything with Azure scale sets, I guess I will need to correlate across a number of logs to deal with the lack of persistence of the devices if I want to use UF or just use the inbuilt logging provided by Microsoft? 
Hello Team, I need to set the SVC baseline in our environment. Please help me how to start to set up the SVC baseline. I mean what are the parameters I need to check. eg : Skip searches, expensiv... See more...
Hello Team, I need to set the SVC baseline in our environment. Please help me how to start to set up the SVC baseline. I mean what are the parameters I need to check. eg : Skip searches, expensive searches, orphaned searches.  Like : Where I am now after setting up the SVC baseline and where we will after setting up the SVC.
Hi, I am using the Splunk version 8.2.8 when I am trying to open the setup page of Splunk Add-on : ServiceNow Security Operations Integration Then i am getting below error on console    Can ... See more...
Hi, I am using the Splunk version 8.2.8 when I am trying to open the setup page of Splunk Add-on : ServiceNow Security Operations Integration Then i am getting below error on console    Can anyone please help on this? Above error is not coming in Splunk version 9.0.2. Add-on working fine with this version.
I want to implement this correlation search:   `sysmon` EventCode=10 TargetImage=*lsass.exe CallTrace=*dbgcore.dll* OR CallTrace=*dbghelp.dll* | stats count min(_time) as firstTime max(_time) as ... See more...
I want to implement this correlation search:   `sysmon` EventCode=10 TargetImage=*lsass.exe CallTrace=*dbgcore.dll* OR CallTrace=*dbghelp.dll* | stats count min(_time) as firstTime max(_time) as lastTime by Computer, TargetImage, TargetProcessId, SourceImage, SourceProcessId | rename Computer as dest | `security_content_ctime(firstTime)`| `security_content_ctime(lastTime)` | `access_lsass_memory_for_dump_creation_filter` I do not have the required fields from Sysmon log data. I have fields like Image,ParentImage,Processid but do not have TargetImage, TargetProcessId, SourceImage, SourceProcessId. How do I build the above query using the fields I have.  
Hey, I have a big query and I need to have a command on the query that would filter all  Asset_State!="Development" OR Asset_State!="Pre-Production", bit for ONLY Asset_Environment!="PKI  AND Offline... See more...
Hey, I have a big query and I need to have a command on the query that would filter all  Asset_State!="Development" OR Asset_State!="Pre-Production", bit for ONLY Asset_Environment!="PKI  AND Offline" Status="2". If tried the following command: | if( Asset_Environment!="PKI  AND Offline" Status="2".,search NOT (Asset_State!="Development" OR Asset_State!="Pre-Production"))   I know the syntax is wrong, can you help ? Many thanks
Hi Team,  I just need to send logs from linux client machine (Suse linux) to the Splunk Server hosted in a remote datacenter.  I would like to ask, how to configure the linux machine to send the logs... See more...
Hi Team,  I just need to send logs from linux client machine (Suse linux) to the Splunk Server hosted in a remote datacenter.  I would like to ask, how to configure the linux machine to send the logs to the server and what're the step to be followed.  please share the method so that i can try embed the steps for automation.  Ideally, as the client machines are very high in numbers and they need to send the logs to Splunk Server, which is the recommended method (push based or pull based).  Thanks for this help with the answer.
Hi All, we have a requirement to configuring cisco aci app with our splunk environment (not cloud and it in on prem).we wanted to see the dashboard data in which ara available in the app. But when ... See more...
Hi All, we have a requirement to configuring cisco aci app with our splunk environment (not cloud and it in on prem).we wanted to see the dashboard data in which ara available in the app. But when i see the addon configuration (default-->eventtype.conf) I do see any events for any of the eventtype that i search for..i do find the configuration with cisco:apic:cloud .But here we have cisco app on prem.
Windows Event IDs (Codes) logs are delayed for days. Latency varies, The delay reaches at times weeks or months. It is confirmed that none of the pipeline queues are blocked. Note that the Event ID... See more...
Windows Event IDs (Codes) logs are delayed for days. Latency varies, The delay reaches at times weeks or months. It is confirmed that none of the pipeline queues are blocked. Note that the Event IDs are reached in real-time in the event viewer
i have below result, how can I do a regex to extract the fields, first being DateTime, username, Action, Entity 2022-11-21 15:44:13,ea186520,CREATED,USERSESSIONLOG
I have create a table having some rows but when I use a visualisation using Number Display Viz app, it shows separate charts for each row. Is there a way to merge all the charts for example into a si... See more...
I have create a table having some rows but when I use a visualisation using Number Display Viz app, it shows separate charts for each row. Is there a way to merge all the charts for example into a single pie chart to display the results?
Hi need to generate current date like this "20201123" and use as a search filter on metadata. AFAIK there is no "_time" in metadata so need to generate current date for search filter.   here ... See more...
Hi need to generate current date like this "20201123" and use as a search filter on metadata. AFAIK there is no "_time" in metadata so need to generate current date for search filter.   here is my query,  |metadata type=sources index="app" |table source   any idea? Thanks,  
I want to restrict a user access based on the field present in the datamodel, however under roles->restrictions, even after entering the datamodel field and its value,  it is not filtering the data. ... See more...
I want to restrict a user access based on the field present in the datamodel, however under roles->restrictions, even after entering the datamodel field and its value,  it is not filtering the data. This is required because in the dashboard, the user is seeing all the data in the datamodel. How can this be achieved? Thanks in advance! Eg: User - Test01 Datamodel Field - testfield Test01 user should see the dashboard with the datamodel data filtered by the above testfield
So I have some data like below in my _raw: Name: BES Client, Running as: LocalSystem, Path: ""C:\Program Files (x86)\BigFix Enterprise\BES Client\BESClient.exe"", SHA1: 5bf0d29324081f2f830f7e66bba7a... See more...
So I have some data like below in my _raw: Name: BES Client, Running as: LocalSystem, Path: ""C:\Program Files (x86)\BigFix Enterprise\BES Client\BESClient.exe"", SHA1: 5bf0d29324081f2f830f7e66bba7aa5cb1c46047 Name: BESClientHelper, Running as: LocalSystem, Path: ""C:\Program Files (x86)\BigFix Enterprise\BES Client\BESClientHelper.exe"", SHA1: c989ae2278a9f8d6d5c5ca90fca6a57d19b168b8  Name: svchost.exe, PID: 424, PPID: 432, ( Started up: Mon, 19 Sep 2022 03:41:57 -0700 ), Running as: NT AUTHORITY\LOCAL SERVICE, Path: C:\Windows\System32\svchost.exe, SHA1: 3196f45b269a614a3926efc032fc9d75017f27e8 Name: scsrvc.exe, PID: 1384, PPID: 432, ( Started up: Mon, 19 Sep 2022 03:42:34 -0700 ), Running as: NT AUTHORITY\SYSTEM, Path: C:\Program Files\McAfee\Solidcore\scsrvc.exe, SHA1: ef1cc70f3e052a6c480ac2fe8cdfe21a502669cc I am trying to parse out just the "running" process name like "BES Client" or "BESClientHelper", however it has to have the text "Running" behind it so I know its a running process. Not the two "exe" files crossed out above. Make sense? Thanks!!
Hey Splukers Its a distributed environment. we created index in Cluster Master. We can access the indexes in SH cluster member.. Now I need to create user roles and add specific index to that... See more...
Hey Splukers Its a distributed environment. we created index in Cluster Master. We can access the indexes in SH cluster member.. Now I need to create user roles and add specific index to that role //while doing that the indexes created in CM is not listing  then I tried  to check in SH .. setting->indexes.. those indexes not listing.. what could be a reason? Note: in SH i can see the data if i put index=<index >name.. where i need to start my analysis?  
Hello having some confusing problems with Splunk permissions that I am trying to understand. Little background we upgrade our index/deployment server from Debian to ubuntu.   here is the problem I ... See more...
Hello having some confusing problems with Splunk permissions that I am trying to understand. Little background we upgrade our index/deployment server from Debian to ubuntu.   here is the problem I am seeing after this upgrade.   I was monitoring a file in var/log/test-combo.log  and everything worked before hand on debian 11. Now I am not getting any of the data from this file ingested into my index but I can see fresh logs. The file is owned by syslog and the group is adm. My splunk user: uid=1001(splunk) gid=1001(splunk) groups=1001(splunk),4(adm) I wanted to do a test and I went under Data Inputs > Files & Directories > New Local File & Directory > Browse > Var > Log the strange thing was that I can see half of the logs and half of the directories under there. All the directories and files that I can root:root and had other: r-- set permissions the file in question (test-combo.log) didn't have other:r-- permissions set.  So why is splunk able to see files with these permissions # file: vpn.log # owner: root # group: root user::rw- group::rw- other::r--   and not able to see files with this permission   # file: test-combo.log # owner: syslog # group: adm user::rw- group::r-- other::--- is it because other is not set to read perms? What would be the significance of setting other to read?
I'm analysing VPN connection logs to produce a report of the count of staff working from home for longer than 6 hours a day. Unfortunately the VPN session isn't started and ended by the staff membe... See more...
I'm analysing VPN connection logs to produce a report of the count of staff working from home for longer than 6 hours a day. Unfortunately the VPN session isn't started and ended by the staff member - the VPN just writes a log when data is sent. This means there is nothing that can be used as a start or end of flag in the data. I have tried using the TRANSACTION command, the Username as the unique element, and set the 'maxpause' between sessions to 65minutes. Example query:   index=VPN sourcetype=VPNlog | transaction Username maxevents=-1 maxpause=65m | stats sum(duration) as Duration BY Username   This worked for small sample sets of data. I could then extract the count of staff who's total duration of sessions was over 6 hours a day. However when I attempted to run this same query over the complete set of data it produced an incomplete set of results along with the message: "Some transactions have been discarded. To include them, add keepevicted=true to your transaction command." Enabling keepevicted produces more results but the figures are incorrect - I assume there are still too many events for the transaction command to analyse? After reading about the limitations of the transaction command I tried using STATS in its place - this works far quicker and for all the data except it can't there doesn't seem to be an equivalent of 'maxpause' to end a session. Instead the duration of a staff members session always ends up being the duration from the start of their first connection to the end of their last - which leads to people appearing for work 12 hour days because they logon remotely in the morning for a brief while, then briefly again in the evening. Is there another way to use the transaction command that will allow it to handle more data? The results don't have to return overly quickly as it will be run over night to produce reporting.
Hi All,   i have events like below and i want to extract the fields as TotalRecords, SuccessRecords, FailedRecords, Batch, SuccessRecords, FailedRecords,BatchSize, Success, Failed. if the data no... See more...
Hi All,   i have events like below and i want to extract the fields as TotalRecords, SuccessRecords, FailedRecords, Batch, SuccessRecords, FailedRecords,BatchSize, Success, Failed. if the data not there for the even it should show as blank or null.   Item InsertStatus= 'TotalRecords': 1 'SuccessRecords': 1 'FailedRecords': 0 Entity: DevOpsItemAttribute records Batch 1 SuccessRecords=1 FailedRecords=0 EntityData Entity Delete Status BatchSize=50000 Success=26 Failed=0   my output should be like below. TotalRecords, SuccessRecords, FailedRecords, Batch, SuccessRecords, FailedRecords,BatchSize, Success, Failed 1,1,0,null,null,null,null,null null,null,null,1,1,0,null,null,null null,null,null,null,null,null,50000,26,0  
Is it possible to create a Pie Chart from three fields? If so, how?   Thanks a million in advance!