All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Guys,  Looking for some support on this. We are trying to setup alerts for the CPU metric data, to have incident when average CPU usage reaches over 90% for over last 2 hours.  We created a foll... See more...
Hi Guys,  Looking for some support on this. We are trying to setup alerts for the CPU metric data, to have incident when average CPU usage reaches over 90% for over last 2 hours.  We created a following base search, | mstats avg(cpu_metric.pctIdle) as cpu_idle  where index=lxmetrics earliest=-4h latest=now() span=2h by host| eval cpu_used=round(100-cpu_idle,2) Problem, incidents created as soon CPU is over 90% when KPI search schedule reaches(15mins). It is not waiting for 2 hours to complete, to take the average. Need some light on this. Thanks
Hello Splunkers, As per the below details i need to create an alert on the basis of threshold value. But in this case every offeringID has different rate. So how can i calculate the threshold for e... See more...
Hello Splunkers, As per the below details i need to create an alert on the basis of threshold value. But in this case every offeringID has different rate. So how can i calculate the threshold for each offeringID and how can I map this under an alert as a generic thershold value. Please suggest some ideas on this.  As well as if any one aware about in dat -incache-memory in Splunk.  
Hi All, I am investigating the possibility of consolidating our separate standalone ES Searchheads into a single clustered ES instance. Due to network segmentation rules, my indexer clusters will ha... See more...
Hi All, I am investigating the possibility of consolidating our separate standalone ES Searchheads into a single clustered ES instance. Due to network segmentation rules, my indexer clusters will have to remain separate. My question is, can Splunk ES utilise separate Indexer clusters from the same ES Searchhead cluster, in the same way that non-ES searchhead clusters can? If it is possible, are there any gotcha's to be aware of?   Thanks in advance.   Waja1n0z1
I am preparing a SNOW incident trend which should showcase the percentage of tickets reduced/increased in current month as compare to the previous month along with the current opened tickets value. B... See more...
I am preparing a SNOW incident trend which should showcase the percentage of tickets reduced/increased in current month as compare to the previous month along with the current opened tickets value. But when I compared it with the help of timechart command and span it is giving me current value as 0. Ideally it should show me the value of total opened tickets. Since it is taking current days data it is showing as 0. How I make sure that it should the data for all opened incidents?
I have created a pie chart which has 3 values in it. I want to create drilldown for each value with the help of click value token mechanism. But since I am using dbxquery it is not allowing me to add... See more...
I have created a pie chart which has 3 values in it. I want to create drilldown for each value with the help of click value token mechanism. But since I am using dbxquery it is not allowing me to add token in it. Please help me with the solution.
Hello, Would it be possible for UFs to forward/send logs/events to other HFs/UFs? Thank you!  
my log appear:   1;1;laptop-rdvt90t4;http://update-software.xxx.com/WeatherFix03_SP03120.exe;C:\Windows\SysWOW64\DynamicWeather.exe;NT AUTHORITY\SYSTEM;2022-05-02 09:23:25;192.168.1.8;;; 1;1;lapto... See more...
my log appear:   1;1;laptop-rdvt90t4;http://update-software.xxx.com/WeatherFix03_SP03120.exe;C:\Windows\SysWOW64\DynamicWeather.exe;NT AUTHORITY\SYSTEM;2022-05-02 09:23:25;192.168.1.8;;; 1;1;laptop-rdv7446p;http://update-software.xxx.com/qatherFixP00190.exe;C:\Windows\SysWOW64\Der.exe;ScWhJ\lizonghao;2022-05-02 09:25:27;192.168.1.8;;; I use :strptime()  %H:%M:%S , and reg “202\d+-\d+\-\d+\s” to get the time,   but it look like wrong。 like this pic.     how to write this reg to get the  time?  
So i have this:     (index=* OR index=_*) (index="GA2014" EventCode=4625) | dedup RecordNumber | rename Account_Name AS EventObject.Account_Name EventCode AS EventObject.EventCode | stats ded... See more...
So i have this:     (index=* OR index=_*) (index="GA2014" EventCode=4625) | dedup RecordNumber | rename Account_Name AS EventObject.Account_Name EventCode AS EventObject.EventCode | stats dedup_splitvals=t count AS "Count of Event Object" by "EventObject.Account_Name" | sort limit=100000 "EventObject.Account_Name" | fields - _span | rename "EventObject.Account_Name" AS Account_Name | fillnull "Count of Event Object" | fields Account_Name, "Count of Event Object" | search NOT Account_Name="-"     Resulting into this:     +--------------+-----------------------+ | Account_Name | Count of Event Object | +--------------+-----------------------+ | SQLSERVICE | 1 | +--------------+-----------------------+ | STAFF | 1 | +--------------+-----------------------+ | STUDENT | 1 | +--------------+-----------------------+ | SUPORTE | 1 | +--------------+-----------------------+ | SUPPORT | 2 | +--------------+-----------------------+ | SYMANTEC | 1 | +--------------+-----------------------+     !!!!WITH!!!! These 3 over here:     +---------------+-----------------------+ | Account_Name | Count of Event Object | +---------------+-----------------------+ | АДМИН | 8 | +---------------+-----------------------+ | АДМИНИСТРАТОР | 8 | +---------------+-----------------------+ | ПОЛЬЗОВАТЕЛЬ | 8 | +---------------+-----------------------+     !!BUT!! When i do a search like this:     (index=* OR index=_*) (index="GA2014" EventCode=4625) | dedup RecordNumber | rename Account_Name AS EventObject.Account_Name EventCode AS EventObject.EventCode Workstation_Name AS EventObject.Workstation_Name | bucket _time span=1s | stats dedup_splitvals=t values("EventObject.EventCode") AS "Distinct Values of EventCode" by _time, "EventObject.Account_Name", "EventObject.Workstation_Name", "EventObject.EventCode" | sort limit=10000000 _time | rename "EventObject.Account_Name" AS Account_Name "EventObject.EventCode" AS EventCode "EventObject.Workstation_Name" AS Workstation_Name | fields _time, Account_Name, Workstation_Name, "Distinct Values of EventCode" | search NOT Account_Name="-"     I get this:     +---------------------+--------------+------------------+------------------------------+ | _time | Account_Name | Workstation_Name | Distinct Values of EventCode | +---------------------+--------------+------------------+------------------------------+ | 2020-02-21 01:03:48 | Demo | workstation | 4625 | +---------------------+--------------+------------------+------------------------------+ | 2020-02-21 01:05:57 | Reception | workstation | 4625 | +---------------------+--------------+------------------+------------------------------+ | 2020-02-21 01:09:06 | User11 | workstation | 4625 | +---------------------+--------------+------------------+------------------------------+ | 2020-02-21 01:10:34 | Ieuser | workstation | 4625 | +---------------------+--------------+------------------+------------------------------+     !!Without!!     АДМИН АДМИНИСТРАТОР ПОЛЬЗОВАТЕЛЬ     Nowhere to be seen in sight. Don't know right now if it applies only to these 3 or not, but i searched it with ctrl+f in browser and found nothing .... Honestly, i don't know what name to give to this thread/question. Maybe i can get some advice on this too, if i will be able to rename my thread/question .... P.S.: It's 2 in the mornin' over here, so if i have any typos, it must be the late hour ...
Is there a way to do a search like this; If Eventid=1111     only do these  statements elseif Eventid=2222     only do these statements elseif eventid=3333    only do these statements D... See more...
Is there a way to do a search like this; If Eventid=1111     only do these  statements elseif Eventid=2222     only do these statements elseif eventid=3333    only do these statements Do these extra statements ...
Hello Splunkers,    I have client that already has a IBM Qradar SIEM and wants to Integrates with Splunk SOAR (formely name as Splunk phantom).   Can you guys tell me if is it possible and how to... See more...
Hello Splunkers,    I have client that already has a IBM Qradar SIEM and wants to Integrates with Splunk SOAR (formely name as Splunk phantom).   Can you guys tell me if is it possible and how to implement it?   Regards, Marcos Pereira
I want to get QID list from yesterday’s published data.  For that I'm using PUBLISHED_DATETIME field with yesterday’s date. The date format for that field result is in GMT format (2005-11-11T08:00:00... See more...
I want to get QID list from yesterday’s published data.  For that I'm using PUBLISHED_DATETIME field with yesterday’s date. The date format for that field result is in GMT format (2005-11-11T08:00:00Z). For example, I’m running this search on may 4th, but I need to get QID fields with published date as 05/03/2022. (May 3rd) |table QID PUBLISHED_DATETIME
Hi everyone, I am new to Splunk and  I have been trying to do a complex report that I haven't been able to solve so please any help would appreciate a lot. I need to create a table like this: ... See more...
Hi everyone, I am new to Splunk and  I have been trying to do a complex report that I haven't been able to solve so please any help would appreciate a lot. I need to create a table like this: ID Name Function   Device Number  Unit  1 AAA23 Allocate A1 12 U1       A2 15 U2       A3 13 U1       A4 12 U4 2 AAA23 Allocate A1 12 U1 3 AAA23 Deallocate A1 12 U1       A2 15 U2 Here are the three events in JSON format: 1{"ID":"1","NAME":"AAA23","FUNCTION":"1", "DEVICE_001":”A1”,”NUMBER_001”:12,”UNIT_001”:”U1”, "DEVICE_002":”A2”,”NUMBER_002”:15,”UNIT_002”:”U2”, "DEVICE_003":”A3”,”NUMBER_003”:13,”UNIT_003”:”U1”, "DEVICE_004":”A4”,”NUMBER_004”:12,”UNIT_004”:”U4”} 2 {"ID":"2","NAME":"AAA23","FUNCTION":"1", "DEVICE_001":”A1”,”NUMBER_001”:12,”UNIT_001”:”U1” } 3{"ID":"3","NAME":"AAA23","FUNCTION":"2", "DEVICE_001":”A1”,”NUMBER_001”:12,”UNIT_001”:”U1”, "DEVICE_002":”A2”,”NUMBER_002”:15,”UNIT_002”:”U2”) As you can see the name of the fields DEVICE, NUMBER and UNIT depends on the number of entries in the NAME & ID fields so, sometimes for the same NAME & ID   field values I have 50 different name fields with a consecutive number, so as an example the previous fields are: DEVICE_001 ,DEVICE_002,…,DEVICE_050 NUMBER_001, NUBMER _002…., NUMBER _050, UNIT_001, UNIT_002,…, UNIT_050 And sometimes only 1 entry . this is variable and don´t depend on a specific field name.   With this in mind my question is how I can set this search on a Table Splunk: I have been trying the next: index=dataexample  |spath |rex "DEVICE_\d+":"(?P<DEVICE_1>[a-zA-Z0-9]+)" max_match=0 |rex "NUMBER_\d+":(?P<NUMBER_1>\d+)" max_match=0 |rex "UNIT_\d+":"(?P<UNIT_1>[a-zA-Z0-9]+)" max_match=0 |eval TIPO=case(FUNCTION ==01,"ALLOCATE", FUNCTION ==02,"DEALLOCATE", FUNCTION ==03, "OTHER") | stats values(NAME),values(TIPO),values(DEVICE_1), values(NUMBER_1), values(UNIT_1) by ID But I don´t know how to set all the variable( 1 or 50 or 60 ) field values in just one column per each DEVICE, NUMBER, UNIT per each event.
I'm trying to query Observability cloud for list of traces between a custom time frame.  From the API reference, I do not see an endpoint for this query.  Any help would be appreciated
Hello, So I have been working on this for a few days, looking at numerous Splunk responses but have yet to find something that works for my situation. So I have a large inventory of servers that ... See more...
Hello, So I have been working on this for a few days, looking at numerous Splunk responses but have yet to find something that works for my situation. So I have a large inventory of servers that I search through and currently use a general IN query in my searches but some querys have over 20 or so servers to search through and want to simplify it. So I am currently using something like this that works but can be exceedingly large depending on what servers I need to look up:   index=myindex hosts IN (server1,server2,server3) <mysearchquery>   So I had a bright idea of creating a lookup table to group the servers together. The lookup table: group,server group1,server1 group1,server2 group1,server3 group2,server4 group2,server5 I can get the desired list of servers by doing the following: |inputlookup lookuptable.csv | search group=group1 | fields server This would return: server1 server2 but applying it to my search has proved a lot more difficult. I think I was close with this one but have not quite figured it out yet:   index=myindex <Search> [ |inputlookup lookuptable.csv | search group=group1 | fields server ]   Any suggestions would be greatly appreciated, or a link to similar posts for me to review.
Has anyone integrated with Splunk from the controller and used extra parameters?  I have tried adding index=* as a parameter, but when Splunk query page opens the extra parameter is not part of the s... See more...
Has anyone integrated with Splunk from the controller and used extra parameters?  I have tried adding index=* as a parameter, but when Splunk query page opens the extra parameter is not part of the search string.
Hi, I have a dashboard with multiple table views from different indexes and just wondered if it is possible to combine them all in one stats table?   Thanks,   Joe
I extracted the _raw field and recieved values looking like - \xB9k?\x93\xE8\xC6\. How could I convert this to readable format?
I appreciate any any assistance with my Rex error. When running this Rex command: | rex "New Logon:\s+Security ID:\s+(?&lt;account&gt;.*)" I receive the following error, "Rex in dashboard says ... See more...
I appreciate any any assistance with my Rex error. When running this Rex command: | rex "New Logon:\s+Security ID:\s+(?&lt;account&gt;.*)" I receive the following error, "Rex in dashboard says missing terminator" Thanks in advance!
Hello Dear Community. For our Enterprise Splunk>, we were thinking about using the SPLUNK DB Connect to ingest structured Data (Comming from the ERP) in SPLUNK. What do you use as a strategy to han... See more...
Hello Dear Community. For our Enterprise Splunk>, we were thinking about using the SPLUNK DB Connect to ingest structured Data (Comming from the ERP) in SPLUNK. What do you use as a strategy to handle: Data Updates Data Deletion Does the SPLUNK DB connect offers something like this out of the box, or should we think about another setup for that ? 
This is the inputs.file:: As you can see they all go to the same directory structure, but the last one is supposed to catch all the logs not beginning with the defined *_xxxxx_*.log so that general l... See more...
This is the inputs.file:: As you can see they all go to the same directory structure, but the last one is supposed to catch all the logs not beginning with the defined *_xxxxx_*.log so that general logs will be stored in Splunk as well.  How can I do this? [monitor:///var/log/containers/*_ctisp1_*.log] index = ctisp1 sourcetype = dks-ctisp1 followSymlink = true [monitor:///var/log/containers/*_ocpprd_*.log] index = ocpprd sourcetype = dks-ocpprd followSymlink = true [monitor:///var/log/containers/*_custconnectp1_*.log] index = custcontp1 sourcetype = custcontp1 followSymlink = true [monitor:///var/log/containers/*_ocpnotifp3_*.log] index = ocpnotifp3 sourcetype = dks-ocpnotifp3 [monitor:///var/log/containers/*_ocpcorep3_*.log] index = ocpcorep3 sourcetype = ocpcorep3 [monitor:///var/log/containers/*_custcon2p3_*.log] index = custcon2p3 sourcetype = custcon2p3 [monitor:///var/log/containers/*_custcon1p3_*.log] index = custcon1p3 sourcetype=custcont1p3 [monitor:///var/log/containers/*_ctisap3_*.log] index = ctisap3 sourcetype = dks-ctisap3 [monitor:///var/log/containers/*_ctisp1_*.log] index = ctisp1 sourcetype = dks-ctisp1 [monitor:///var/log/containers/*_ivrp1_*.log] index = ivrp1 sourcetype = dks-ivrp1 #[monitor:///host/containers/*/[a-f0-9]+-json.log$] #index=dcp #sourcetype=dner-logsiamanti-container-logs #[monitor:///var/lib/docker/containers/*/[a-f0-9]+-json.log$] #index=dcp #sourcetype=diamanti-container-logs [monitor:///var/log/containers/*_ocpnotifp3_*.log] index = ocpnotifp3 sourcetype = ocpnotifp3 [monitor:///var/log/containers/*_ocpcorep3_*.log] index = ocpcorep3 sourcetype = ocpcorep3 [monitor:///var/log/containers/*_custcon2p3_*.log] index = custcon2p3 sourcetype = custcont2p3 [monitor:///var/log/containers/*_igridp2_*.log] index = igridp2 sourcetype = dks-igridp2 ## END of PROD ## Monitor all Diamanti logs [monitor:///var/log/diamanti/.../*.log] index=dcp sourcetype = diamanti-system-logs # Monitor Container logs [monitor:///var/log/containers/*.log] index=dcp sourcetype = diamanti-container-logs