All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I'm trying to make a test with data that I onboarded with the collect command. What I see is that when I insert an event that is exactly the same as existing events, the data that I insert is not a... See more...
I'm trying to make a test with data that I onboarded with the collect command. What I see is that when I insert an event that is exactly the same as existing events, the data that I insert is not able to be searched on the fields while it is possible with the data that is normally onboarded. My guess is that the transforms.conf configuration is not running on collect events, but I can not figure out how i can make sure it happens.   How can I force a transforms.conf to also run on data onboarded with collect?
I created a workflow action of off some netflow logs.  I want to pass the source IP from the netflow and pass it to another search what looks at authentication logs from another log source to see the... See more...
I created a workflow action of off some netflow logs.  I want to pass the source IP from the netflow and pass it to another search what looks at authentication logs from another log source to see the user that most recently authenticated PRIOR to the event that I am triggering the workflow from.  I can pass _time to the new search as latest=$_time$  but I cannot seem to set earliest to what I want (in this case 4 hours before the passed $_time$ variable.  How I can I properly set earliest to 4 hours before $_time$ so the workflow search looks back 4 hours from the event I am pivoting off of?
while setting alert action to webhook and giving URL details, getting error logs like these.  URL format : http://<IP>:<PORT>/alert ERROR sendmodalert [46453 AlertNotifierWorker-0] - action=webho... See more...
while setting alert action to webhook and giving URL details, getting error logs like these.  URL format : http://<IP>:<PORT>/alert ERROR sendmodalert [46453 AlertNotifierWorker-0] - action=webhook STDERR - Error sending webhook request: HTTP Error 401: action=webhook - Alert action script completed in duration=84 ms with exit code=2 sendmodalert [19216 AlertNotifierWorker-0] - action=webhook - Alert action script returned error code=2 do anyone else faced similar issue while setting up webhooks.  Response would be appreciated. Thanks !
Is it possible to use different index names for each server, I would like to send the same logs from Heavy Forwarder to two servers (Splunk Enterprise, Splunk Cloud). The logs to Splunk Cloud will ... See more...
Is it possible to use different index names for each server, I would like to send the same logs from Heavy Forwarder to two servers (Splunk Enterprise, Splunk Cloud). The logs to Splunk Cloud will be sent using the Credentials App. Will this below configuration perform the thing ...   Is there any corrections / other ways to perform this...    
Hi   I have a search  index=main sourcetype=data2 type=policy that gives me the following in json: customerId: man0000 dns: false ioc: true type: policy I have a csv which has ... See more...
Hi   I have a search  index=main sourcetype=data2 type=policy that gives me the following in json: customerId: man0000 dns: false ioc: true type: policy I have a csv which has the following (the purpose of the csv is to show what the default settings should be across all customers) Config Item, Config setting DNS, Enabled IOC, Disabled   We also have a list of customers in a database with the customerId's   So my search logic was as follows:   Search the index to bring all the different search results as a table rename the search results so instead of dns have DNS and instead of ioc have IOC etc | join customer ID [| dbxquery query=.....] - to get cus id's |Inputlookup the csv file (here is where i get stuck) I don't know how to link them together so that for every customerid from the DB that matches the customerID in the search to compare the results from search i.e where ioc: true and on csv is Disabled, to output the results.   Any help would be appreciated. Thanks in advance
can someone tell me what am I doing wrong in this xml? <dashboard> <label>test Veracode</label> <row> <panel> <title>Severity by flaw</title> <chart> <search> <... See more...
can someone tell me what am I doing wrong in this xml? <dashboard> <label>test Veracode</label> <row> <panel> <title>Severity by flaw</title> <chart> <search> <query>index="veracode_test" sourcetype="Veracode_scan" | lookup Veracode.csv findings{}.severity | stats count by Severity | append [| inputlookup Veracode.csv | fields Severity | stats count by Severity | eval count = 0] | stats max(count) as Total by Severity | eval sorter = case(Severity="Very High", 5, Severity="High", 4, Severity="medium", 3, Severity="Low",2, Severity="Very Low",1,1==1,99) | sort + sorter | fields - sorter</query> <earliest>0</earliest> <sampleRatio>1</sampleRatio> </search> <option name="charting.axisLabelsX.majorLabelStyle.overflowMode">ellipsisNone</option> <option name="charting.axisLabelsX.majorLabelStyle.rotation">0</option> <option name="charting.axisTitleX.visibility">visible</option> <option name="charting.axisTitleY.visibility">visible</option> <option name="charting.axisTitleY2.visibility">visible</option> <option name="charting.axisX.abbreviation">none</option> <option name="charting.axisX.scale">linear</option> <option name="charting.axisY.abbreviation">none</option> <option name="charting.axisY.scale">linear</option> <option name="charting.axisY2.abbreviation">none</option> <option name="charting.axisY2.enabled">0</option> <option name="charting.axisY2.scale">inherit</option> <option name="charting.chart">column</option> <option name="charting.chart.bubbleMaximumSize">50</option> <option name="charting.chart.bubbleMinimumSize">10</option> <option name="charting.chart.bubbleSizeBy">area</option> <option name="charting.chart.nullValueMode">gaps</option> <option name="charting.chart.showDataLabels">none</option> <option name="charting.chart.sliceCollapsingThreshold">0.01</option> <option name="charting.chart.stackMode">default</option> <option name="charting.chart.style">shiny</option> <option name="charting.drilldown">none</option> <option name="charting.fieldColors">{"Very High":#e60000,"High":ff0000,"meidum":#ff8000, "Low":#ffbf00,"Very Low":#ffff00 }</option> <option name="charting.layout.splitSeries">0</option> <option name="charting.layout.splitSeries.allowIndependentYRanges">0</option> <option name="charting.legend.labelStyle.overflowMode">ellipsisMiddle</option> <option name="charting.legend.mode">standard</option> <option name="charting.legend.placement">right</option> <option name="charting.lineWidth">2</option> <option name="trellis.enabled">0</option> <option name="trellis.scales.shared">1</option> <option name="trellis.size">medium</option> </chart> </panel>
I need to add multiple values from a CSV to a main Search I have, I used the lookup command but I think that will just compare one field from the main search and the CSV and I need to add more fields... See more...
I need to add multiple values from a CSV to a main Search I have, I used the lookup command but I think that will just compare one field from the main search and the CSV and I need to add more fields from the CSV to do some evals, Please help!
Hi, I have been tasked to design an alert to trigger whenever there is a modification of the "search query" of an alert. To achieve this, I have decided to use the following approach: 1.compute th... See more...
Hi, I have been tasked to design an alert to trigger whenever there is a modification of the "search query" of an alert. To achieve this, I have decided to use the following approach: 1.compute the hash value of the search    2.create a lookup table (say, search_hash.csv)   3.and then compute the hash of the search (say every 24hr)     4.and now compare the computed hash against the already existing hash in the lookuptable     5.and if there is a difference, then REPLACE the value in the original lookup file search_hash.csv. with the dynamically computed value.  I have been able to reach step 4, but stuck at STEP 5. Please can some help me how I can achieve the last step of DYNAMICALLY REPLACING VALUES OF A LOOKUP WITH SEARCH RESULTS.?
I am looking to convert this regular search: index=foo action=blocked `macro` src_zone=foo | timechart count span=1d over to a search that leverage tstats and the Network Traffic datamodel that sh... See more...
I am looking to convert this regular search: index=foo action=blocked `macro` src_zone=foo | timechart count span=1d over to a search that leverage tstats and the Network Traffic datamodel that shows the count of blocked traffic per day for the past 7 days due to the large volume of network events | tstats count AS "Count of Blocked Traffic" from datamodel=Network_Traffic where (nodename = All_Traffic.Traffic_By_Action.Blocked_Traffic) All_Traffic.src_zone=foo groupby _time, All_Traffic.src_zone prestats=true  How can I get this search to use timechart? Thx
Hi All,  i am not able to see the logs in Splunk from one source  and one host Usecase: i have 2 host, host a and host b , source=/app/opt/source/logs/sample.log, i can see the data in host a but... See more...
Hi All,  i am not able to see the logs in Splunk from one source  and one host Usecase: i have 2 host, host a and host b , source=/app/opt/source/logs/sample.log, i can see the data in host a but i cannot see the data in host b  below is the inputs used: [monitor:///app/opt/source/logs/sample.log] sourcetype=app:sample:log disabled=0 index=xxxx blacklist= \.(.?:tar |gz)$
Hello @All, I am using Splunk add-on for MS Cloud service to create a new EventHub. I would like to ask how to make sure I have set it up properly. We have an old one which is up and running. The... See more...
Hello @All, I am using Splunk add-on for MS Cloud service to create a new EventHub. I would like to ask how to make sure I have set it up properly. We have an old one which is up and running. The idea is to create a new namespace and run those in parallel to make sure all works before we shut down the old one. The New input was created successfully. However, still cannot see any new source. Where am I going wrong? Do I need to manually create an entry within a config file? Thank you all!
My logs have a JSON field, like this: {   "foo": 5,   "bar": {} } I'd like to filter out logs that have an empty JSON for the "bar" field, like in the above example. How do I do that? I trie... See more...
My logs have a JSON field, like this: {   "foo": 5,   "bar": {} } I'd like to filter out logs that have an empty JSON for the "bar" field, like in the above example. How do I do that? I tried something like  where len('bar{}') > 0 but didn't work. Thank you so much
Hello all, I have created a ldap search query to load identity data. I want all existing lookup table entries to be deleted and populate the same table with new entries obtained by scheduled search... See more...
Hello all, I have created a ldap search query to load identity data. I want all existing lookup table entries to be deleted and populate the same table with new entries obtained by scheduled search (weekly basis). Let's say lookup name is identities.csv. Can someone help me with the query to delete all existing lookup table entries (identities.csv).   
Hi I use the search below which has to be used only in real time The goal of the search is to calculate a percentage It works fine except the performances because the subsearch returns a lot of ev... See more...
Hi I use the search below which has to be used only in real time The goal of the search is to calculate a percentage It works fine except the performances because the subsearch returns a lot of events  inde=toto (sourcetype= titi OR sourcetype=tutu) web-status=405 | fields web-status | stats count as total by web-status | appendcols  [ search  inde=toto (sourcetype= titi OR sourcetype=tutu) web-status=* | fields web-status | stats count as total2 by web-status] | eval perc=(toto / toto2) * 100 What i can do please?  
Hi I have a Splunk panel that takes ~20 seconds to load, but when I click on the inspect it tells me it took .7 seconds to load. When I pull it out to just SPL it does run fast, it is j... See more...
Hi I have a Splunk panel that takes ~20 seconds to load, but when I click on the inspect it tells me it took .7 seconds to load. When I pull it out to just SPL it does run fast, it is just the final visualization that seems to take extra time in the Splunk pannel. Is there anything I can do here or a way to monitor this? OR how do I fix the lag? For information, I am running Splunk 8.1 on one SH with 3 Indexers. To add once it is loaded it works fine - its just the initial load that is very slow
I am investigating higher CPU usage on my indexers, and am finding that this is a hard topic to really pinpoint. I run this search on my search head to identify different searches and the resource ... See more...
I am investigating higher CPU usage on my indexers, and am finding that this is a hard topic to really pinpoint. I run this search on my search head to identify different searches and the resource consumption, but the results are confusing me.         index=_introspection host=* source=*/resource_usage.log* component=PerProcess data.process_type="search" | stats latest(data.pct_cpu) AS resource_usage_cpu latest(data.mem_used) AS resource_usage_mem by _time, data.search_props.type,data.search_props.mode,data.search_props.user, data.search_props.app, host data.search_props.label data.elapsed data.search_props.search_head | sort - resource_usage_cpu           _time  data.search_props.type  data.search_props.mode  host  data.search_props.label data.elapsed data.search_props.search_head resource_usage_cpu 2022-11-01 10:23:54.338 scheduled historical batch idx04-k Process-Creation-Events-DomainController 1431.6000 sh02-g 95.40 2022-11-01 10:23:52.815 scheduled historical batch idx03-k Process-Creation-Events-DomainController 1430.0200 sh02-g 115.50 2022-11-01 10:23:50.738 scheduled historical batch idx05-k Process-Creation-Events-DomainController 1427.9800 sh02-g 105.70 2022-11-01 10:23:46.748 scheduled historical batch idx03-g Process-Creation-Events-DomainController 1424.0400 sh02-g 101.90 2022-11-01 10:23:45.081 scheduled historical batch idx02-k Process-Creation-Events-DomainController 1422.3200 sh02-g 97.90 From this, I can see that the search: 1) Was triggered from sh02 2) Was executed across several my indexers 3) Took ~1500 seconds to run 4) Consumed ~1 core on each instance BUT: The search is scheduled for once a day, and that time is not 10:23. It is scheduled for 11. (No window) There  are dozens on "instances" of this search being executed on all 10 of my indexers, triggered by sh02, in the ~10:22 timeframe. Maybe one row in the table above per indexer might make sense, but this is so many. What is happening here? How do I read these results to make a sane performance judgement about this situation?
Hello, I have a Splunk Enterprise installed in my system, I want to use this splunk in other system which is connected in the same LAN  network,  Is it Possible ?? if it is can you help us how ... See more...
Hello, I have a Splunk Enterprise installed in my system, I want to use this splunk in other system which is connected in the same LAN  network,  Is it Possible ?? if it is can you help us how to achive this ?? i saw a random post related to this,  in that i found like using IP address we can achieve it, But It didnt workout for me. Please helpus.
Hi all, I'm trying to create category based on host category: Lab,Personal,Staff and get workstations to be counted for each category. I tried using below and it gives desired results however it doe... See more...
Hi all, I'm trying to create category based on host category: Lab,Personal,Staff and get workstations to be counted for each category. I tried using below and it gives desired results however it doesn't work when I applied boolean expression (OR) on more details in certain category. <base search>| eval category = case(match(host,"ABC-*"),"Staff",match(host,"DESKTOP*" OR host,"PC-*"),"Lab",true(),"Personal")|stats count by category,host|sort -count|stats sum(count) as Total list(host) as Workstation_Name list(count) as count by category|where Total>1|sort Total Expected Result: category | Total |     Workstation_Name     | count     Staff          5                   ABC123                            2                                                ABC345                           3      Lab            2               DESKTOP123                     1                                                    PC123                           1      Personal   1                        Etc...                              1   Any help would be much appreciated!      
Hi Splunk Community, I need help to check whether my directory field match the regex The regex I used is ^\w+:\\root_folder\\((?:(?!excluded_folder).)*?)\\    to check the file path does not belo... See more...
Hi Splunk Community, I need help to check whether my directory field match the regex The regex I used is ^\w+:\\root_folder\\((?:(?!excluded_folder).)*?)\\    to check the file path does not belong to the excluded_folder Example: c:\root_folder\excluded_folder\...\...\...\file  is False d:\root_folder\subfolder\...\...\...\file is True Could anyone please help? Much appreciated!
As mentioned in the title above, collect command is not able to add an event to a source of an index. The collect command is able to add an event to sources like XmlWinEventLog:Security or XmlWinEven... See more...
As mentioned in the title above, collect command is not able to add an event to a source of an index. The collect command is able to add an event to sources like XmlWinEventLog:Security or XmlWinEventLog:Application but it is unable to add that same event to XmlWinEventLog:Microsoft-Windows-Sysmon/Operational. No error will be shown but the index won't have that event.  Sample code is shown below.     | makeresults | eval _raw="<Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385F-C22A-43E0-BF4C-06F5698FFBD9}'/><EventID>1</EventID><Version>5</Version><Level>4</Level><Task>1</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime='2021-03-12T04:12:31.706558800Z'/><EventRecordID>1352199</EventRecordID><Correlation/><Execution ProcessID='2296' ThreadID='4076'/><Channel>Microsoft-Windows-Sysmon/Operational</Channel><Computer>win-dc-293.attackrange.local</Computer><Security UserID='S-1-5-18'/></System><EventData><Data Name='RuleName'>-</Data><Data Name='UtcTime'>2021-03-12 04:12:31.704</Data><Data Name='ProcessGuid'>{110B94A8-EA2F-604A-4C05-00000000B001}</Data><Data Name='ProcessId'>2288</Data><Data Name='Image'>C:\Windows\System32\cmd.exe</Data><Data Name='FileVersion'>10.0.14393.0 (rs1_release.160715-1616)</Data><Data Name='Description'>Windows Command Processor</Data><Data Name='Product'>Microsoft® Windows® Operating System</Data><Data Name='Company'>Microsoft Corporation</Data><Data Name='OriginalFileName'>Cmd.Exe</Data><Data Name='CommandLine'>C:\Windows\system32\cmd.exe /C quser</Data><Data Name='CurrentDirectory'>c:\windows\system32\inetsrv\</Data><Data Name='User'>NT AUTHORITY\SYSTEM</Data><Data Name='LogonGuid'>{110B94A8-E38E-604A-E703-000000000000}</Data><Data Name='LogonId'>0x3e7</Data><Data Name='TerminalSessionId'>0</Data><Data Name='IntegrityLevel'>System</Data><Data Name='Hashes'>MD5=F4F684066175B77E0C3A000549D2922C,SHA256=935C1861DF1F4018D698E8B65ABFA02D7E9037D8F68CA3C2065B6CA165D44AD2,IMPHASH=3062ED732D4B25D1C64F084DAC97D37A</Data><Data Name='ParentProcessGuid'>{110B94A8-E45C-604A-3701-00000000B001}</Data><Data Name='ParentProcessId'>10332</Data><Data Name='ParentImage'>C:\Windows\System32\inetsrv\w3wp.exe</Data><Data Name='ParentCommandLine'>c:\windows\system32\inetsrv\w3wp.exe -ap 'MSExchangeOWAAppPool' -v 'v4.0' -c 'C:\Program Files\Microsoft\Exchange Server\V15\bin\GenericAppPoolConfigWithGCServerEnabledFalse.config' -a \\.\pipe\iisipm47dec653-b876-4ff7-964d-67331a8bd96f -h 'C:\inetpub\temp\apppools\MSExchangeOWAAppPool\MSExchangeOWAAppPool.config' -w '' -m 0</Data></EventData></Event>" | collect index="some_index" host="some_host" sourcetype="xmlwineventlog" source="XmlWinEventLog:Microsoft-Windows-Sysmon/Operational"     Could it be due to minor breaker? Please do let me know the possible causes for this issue. Thanks!