All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all, I have created a ldap search query to load identity data. I want all existing lookup table entries to be deleted and populate the same table with new entries obtained by scheduled search... See more...
Hello all, I have created a ldap search query to load identity data. I want all existing lookup table entries to be deleted and populate the same table with new entries obtained by scheduled search (weekly basis). Let's say lookup name is identities.csv. Can someone help me with the query to delete all existing lookup table entries (identities.csv).   
Hi I use the search below which has to be used only in real time The goal of the search is to calculate a percentage It works fine except the performances because the subsearch returns a lot of ev... See more...
Hi I use the search below which has to be used only in real time The goal of the search is to calculate a percentage It works fine except the performances because the subsearch returns a lot of events  inde=toto (sourcetype= titi OR sourcetype=tutu) web-status=405 | fields web-status | stats count as total by web-status | appendcols  [ search  inde=toto (sourcetype= titi OR sourcetype=tutu) web-status=* | fields web-status | stats count as total2 by web-status] | eval perc=(toto / toto2) * 100 What i can do please?  
Hi I have a Splunk panel that takes ~20 seconds to load, but when I click on the inspect it tells me it took .7 seconds to load. When I pull it out to just SPL it does run fast, it is j... See more...
Hi I have a Splunk panel that takes ~20 seconds to load, but when I click on the inspect it tells me it took .7 seconds to load. When I pull it out to just SPL it does run fast, it is just the final visualization that seems to take extra time in the Splunk pannel. Is there anything I can do here or a way to monitor this? OR how do I fix the lag? For information, I am running Splunk 8.1 on one SH with 3 Indexers. To add once it is loaded it works fine - its just the initial load that is very slow
I am investigating higher CPU usage on my indexers, and am finding that this is a hard topic to really pinpoint. I run this search on my search head to identify different searches and the resource ... See more...
I am investigating higher CPU usage on my indexers, and am finding that this is a hard topic to really pinpoint. I run this search on my search head to identify different searches and the resource consumption, but the results are confusing me.         index=_introspection host=* source=*/resource_usage.log* component=PerProcess data.process_type="search" | stats latest(data.pct_cpu) AS resource_usage_cpu latest(data.mem_used) AS resource_usage_mem by _time, data.search_props.type,data.search_props.mode,data.search_props.user, data.search_props.app, host data.search_props.label data.elapsed data.search_props.search_head | sort - resource_usage_cpu           _time  data.search_props.type  data.search_props.mode  host  data.search_props.label data.elapsed data.search_props.search_head resource_usage_cpu 2022-11-01 10:23:54.338 scheduled historical batch idx04-k Process-Creation-Events-DomainController 1431.6000 sh02-g 95.40 2022-11-01 10:23:52.815 scheduled historical batch idx03-k Process-Creation-Events-DomainController 1430.0200 sh02-g 115.50 2022-11-01 10:23:50.738 scheduled historical batch idx05-k Process-Creation-Events-DomainController 1427.9800 sh02-g 105.70 2022-11-01 10:23:46.748 scheduled historical batch idx03-g Process-Creation-Events-DomainController 1424.0400 sh02-g 101.90 2022-11-01 10:23:45.081 scheduled historical batch idx02-k Process-Creation-Events-DomainController 1422.3200 sh02-g 97.90 From this, I can see that the search: 1) Was triggered from sh02 2) Was executed across several my indexers 3) Took ~1500 seconds to run 4) Consumed ~1 core on each instance BUT: The search is scheduled for once a day, and that time is not 10:23. It is scheduled for 11. (No window) There  are dozens on "instances" of this search being executed on all 10 of my indexers, triggered by sh02, in the ~10:22 timeframe. Maybe one row in the table above per indexer might make sense, but this is so many. What is happening here? How do I read these results to make a sane performance judgement about this situation?
Hello, I have a Splunk Enterprise installed in my system, I want to use this splunk in other system which is connected in the same LAN  network,  Is it Possible ?? if it is can you help us how ... See more...
Hello, I have a Splunk Enterprise installed in my system, I want to use this splunk in other system which is connected in the same LAN  network,  Is it Possible ?? if it is can you help us how to achive this ?? i saw a random post related to this,  in that i found like using IP address we can achieve it, But It didnt workout for me. Please helpus.
Hi all, I'm trying to create category based on host category: Lab,Personal,Staff and get workstations to be counted for each category. I tried using below and it gives desired results however it doe... See more...
Hi all, I'm trying to create category based on host category: Lab,Personal,Staff and get workstations to be counted for each category. I tried using below and it gives desired results however it doesn't work when I applied boolean expression (OR) on more details in certain category. <base search>| eval category = case(match(host,"ABC-*"),"Staff",match(host,"DESKTOP*" OR host,"PC-*"),"Lab",true(),"Personal")|stats count by category,host|sort -count|stats sum(count) as Total list(host) as Workstation_Name list(count) as count by category|where Total>1|sort Total Expected Result: category | Total |     Workstation_Name     | count     Staff          5                   ABC123                            2                                                ABC345                           3      Lab            2               DESKTOP123                     1                                                    PC123                           1      Personal   1                        Etc...                              1   Any help would be much appreciated!      
Hi Splunk Community, I need help to check whether my directory field match the regex The regex I used is ^\w+:\\root_folder\\((?:(?!excluded_folder).)*?)\\    to check the file path does not belo... See more...
Hi Splunk Community, I need help to check whether my directory field match the regex The regex I used is ^\w+:\\root_folder\\((?:(?!excluded_folder).)*?)\\    to check the file path does not belong to the excluded_folder Example: c:\root_folder\excluded_folder\...\...\...\file  is False d:\root_folder\subfolder\...\...\...\file is True Could anyone please help? Much appreciated!
As mentioned in the title above, collect command is not able to add an event to a source of an index. The collect command is able to add an event to sources like XmlWinEventLog:Security or XmlWinEven... See more...
As mentioned in the title above, collect command is not able to add an event to a source of an index. The collect command is able to add an event to sources like XmlWinEventLog:Security or XmlWinEventLog:Application but it is unable to add that same event to XmlWinEventLog:Microsoft-Windows-Sysmon/Operational. No error will be shown but the index won't have that event.  Sample code is shown below.     | makeresults | eval _raw="<Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385F-C22A-43E0-BF4C-06F5698FFBD9}'/><EventID>1</EventID><Version>5</Version><Level>4</Level><Task>1</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime='2021-03-12T04:12:31.706558800Z'/><EventRecordID>1352199</EventRecordID><Correlation/><Execution ProcessID='2296' ThreadID='4076'/><Channel>Microsoft-Windows-Sysmon/Operational</Channel><Computer>win-dc-293.attackrange.local</Computer><Security UserID='S-1-5-18'/></System><EventData><Data Name='RuleName'>-</Data><Data Name='UtcTime'>2021-03-12 04:12:31.704</Data><Data Name='ProcessGuid'>{110B94A8-EA2F-604A-4C05-00000000B001}</Data><Data Name='ProcessId'>2288</Data><Data Name='Image'>C:\Windows\System32\cmd.exe</Data><Data Name='FileVersion'>10.0.14393.0 (rs1_release.160715-1616)</Data><Data Name='Description'>Windows Command Processor</Data><Data Name='Product'>Microsoft® Windows® Operating System</Data><Data Name='Company'>Microsoft Corporation</Data><Data Name='OriginalFileName'>Cmd.Exe</Data><Data Name='CommandLine'>C:\Windows\system32\cmd.exe /C quser</Data><Data Name='CurrentDirectory'>c:\windows\system32\inetsrv\</Data><Data Name='User'>NT AUTHORITY\SYSTEM</Data><Data Name='LogonGuid'>{110B94A8-E38E-604A-E703-000000000000}</Data><Data Name='LogonId'>0x3e7</Data><Data Name='TerminalSessionId'>0</Data><Data Name='IntegrityLevel'>System</Data><Data Name='Hashes'>MD5=F4F684066175B77E0C3A000549D2922C,SHA256=935C1861DF1F4018D698E8B65ABFA02D7E9037D8F68CA3C2065B6CA165D44AD2,IMPHASH=3062ED732D4B25D1C64F084DAC97D37A</Data><Data Name='ParentProcessGuid'>{110B94A8-E45C-604A-3701-00000000B001}</Data><Data Name='ParentProcessId'>10332</Data><Data Name='ParentImage'>C:\Windows\System32\inetsrv\w3wp.exe</Data><Data Name='ParentCommandLine'>c:\windows\system32\inetsrv\w3wp.exe -ap 'MSExchangeOWAAppPool' -v 'v4.0' -c 'C:\Program Files\Microsoft\Exchange Server\V15\bin\GenericAppPoolConfigWithGCServerEnabledFalse.config' -a \\.\pipe\iisipm47dec653-b876-4ff7-964d-67331a8bd96f -h 'C:\inetpub\temp\apppools\MSExchangeOWAAppPool\MSExchangeOWAAppPool.config' -w '' -m 0</Data></EventData></Event>" | collect index="some_index" host="some_host" sourcetype="xmlwineventlog" source="XmlWinEventLog:Microsoft-Windows-Sysmon/Operational"     Could it be due to minor breaker? Please do let me know the possible causes for this issue. Thanks!      
Good afternoon! The infrastructure command gave me permissions so that I can add a dashboard tab to my application. I can't find where it's done. Please advise.
Hi, I have used eval with multiple if conditions and it's failing. Kindly help.   source = "2access_30DAY.log" | eval new_field = if(status==200, "I love you Suman", "I love you Cloeh", if(sta... See more...
Hi, I have used eval with multiple if conditions and it's failing. Kindly help.   source = "2access_30DAY.log" | eval new_field = if(status==200, "I love you Suman", "I love you Cloeh", if(status==403, "Suman Cloeh", "Cloeh Suman")) | table status, new_field   Regards Suman P.   
hello index=_audit user=admin action=search info=granted search=* | table search_id search | replace "'search *" WITH "*" IN search | replace "*'" WITH "*" IN search I extracted the following r... See more...
hello index=_audit user=admin action=search info=granted search=* | table search_id search | replace "'search *" WITH "*" IN search | replace "*'" WITH "*" IN search I extracted the following result with this command. search_id search [ID1] [SPL1] [ID2] [SPL2] [ID3] [SPL3] I want to extract count of search field by re-search. search_id search  count [ID1]         [SPL1]   [SPL1-count] [ID2]         [SPL2]   [SPL2-count] [ID3]         [SPL3]   [SPL3-count] I'd appreciate it if you could help me.
Hi, I wrote a eval command and its not working. Kindly help. source = "2access_30DAY.log" | eval "new_field" = case('status'=200, 'Suman and Cloeh are best couple') | table "status" "new_field" ... See more...
Hi, I wrote a eval command and its not working. Kindly help. source = "2access_30DAY.log" | eval "new_field" = case('status'=200, 'Suman and Cloeh are best couple') | table "status" "new_field" Regards Suman P.
What is the best way to validate entire data from an indexer is uploaded to smart store?
hi experts by any chance if anyone has intergrate nifi to splunk via using httpinvoke processor. for this testing im generating self sign cert using openssl to test https from splunk where do i g... See more...
hi experts by any chance if anyone has intergrate nifi to splunk via using httpinvoke processor. for this testing im generating self sign cert using openssl to test https from splunk where do i generate my self sign certs , CA should i follow this link ? https://docs.splunk.com/Documentation/Splunk/9.0.1/Security/Howtoself-signcertificates appreciate any help or resource which i can reference
Hi, We have a custom TA to collect some logs from a Windows Server. This morning I just noticed that the Splunk is actually swapping day and month. Note: The time difference is from different time... See more...
Hi, We have a custom TA to collect some logs from a Windows Server. This morning I just noticed that the Splunk is actually swapping day and month. Note: The time difference is from different time zone, shouldn't be a problem For example       1/11/22 9:59:30.447 AM Src1 [01/11/2022 08:59:30.447] 1/11/22 9:59:30.447 AM Src1 [01/11/2022 08:59:30.447]       It was working before the event time turned to 01/11/2022 00:00:00 Last logging:     11/1/22 12:59:30.548 AM Src1 [31/10/2022 23:59:30.548]     Our props.conf looks like below:       DATETIME_CONFIG = LINE_BREAKER = ([\r\n]+) TIME_FORMAT = %d/%m/%Y %H:%M:%S MAX_TIMESTAMP_LOOKAHEAD = 125 SHOULD_LINEMERGE = false TIME_PREFIX = Src1\s+\[        Any suggestion will be appreciated. Thanks
I'm trying to get data in from a server via a powershell script. I have another app already doing similar on the same server but for some reason (that is driving me up the wall) i constantly get the ... See more...
I'm trying to get data in from a server via a powershell script. I have another app already doing similar on the same server but for some reason (that is driving me up the wall) i constantly get the error:     ERROR Executing script=. "$SplunkHome\etc\apps\vmwareinventory\appserver\static\vmguests.ps1" for stanza=VMWare-Guests failed with exception=The system cannot find the path specified.   I have checked and double checked the spellings, the system variables (for Splunk_Home), renamed the files, the directories and still it comes up with this error. I have followed the instructions to the letter from the Splunk Documentation and still the error persists.   This is my input stanza:   [powershell://VMWare-Guests] script = . "$SplunkHome\etc\apps\vmwareinventory\appserver\static\vmguests.ps1" schedule = 30 0 * * * sourcetype = vm:inventory index = vmware disabled = false     Can someone please tell me why this would be happening with this script but not others on the same server?   TIA  
Hello everyone,  I am trying to configure splunk security essentials, but it is completely blank. When I click on Data Inventory the whole page is completely blank with only the title "Data Invento... See more...
Hello everyone,  I am trying to configure splunk security essentials, but it is completely blank. When I click on Data Inventory the whole page is completely blank with only the title "Data Inventory" showing. Has anyone come across this before or might know how to troubleshoot it? I have tried updating the app, but that didn't do anything. 
I need to be able to split multiple fields that have a delimiter of "|#|". The field name will differ depending on the log. Is there a way to do a mass split using props.conf or transforms.conf. Is t... See more...
I need to be able to split multiple fields that have a delimiter of "|#|". The field name will differ depending on the log. Is there a way to do a mass split using props.conf or transforms.conf. Is there a way to do this without having to write a eval statement for every single field that may come? EX: log:: time=XXXX,src_ip="123.123.123.123|#|234.234.234.234|#|",user="foo1|#|foo2|#|foo3"...... I want to split src_ip, and user. 
Please help... 1st search query is where I get a value from the result. (value can be in either 1 of 3 fields)     index=index1 | table SQ1-user SQ1-field1 SQ1-field2 SQ1-field3     ... See more...
Please help... 1st search query is where I get a value from the result. (value can be in either 1 of 3 fields)     index=index1 | table SQ1-user SQ1-field1 SQ1-field2 SQ1-field3     SQ1-user SQ1-field1 SQ1-field2 SQ1-field3 john null null apple jane null orange null doe banana null null From that value, I want to use it to check if it exist in another search query, (the value can be on any fields)     index=index2 | where ANY_FIELD=SQ1-field1 OR ANY_FIELD=SQ1-field2 OR ANY_FIELD=SQ1-field3     SQ2-ID SQ2-field1 SQ2-field2 SQ2-field3 001 null apple null 002 banana null null if it exist in the second query, I want to have a new field on my first query that says the ID of where it was found or "NOT FOUND". SQ1-user SQ1-field1 SQ1-field2 SQ1-field3 (NEW FIELD)SQ2-ID john null null apple 001 jane null orange null NOT FOUND doe banana null null 002
In my dashboard I need to add multiple custom URL  but Drilldown only allow me to add one custom url. Is there any way I can use xml to add more Custom URL. Below are my xml code. [   <row> <panel... See more...
In my dashboard I need to add multiple custom URL  but Drilldown only allow me to add one custom url. Is there any way I can use xml to add more Custom URL. Below are my xml code. [   <row> <panel> <table> <search> <query>index="main" sourcetype="cisco.json" findings{}.issue_type=* findings{}.cwe_id=* findings{}.severity=* | table findings{}.severity findings{}.cwe_id findings{}.issue_type findings{}.flaw_details_link | rename findings{}.severity as Severity1,findings{}.cwe_id as CWE_ID1,findings{}.issue_type AS Name1 findings{}.flaw_details_link AS "More_Info" | eval Severity = mvdedup(Severity1) | eval CWE_ID = mvdedup(CWE_ID1) | eval Name = mvdedup(Name1) | eval More Info = mvdedup(More_Info) | table Severity CWE_ID Name "More Info"</query> <earliest>$field1.earliest$</earliest> <latest>$field1.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">50</option> <option name="dataOverlayMode">none</option> <option name="drilldown">cell</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> <drilldown> <link target="_blank">https://downloads.cisco.com/securityscan/cwe/v4/xmla/78.html</link> </drilldown> </table> </panel> </row>             ]