All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I'd like to know what are the usecases applied on splunk enterprise
How large is your csv?
Hello. Thank you for all your help and support. In a registered lookup table file (CSV), if I want to search and match the value of a specific field from two columns (two columns), how should I set... See more...
Hello. Thank you for all your help and support. In a registered lookup table file (CSV), if I want to search and match the value of a specific field from two columns (two columns), how should I set the input fields in the automatic lookup setup screen? For example, I have the following columns in my table file PC_Name,MacAddress1,MacAddress2 The MacAddress in the Splunk index log resides in either MacAddress1 or MacAddress2 in the table file. Therefore, we want to search both columns and return the PC_Name of the matching record. As a test, I tried to set the following two input fields to be searched automatically from the Lookup settings screen of the GUI, but PC_Name did not appear in the search result field. *See attached image. *If the following input field setting is one, PC_Name is output. MACAddr1 = Mac address MACAddr2 = Mac address So, as a workaround, I split the lookup settings into two and set each as follows MACAddr1 = MacAddress and MACAddr2 = MacAddress in the input fields to display the search results. However, this is not smart. Note that the lookup is configured from the Splunk Web UI. What is the best way to configure this?
I have a dashboard which gives the below error at user end but when i open the dashboard i dont see any error at my end and it perfectly runs fine with the proper result Error in 'lookup' comma... See more...
I have a dashboard which gives the below error at user end but when i open the dashboard i dont see any error at my end and it perfectly runs fine with the proper result Error in 'lookup' command: Could not construct lookup 'EventCodes, EventCode, LogName, OUTPUTNEW, desc'. See search.log for more details. Eventtype 'msad-rep-errors' does not exist or is disabled. Please help me how to fix this issue.  
Hi Manall,   I want to use NAS as read storage i.e as cold not hot. BTW its work fine with me till now
Thank you for your response and the suggestion about a custom modular input.  We will look in to that further.
Hi @Player01 , Are all your searches slow or only this one? how many logs have you in your index and in your lookup? which storage have? is it compliant with the Splunk requirement of at least 800... See more...
Hi @Player01 , Are all your searches slow or only this one? how many logs have you in your index and in your lookup? which storage have? is it compliant with the Splunk requirement of at least 800 IOPS? Ciao. Giuseppe
Hi @atr , as @deepakc said, check and disable the local firewall. Then see at /opt/splunk/var/log/splunk/first_install.log if something is described. If you continue to have issues, open a case to... See more...
Hi @atr , as @deepakc said, check and disable the local firewall. Then see at /opt/splunk/var/log/splunk/first_install.log if something is described. If you continue to have issues, open a case to Splunk Support, But I'm confident that the issue is the local firewall. Ciao. Giuseppe
@ITWhisperer I tried to roll buckets and it's fine but the error returns after a short time    this is what i meant   
yes  it is a values for id in my events
 Start by checking that you have port 8000 available - check firewall ports  The SSH issue is again most likely related to ports/access 22 (but this is not a splunk issue) Speak to an Linux OS ... See more...
 Start by checking that you have port 8000 available - check firewall ports  The SSH issue is again most likely related to ports/access 22 (but this is not a splunk issue) Speak to an Linux OS admin, as your issue's seem to be OS config related.  
The problem is actually deeper because appendcols works only if the lookup and index search has the same number of rows (and sort order).  In this use case, that's opposite to the premise.  I will ha... See more...
The problem is actually deeper because appendcols works only if the lookup and index search has the same number of rows (and sort order).  In this use case, that's opposite to the premise.  I will have to look deeper - but there should be something - it could be even more cumbersome.
If the lookup is of any size and if the number of events from the index search is large, yes, this will take vary long because it is effectively performing | search url = "foo.com" OR url = "bar.com... See more...
If the lookup is of any size and if the number of events from the index search is large, yes, this will take vary long because it is effectively performing | search url = "foo.com" OR url = "bar.com" OR url = "barz" OR ... Moving the subsearch to index search can save some time because you won't be streaming as many events: index=myindex sourcetype=mysource [|inputlookup LCL_url.csv | fields url] | stats count by url | fields - count | sort url However, using a lookup file as subsearch may not be the best option to begin with.
This has nothing to do with now().  Your strptime receives a literal string "First Detected" and tries to calculate time.  The result is null, of course. Change double quote to single quote.   ind... See more...
This has nothing to do with now().  Your strptime receives a literal string "First Detected" and tries to calculate time.  The result is null, of course. Change double quote to single quote.   index=stuff source=file Severity="Critical" | lookup detail.csv "IP Address" OUTPUTNEW Manager | eval First_DiscoveredTS = strptime('First Discovered', "%b %d, %Y %H:%M:%S %Z"), Last_ObservedTS = strptime('Last Observed', "%b %d, %Y %H:%M:%S %Z"), firstNowDiff = (now() - First_DiscoveredTS)/86400, Days = floor(firstNowDiff) | stats count by Manager Days | where Days > 30   Play with this emulation and compare with real data   | makeresults format=csv data="First Discovered, Last Observed \"Jul 26, 2023 16:50:26 UTC\", \"Jul 19, 2024 09:06:32 UTC\"" | appendcols [makeresults format=csv data="Manager foo"] ``` the above emulates index=stuff source=file Severity="Critical" | lookup detail.csv "IP Address" OUTPUTNEW Manager ```   Output from this is Manager Days count foo 362 1
Like this?   | rex "message: \((?<in_parentheseses>[^\)]+)"   You can test with   | makeresults format=csv data="_raw message: (c4328dd3-d16e-4df8-a8e6-b2ebcab9d8bc)" ``` data emulation above `... See more...
Like this?   | rex "message: \((?<in_parentheseses>[^\)]+)"   You can test with   | makeresults format=csv data="_raw message: (c4328dd3-d16e-4df8-a8e6-b2ebcab9d8bc)" ``` data emulation above ``` | rex "message: \((?<in_parentheses>[^\)]+)"   _raw in_parentheses message: (c4328dd3-d16e-4df8-a8e6-b2ebcab9d8bc) c4328dd3-d16e-4df8-a8e6-b2ebcab9d8bc
Hi, I can see Splunk is vulnerable to openssl 1.0.2zk, I've applied the latest 9.2.2 on Splunk Enterprise and the Universal Forwarder, still running the older 1.0.2zj version. Any ideas when this w... See more...
Hi, I can see Splunk is vulnerable to openssl 1.0.2zk, I've applied the latest 9.2.2 on Splunk Enterprise and the Universal Forwarder, still running the older 1.0.2zj version. Any ideas when this will be remediated? OpenSSL Bulletin on 26 June [ Vulnerabilities ] - /news/vulnerabilities-1.0.2.html (openssl.org) From Splunk Advisory, latest openssl related update was in March for zj version.  
@richgalloway It is still not working , here is the xml code of my input- <input type="multiselect" token="field2"> <label>field2</label> <choice value="*">All</choice> <valuePrefi... See more...
@richgalloway It is still not working , here is the xml code of my input- <input type="multiselect" token="field2"> <label>field2</label> <choice value="*">All</choice> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter> OR </delimiter> <fieldForLabel>Sub_Competency</fieldForLabel> <fieldForValue>Sub_Competency</fieldForValue> <search> <query>| inputlookup cyber_q1_available_hours.csv | rename "Sub- Competency" as Sub_Competency | dedup Sub_Competency | table Sub_Competency</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> XML code of main search <query>| inputlookup cyber_q1_available_hours.csv | rename "Sub- Competency" as Sub_Competency | search Sub_Competency IN ("$sub_competency$") | eval split_name=split('Resource Name', ",") | eval first_name=mvindex(split_name,1) | eval last_name=mvindex(split_name,0) | eval Resource_Name=trim(first_name) . " " . trim(last_name) | stats count,values(Sub_Competency) as Sub_Competency values(Competency) as Competency values("FWD Looking Util") as FWD_Util values("YTD Util") as YTD_Util by Resource_Name | search Competency="$selected_competency$" | table Resource_Name, Competency, Sub_Competency,FWD_Util,YTD_Util |sort FWD_Util</query>  
My Raw log says "message: (c4328dd3-d16e-4df8-a8e6-b2ebcab9d8bc)"  I wanted to extract everything  inside the  Parentheses ( )   Thanks in advance.
I have a csv that gets loaded weekly... timestamp for events are on load. However, this file has multiple time fields (first discovered, last seen, etc.). I am attempting to find those events (based ... See more...
I have a csv that gets loaded weekly... timestamp for events are on load. However, this file has multiple time fields (first discovered, last seen, etc.). I am attempting to find those events (based on the fields) that are greater than 30 days, for example. had this working fine, until I introduced a lookup. I am attempting to show results grouping them by owner (stats) but only those events that are 30 days from first discovered until now().  If I add | where Days > 30, results show every event from the fiel. But I know they are there... anonymized query below. What am I doing wrong? Sample fields being eval'ed:  First Discovered: Jul 26, 2023 16:50:26 UTC Last Observed: Jul 19, 2024 09:06:32 UTC   index=stuff  source=file Severity="Critical" | lookup detail.csv "IP Address" OUTPUTNEW Manager | eval First_DiscoveredTS = strptime("First Discovered", "%b %d, %Y %H:%M:%S %Z"), Last_ObservedTS = strptime("Last Observed", "%b %d, %Y %H:%M:%S %Z"), firstNowDiff = (now() - First_DiscoveredTS)/86400, Days = floor(firstNowDiff) | stats by Manager | where Days > 30    
I have installed Splunk Enterprise on an RHEL9 VM in AWS. I have tried installing via TAR and RPM. I also tried starting it as "root" and "splunk" users but it just won't start. It always hangs at th... See more...
I have installed Splunk Enterprise on an RHEL9 VM in AWS. I have tried installing via TAR and RPM. I also tried starting it as "root" and "splunk" users but it just won't start. It always hangs at the same point and when that happens I can't even SSH to my VM. I have to reboot the VM to get access to it again. It stays here for about 30 minutes (maybe longer). Then, I see the following. Any idea what might be going on?