All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

This part suggests that the dropdown selections are values in the COMPONENT ID field: "COMPONENT ID"="$DropdownValue$" If that's the case, you could filter based on "COMPONENT ID" IN ($DropdownV... See more...
This part suggests that the dropdown selections are values in the COMPONENT ID field: "COMPONENT ID"="$DropdownValue$" If that's the case, you could filter based on "COMPONENT ID" IN ($DropdownValue$) and join the subsearches on COMPONENT ID rather than appending columns. Joins are not particularly efficient, so instead of that suggestion, I would look at pulling all the data from that index back in a single search, and conditionally evaluating the stats. Paul
The .* does not match newline etc, so here is a trick I did find.  Change .* with [\s\S]* example: \<ReportItem\s(?<pluginout>[\s\S]*?)\<\/ReportItem\>  
Make sure all you path are correct.   Has this worked before? You should also set the hot volume  to not be able to take all the disk volume. Eks if you have 2TB of Hot/Warm disk, use some like thi... See more...
Make sure all you path are correct.   Has this worked before? You should also set the hot volume  to not be able to take all the disk volume. Eks if you have 2TB of Hot/Warm disk, use some like this: [volume:hot] path = /data/splunk/index/hot/vol maxVolumeDataSizeMB = 1900000
This one have always help with all Azure onboarding questions, want to give it a try? - https://jasonconger.com/splunk-azure-gdi/ Reference -  https://www.splunk.com/en_us/blog/tips-and-tricks/getti... See more...
This one have always help with all Azure onboarding questions, want to give it a try? - https://jasonconger.com/splunk-azure-gdi/ Reference -  https://www.splunk.com/en_us/blog/tips-and-tricks/getting-microsoft-azure-data-into-splunk.html?locale=en_us    
Not sure if https://splunkbase.splunk.com/app/1761 helps?
Hello @yuvaraj_m91  Can you please elaborate the question a bit?
Hello @heskez Yes it's possible. You can have a custom lookup popped-up and integrate the same local intel - https://docs.splunk.com/Documentation/ES/7.3.2/Admin/Addlocalthreatintel   Please hit Ka... See more...
Hello @heskez Yes it's possible. You can have a custom lookup popped-up and integrate the same local intel - https://docs.splunk.com/Documentation/ES/7.3.2/Admin/Addlocalthreatintel   Please hit Karma, if this helps!
The first error could appear if you 're trying to install a RHEL8 version of Soar on a Centos7. Regarding CentOS 8, it's not officially supported but you can download a version CentOS/RHEL8... ht... See more...
The first error could appear if you 're trying to install a RHEL8 version of Soar on a Centos7. Regarding CentOS 8, it's not officially supported but you can download a version CentOS/RHEL8... https://docs.splunk.com/Documentation/SOARonprem/6.2.2/Install/Requirements  
I want to get the below search executed and display the results in a table for all comma separated values that gets passed from dropdown. index="xxx" source = "yyyyzzz" AND $DropdownValue$ AND Inpu... See more...
I want to get the below search executed and display the results in a table for all comma separated values that gets passed from dropdown. index="xxx" source = "yyyyzzz" AND $DropdownValue$ AND Input| eventstats max(_time) as maxTimestamp by desc| head 1 | dedup _time | eval lastTriggered = strftime(_time, "%d/%m/%Y %H:%M:%S %Z")| stats values(lastTriggered) as lastTriggeredTime| appendcols [search index="xxx" source = "yyyyzzz" sourcetype = "mule:rtf:per:logs" AND $DropdownValue$ AND Output| eventstats max(_time) as maxTimestamp by desc| head 1 | dedup_time | eval lastProcessed = strftime(_time, "%d/%m/%Y %H:%M:%S %Z")| stats values(lastProcessed) as lastProcessedTime] | appendcols [search index="xxx" source = "yyyyzzz" sourcetype = "mule:rtf:per:logs" AND $DropdownValue$ AND Error| eventstats max(_time) as maxTimestamp by desc| head 1 | dedup_time | eval lastErrored = strftime(_time, "%d/%m/%Y %H:%M:%S %Z")]|eval "COMPONENT ID"="$DropdownValue$"|eval "Last Triggered Time"=lastTriggeredTime |eval "Last Processed Time"=lastProcessedTime| eval "Last Errored Time"=lastErrored | table "COMPONENT ID", "Last Triggered Time", "Last Processed Time","Last Errored Time" | fillnull value="NOT IN LAST 12 HOURS" "COMPONENT ID","Last Triggered Time", "Last Processed Time","Last Errored Time"   For example if $dropdownValue$ is having ABC,DEV, then the entire above mentioned search should get executed twice and 2 rows od data should be displayed in the table. Can someone guide how this can be achieved?    
I am getting this error "Login failed due to client tls version being less than minimal tls version allowed by the server " when editing the connection. From the splunk community, I have already adde... See more...
I am getting this error "Login failed due to client tls version being less than minimal tls version allowed by the server " when editing the connection. From the splunk community, I have already added some solutions to my configuration, using db connect setup page to set tls version with the parameter:   -Dhttps.protocols=TLSv1,TLSv1.1,TLSv1.2 Also, Adding sslVersions = tls1.2 under [sslconfig] None of the above worked! Kindly suggest me is there anything i need to check from my end or give any solution for this error. Splunk DB Connect #tlsversion
Try looking in inputs.conf
Hi, I wanted to know the exact path where I can see the mentioned sourcetypes. So that I can check under which category 2 different sourcetypes are defined and if possible, I can make it as a single ... See more...
Hi, I wanted to know the exact path where I can see the mentioned sourcetypes. So that I can check under which category 2 different sourcetypes are defined and if possible, I can make it as a single sourcetype since both are of same name
Perhaps your example is not large enough, but from the subject, perhaps you could try this | stats count as count_B by "Field A" "Field B" | eventstats count as count_A by "Field A" | where count_A ... See more...
Perhaps your example is not large enough, but from the subject, perhaps you could try this | stats count as count_B by "Field A" "Field B" | eventstats count as count_A by "Field A" | where count_A = 1
It is not clear to me what exactly you want to check. Please can you clarify?
Hi, here are the props.conf for the CSV file: DATETIME_CONFIG = INDEXED_EXTRACTIONS = csv KV_MODE = none NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Structured description = Thi... See more...
Hi, here are the props.conf for the CSV file: DATETIME_CONFIG = INDEXED_EXTRACTIONS = csv KV_MODE = none NO_BINARY_CHECK = true SHOULD_LINEMERGE = false category = Structured description = This  sourcetype stores all the DB connect information disabled = false pulldown_type = true
i have anything displayed under label... is anybody can help? Here is what i see  
I have  inserted the raw log in the xml code editor. One without new lines in it are extracting fine but not the ones with new lines or tabs are not even though I am using (?s)
   
Hi @Mario.Morelli, Are you able to jump back in here and help out @Maximiliano.Salibe?
Hi All, It would be great help if anyone help me figure out this. App is deployed in the UFs to receive such logs in splunk under the index wineventlog. I can see 2 different sourcetypes (xmlwineve... See more...
Hi All, It would be great help if anyone help me figure out this. App is deployed in the UFs to receive such logs in splunk under the index wineventlog. I can see 2 different sourcetypes (xmlwineventlog, XmlWinEventLog) under the wineventlog index sourcetype : XmlWinEventLog (source : "XmlWinEventLog:Application", "XmlWinEventLog:Security", "XmlWinEventLog:System") sourcetype : xmlwineventlog (source : "WinEventLog:Microsoft-Windows-Sysmon/Operational", "WinEventLog:Microsoft-Windows-Windows Defender/Operational") Please help me where should I need to check these exact difference of two distinct case sensitive sourcetypes. Thanks