All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Not yet. I'm still discussing with support is this a bug or something else. Currently we are waiting (final?) answer from developers/PM to hear what are their plans for it.
Using the below sample search I'm trying to get every possible combination of results between two different sets of data and interested if there are any good techniques for doing so that are relative... See more...
Using the below sample search I'm trying to get every possible combination of results between two different sets of data and interested if there are any good techniques for doing so that are relatively efficient.  At least with the production data set I'm working with it should translate to about 40,000 results.  Below is just an example to make the data set easier to understand.  Thank you in advance for any assistance. Sample search | makeresults | eval new_set="A,B,C" | makemv delim="," new_set | append [| makeresults | eval baseline="X,Y,Z" ] | makemv delim="," baseline Output should be roughly in the format below and I'm stuck on getting the data manipulated in a way that aligns with the below. new_set - baseline -- A-X A-Y A-Z B-X B-Y B-Z C-X C-Y C-Z
Hey. Any updates regarding the bug? Found the same issue, using latest splunk (9.3.2)
When you run the command "netsh wlan show wlanreport", it does not only generate a HTML report, but also a xml report. This is good because the HTML report is intended for human consumption so Splunk... See more...
When you run the command "netsh wlan show wlanreport", it does not only generate a HTML report, but also a xml report. This is good because the HTML report is intended for human consumption so Splunk will not be happy with it. You can instead index the XML file. The XML file is at: C:\ProgramData\Microsoft\Windows\WlanReport\wlan-report-latest.xml To set up Splunk to generate and index this file once per hour, you need 3 configuration files: 1) A props.conf file on your indexer machine(s) # Put this in /opt/splunk/etc/apps/<yourappname>/local/props.conf [WlanReport] maxDist = 170 SHOULD_LINEMERGE = true BREAK_ONLY_BEFORE = <?xml version TIME_PREFIX = ReportDate> 2) A inputs.conf file on your forwarder machine(s) # Put this in /opt/splunkforwarder/etc/apps/<yourdeploymentappname>/local/inputs.conf [monitor://C:\ProgramData\Microsoft\Windows\WlanReport\wlan-report-latest.xml] index=main sourcetype=WlanReport disabled = 0 initCrcLength = 256 # You can use a scripted input to run the command once per X seconds specified by the interval [script://C:\Program Files\SplunkUniversalForwarder\etc\apps\<yourdeploymentappname>\bin\scripts\wlanreport.bat] interval = 3600 disabled = 0 # (I have trouble getting it to work with a relative path to the script) 3) The script file on your forwarder machine(s): # Put this in /opt/splunk/etc/apps/<yourdeploymentappname>/bin/wlanreport.bat @echo off netsh wlan show wlanreport   You will then have events coming in containing the XML file contents, every hour.
Sorry for delayed response, holidays got in the way. I ran "splunk btool server list sslConfig" and it returned no data.  I tried it without sslconfig and searched for that cert name and nothing Wh... See more...
Sorry for delayed response, holidays got in the way. I ran "splunk btool server list sslConfig" and it returned no data.  I tried it without sslconfig and searched for that cert name and nothing When I run openssl.exe x509 -enddate -noout -text -in "c:\programs files\splunk\etc\auth\server_pkcs1.pem" it shows as the issuer being Splunk.
In Dashboard Studio it's $row.<<fieldname>>.value$. $row.host.value$
Hi @Vinodh.Angalaguthi, It's been a few days with no reply from the community. Did you happen to find a solution or more information you can share? If you still need help, you can contact AppDyn... See more...
Hi @Vinodh.Angalaguthi, It's been a few days with no reply from the community. Did you happen to find a solution or more information you can share? If you still need help, you can contact AppDynamics Support: How to contact AppDynamics Support and manage existing cases with Cisco Support Case Manager (SCM) 
Hello Splunk Community,  I was wondering if anyone has been successful in setting up the Microsoft Teams Add-on for Splunk app in their Enterprise/Heavy Forwarder. This application requires configur... See more...
Hello Splunk Community,  I was wondering if anyone has been successful in setting up the Microsoft Teams Add-on for Splunk app in their Enterprise/Heavy Forwarder. This application requires configuring a Teams webhook. When reading the documentation it appears that the app is supposed to create or include the Microsoft Teams-specific webhook. However, when I attempt to search for the Webhook in the search app using:  sourcetype="m365:webhook" I don't get anything back and I'm not sure what the Webhook address is since document doesn't specify the format or go over the steps to create a Webhook address.  I followed these steps: https://lantern.splunk.com/Data_Descriptors/Microsoft/Getting_started_with_the_Microsoft_Teams_Add-on_for_Splunk If anyone has an idea on how to create the Webhook or has an idea what I am doing wrong, I would greatly appreciate it.  Thanks!
Remove Blue Dot In Dashboard Studio, my panels use a parent search which uses a multisearch. Because of this, all of the panels have this annoying informational blue dot that appears until the sea... See more...
Remove Blue Dot In Dashboard Studio, my panels use a parent search which uses a multisearch. Because of this, all of the panels have this annoying informational blue dot that appears until the search completely finishes. How can I get rid of this so it never appears? 
Sorry about that, I didn't think it would matter.  Looks like it does.  I've created a Support ticket for this as well.  Hopefully, they'll get back to me.  If they do, I'll let you know the solution... See more...
Sorry about that, I didn't think it would matter.  Looks like it does.  I've created a Support ticket for this as well.  Hopefully, they'll get back to me.  If they do, I'll let you know the solution with Studio. Thanks again, Tom
@gcusello  I'm not entirely sure what you're referring to to be honest. Our subsearch is well under 50k results so that shouldn't be the issue. But I appreciate you trying to assist. I'll see if I c... See more...
@gcusello  I'm not entirely sure what you're referring to to be honest. Our subsearch is well under 50k results so that shouldn't be the issue. But I appreciate you trying to assist. I'll see if I can puzzle it out.
| eval "Last Logon"=strftime(strptime(LastLogon, "%Y-%m-%dT%H:%M:%S.%QZ"),"%Y%m%d %H:%M:%S") | eval lastLogon=strptime(LastLogon, "%Y-%m-%dT%H:%M:%S.%QZ") Sorry about not having a better explanation... See more...
| eval "Last Logon"=strftime(strptime(LastLogon, "%Y-%m-%dT%H:%M:%S.%QZ"),"%Y%m%d %H:%M:%S") | eval lastLogon=strptime(LastLogon, "%Y-%m-%dT%H:%M:%S.%QZ") Sorry about not having a better explanation.  "Last Logon" and "lastLogon" are being generated from a field "LastLogon" which I hope or assume is in the original data set. "Last Logon" is a nested strptime inside a strftime.  The strptime takes and human readable format and converts to epoch, while the strftime will take epoch and convert to human readable.  The nested function here essentially converts the format from one human readable to another human readable.  There are easier methods but if it was working maybe don't change it until your skill level jumps. "lastLogon" just takes the human readable format and converts to epoch(Unix) time - which makes duration calculations much easier. Check that "LastLogon" field is still there and that the format still matches the "xxxx-xx-xxTxx:xx:xx.xxxZ" that the strptime command is configured to expect.  Also check to see if the time shift you are experience can be explain by the delta in your local time zone (either personal setting, or that of the Search Head).  It expects the raw data from the field to be in Zulu time.  
That should be doable. Does the other product have documentation describing the format in which it expects to receive the lookup? You should be able then to use SPL to convert the lookup into that fo... See more...
That should be doable. Does the other product have documentation describing the format in which it expects to receive the lookup? You should be able then to use SPL to convert the lookup into that format, in one or more fields, then send it using the POST HTTP alert action.
I had assumed you were doing Classic XML to start, Dashboard Studio is slightly different I can try testing later.
Can you give more information about this? Is this function part of a library, or app, or base Splunk functionality?
Yes, this is a security recommendation added recently. As the alert suggests, you can add email domains to your allowedDomainList by going to Server Settings > Email Settings > Email Domains. For exa... See more...
Yes, this is a security recommendation added recently. As the alert suggests, you can add email domains to your allowedDomainList by going to Server Settings > Email Settings > Email Domains. For example, if you want email alerts to only go to your company email addresses, then you can add your company domain there. This will restrict your email alerts so that users cannot accidentally or maliciously send data to unauthorized email domains.
I checked the Field Extraction section and did NOT find any reference to "Last Logon". Being new to Splunk this is where I am unsure where fields come from and how they work which is fine for now, so... See more...
I checked the Field Extraction section and did NOT find any reference to "Last Logon". Being new to Splunk this is where I am unsure where fields come from and how they work which is fine for now, something for me to research.  I switched the SPL to the following and it still doesn't return the 'lastLogon' attribute from AD, would this be expected or should it in fact return the 'lastLogon' attribute?  | ldapsearch domain=mine search="(objectClass=user)" attrs=sAMAccountName,lastLogon | table sAMAccountName,lastLogon  
Thanks,  I tried the steps, but same thing occurred.  I then quickly set up a Classic Dashboard instead of a Dashboard Studio, and it works.  Looks like either an issue with Studio, of maybe it's jus... See more...
Thanks,  I tried the steps, but same thing occurred.  I then quickly set up a Classic Dashboard instead of a Dashboard Studio, and it works.  Looks like either an issue with Studio, of maybe it's just done differently.  Thanks again, Tom  
Ok so we know row and results works in other environments.  Something should be there based upon what we have seen from your SPL and table results.  I would recommend saving the updated drill down, t... See more...
Ok so we know row and results works in other environments.  Something should be there based upon what we have seen from your SPL and table results.  I would recommend saving the updated drill down, then log out of splunk, close browser and clear cache/cookies, log into splunk, and reload dashboards.
Hey guys, Thanks for the quick help, still stuck for some reason.  So I've tried $row.host$ and $result.host$ but they both result in just passing $xxx.host$ for some reason.  Here's the config: ... See more...
Hey guys, Thanks for the quick help, still stuck for some reason.  So I've tried $row.host$ and $result.host$ but they both result in just passing $xxx.host$ for some reason.  Here's the config: Here's the resulting search: Here's the table query: index="netscaler" host=* | rex field="servicegroupname" "\?(?<Name>[^\?]+)" | rex field="servicegroupname" "(?<ServiceGroup>[^\?]+)" | rename "state" AS LastStatus | eval Component = host."|".servicegroupname | search Name=* | eval c_time=strftime(Time,"%m/%d/%Y %H:%M:%S") | streamstats window=1 current=f global=f values(LastStatus) as Status by Component | where LastStatus!=Status | rename _time as "Date" | eval Date=strftime(Date, "%m/%d/%Y %H:%M:%S") | table Date, host, ServiceGroup, Name, Status, LastStatus   And, here's a screenshot of the table if helpful.     Thanks again for the help on this one, very much appreciated. Tom