All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi. Can you tell me the spl how to fetch the password expiry date and username from search results ? 
Thanks for the answer but can i do this without appendpipe
Hello comrades! I just wonder, does splunk detects logs similarity by it's pattern? Many thanks.
I've tried to enable boot-start on *nix and Windows, but after the machine reboots, Splunk Forwarder still cannot start automatically. Can anyone have solutions for this case?
Hello, How do I add a dropdown or a text on any location in the Dashboard Studio? I tried to put inside the rectangle in the middle of my dashboard, but it stayed in the top of the dashboard below ... See more...
Hello, How do I add a dropdown or a text on any location in the Dashboard Studio? I tried to put inside the rectangle in the middle of my dashboard, but it stayed in the top of the dashboard below the title. I tried to move "inputs" section in the JSON source code, but it didn't seem to work. Also, whenever I made changes in the source code, I wasn't able to revert it back easily like it did on the classic dashboard. Please suggest. Thank you.
Hi @ITWhisperer , is there any way to match the ip without expand the ip?
Hi @azulueta  Try the following query index=tenable sourcetype=tenable:io:assets | stats count values(hostnames) BY agent_uuid | where count > 1 Hope that helps
Hi @harishsplunk7 ,   I’m a Community Moderator in the Splunk Community.  This question was posted 3 years ago, so it might not get the attention you need for your question to be answered. We re... See more...
Hi @harishsplunk7 ,   I’m a Community Moderator in the Splunk Community.  This question was posted 3 years ago, so it might not get the attention you need for your question to be answered. We recommend that you post a new question so that your issue can get the  visibility it deserves. To increase your chances of getting help from the community, follow these guidelines in the Splunk Answers User Manual when creating your post.   Thank you!
Hi, I am new to Splunk and am looking for a search that is able to identify duplicate field values. We have an issue in Tenable that assets have duplicate asset IDs. My initial search is: index=t... See more...
Hi, I am new to Splunk and am looking for a search that is able to identify duplicate field values. We have an issue in Tenable that assets have duplicate asset IDs. My initial search is: index=tenable sourcetype=tenable:io:assets | stats count by hostnames, agent_uuid Lists hostnames with ther unique ID on a table. Need to just show hostnames with the same agent_uuid. I don't know if I need to export this and put it on a lookup table and then compare the agent_uuid values from there and just show the duplicates but I was hoping for a more straight forward search to do this. Thank you.
Hi @richgalloway @gcusello , How can we utilize Btool on the host to troubleshoot whether the Universal Forwarder (UF) is utilizing an inputs.conf file other than the one intended for the Windows_TA... See more...
Hi @richgalloway @gcusello , How can we utilize Btool on the host to troubleshoot whether the Universal Forwarder (UF) is utilizing an inputs.conf file other than the one intended for the Windows_TA? Despite applying the correct Regex filters, we are still encountering issues with events not being properly blacklisted. ThanQ
The first set of props will not ingest a CSV properly.  The second should work much better. In which Splunk instance did you make the change?  It should be done on the indexers and heavy forwarders ... See more...
The first set of props will not ingest a CSV properly.  The second should work much better. In which Splunk instance did you make the change?  It should be done on the indexers and heavy forwarders (if you have them). Use btool on an indexer to make sure the settings are as expected. splunk btool --debug props list aws:s3:csv The change will apply to new data only.
The EventCode key and $XmlRegex key use two different regular expressions.  The former is simple and certain to work correctly, whereas the latter is not.  That is why I showed a corrected $XmlRegex ... See more...
The EventCode key and $XmlRegex key use two different regular expressions.  The former is simple and certain to work correctly, whereas the latter is not.  That is why I showed a corrected $XmlRegex expression. The regex101.com expression is working fine.  Include sample data that matches the expression and you'll see. https://regex101.com/r/ZTE3Z4/1
Hi all,   I am trying to get  Azure AD B2C to work as SAML provider for Splunk    anyone managed to get this to work ?    please advise,  I followed all the available online resources but noth... See more...
Hi all,   I am trying to get  Azure AD B2C to work as SAML provider for Splunk    anyone managed to get this to work ?    please advise,  I followed all the available online resources but nothing is working 
The splunk DLTK 5.1.0 documentation suggests below : No indexer distribution Data is processed on the search head and sent to the container environment. Data cannot be processed in a distributed... See more...
The splunk DLTK 5.1.0 documentation suggests below : No indexer distribution Data is processed on the search head and sent to the container environment. Data cannot be processed in a distributed manner, such as streaming data in parallel from indexers to one or many containers. However, all advantages of search in a distributed Splunk platform deployment still exist. Does the above imply that data from splunk are not distributed (such as data parallelism) among multiple containers in the Kubernetes execution environment during training or inference phase ? Further, is the distribution only vertical in nature (multi CPU or multi GPU in a single container) or the jobs can scale horizontally as well (multiple containers) with each container working on a partition of data ? Further, for executing Tensorflow, PyTorch, Spark or Dask jobs do we need to have required operators/services pre-installed prior to (Spark K8s operator for example) submitting the jobs from Splunk Jupyter notebook ? Or are these services setup during DLTK app installation and configuration in Splunk ? Appreciate any inputs on above query. Thanks in advance !
Hi@richgalloway , Why there is no EventCode 4688 in the regex  ? This is not working , https://regex101.com/r/45I3Kt/1 pls check it once  Thanks
@cbiraris  there are a number of ways of doing this, but it depends on what you want to end up with. I am assuming that the event _time field denotes your time - if not, then parsing your time field... See more...
@cbiraris  there are a number of ways of doing this, but it depends on what you want to end up with. I am assuming that the event _time field denotes your time - if not, then parsing your time field using strptime() is needed first. A couple of examples below showing you stats and streamstats usage. Using stats you can collect your events together like this, assuming you have some kind of correlation ID that can group the events together. | makeresults count=4 | streamstats c | eval _time=now() - (c * 60) - (random() % 30) | eval EventID="ID:".round(c / 2) | fields - c ``` Calculate the gap ``` | stats range(_time) as r by EventID If you have a number events a simple example of streamstats will just calculate the difference between two events like this, which generates 4 random timed events and calculates the difference between each pair | makeresults count=4 | streamstats c | eval _time=now() - (c * 60) - (random() % 30) | fields - c | eval Event=mvindex(split("Start,End",","),(c - 1) % 2) ``` Calculate the gap ``` | streamstats reset_after="Event=\"End\"" range(_time) as gap  
You can't do it via the legend, as that does not support drilldown - you'd have to play around with JS probably to get that to work. I assume you want to be able to unhide it again, so you can't do ... See more...
You can't do it via the legend, as that does not support drilldown - you'd have to play around with JS probably to get that to work. I assume you want to be able to unhide it again, so you can't do it directly on that chart, but you could do it by having another set of buttons in another panel that would provide a filter set to show/hide those. I've often done this either through a multiselect input above the chart or a link input where the inputs are tabbed horizontally, e.g. You can see how that is done in the Itsy Bitsy app for Splunk - https://splunkbase.splunk.com/app/5256  
If that is your _raw event, just do | spath correlation_id and it will give you the correlation_id field
I should also mention that changing the sourcetype to anything other than aws:s3 or aws:s3:csv results in no data being indexed at all.
Here is the props.conf stanza from the TA's default directory that applies to the source type that is specified in the documentation: ########################### ### CSV ### ######################... See more...
Here is the props.conf stanza from the TA's default directory that applies to the source type that is specified in the documentation: ########################### ### CSV ### ########################### [aws:s3:csv] DATETIME_CONFIG = CURRENT TIME_FORMAT = %Y-%m-%dT%H:%M:%S%Z SHOULD_LINEMERGE = false LINE_BREAKER = [\r\n]+ TRUNCATE = 8388608 EVENT_BREAKER_ENABLE = true EVENT_BREAKER = [\r\n]+ KV_MODE = json I tried adding a props.conf into the local directory for the TA but it seems to be ignored because the data ends up indexed exactly the same after adding the new file and then restarting Splunk. This is the contents of the local props.conf that I tried: [aws:s3:csv] TIME_FORMAT = %s HEADER_FIELD_LINE_NUMBER = 1 INDEXED_EXTRACTIONS = TSV TIMESTAMP_FIELDS = timestamp