All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi. I have a single filed for date and time of event - 2024-02-19T11:16:58.930104Z I would like to have to fields Date and Time as well as one more calculated fields I can use to find records not... See more...
Hi. I have a single filed for date and time of event - 2024-02-19T11:16:58.930104Z I would like to have to fields Date and Time as well as one more calculated fields I can use to find records not changed in last 2 days or 48 hours what ever is better for the search. I tried  |eval Date = strftime(policy_refresh_at, "%b-%d-%Y") | eval Time = strftime(policy_refresh_at, "%H:%M") or | eval Date=substr(policy_refresh,10,1) The result come empty in both cases. So nothing to calculate on Please advise, Thank you Please advise on
I am using the | fields _raw to show the entire content of the source file as a single event.  It works for most of my log files less than 100K.  For occasionally larger files, the search will break ... See more...
I am using the | fields _raw to show the entire content of the source file as a single event.  It works for most of my log files less than 100K.  For occasionally larger files, the search will break the results into multiple events and missing out the details.  How can I fix it?  Or is there another way to return the file contents?  I know users can click on the Show Source in the event action, but my search queries are part of a dashboard drilldown on file names.
@bowesmana Thanks, Try what you mentioned but not work as I expected, Change my mind, Is it possible to create table like this? PF              Host1      Host2      Host3 red.            50.   ... See more...
@bowesmana Thanks, Try what you mentioned but not work as I expected, Change my mind, Is it possible to create table like this? PF              Host1      Host2      Host3 red.            50.              20.           89 purple.      30.              80.          1 green.        80.             12.           -
I am not sure why different eventtypes can't be combined into the same search - assuming they can, try something like this index="indexName" eventType="group.user_membership.add" OR eventType="use... See more...
I am not sure why different eventtypes can't be combined into the same search - assuming they can, try something like this index="indexName" eventType="group.user_membership.add" OR eventType="user.authentication.sso" | spath "target{}.displayName" |rename target{}.displayName as grpID | eval groupName=mvindex(grpID, 1) | rename "target{}.alternateId" AS "targetId" | rename "target{}.type" AS "targetType" ``` Assuming target_useris already extracted for sso events (otherwise extract it here) ``` | eval target_user=if(eventType=="user.authentication.sso",target_user,mvindex(targetId, mvfind(targetType, "User"))) | table target_user groupName date | eventstats values(groupName) as groupName by target_user | where eventType="user.authentication.sso" | stats count by date
I need help to write a search query where the result from the one query is passed onto the second query 1 we import the users from the active directory group in the okta group and the event eventTyp... See more...
I need help to write a search query where the result from the one query is passed onto the second query 1 we import the users from the active directory group in the okta group and the event eventType="group.user_membership.add" captures this Json event Following the query get me the name of the group and user name. index="indexName"   eventType="group.user_membership.add" | spath "target{}.displayName" |rename target{}.displayName as grpID| eval groupName=mvindex(grpID, 1) |  rename "target{}.alternateId" AS "targetId" | rename "target{}.type" AS "targetType"| eval target_user=mvindex(targetId, mvfind(targetType, "User")) | table target_user groupName 2. After the user is added to the Okta group, I want to find the occurrence of the user authentications during time range  . I can separately find user authentication using eventType="user.authentication.sso" this event doesn't have a group name. index="indexName"   eventType="user.authentication.sso"  target_user  | stats count by date How do I pass the user in the first query to the second query. I cannot use subsearch since the main search eventype is not the same as the second sub search.   Basically, I want to create a report by groupname/username authentications for the selected time range Any help is appreciated.
Thanks @ITWhisperer  This worked.
The first command of the map search needs to be a generating command, such as rest. Try adding the eval afterwards. <Base Search> | stats count min(_time) as firstTime max(_time) as lastTime values(... See more...
The first command of the map search needs to be a generating command, such as rest. Try adding the eval afterwards. <Base Search> | stats count min(_time) as firstTime max(_time) as lastTime values(user) as user by user, src_ip, activity, riskLevel |map maxsearches=100 search="| rest splunk_server=local /services/App/.../ ioc=\"$src_ip$\" | eval activity=\"$activity$\""
I will try both approaches today and see what happens.  Thanks for the suggestions!
It works Many thanks for your help !
We have a search where one of the fields from base search is passed onto a REST API using map command.    <Base Search> | stats count min(_time) as firstTime max(_time) as lastTime values(user) as ... See more...
We have a search where one of the fields from base search is passed onto a REST API using map command.    <Base Search> | stats count min(_time) as firstTime max(_time) as lastTime values(user) as user by user, src_ip, activity, riskLevel |map maxsearches=100 search="| rest splunk_server=local /services/App/.../ ioc="$src_ip$"     But after this search ,only the results returned by the REST API are shown. How can I include some of the fields from original search, e.g. user, activity so that they can later be used in a table? Tried adding the field using eval right before the REST call but that doesn't seem to be working.    eval activity=\"$activity$\" | rest     Also tried using "multireport" but only the first search is considered.    | multireport [ table user, src_ip, activity, riskLevel] [| map map maxsearches=100 search="| rest splunk_server=local /services/App/.../ ioc="$src_ip$"]     Is there a way to achieve this? API call itself returns a set of fields which I am extracting using spath but also want to keep some of the original ones for added context. Thanks, ~Abhi
Thank you this should work quite well for my needs.  
Hello to everyone! I have a Win server with Splunk UF installed that consumes MS Exchange logs This logs is stored in CSV format Splunk UF settings look like this: props.conf [exch_file_httppr... See more...
Hello to everyone! I have a Win server with Splunk UF installed that consumes MS Exchange logs This logs is stored in CSV format Splunk UF settings look like this: props.conf [exch_file_httpproxy-mapi] ANNOTATE_PUNCT = false BREAK_ONLY_BEFORE_DATE = true INDEXED_EXTRACTIONS = csv initCrcLength = 2735 HEADER_FIELD_LINE_NUMBER = 1 MAX_TIMESTAMP_LOOKAHEAD = 24 SHOULD_LINEMERGE = false TIMESTAMP_FIELDS = DateTime TRANSFORMS-no_column_headers = no_column_headers transforms.conf [no_column_headers] REGEX = ^#.* DEST_KEY = queue FORMAT = nullQueue   Thanks to the data quality report on the indexers layer, I found out that this source type has some timestamp issues I investigated this problem by executing a search on the searched layer and found surprising events breaking You can see an example in the attachment _raw data is OK and is not contain "unxepected" next-line characters What is wrong with my settings?
| rex "\w+\.(?<domaine_test>[\.\w-]+)" if the - is at the end of the character class [] it doesn't need to be escaped
Create a second dropdown with the dynamic search
map can be slow and limited - try something like this [| inputlookup testlookup | table index sourcetype] earliest=-2d@d latest=@d | eval day=if(_time < relative_time(now(), "-1d@d"), "Yesterday", "... See more...
map can be slow and limited - try something like this [| inputlookup testlookup | table index sourcetype] earliest=-2d@d latest=@d | eval day=if(_time < relative_time(now(), "-1d@d"), "Yesterday", "Today") | stats count by day index sourcetype | eval {day}=count | stats values(Today) as Today values(Yesterday) as Yesterday by index sourcetype | fillnull value=0 Yesterday Today | eval difference=abs(Yesterday - Today)
Hello everyone, Unfortunately, from the license master server I can not see anything from the dashboards of the license usage page. I have also tried with the query below:  index=_internal sourcety... See more...
Hello everyone, Unfortunately, from the license master server I can not see anything from the dashboards of the license usage page. I have also tried with the query below:  index=_internal sourcetype=splunkd source=*license_usage.log type=Usage idx=*  But nothing, no results found.  Could you please help me? Thanks in advance.  
Hi, I am trying to deploy a new index to my indexer cluster via the Cluster Master and have followed the usual documentation on how to deploy via the Master-Apps Folder. I have done this before and ... See more...
Hi, I am trying to deploy a new index to my indexer cluster via the Cluster Master and have followed the usual documentation on how to deploy via the Master-Apps Folder. I have done this before and it has worked no problem but this time I have no idea why it is not working.  When I make the change to indexes.conf and run the command "splunk validate cluster-bundle" it gives me no errors and then brings me back to my CLI so I would presume it has validated it. Then I run the command "splunk show cluster-bundle-status" to check the bundle ID's they are still the same ID's on the active bundle and the latest bundle. Its as if Splunk is not recognising that a change has been made to the bundle and therefore cannot deploy it down to the indexers.   I ran the command "splunk apply cluster-bundle" and it gave me the below error. However when I checked the Splunkd.log on the CM and the Indexers there was no indication of a validation error, or any error for that case. Is there anything that I am missing here? Just cant work out why it is not recognising a change has been made to update the Bundle IDs to be pushed down.  Thanks  
Hi, Thank you for your response. I have some domain with "-" character, for exemple black-ice.com The result is "black". Is it possible to get all domain ?
But DropenDown 1 has Static values (Host Names) added hence if i add Dynamic based results in same dropdown1, values are duplicating.
Does your lookup identify which products are associated with with host? If so, you can dynamically populate the dropdown based on the results of a search which filters the products based on the hostn... See more...
Does your lookup identify which products are associated with with host? If so, you can dynamically populate the dropdown based on the results of a search which filters the products based on the hostname chosen.