All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

i am not sure of whether this requires restart or not..  just give it some more time...  also this internal errors are not a big issue.. it can wait. it wont impact splunk indexing and searching/pe... See more...
i am not sure of whether this requires restart or not..  just give it some more time...  also this internal errors are not a big issue.. it can wait. it wont impact splunk indexing and searching/performance.. so no need to worry much.  if you are having a service restart within two/three weeks or within a month, you can wait until that service restart and then verify this one. hope you understood .. thanks.   
yes i have updated the dashboards this afternoon but i am still seeing the errors till now but the errors are showing up exactly at 30 min interval. Does this change needs a rolling restart on SHC... See more...
yes i have updated the dashboards this afternoon but i am still seeing the errors till now but the errors are showing up exactly at 30 min interval. Does this change needs a rolling restart on SHC? yes this is a production Search head
Ok thank you, I have a problem with another lookup  | inputlookup test the lookup table file and definition both are available, both of the permissions are set to read(everyone)- set to app level, ... See more...
Ok thank you, I have a problem with another lookup  | inputlookup test the lookup table file and definition both are available, both of the permissions are set to read(everyone)- set to app level, but when i am trying to inputlookup i am seeing the error The lookup table 'test' requires a .csv or KV store lookup definition. The lookup table 'test' is invalid.
Thank you! Search is returning some results, but hangs indefinitely. 
Hello, I was trying to use REGEX command within props/transforms conf files to extraction fields, but field extraction is not working. Two sample events and my props/transforms conf files are given ... See more...
Hello, I was trying to use REGEX command within props/transforms conf files to extraction fields, but field extraction is not working. Two sample events and my props/transforms conf files are given below. Any recommendations will be highly appreciated. Thank you so much. props.conf [mysourcetype] SHOULD_LINEMERGE=false LINE_BREAKER = ([\r\n]+) TIME_PREFIX=^ TIME_FORMAT=%Y-%m-%dT%H:%M:%S.%3N MAX_TIMESTAMP_LOOKAHEAD=24 TRUNCATE = 9999 REPORT-fieldEtraction = fieldEtraction   transforms.conf [fieldEtraction] REGEX = \{\"UserID\":\"(?P<UserID>\w+)\","UserType":\"(?P<UserType>\w+)\","System":\"(?P<System>\w+)\","UAT":\"(?P<UAT>.*)\","EventType":\"(?P<EventType>.*)\","EventID":"(?P<EventID>.*)\","Subject":"(?P<Subject>.*)\","EventStatus":"(?P<EventStatus>.*)\","TimeStamp":\"(?P<TimeStamp>.*)\","Device":"(?P<Device>.*)\","MsG":"(?P<Message>.*)\"}   Samples Events  2023-10-03T18:56:31.099Z OTESTN097MA4513020 TEST[20248] {"UserID":"8901A","UserType":"EMP","System":"TEST","UAT":"UTA-True","EventType":"TEST","EventID":"Lookup","Subject":"A516617222","EventStatus":"00","TimeStamp":"2023-10-03T18:56:31.099Z","Device":" OTESTN097MA4513020","Msg":"lookup ok"} 2023-10-03T18:56:32.086Z OTESTN097MA4513020 TEST[20248] {"UserID":"8901A","UserType":"EMP","System":"TEST","UAT":"UTA-True","EventType":"TEST","EventID":"Lookup","Subject":"A516617222","EventStatus":"00","TimeStamp":"2023-10-03T18:56:32.086Z","Device":" OTESTN097MA4513020","Msg":"lookup ok"}
Are you looking for something like this? index=... Host=HostName "User ID"=* earliest=$time_tok.earliest$ latest=$time_tok.latest$ | stats count by "User ID" |stats avg(eval(count*86400/($time_tok... See more...
Are you looking for something like this? index=... Host=HostName "User ID"=* earliest=$time_tok.earliest$ latest=$time_tok.latest$ | stats count by "User ID" |stats avg(eval(count*86400/($time_tok.latest$ - $time_tok.earliest$)) This is really just translating what you described into math formula.  If you need to truncate number of decimals, round is your friend.  Note: "User ID"=* is not needed after group by "User ID".  Any null value will disappear from groupby.  Meanwhile, if you have events that miss "User ID", it is beneficial to remove them before hand. Hope this helps.
First, some house cleaning: You posted two nearly identical topics.  This one appears to be more specific in subject.  Could you delete https://community.splunk.com/t5/Splunk-Search/searching-for-spe... See more...
First, some house cleaning: You posted two nearly identical topics.  This one appears to be more specific in subject.  Could you delete https://community.splunk.com/t5/Splunk-Search/searching-for-specific-result/m-p/659465#M227694, then? Second, you need to give enough context for a person with no context about your environment, dataset, etc., to understand what difficulty you face, what attempts you have made with what result.  Do not assume that volunteers are mind-readers.  For example, and so on. Notice all the system_id starts with common 'AA-1' and * afterward. However, when use it as a token, as you've already feel the problem, AA-10* would return ALL the following id's start Never mind the problem.  I tail to see any problem of putting system_id in a token as discrete values.  For one, system_id starts with AA-1, but there is no asterisk ('*') in any of the examples.  If I use <your initial search> | stats count by system_id to populate $mytoken$, none of the values will have wildcard.  Your problem statement implies that you populate $mytoken$ either with fixed strings including AA-1*, AA-10*, etc., or you populate $mytoken$ with a search like my example, but manipulate the results in a way the adds wildcard to certain positions.  Another person would have no way of knowing why you populate $mytoken$ with AA-1* instead of AA-1-*, for example. Then, there is a question of use of said token.  Do you use it in a search command?  A where command?  A match function?  A different part of an eval expression?  Each of these can work with a string differently. Can you explain how that wildcard character gets into your token values and how you token is used?
Hello @Rob2520, If you go to Content -> Manage Bookmarks section, there's a button for "Backup and Restore", have you tried that? Below screenshot for reference -  Please accept the solution and... See more...
Hello @Rob2520, If you go to Content -> Manage Bookmarks section, there's a button for "Backup and Restore", have you tried that? Below screenshot for reference -  Please accept the solution and hit Karma, if this helps!
Wow! it works like a charm! Thank you so much for the help! Best,
Hello, Just checking through if the issue was resolved or you have any further questions?
Hello, Just checking through if the issue was resolved or you have any further questions?
Hello, Just checking through if the issue was resolved or you have any further questions?
Hello, Just checking through if the issue was resolved or you have any further questions?
Hello, Just checking through if the issue was resolved or you have any further questions?
 I'm new here and still learning to make the change. Currently I'm on the Splunk cloud version and this Field transformations is where I can find to add the transform but not sure how I can spec... See more...
 I'm new here and still learning to make the change. Currently I'm on the Splunk cloud version and this Field transformations is where I can find to add the transform but not sure how I can specify the log field and the Format option there. Should I update the Source Key there? Thanks for the help!
suorce key is the field name, change _raw to log. You don't need the format, as you have specified the field names in the extraction string Note that the existing JSON needs to be auto extracted, wh... See more...
suorce key is the field name, change _raw to log. You don't need the format, as you have specified the field names in the extraction string Note that the existing JSON needs to be auto extracted, which means that it has to have been set up to do so. It's easy to see just do index=x and look down the left hand side of the display in verbose mode to see if the 'log' field is shown as a field.
Why do you need them in one token. You will not be able to search for  AA-1* without picking up the AA-10, so if you have a token that is base_id, which contain AA-1, which you search for, i.e.  s... See more...
Why do you need them in one token. You will not be able to search for  AA-1* without picking up the AA-10, so if you have a token that is base_id, which contain AA-1, which you search for, i.e.  system_id=$base_token$* and then a second token with AA-1($|-) and do a regex, e.g. | regex system_id="$regex_token$"  
Thank you so much for the quick response! I found this Field transformations to be added in our Splunk cloud.  Where can I specify the source field log and what should be configured in the Format o... See more...
Thank you so much for the quick response! I found this Field transformations to be added in our Splunk cloud.  Where can I specify the source field log and what should be configured in the Format option there? Best,  
Thank you so much for the quick response! I found this Field transformations to be added in our Splunk cloud.  Where can I specify the source field log and what should be configured in the Form... See more...
Thank you so much for the quick response! I found this Field transformations to be added in our Splunk cloud.  Where can I specify the source field log and what should be configured in the Format option there? Best,
are you not receiving Snow incident tickets only for a particular set of alerts  or are you not receiving Snow incident tickets for whole splunk altogether?