All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

How to compare difference in the json file. If there is no difference we are good. But in my case i need to find compare N_aaa and A_aaa and find out the difference  N_aaa A_aaa { "AAA": { "m... See more...
How to compare difference in the json file. If there is no difference we are good. But in my case i need to find compare N_aaa and A_aaa and find out the difference  N_aaa A_aaa { "AAA": { "modified_files": [ "a/D:\\\\splunk\\\\Repos\\\\Wed\\\\N_aaa/aaa/pack-672b2efd6aada12ecfc8d1745f805706f43902f4.idx", "a/D:\\\\splunk\\\\Repos\\\\Wed\\\\N_aaa/aaa/pack-672b2efd6aada12ecfc8d1745f805706f43902f4.pack", "a/D:\\\\splunk\\\\Repos\\\\Wed\\\\A_aaa/aaa/objects/pack/pack-8a069e643d668a0715f82a237b44f1554535719f.idx", "a/D:\\\\splunk\\\\Repos\\\\Wed\\\\A_aaa/aaa/objects/pack/pack-8a069e643d668a0715f82a237b44f1554535719f.pack" ] } }
I have a table like this:           I would like to propagate "start" value and "end" value if "_time>=start AND _time<end". It's like a "transaction" with "startwith and endwith", but I n... See more...
I have a table like this:           I would like to propagate "start" value and "end" value if "_time>=start AND _time<end". It's like a "transaction" with "startwith and endwith", but I need to use "streamstats", because I can't lost event details. So I would like to obtain:             Thanks
Hi I have table like below, each word is parameter of a search query, now want to know which  of them mostly use? SPL | table a b c d e f    FYI: some these field are empty, some of them partia... See more...
Hi I have table like below, each word is parameter of a search query, now want to know which  of them mostly use? SPL | table a b c d e f    FYI: some these field are empty, some of them partially like each other. Need to find most use pattern on this table. Any idea? Thanks
I am in Splunk Enterprise trying to create a Dashboard in the source code. When I input the below code it says on the UI "Unable to create search" in regards to the User: All section Is this a us... See more...
I am in Splunk Enterprise trying to create a Dashboard in the source code. When I input the below code it says on the UI "Unable to create search" in regards to the User: All section Is this a user role restriction preventing me from searching all users or something else? It does not have any errors in the edit source page. Below Code: <form theme="dark"> <label>Splunk Search Activity</label> <fieldset submitButton="true" autoRun="false"> <input type="time" token="time1"> <label></label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> <input type="radio" token="exclude1" searchWhenChanged="true"> <label>Splunk System User</label> <choice value="user!=splunk-system-user">exclude</choice> <choice value="*">include</choice> <default>user!=splunk-system-user</default> <initialValue>user!=splunk-system-user</initialValue> </input> <input type="multiselect" token="user1"> <label>User:</label> <fieldForLabel>user1</fieldForLabel> <fieldForValue>user</fieldForValue> <search> <query>index=_audit action=search search!="'typeahead*" $exclude1$ | stats count by user</query> <earliest>$time1.earliest$</earliest> <latest>$time1.latest$</latest> </search> <choice value="*">all</choice> <default>*</default> <initialValue>*</initialValue> <delimiter> </delimiter> </input> <input type="text" token="filter1"> <label>Search Filter:</label> <default>*</default> <initialValue>*</initialValue> <prefix>"*</prefix> <suffix>*"</suffix> </input> </fieldset> <row> <panel> <table> <search> <query>index=_audit action=search search!="'typeahead*" user="$user1$" search=$filter1$ $exclude1$ | stats count by _time user search total_run_time search_id app event_count | sort -_time</query> <earliest>$time1.earliest$</earliest> <latest>$time1.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </form>
Hi All, I have a multi-value field as shown below- _time                                      field_test 2022-05-13 04:36:00 test_data_1   test_data_2   test_data_3   test_data_... See more...
Hi All, I have a multi-value field as shown below- _time                                      field_test 2022-05-13 04:36:00 test_data_1   test_data_2   test_data_3   test_data_4 2022-05-13 03:30:00    test_data_9   test_data_10   test_data_3   test_data_4   For the above two events, I am trying to write a query which can provide me the common values such that result is- test_data_3 test_data_4   Please help me on how can I accomplish it?    
One problem that I have with alerting from Splunk is that when I alert by email, total width of the table can exceed what the recipient can handle lookin at.  I'd like to start transposing my result ... See more...
One problem that I have with alerting from Splunk is that when I alert by email, total width of the table can exceed what the recipient can handle lookin at.  I'd like to start transposing my result table to address this.   That is, I'd like to go from sending alerted results like this time field1 field2 field 3 5/31/2022 value1 value2 really long value 3, so long that it creates a formatting problem. Oh noes! What will I do? To something more like this: Time: 5/31/2022 field1: value1 field2: values2 field3: really long value 3, so long that it creates a formatting problem. Oh noes! What will I do?   I know that I could create a field name called "alert fields" and manually create the fields, but is there a simple way to do this in Splunk
Hi, We have a tier related to a java process. The point is that this tier is related to a batch process. However, it is recognized as a normal process and it is not unregistered after a time altho... See more...
Hi, We have a tier related to a java process. The point is that this tier is related to a batch process. However, it is recognized as a normal process and it is not unregistered after a time although it finished its function time before. So, is there any way to control the time in which a node can be unregistered automatically if it doesn't have any traffic for a time? Thanks, Carlos
Hello, I'm facing a problem with role restriciton in searchs.  I applied the restriction in the role and everything was working perfect, even with searchs in datamodel. However, when I accelera... See more...
Hello, I'm facing a problem with role restriciton in searchs.  I applied the restriction in the role and everything was working perfect, even with searchs in datamodel. However, when I accelerated my datamodel, the role restriction filters stopped working. I'm imaging that it was due to the tsidx files generated by acceleration.  How can I apply such restriction even in accelerated datamodels?  Thanks!
Hi , Thanks in Advance I am trying to onboard json file data to splunk .But i am not forwarding all the data from json file.   My json file format { "aaa": { "modified_files": [ "a/D:\\\... See more...
Hi , Thanks in Advance I am trying to onboard json file data to splunk .But i am not forwarding all the data from json file.   My json file format { "aaa": { "modified_files": [ "a/D:\\\\splunk\\\\Repos\\\\/.git/HEAD", "a/D:\\\\splunk\\\\Repos\\\\/.git/config", "a/D:\\\\splunk\\\\Repos\\\\/.git/index", "a/D:\\\\splunk\\\\Repos\\\\/.git/logs/HEAD"] }, "bbb": { "modified_files": [ "b/D:\\\\splunk\\\\Repos\\\\/.git/HEAD", "b/D:\\\\splunk\\\\Repos\\\\/.git/config", "b/D:\\\\splunk\\\\Repos\\\\/.git/index", "b/D:\\\\splunk\\\\Repos\\\\/.git/logs/HEAD" ]  } } I am getting result as like this { "aaa": { "modified_files": [ "a/D:\\\\splunk\\\\Repos\\\\/.git/HEAD", "a/D:\\\\splunk\\\\Repos\\\\/.git/config", "a/D:\\\\splunk\\\\Repos\\\\/.git/index", "a/D:\\\\splunk\\\\Repos\\\\/.git/logs/HEAD"
Search job won't finish and causing resource drain on shared indexers and ES. I am suspecting I might not be using 'tstats ' efficiently. Perhaps the two tstats are the culprit Any pointers on ho... See more...
Search job won't finish and causing resource drain on shared indexers and ES. I am suspecting I might not be using 'tstats ' efficiently. Perhaps the two tstats are the culprit Any pointers on how to use append with two tstats output?   
When Splunk loads dashboards after dashboard creation is complete, it is out of range or visually uncomfortable Is there a way to automatically resize the screen like Dashboard Classic?
I tried to create a dashboard within the Search Function. "Splunk dashboard that displays user searches" This is on Splunk Enterprise. Currently I am getting ("Server Error") Below is the entered c... See more...
I tried to create a dashboard within the Search Function. "Splunk dashboard that displays user searches" This is on Splunk Enterprise. Currently I am getting ("Server Error") Below is the entered command: <form theme="dark"> <label>Splunk Search Activity</label> <fieldset submitButton="true" autoRun="false"> <input type="time" token="time1"> <label></label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> <input type="radio" token="exclude1" searchWhenChanged="true"> <label>Splunk System User</label> <choice value="user!=splunk-system-user">exclude</choice> <choice value="*">include</choice> <default>user!=splunk-system-user</default> <initialValue>user!=splunk-system-user</initialValue> </input> <input type="multiselect" token="user1"> <label>User:</label> <fieldForLabel>user1</fieldForLabel> <fieldForValue>user</fieldForValue> <search> <query>index=_audit action=search search!="'typeahead*" $exclude1$ | stats count by user</query> <earliest>$time1.earliest$</earliest> <latest>$time1.latest$</latest> </search> <choice value="*">all</choice> <default>*</default> <initialValue>*</initialValue> <delimiter> </delimiter> </input> <input type="text" token="filter1"> <label>Search Filter:</label> <default>*</default> <initialValue>*</initialValue> <prefix>"*</prefix> <suffix>*"</suffix> </input> </fieldset> <row> <panel> <table> <search> <query>index=_audit action=search search!="'typeahead*" user="$user1$" search=$filter1$ $exclude1$ | stats count by _time user search total_run_time search_id app event_count | sort -_time</query> <earliest>$time1.earliest$</earliest> <latest>$time1.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> </form>
Any advice on how to fix this command? I pulled it from GoSplunk "Show all successful Splunk configurations by user." This is on Splunk Enterprise. Below is my entered command and I am getting the ... See more...
Any advice on how to fix this command? I pulled it from GoSplunk "Show all successful Splunk configurations by user." This is on Splunk Enterprise. Below is my entered command and I am getting the error: Comparator '=' has an invalid term on the left hand side: host=object index=_audit action=edit* info=granted operation!=list host= object=* | transaction action user operation host maxspan=30s | stats values(action) as action values(object) as modified_object by _time,operation,user,host | rename user as modified_by | table _time action modified_object modified_by  
Hi I have SPL like below: index="myindex" user | rex field=source "\/data\/(?<product>\w+)\/(?<date>\d+)\/(?<server>\w+)" | search server=server1 as we know first search "user" work more quickl... See more...
Hi I have SPL like below: index="myindex" user | rex field=source "\/data\/(?<product>\w+)\/(?<date>\d+)\/(?<server>\w+)" | search server=server1 as we know first search "user" work more quickly but second one "server=server1" take long time specially on large data. is there any way to search "search server=server1" more efficient like "user"? Thanks
Hi, I'm looking for users that login into an application and reset the password at the same time . The logs involved are like this:   Login: 1.1.1.1 - - [31/May/2022:11:15:03 +0200] "POST /se... See more...
Hi, I'm looking for users that login into an application and reset the password at the same time . The logs involved are like this:   Login: 1.1.1.1 - - [31/May/2022:11:15:03 +0200] "POST /servlet/Login HTTP/1.1" 200.....   Pwd Change: 1.1.1.1 - - [31/May/2022:11:15:03 +0200] "GET /PasswordChange/ HTTP/1.1" 200 .......   IP: 1.1.1.1 action : /servlet/Login, /PasswordChange   Ip and action are already extracted, So I need something like if IP1=IP2 and time1=time2 and action1=login and action2=pwdchange.   Thanks in advance!  
Hi I have exactly two SPL, same date range, one with "tracnsaction" command another wirhout it. as you see in picture without transaction timechart show correctly but with transaction last part m... See more...
Hi I have exactly two SPL, same date range, one with "tracnsaction" command another wirhout it. as you see in picture without transaction timechart show correctly but with transaction last part missed! FYI: 1-I've check log file correctly indexed and available. 2-pair of eachtransaction availabe in log in  missing part. what happen here? any idea?   Thanks  
Problem: Timestamp format setting is ignored when sending request I have created SourceType "test" with settings Timestamp format: %s,%3N Timestamp fields: time Created HTTP Event Collector wi... See more...
Problem: Timestamp format setting is ignored when sending request I have created SourceType "test" with settings Timestamp format: %s,%3N Timestamp fields: time Created HTTP Event Collector with settings Source Type: test Restarted Splunk And when making a request http://banana:8088/services/collector/event/1.0 Body: {     "time":"1653643363,529",     "sourcetype": "test",     "event":{         "id":"1",         "severity":"Information",         "message":"Test",     } } Response with status 400 is returned "text": "Error in handling indexed fields", "code": 15, "invalid-event-number": 0 Why is timestamp format ignored (works with "." but not with ",")?
Hi, I have an event display problem when no events matching the conditions are found. I want to filter only those events that have the "DATA_LAVORAZIONE" (STC) field greater than "OGGI" up to 7 day... See more...
Hi, I have an event display problem when no events matching the conditions are found. I want to filter only those events that have the "DATA_LAVORAZIONE" (STC) field greater than "OGGI" up to 7 days ahead. In the AMPLIAMENTI sourcetype there are some events for which it returns the sum, while in the DIRETTA sourcetype there are no events, and it does not show me anything. I would like the row with all 0s to be displayed anyway. I tried with fillnull value = 0 field, field, field .... but it doesn't work. Also tried fulldown, but nothing. Do you have any suggestions? Thank you   CODE: index =DATI sourcetype = AMPLIAMENTI |fields - _* |eval OGGI=strftime(relative_time(now(),"-0d@d"), "%Y-%m-%d") |eval OGGI_1=strftime(relative_time(now(),"+1d@d"), "%Y-%m-%d") |eval OGGI_2=strftime(relative_time(now(),"+2d@d"), "%Y-%m-%d") |eval OGGI_3=strftime(relative_time(now(),"+3d@d"), "%Y-%m-%d") |eval OGGI_4=strftime(relative_time(now(),"+4d@d"), "%Y-%m-%d") |eval OGGI_5=strftime(relative_time(now(),"+5d@d"), "%Y-%m-%d") |eval OGGI_6=strftime(relative_time(now(),"+6d@d"), "%Y-%m-%d") |eval OGGI_7=strftime(relative_time(now(),"+7d@d"), "%Y-%m-%d") |eval STC=strftime(strptime(DATA_LAVORAZIONE, "%Y-%m-%d"), "%Y-%m-%d") |where STC > OGGI |eval X = if(STC=OGGI,1,0) |eval X+1 = if(STC=OGGI_1,1,0) |eval X+2 = if(STC=OGGI_2,1,0) |eval X+3 = if(STC=OGGI_3,1,0) |eval X+4 = if(STC=OGGI_4,1,0) |eval X+5 = if(STC=OGGI_5,1,0) |eval X+6 = if(STC=OGGI_6,1,0) |eval X+7 = if(STC=OGGI_7,1,0) |eval TOTALE=if(STC > OGGI AND STC <= OGGI_7,1,0) |eval TUTTI=if(STC > OGGI ,1,0) |sort - DATE_UPD, LINK |dedup LINK |where STATO IN("LAVORAZIONE", "CONFERMA DATA") |stats sum(X) as X, sum(X+1) as X+1,sum(X+2) as X+2, sum(X+3) as X+3,sum(X+4) as X+4,sum(X+5) as X+5, sum(X+6) as X+6,sum(X+7) as X+7, sum(TOTALE) as TOTALE,sum(TUTTI) as OVER |eval TIPOL ="AMPLIAMENTI" |table TIPOL X X+1 X+2 X+3 X+4 X+5 X+6 X+7 TOTALE OVER |append [ search index =DATI sourcetype = diretta |fields - _* |where TIPOLOGIA IN("SUBNET","VOCE") |eval OGGI=strftime(relative_time(now(),"-0d@d"), "%Y-%m-%d") |eval OGGI_1=strftime(relative_time(now(),"+1d@d"), "%Y-%m-%d") |eval OGGI_2=strftime(relative_time(now(),"+2d@d"), "%Y-%m-%d") |eval OGGI_3=strftime(relative_time(now(),"+3d@d"), "%Y-%m-%d") |eval OGGI_4=strftime(relative_time(now(),"+4d@d"), "%Y-%m-%d") |eval OGGI_5=strftime(relative_time(now(),"+5d@d"), "%Y-%m-%d") |eval OGGI_6=strftime(relative_time(now(),"+6d@d"), "%Y-%m-%d") |eval OGGI_7=strftime(relative_time(now(),"+7d@d"), "%Y-%m-%d") |eval STC=strftime(strptime(DATA_LAVORAZIONE, "%Y-%m-%d"), "%Y-%m-%d") |where STC > OGGI |eval X = if(STC=OGGI,1,0) |eval X+1 = if(STC=OGGI_1,1,0) |eval X+2 = if(STC=OGGI_2,1,0) |eval X+3 = if(STC=OGGI_3,1,0) |eval X+4 = if(STC=OGGI_4,1,0) |eval X+5 = if(STC=OGGI_5,1,0) |eval X+6 = if(STC=OGGI_6,1,0) |eval X+7 = if(STC=OGGI_7,1,0) |eval TOTALE=if(STC > OGGI AND STC <= OGGI_7,1,0) |eval TUTTI=if(STC > OGGI ,1,0) |sort - DATE_UPD, LINK |dedup LINK |where STATO IN("CONFERMA DATA") |stats sum(X) as X, sum(X+1) as X+1,sum(X+2) as X+2, sum(X+3) as X+3,sum(X+4) as X+4,sum(X+5) as X+5, sum(X+6) as X+6,sum(X+7) as X+7, sum(TOTALE) as TOTALE,sum(TUTTI) as OVER |eval TIPOL ="SUBNET - VOCE" | fillnull value=0 TIPOL X X+1 X+2 X+3 X+4 X+5 X+6 X+7 TOTALE OVER |table TIPOL X X+1 X+2 X+3 X+4 X+5 X+6 X+7 TOTALE OVER] . (others APPEND) . . |table TIPOL X X+1 X+2 X+3 X+4 X+5 X+6 X+7 TOTALE OVER   RESULT: TIPOL                                  X    X+1   X+2    X+3 ........ TOTAL   OVER AMPLIAMENTI                0       2       1          0     .......       3            3   DESIRED:  TIPOL                                  X    X+1   X+2    X+3 ........ TOTAL   OVER AMPLIAMENTI                0       2       1          0     .......       3            3 SUBNET - VOCE             0      0        0          0  .........       0            0 TKS
Hello, Good Day! I having the values in the field Data As shown below 2022-05-31 10:18:09   emea   2022-05-31 2022-05-31 10:18:14    apac  2022-05-31 2022-05-31 1... See more...
Hello, Good Day! I having the values in the field Data As shown below 2022-05-31 10:18:09   emea   2022-05-31 2022-05-31 10:18:14    apac  2022-05-31 2022-05-31 10:18:20     us  I want to show the time zone as well like if emea comes after time it should show CST Output should be as follows: 2022-05-31 10:18:09 CST  emea   2022-05-31 2022-05-31 10:18:14 HKT   apac  2022-05-31 2022-05-31 10:18:20  EDT   us  Please help me on this Thank you in Advance Veeru
Hi Spunkers, I have a request by customer never faced before. For one particular Data Model, the Email one, it is required that certaine filed are always populated, even if the logs have this fields... See more...
Hi Spunkers, I have a request by customer never faced before. For one particular Data Model, the Email one, it is required that certaine filed are always populated, even if the logs have this fields empty and/or are not present. So for example it is required that the field subject is always filled; of course, if subject is not present in events, we have to fill it with a token, like the fillnullvalue function does. The particular part is that the customer required that this filling is performed not at search time, with a fillnull command in search, but by the Data Model itself; so, for example, if a log from mail server arrive and it not contain the subject field and/or it is not populated, the DM must fill it with a token value and so, when a search is executed, subject will be already filled with this token. My question is: is this possible to perform?