All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

The case function seems to finding the first true statement and displays that value. Is there another function or different way to use case to get the results i want below? There are different events... See more...
The case function seems to finding the first true statement and displays that value. Is there another function or different way to use case to get the results i want below? There are different events with similar features below, want something that would use for all different scenarios of when something is true or false. Event Fields i am filtering on: vpn=true proxy=false tor=true What im using: | eval anon= case(vpn="true", "vpn" , proxy="true", "proxy", tor="true", "tor") results im getting: anon=vpn  results i want: anon=vpn              tor thanks for any help!
I have a simple CSV file input on a Windows UF with a header of field names in the top row. The file is overwritten daily with the same name. When I delete the file and restart Splunk, the header row... See more...
I have a simple CSV file input on a Windows UF with a header of field names in the top row. The file is overwritten daily with the same name. When I delete the file and restart Splunk, the header row is ignored as expected. But if the UF (v8.0.5) is restarted, the header row will start being indexed. This will continue until I delete the file and restart the Splunk process, when it will begin ignoring the header row again (until the Splunk process is restarted). My goal is to always ignore the first line of the file, regardless of whether the Splunk process is restarted. Here is the current iteration of our props.conf. I'm not locked into this config, but I've tried many different combinations and can't seem to find the right one. Any suggestions on what to tweak?   [crowdstrike:metrics:cicoverage] SHOULD_LINEMERGE = false LINE_BREAKER = ([\r\n]+) CHARSET = UTF-8 INDEXED_EXTRACTIONS = csv KV_MODE = none disabled = false HEADER_FIELD_LINE_NUMBER = 1 TZ = UTC CHECK_FOR_HEADER = true    
I just installed splunk and imported my license. I have a series of Windows event viewer files that have been exported that I want to import.   I have tried the following: Settings --> Add Data ... See more...
I just installed splunk and imported my license. I have a series of Windows event viewer files that have been exported that I want to import.   I have tried the following: Settings --> Add Data Upload Files From My computer Select the file.  It reads the file. Next Select Preprocess-winevt Next Review Submit Start Searching No events are shown.   What am I doing wrong?
Hello, I have some difficulties to ingest properly logs from rotated file, where the rotation is fully handled by an application, without any settings about its log files. In some cases, it rotates... See more...
Hello, I have some difficulties to ingest properly logs from rotated file, where the rotation is fully handled by an application, without any settings about its log files. In some cases, it rotates the log file in the middle of an event, and before the timestamp. Like this: file.log.1 [category][event][reason][host][timestamp]... [category][event][reason][host][timestamp]... [category][event][reason] Note: there is no new line or carriage return at tne end of the file file.log [host][timestamp]... [category][event][reason][host][timestamp]... [category][event][reason][host][timestamp]... I have some events with an additional line in plain text (no brackets), so at first I let the line merger do its job, but for the splitted line between the 2 log files, it does not work as the first line does not contain the timestamp, and it is added to the previous event. So, I disabled line merging and started to build a line breaker, resulting in this: (?:\[[^]]*\][\r\n]*){24}\[[^]]*\](?:[^\[\]]*)([\r\n]+) This is because the log line is made of 25 blocks between brackets, and optionally a second line. It works very good on a test file when the splitted line is in the same file, but once the 2 parts of the line are in different files ... it does not work anymore Do you have any advice on how I could handle that ?
I have a set of data with X categories and each category is getting measured (measurements are positive or negative double) any period between 1s and 5m. I want to look at a time series chart for th... See more...
I have a set of data with X categories and each category is getting measured (measurements are positive or negative double) any period between 1s and 5m. I want to look at a time series chart for the MEASUREMENTS by each CATEGORY and then one step farther to use control limits (for simplicity sake +-1 standard deviation of the 1hr moving average)
We are in the process of planning the Windows 20H2 update. Is the current version of Splunk UF compatible with this latest version of Windows? 
I have a folder with file generated once a day  I would like to index all files event the files have the some content  for example  1_x.csv 2_x.csv  . . . I would like to index both files eve... See more...
I have a folder with file generated once a day  I would like to index all files event the files have the some content  for example  1_x.csv 2_x.csv  . . . I would like to index both files even they are identical  the below is the input  [monitor://\\ntnet\filestore1\information_security$\OS_Security_Splunk\*\...\*] disabled = false index = os_security sourcetype = csv_current_time crcSalt = <SOURCE> initCrcLength = 1024 recursive = true whitelist = \.csv$
Hello, I am trying to assign a value from one field to all earlier instances of a field until a non-null value is met. So for example, my events may appear like this (for a particular item id): 202... See more...
Hello, I am trying to assign a value from one field to all earlier instances of a field until a non-null value is met. So for example, my events may appear like this (for a particular item id): 2021-05-11 09:18:13 ItemId:R231-1993 Moved to Location:M45-1 2021-05-11 09:16:48 ItemId:R231-1993 Retrieved from Location:T97-1 2021-05-11 09:16:17 ItemId:R231-1993 is active 2021-05-11 09:15:02 ItemId:R231-1993 is active 2021-05-11 09:14:13 ItemId:R231-1993 is active 2021-05-11 07:56:12 ItemId:R231-1993 Moved to Location:T97-1 2021-05-11 07:54:23 ItemId:R231-1993 is active 2021-05-11 07:53:41 ItemId:R231-1993 Retrieved from Location:D14-2 2021-05-11 07:52:13 ItemId:R231-1993 is active 2021-05-11 07:51:39 ItemId:R231-1993 is active   Using rex I am able to extract the Location from the events with "Retrieved" to signify the initial starting point. I want to be able to assign the value of that Location to all other events before it until another "Retrieved" event is met. (Essentially, what I'm trying to do is figure out where the initial location is for an item when it first become 'active') In the case above, I would want all of the events from 07:51:39 to 07:53:41 to have a Location of "D14-2" and for all event from 07:54:23 to 09:16:48 to have a Location of "T97-1". Is there a way to do this?
Hi We are having a severe issue with DEADLOCKS on our SQL DB . I mean DEADLOCKS of Data Base itself and not the application.   what is the way to investigate it with AppDynamics ? Any suggestion o... See more...
Hi We are having a severe issue with DEADLOCKS on our SQL DB . I mean DEADLOCKS of Data Base itself and not the application.   what is the way to investigate it with AppDynamics ? Any suggestion or reference to any source of information on this theme will help. Thank you! 
Hi,   Basically I need to find out when some old service accounts were last used/if they have ever been used. We have 1000's and would like a report string that would report based on all accounts f... See more...
Hi,   Basically I need to find out when some old service accounts were last used/if they have ever been used. We have 1000's and would like a report string that would report based on all accounts found in a particular OU. I have one for searching specific accounts but copy and pasting all the account names is very tedious.   This is what I have for searching accounts:   index=wineventlog source="WinEventLog:Security" Account_Name=redbox.service host!=DOMAIN, host!=DOMAIN, host!=DOMAIN, host!=DOMAIN* | stats count by Account_Name, host I am very new to splunk so any suggestions would be much appreciated. If you know of a better way to do this then feel free to let me know!  
Hi, I have the field Queue in my dataset with pattern as follows: adcams01 adcams02 adcems05 I would like to create a new column in my table which contains 3 letters after adc. For example: ... See more...
Hi, I have the field Queue in my dataset with pattern as follows: adcams01 adcams02 adcems05 I would like to create a new column in my table which contains 3 letters after adc. For example: Queue: Site adcams01 ams adcams02 ams adcems05 ems Is it possible using regex? I know it should include eval but it didn't work for me. Thank you
Hi, Using Splunk 7.3.3. I tried to find that information in the monitoring console without success. then I tried with the following queries but I dont see big numbers for some unknown reason.   ... See more...
Hi, Using Splunk 7.3.3. I tried to find that information in the monitoring console without success. then I tried with the following queries but I dont see big numbers for some unknown reason.     index=* | eval size=len(_raw) | eval GB=(size/1024/1024/1024) | stats sum(GB) by index index=* | eval size=len(_raw) | eval GB=(size/1024/1024/1024) | timechart span=1d sum(GB) by index   I set the time frame to the last 3 days in Fast mode. Any better way to identify the "guilty" input which is consuming the license? thanks      
Hi Splunkers, I'm here again asking for help with the alert manager app. I'm trying the "auto-resolve" feature combined with "append incident with the same title". I would like that all incidents ... See more...
Hi Splunkers, I'm here again asking for help with the alert manager app. I'm trying the "auto-resolve" feature combined with "append incident with the same title". I would like that all incidents with new appended events to be automatically closed at time "last_event + ttl" What I'm seeing now is an automatic closure at time "open time + ttl" even if there are new events for the same incident. Here below a simple example: Auto-close = enabled Append new incidents = enabled Search = my search TTL = 11m Incident creation time = 13:00:00  Appended events time = 13:05:00 , 13:10:00 Auto close time = 13:00:00 + 11m = 13:11:00 Desidered auto close time = 13:10:00 + 11m = 13:21:00 Thanks in advance for your support.
Hi All, I am a newbie in Splunk world and looking for some help in structuring my query. I have an index with data like this -   index=sec_sso sourcetype=sso_insa earlist=-1d@d latest=now | eval... See more...
Hi All, I am a newbie in Splunk world and looking for some help in structuring my query. I have an index with data like this -   index=sec_sso sourcetype=sso_insa earlist=-1d@d latest=now | eval Day=if(_time<relative_time(now(), "@d"), "Yesterday", "Today") | eventstats count by EMINO DPRTM_NAME | search count=1  | table EMINO DPRTM_NAME Day   If there is only today's value (new registration), I want to change this query to exclude that value. 
I have added 2 JS file in my dashboard XML. Want to override a on click(button) function defined in JS-1 in JS-2 and want the dashboard to call the overridden function. The first JS is common one use... See more...
I have added 2 JS file in my dashboard XML. Want to override a on click(button) function defined in JS-1 in JS-2 and want the dashboard to call the overridden function. The first JS is common one used by many apps, so can not change that one.  I am basically setting the time picker token in both functions with different token value. On clicking it is calling  both functions as it is printing both "In JS1" and "In JS2". But timepicker token is set to the first JS value. Have tried the following to override. JS1 In JS1     $(".a-btn").click(function () { console.log("In def JS 1"); // Code for setting time picker token to -30m }      In JS2     var tmp_fun = $.fn.a-btn; // Have tried without this as well $(".a-btn").click(function () { console.log("In def JS 2"); // Code for setting the time picker token to -60m }      Please suggest
1. write python code on the dltk container jupyternotebook 2. run splunk query | fit MLTKContainer algo= ~~ after i get error:  /fit: ERROR: unable to load algo code from module.  3. check conver... See more...
1. write python code on the dltk container jupyternotebook 2. run splunk query | fit MLTKContainer algo= ~~ after i get error:  /fit: ERROR: unable to load algo code from module.  3. check convert ipynb to py : not working, py file is empty... why not working convert ipynb to py???  ipynb run very well, not error ipynb have stage, init, fit, apply, save, load and summary method      
Hi, I am getting below error while uploading a csv in lookup table file.   Your entry was not saved. The following error was reported: SyntaxError: Unexpected token < in JSON at position 0.
Good morning, We are trying to use a kvstore to store data when performing a query to later query it in a dashboard. The kvstore has the following data: Subcontrols | Value1 | Value2 1.1         ... See more...
Good morning, We are trying to use a kvstore to store data when performing a query to later query it in a dashboard. The kvstore has the following data: Subcontrols | Value1 | Value2 1.1                    | 100       | 99 1.2                    | 200       | 80 1.3                    |99           | 98 Reviewing the documentation and following the examples we can enter a number manually in the query and change the value using a | eval : | inputlookup ciskvstore | eval key=_key | where SubControls="1.1" | eval Value2=526 | outputlookup ciskvstore append=True And the result would be the following: Subcontrols | Value1 | Value2 1.1                    | 100       | 526 1.2                    | 200       | 80 1.3                    |99           | 98   The problem appears when we try to update the Value2 field of a Subcontrol from another query: EX: | inputlookup ciskvstore append=true | where SubControls="1.1" | append [| search index=paloalto sourcetype="pan:threat" | stats count as Value2 ] | outputlookup ciskvstore append=true The result in the kvstore would be the following: Subcontrols | Value1 | Value2 1.1                    | 100       | 526 1.2                    | 200       | 80 1.3                    |99           | 98                                             | 396 Could someone help me and tell me how to correctly perform the query so that from another query I can write the Value2 field of a specific Subcontrol please?   Thank you very much in advance,
So what I have now from my search so far Product     Status    Time A                   Start        8.00 AM A                    A1            8.05 AM A                    A2            8.15 AM ... See more...
So what I have now from my search so far Product     Status    Time A                   Start        8.00 AM A                    A1            8.05 AM A                    A2            8.15 AM A                    End          8.20 AM Is there anyway I can get the duration (End - Start) = 20 minutes and then display as Duration for Product A as 20 minutes?
Hi I have a search that his the Field of Username and 3 sample values Username  Bob Marley Peter, Sammy Dolphin Green, Larry Macy Jr, I need help with the rex syntax that keeps the first and m... See more...
Hi I have a search that his the Field of Username and 3 sample values Username  Bob Marley Peter, Sammy Dolphin Green, Larry Macy Jr, I need help with the rex syntax that keeps the first and middle name separate, but joins the middle and last name while dropping the apostophe In essence, i want the end result to be like this: Username2 Bob MarleyPeter Sammy DolphinGreen Larry MacyJr Please help correct this syntax: | rex field=blah "(?i)username=(?<username2>[^,]+) | table username2