All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Are you running splunk as root or some other user? Use root is against security practices! If you are running it as splunk, you should also check btool with that user. Otherwise there is small possib... See more...
Are you running splunk as root or some other user? Use root is against security practices! If you are running it as splunk, you should also check btool with that user. Otherwise there is small possibility that those files are owned by root and splunk user haven’t read access to those. Another option is that some options can set only in …/system/local. Unfortunately you cannot use DS to deploy those configuration into it. Maybe it’s best to rise Spunk support case for it!
I have a custom command that calls a script for nslookup and returns the data to splunk. All of it is working but I want to use this custom command in Splunk to return the data to an eval and output ... See more...
I have a custom command that calls a script for nslookup and returns the data to splunk. All of it is working but I want to use this custom command in Splunk to return the data to an eval and output that into a table. For example, the search string would look something like the following:    index="*" | iplocation src_ip | eval testdata = | nslookupsearch dest_ip | table testdata _time | sort - _time   NOTE: This is not the exact search string, this is just a mock string. When I run:   | nslookupsearch Record_Here   I get the correct output and data that I want to see. But when I run the command to attach the returned value to an eval, it fails. I keep getting errors on doing this but I can't find something that will work like this. The testdata eval keeps failing. 
HI All, I am new to using Splunk.  I am uploading a CSV to Splunk that has a column called 'Transaction Date' with the entries in DD/MM/YYYY format as shown below. At the Set Source Type step ... See more...
HI All, I am new to using Splunk.  I am uploading a CSV to Splunk that has a column called 'Transaction Date' with the entries in DD/MM/YYYY format as shown below. At the Set Source Type step I have updated the timestamp format to avoid getting the default modtime. I have updated it with %d/%m/%Y as shown below. This partly works as my '_time' field no longer shows the default modtime. However it shows the date in the incorrect format of MM/DD/YYYY instead of DD/MM/YYYY. (also shown below)     Everything else I have left as default. These are my advanced settings: Any Ideas how I can fix this to display the correct format?  Thank you!
Searching for "W" or "E" will return a lot of noise.  That's why my suggested query included spaces around each letter - the goal being to find the isolated severity codes.
Have you tried math? index=net Model=ERT-SCM EM_ID=Redacted | stats count by Consumption | eval Consumption = exact(Consumption/100)  
Pretty green with SOAR and haven't been able to find an good answer to this. All of our events in SOAR are generated by pulling them in from Splunk ES.  This creates one artifact for each event.  I'... See more...
Pretty green with SOAR and haven't been able to find an good answer to this. All of our events in SOAR are generated by pulling them in from Splunk ES.  This creates one artifact for each event.  I'm looking for a way to extract data from that artifact so we can start using and labeling that data. Am I missing something here?  I haven't found much in the way of training on the data extraction part of this, so any tips for that would be great too.  
Hello, I have a 4 servers A, B C, & D. These servers points to two different DS. A & B points to US DS server, C & D servers points to UK DS Server. I'm selecting these 4 servers in an multise... See more...
Hello, I have a 4 servers A, B C, & D. These servers points to two different DS. A & B points to US DS server, C & D servers points to UK DS Server. I'm selecting these 4 servers in an multiselect value and it has to show two different panels. (hide initially) But, If i select only A & B it has show only US DS panel. (I don't want to show the DS values in the input values.  
Hello Thank you for your answer. I tried your command and I have got: root@MSVMSLMCLM01:/opt/splunk/bin# ./splunk btool alert_actions list --debug | grep allowed /opt/splunk/etc/apps/setSplunkComm... See more...
Hello Thank you for your answer. I tried your command and I have got: root@MSVMSLMCLM01:/opt/splunk/bin# ./splunk btool alert_actions list --debug | grep allowed /opt/splunk/etc/apps/setSplunkCommonConfig/default/alert_actions.conf allowedDomainList = domain.sk root@MSVMSLMCLM01:/opt/splunk/bin# ./splunk btool alert_actions list --debug | grep from /opt/splunk/etc/apps/setSplunkCommonConfig/default/alert_actions.conf from = splunk@domain.sk So this looks like settings are used from correct file, file from pushed application. But when I check web on this machine, those values are empty: Any idea?  
You know there is a field alias feature in Splunk, too.  That is a more appropriate solution if you do really want to search by a different name.  An extra lookup is clunky and also a compute cost. ... See more...
You know there is a field alias feature in Splunk, too.  That is a more appropriate solution if you do really want to search by a different name.  An extra lookup is clunky and also a compute cost. Go to Settings -> Fields -> Field aliases.  
The problem I am having is the raw data looks like this:  "[8/8/24 13:37:46:622 EDT] 00007e14 HOSTEDWIRES** I ************" What I am trying to do is do a search on the raw data find the "W" and "E"... See more...
The problem I am having is the raw data looks like this:  "[8/8/24 13:37:46:622 EDT] 00007e14 HOSTEDWIRES** I ************" What I am trying to do is do a search on the raw data find the "W" and "E" The problem I am having is the raw data looks like this:  "[8/8/24 13:37:46:622 EDT] 00007e14 HOSTEDWIRES** W ************" or The problem I am having is the raw data looks like this:  "[8/8/24 13:37:46:622 EDT] 00007e14 HOSTEDWIRES** E ************" A basic search I am using: (Sorry, I had to obfuscate some of the SPL. index="index" host IN ("Server 1","Server 2","Backup Server 1","Backup Server 2") source=* sourcetype=###_was_systemout_log | ("W" or "E") In WebSphere SystemOut logs, the warning or error indicator comes after the timestamp and application type.  So, when I search for just ("W" or "E") it will pull everything that has "W" "E" in the text.  How do I isolate it to search for that after the application type, and before the transaction raw data?  I don't get to play with Splunk that much, so this is beyond my skill level.  I am still learning.  Thanks again for the help.
Hi If this is your own custom app, then just update splunklib from git as you have originally installed it. There are instructions on dev.splunk.com how to use splunklib on your own apps. If it’s ma... See more...
Hi If this is your own custom app, then just update splunklib from git as you have originally installed it. There are instructions on dev.splunk.com how to use splunklib on your own apps. If it’s made by someone else, then ask that owner will update it or just create your own version and update it as described on previous item. r. Ismo
Hello @yuanliu  I tested and it worked fine for the sample. I accepted your suggestion as solution. Thank you for your help. 1) Max 50k rows When I tested with the real data, I found out that ... See more...
Hello @yuanliu  I tested and it worked fine for the sample. I accepted your suggestion as solution. Thank you for your help. 1) Max 50k rows When I tested with the real data, I found out that the sub search CSV file is limited to 50k rows.  I need the CSV file as my baseline for left join, so if the file has 100k rows, then the expected result after left join is 100k rows (with additional column from index). a) What do you suggest to fix this issue?    (modifying limits.conf is not allowed) b) Will splitting the CSV work? 2) Join command Do you think join command can work on my case? I tested it your solution using join in real data, but it always gave me the result as inner join, instead of left join,  although I already specified join type=left In the solution you provided index will be treated as left data because it's specified first How do I make the CSV as left data?      I appreciate your help. Thanks
I am trying to create a dashboard that uses a search that has a 6 digit number but need the a decimal on the last 2 numbers.  This is the result I get. index=net Model=ERT-SCM EM_ID=Redacted | st... See more...
I am trying to create a dashboard that uses a search that has a 6 digit number but need the a decimal on the last 2 numbers.  This is the result I get. index=net Model=ERT-SCM EM_ID=Redacted | stats count by Consumption 199486 I would like it shown like this. 1994.86 Kwh I have tried this but only gives me the last 2 numbers with a decimal | rex mode=sed field=Consumption "s/(\\d{4})/./g"
Login to any of those servers and use  splunk btool alert_actions list --debug In this way you see from which file each setting is coming. I’m not sure, but there could be some settings in this co... See more...
Login to any of those servers and use  splunk btool alert_actions list --debug In this way you see from which file each setting is coming. I’m not sure, but there could be some settings in this config which are working only from …/system/local or at least that was case on older versions (6.x and 7.x)? r. Ismo
Hi @., You need Controller version 24.6 or higher. You are on an older Controller version. Can you try upgrading to 24.6?
The most important thing is to determine which index (not index*er*) holds the WebSphere logs.  That will narrow the scope of your search. Once you have that information, you can begin your search. ... See more...
The most important thing is to determine which index (not index*er*) holds the WebSphere logs.  That will narrow the scope of your search. Once you have that information, you can begin your search.  Start with " W " and " E ".  Those aren't great strings for searching, but they're a start.  As you receive results, use what you find to add to the search string until have have what you want. index=websphere (" W " OR " E ")
I tried to use "customized in source" option in Splunk Cloud (9.1.2312.203) Dashboard Studio to create a Single Value which background color is controlled by search result. However the code does no... See more...
I tried to use "customized in source" option in Splunk Cloud (9.1.2312.203) Dashboard Studio to create a Single Value which background color is controlled by search result. However the code does not work.  The same code below is tested with statics option which works well. Below is Dashboard JSON { "visualizations": { "viz_74mllhEE": { "type": "splunk.singlevalue", "options": { "majorValue": "> sparklineValues | lastPoint()", "trendValue": "> sparklineValues | delta(-2)", "sparklineValues": "> primary | seriesByName('background_color')", "sparklineDisplay": "off", "trendDisplay": "off", "majorColor": "#0877a6", "backgroundColor": "> primary | seriesByName('background_color')" }, "dataSources": { "primary": "ds_00saKHxb" } } }, "dataSources": { "ds_00saKHxb": { "type": "ds.search", "options": { "query": "| makeresults \n| eval background_color=\"#53a051\"\n" }, "name": "Search_1" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" } }, "layout": { "type": "absolute", "options": { "width": 1440, "height": 960, "display": "auto" }, "structure": [ { "item": "viz_74mllhEE", "type": "block", "position": { "x": 0, "y": 0, "w": 250, "h": 250 } } ], "globalInputs": [ "input_global_trp" ] }, "description": "", "title": "ztli_test" }
There are two separate challenges one about transform the presentation, the other getting the header into the desired order.  Here is my crack at it.  To begin, you need to extract TransID and the ma... See more...
There are two separate challenges one about transform the presentation, the other getting the header into the desired order.  Here is my crack at it.  To begin, you need to extract TransID and the marker "Start time" or "End time".  How you do it is up to you because the data illustrated doesn't seem to be the raw format, at least not the timestamp.  I will take the illustrated format literally.   | rex "(?<time>\S+) (?<TransID>\S+) \"(?<marker>[^\"]+)" | streamstats count by marker | eval marker = marker . count | xyseries TransID marker time | transpose 0 header_field=TransID | eval order = if(column LIKE "Start%", 1, 2) | eval sequence = replace(column, ".+Time", "") | sort sequence order | fields - sequence order | transpose 0 header_field=column column_name=TransID   So, the bigger challenge is to get desired order of headers.  I have to expense two tranposes.  If you do not need that strict order, things are much simpler.  Output from your mock data is TransID Start Time1 End Time1 Start Time2 End Time2 Start Time3 End Time3 0123 8:00 8:01 8:30 8:31 9:00 9:01 Here is an emulation for you to play with and compare with real data   | makeresults format=csv data="_raw 8:00 0123 \"Start Time\" 8:01 0123 \"End Time\" 8:30 0123 \"Start Time\" 8:31 0123 \"End Time\" 9:00 0123 \"Start Time\" 9:01 0123 \"End Time\"" ``` the above emulates index=<app> "Start Time" OR "End Time" ```    
Thank you @isoutamo for your reply. I will look into the tool.  
Hi have you try this https://splunkbase.splunk.com/app/3757 ? Of course if the issue is that Azure has this internal delays there is nothing that could fixed by integrations. If this is the issue, ... See more...
Hi have you try this https://splunkbase.splunk.com/app/3757 ? Of course if the issue is that Azure has this internal delays there is nothing that could fixed by integrations. If this is the issue, then you should contact to Azure support and ask from them if there are any workarounds for it. r. Ismo