All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi @gcusello , We've been asked to upgrade our existing Splunk version(7.1.3) to 8.1. So for that we are now upgrading our apps that are not compatible with the version 8.1.x. I've started with the... See more...
Hi @gcusello , We've been asked to upgrade our existing Splunk version(7.1.3) to 8.1. So for that we are now upgrading our apps that are not compatible with the version 8.1.x. I've started with the Symantec App. Its current version is 1.0.3 which is NOT supported by Splunk now and I seek your help to upgrade this app( I've downloaded its higher version ) but I am stuck as I don't know how to configure . Regards, Rahul Gupta   
Hi Splunkers! I currently use 3 indexers in order to ingest my data and respond to search jobs. We use ES in our deployment. My indexers' hardware is 3 DL38 G7 with 12 physical core and 128GB of RA... See more...
Hi Splunkers! I currently use 3 indexers in order to ingest my data and respond to search jobs. We use ES in our deployment. My indexers' hardware is 3 DL38 G7 with 12 physical core and 128GB of RAM. The daily ingested data is 500GB/day although sometimes the ingested data was been over 1.5 TB/day! but this has happened very few times. I have problems with my ES as the correlation searches always get delayed because of the lack of CPU on my indexers. Now my company has decided to upgrade the indexers. They suggest me 2 DL560 G10 with 192 physical core and 1.5TB of RAM. That's great! Isn't it? My only concern is that in my current deployment if one of my indexers goes down I have 2 indexers to ingest data and responding to search jobs but if I replace the old servers with new servers then if one of my indexers goes down I just have one indexer. So what's your professional recommendation in my circumstances?  
Hi I am trying to search two strings in message like "Stopped successfully" and "connected" from 6 host names. Please help me am writing like below Source="WinEventlog:applicaiton" |rex "message... See more...
Hi I am trying to search two strings in message like "Stopped successfully" and "connected" from 6 host names. Please help me am writing like below Source="WinEventlog:applicaiton" |rex "message\s(?<message>.*).*" |search host like "host1" OR host Like "host2" | search message="stopped succesfully" OR "Connected" |table _time, host, message
I have a single algorithm with 2 methods. Each method produces the same type of data but with different fields names to keep them separated. The dashboard charts depend on which method the user selec... See more...
I have a single algorithm with 2 methods. Each method produces the same type of data but with different fields names to keep them separated. The dashboard charts depend on which method the user selects in a menu. Essentially I create interim results for both methods but desire to change the names to the field names used in the subsequent code. [Q] What is a more efficient method of performing the "Big Switch" in the run anywhere code below?   | makeresults 5 | rename comment AS "-----------------------------------------------------------------" | rename comment AS "User Menu Selection" | eval switch="A" | rename comment AS "-----------------------------------------------------------------" | rename comment AS "Algorithm element2" | eval calcMethod1_field1="1" | eval calcMethod1_field2=2 | eval calcMethod1_field3=3 | eval calcMethod1_field4=4 | eval calcMethod1_field5=5 | rename comment AS "-----------------------------------------------------------------" | rename comment AS "Algorithm element2" | eval calcMethod2_field1="1sub" | eval calcMethod2_field2="2sub" | eval calcMethod2_field3="3sub" | eval calcMethod2_field4="4sub" | eval calcMethod2_field5="5sub" | rename comment AS "-----------------------------------------------------------------" | rename comment AS " Big Switch " | rename comment AS "-----------------------------------------------------------------" | rename comment AS "This is the big switch before entering a stats command" | rename comment AS "Intent is to rename several fields depending on switch value" | eval fieldnameforstats_field1=case(switch=="A",calcMethod1_field1,switch=="B",calcMethod2_field1) | eval fieldnameforstats_field2=case(switch=="A",calcMethod1_field2,switch=="B",calcMethod2_field2) | eval fieldnameforstats_field3=case(switch=="A",calcMethod1_field3,switch=="B",calcMethod2_field3) | eval fieldnameforstats_field4=case(switch=="A",calcMethod1_field4,switch=="B",calcMethod2_field4) | eval fieldnameforstats_field5=case(switch=="A",calcMethod1_field5,switch=="B",calcMethod2_field5) | fields - _time | table fieldnameforstats_field*  
I have created a multiselect dashboard which has a City and Address. If I select ABCadd from City i can see multiple addresses in Address, but when i add another BBCadd from City which now have two s... See more...
I have created a multiselect dashboard which has a City and Address. If I select ABCadd from City i can see multiple addresses in Address, but when i add another BBCadd from City which now have two selection in 1st multiselect, i can only see 1st selected City addresses but when I select ALL and search i can see both two address. I want to have both selection shown in second multiselect where i can select only wanted address. I am using csv file and inputlookup for multiselect. here is my xml.  <input type="multiselect" token="City " searchWhenChanged="true"> <label>City </label> <fieldForLabel>City </fieldForLabel> <fieldForValue>City </fieldForValue> <search> <query>|inputlookup aaa.csv | dedup City | table City </query> <earliest>-15m</earliest> <latest>now</latest> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter> OR </delimiter> <choice value="*">All</choice> </input> <input type="multiselect" token="ShopName" searchWhenChanged="true"> <label>Shop Name</label> <choice value="*">ALL</choice> <fieldForLabel>ShopName</fieldForLabel> <fieldForValue>ShopName</fieldForValue> <search> <query>|inputlookup aaa.csv | search City =$City$| dedup ShopName| table ShopName</query> <earliest>-15m</earliest> <latest>now</latest> </search> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter> OR </delimiter> </input> <search> <query>index=*  City =$City $ ShopName=$ShopName$ |stats values(ShopName) as ShopName, count by Address</query> <earliest>$Timer.earliest$</earliest> <latest>$Timer.latest$</latest> </search>
in search, w/ rex command I can specify which field I want to apply the Regex as following example | rex field=event "My Custom regex...." But if I want to register the same regex in Field Extracti... See more...
in search, w/ rex command I can specify which field I want to apply the Regex as following example | rex field=event "My Custom regex...." But if I want to register the same regex in Field Extraction option (to have it reusable object w/ my team) I don't see any option to specify the field. I assume it register it to entire _raw as default.  Any idea if I can specify the field when I create a Field with "Field Extraction" ?
Hi All,  I have a use case to align two stacked graphs side by side. So, there are 4 columns with values for any particular date. X Axis will have the date values. There should be 2 bars for each d... See more...
Hi All,  I have a use case to align two stacked graphs side by side. So, there are 4 columns with values for any particular date. X Axis will have the date values. There should be 2 bars for each date and each bar has 2 columns' data stacked. I hope I have made the question clear, Can someone please help.  Thanks in advance!
Hi How can I parse iso 8583 messages in Splunk? Here is the sample iso 8583 message that exist in my log: 10:10:00 Message [0200323A40010841801038000000000000000004200508050113921208050420042... See more...
Hi How can I parse iso 8583 messages in Splunk? Here is the sample iso 8583 message that exist in my log: 10:10:00 Message [0200323A40010841801038000000000000000004200508050113921208050420042251320720000010000001156040800411 01251146333156336000299] I want to parse message and extract fields in it. Like this website that parse sample message https://licklider.cl/services/financial/iso8583parser/ more info: https://en.wikipedia.org/wiki/ISO_8583 http://www.lytsing.org/downloads/iso8583.pdf any idea? Thanks
sourcetype=cp_log action!=Drop OR action!=Reject OR action!=dropped  I  am socked ,when i am searching with above query in Splunk search for my checkpoint logs  .it showing me Drop traffic ,although... See more...
sourcetype=cp_log action!=Drop OR action!=Reject OR action!=dropped  I  am socked ,when i am searching with above query in Splunk search for my checkpoint logs  .it showing me Drop traffic ,although i have clearly mentioned in query that i don't need Drop traffic(action!=Drop) Kindly help me on this!
Hi,  I have data that looks like this (as you can see user_id 9 has filled numerous rows). This is just a csv ingested and being searched via lookup user_id meta_key meta_value 9 nickname ... See more...
Hi,  I have data that looks like this (as you can see user_id 9 has filled numerous rows). This is just a csv ingested and being searched via lookup user_id meta_key meta_value 9 nickname 341 9 first_name Gilda 9 last_name Lilia 9 description   9 rich_editing TRUE 9 syntax_highlighting TRUE 9 comment_shortcuts FALSE 9 bz_last_active 202024300 9 _sd_last_login 2251532 9 _jackqueline_persistent_cart_1 a:1:{s:4:"cart";a:0:{}} 9 _order_count 0 9 new_users_id XM00360 9 antonetta a:0:{} 9 rank_on_departure TAD 0fr Class 9 phone_number 12003601 9 add_love 1/120 CARSON ROAD 9 last_name_01 Lashawnda 9 christina_code 1100 9 last_name_01_05 Wendolyn 9 birth_date 05/00/0451 9 country Stephania 9 join_date 13/02/2003 9 Date_left 1/05/2010 9 full_name gilda lilia 9 email gilda_lilia@outlook.com   I really want the output to look like this, where the items in red, come on a single row linked to the unique user_id. At this stage i have over 3600 unique users, and on average, there are about 40 rows per user. user_id first_name last_name new_users_id rank_on_departure add_love last_name_01 christina_code last_name_01_05 country email 9 Gilda Lilia XM00360 TAD 0fr Class 1/120 CARSON ROAD Lashawnda 1100 Wendolyn Stephania gilda_lilia@outlook.com
hi  i want to show Splunk report on a Confluence page. this macro on atlassian market able us make rest api request in pages : https://marketplace.atlassian.com/apps/1211199/pocketquery-for-conflue... See more...
hi  i want to show Splunk report on a Confluence page. this macro on atlassian market able us make rest api request in pages : https://marketplace.atlassian.com/apps/1211199/pocketquery-for-confluence-sql-rest?hosting=cloud&tab=overview   Now I have report that show top errors of web application as pie chart in Splunk. Name of report is “top-errors-week” how can I show this on Confluence page? FYI: this is a dynamic chart and every time I change report date I expect automatically apply on Confluence chart too. any idea?  Thanks 
There are various event codes like eventID = "123" , eventID ="456", eventID = "789" . There are some "appID"   fields that occurs in both eventID = "123"  AND eventID ="456"  (not all "appID" occur ... See more...
There are various event codes like eventID = "123" , eventID ="456", eventID = "789" . There are some "appID"   fields that occurs in both eventID = "123"  AND eventID ="456"  (not all "appID" occur in both these eventID) . So I want to display a list of values from all those "appID"  field which are occurring in both the eventID = "123"  AND eventID ="456" .  Please let me know how can I achieve it. I also have a large data set here. Thank you.
Hi.  First, I've been using this forum for a few months now as I'm new to Splunk.   Thanks to all the contributors on here!! So here's what I'm trying to figure out.   I have a dashboard with 3 lin... See more...
Hi.  First, I've been using this forum for a few months now as I'm new to Splunk.   Thanks to all the contributors on here!! So here's what I'm trying to figure out.   I have a dashboard with 3 line charts.  Each line chart is in it's own unique panel (each panel has it's own unique Id).  I'd like to change the panel background when a returned query value is above a certain number but green when below that number. This is the sample code I'm using to hardcode the 3 panels to either red or green. <row> <panel> <html> <style> #A .dashboard-panel { background: #5AAF71 !important; } #B .dashboard-panel { background: #E35033 !important; } #C .dashboard-panel { background: #5AAF71 !important; } </style> </html> </panel> </row> But I'm looking for a way to dynamically change the panel background colors based on the query returning value. Also, I am not an admin and we don't have permissions to load javascript / CSS files so all of my coding will have to be set into the Dashboard XML. I have used the range before to change colors in the Singles.  Just not sure if it's possible with a panel background that the line chart sits in. thanks in advance!      
Issue: Source log events not forwarded after log rotation. Splunk UF version: /opt/splunk# /opt/splunk/bin/splunk version Splunk Universal Forwarder 7.0.0 (build c8a78efdd40f)   Inputs.conf: [... See more...
Issue: Source log events not forwarded after log rotation. Splunk UF version: /opt/splunk# /opt/splunk/bin/splunk version Splunk Universal Forwarder 7.0.0 (build c8a78efdd40f)   Inputs.conf: [monitor:///var/lib/origin/openshift.local.volumes/pods/*/volumes/kubernetes.io~empty-dir/applog/pipe-co.log] sourcetype = pipe-co ignoreOlderThan = 12h crcSalt = <SOURCE> index = pipe disabled = false Trie Splunk UF PID: ps -ef | grep -i splunk | grep -v grep root 75265 75137 1 19:19 ? 00:00:56 splunkd -p 8089 start ls -al /var/lib/origin/openshift.local.volumes/pods/e43d5812-ebe7-11eb-bf87-48df374d0d30/volumes/kubernetes.io~empty-dir/applog/ total 4876 drwxrwsrwx. 2 root 1000100000 187 Jul 23 20:13 . drwxr-xr-x. 3 root root 20 Jul 23 18:57 .. -rw-r--r--. 1 1000100000 1000100000 960597 Jul 23 20:06 pipe-co-2021-07-23T20-06-16.710.log.gz -rw-r--r--. 1 1000100000 1000100000 964929 Jul 23 20:09 pipe-co-2021-07-23T20-09-57.963.log.gz -rw-r--r--. 1 1000100000 1000100000 963195 Jul 23 20:13 pipe-co-2021-07-23T20-13-26.509.log.gz -rw-r--r--. 1 1000100000 1000100000 2021943 Jul 23 20:14 pipe-co.log Any idea? Thanks
I have a query that finds what I need for the current time and saved it as a scheduled report. However, I also need the same statistics from my historical data but I can't seem to figure out a good w... See more...
I have a query that finds what I need for the current time and saved it as a scheduled report. However, I also need the same statistics from my historical data but I can't seem to figure out a good way to execute it.  The query: index=red_cont | dedup id sortby - _time | where status=="blue" | stats count by level The query is run at the beginning of every hour which is great for current and future but how would I go about getting a snapshot count of every hour from a certain date such as "1/1/21" - till now.  I understand I can do this manually one hour at a time using the time picker and changing the latest hour but that would take a really long time. Thanks 
Hello I'm new to Splunk and I've been given the task to add new types of devices to our Splunk delployment. This includes creating dashboards to be able to find the information we want to know quicke... See more...
Hello I'm new to Splunk and I've been given the task to add new types of devices to our Splunk delployment. This includes creating dashboards to be able to find the information we want to know quicker. Now Currently we use many different devices, Cisco, Juniper and Calix to name a few. We capture all of the information using the same source.  Now what I want to do is create different dashboards for the different types of devices on the network. So you can look at all the different errors or other troubles coming in on certain devices. I tried tagging a few device based on hostname but this seems impractical and very long process. I also tried extracting fields on the various logs that come in. I find there's a lot of conflict since the devices use a different type of message format it causes conflicts when I try to extract fields.  Would it be easier to split up the devices by sending them to diffrent source ie udp xxx1 for cisco xxx2 for juniper and so forth. Or is there an easier way. I have the Cisco IOS app installed and I notice source type from cisco devices is set to Cisco IOS. Would it be easy to set something like that up for my other devices?
Hi Splunk Experts,  I wonder if you could help me putting the below logic in to a search query? Here the link reference to my original question. https://community.splunk.com/t5/Splunk-Search/kv-st... See more...
Hi Splunk Experts,  I wonder if you could help me putting the below logic in to a search query? Here the link reference to my original question. https://community.splunk.com/t5/Splunk-Search/kv-store-search-send-alert-and-also-store-the-the-alert-sent/m-p/560289#M159234 Thanks    "The logic of your requirement seems to be that there are two situations when a user appears in the audit (satisfying the conditions). Either, they are in the list of alerts from yesterday, or they are not. If they were not in the list from yesterday, send an alert and add them to the list (noting when they were added). If they were in the list, don't send an alert but note they were there. now, process the list and remove anyone who didn't appear today (so that an alert will be generated next time they appear on the list), Also, remove anyone who has been on the list for 7 days including today (so that an alert will be generated next time they appear on the list, even if it is tomorrow - day 8)." Day Audit name Alert name at start Alert sent date at start Alert name at end Alert sent date at end Send alert 1 James     James 1 Y   Michael     Michael 1 Y 2 James James 1 James 1 N     Michael 1       3 James James 1 James 1 N   Michael     Michael 3 Y 4 James James 1 James 1 N   Michael Michael 3 Michael 3 N 5 James James 1 James 1 N   Michael Michael 3 Michael 3 N 6 James James 1 James 1 N     Michael 3       7 James James 1 James 1 N   Michael     Michael 7 Y 8 James     James 8 Y     Michael 7      
To provide predictive maintenance. Does the App Splunk Essentials for predictive maint. # 4375 installed on each server individually? Thank u very much for your reply in advance.
I have a custom generating command that returns events to Splunk, however those events are not parsed, so the kv data in them is not available later in the search.  I'm using the python SDK.  I've tr... See more...
I have a custom generating command that returns events to Splunk, however those events are not parsed, so the kv data in them is not available later in the search.  I'm using the python SDK.  I've tried both type='streaming' and type='events' in the @Configuration() decorator.   How do I get Splunk to parse the events I'm giving it?
Hi All, I am stuck with 'Waiting for Input' error for one of the panels that I created in Splunk Dashboard. However, the search runs fine in Search app. Reading through other similar questions, it ... See more...
Hi All, I am stuck with 'Waiting for Input' error for one of the panels that I created in Splunk Dashboard. However, the search runs fine in Search app. Reading through other similar questions, it seems related to tokens. Tried rectifying it but no good. Following are the search and XML: Search: | inputlookup xxxxxxx.csv | stats dc(title) as number_of_rule, values(title) as rules by category | map [| inputlookup yyyyyyyy.csv | eval Date=strftime(_time, \"%m/%d/%Y\") | eval month=strftime(_time, \"%m\") | eval current_month=strftime(now(),\"%m\") | where month=current_month-1 | search index=$$category$$ | stats sum(GB) as GB by index | eval GB=round(GB,3) | eval index=\"$$category$$\", number_of_rule=\"$$number_of_rule$$\" | table index, number_of_rule, GB ]   XML: {     "type": "ds.search",     "options": {         "query": "| inputlookup xxxxxxx.csv\r\n| stats dc(title) as number_of_rule, values(title) as rules by category\r\n| map [| inputlookup yyyyyyyyy.csv\r\n| eval Date=strftime(_time, \\\"%m/%d/%Y\\\")\r\n| eval month=strftime(_time, \\\"%m\\\")\r\n| eval current_month=strftime(now(),\\\"%m\\\")\r\n| where month=current_month-1\r\n| search index=$$category$$\r\n| stats sum(GB) as GB by index\r\n| eval GB=round(GB,3)\r\n| eval index=\\\"$$category$$\\\", number_of_rule=\\\"$$number_of_rule$$\\\" | table index, number_of_rule, GB\r\n]"     },     "name": "Search_8" }   Thanks in advance!