All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello, I'm trying to compare latest data with seven days back data. I want to create column charts in dashboard , one chart with today's data , other one with 7 day's back data. for example , if i... See more...
Hello, I'm trying to compare latest data with seven days back data. I want to create column charts in dashboard , one chart with today's data , other one with 7 day's back data. for example , if i select last 15mins from drop down, both panels should display data for same time interval (one panel with today's date and other one with 7 days back date) is this possible.      
Hi  , I'm trying to disable an alert but while doing so I'm getting an error. can you please help in this.   Please note that I am not the owner of this alert. Is it possible that because ... See more...
Hi  , I'm trying to disable an alert but while doing so I'm getting an error. can you please help in this.   Please note that I am not the owner of this alert. Is it possible that because of this I'm unable to disable it?   Could not find object id=WMS WK: Auto Wave Status Has Changed Regards, Rahul  
Consider a field value which contains a list of comma-separated field names, such as 'fieldList' in this example: | makeresults | eval host="server42" | eval location="dallas" | eval temp="50" | eva... See more...
Consider a field value which contains a list of comma-separated field names, such as 'fieldList' in this example: | makeresults | eval host="server42" | eval location="dallas" | eval temp="50" | eval color="blue" | eval fieldList="temp,host,color" I want to create a new field containing the concatenated values of the fields in 'fieldList', like this: | eval concatenatedValue = temp . host . color ... which, in this example, would result in 'concatenatedValue' containing a value of "50server42blue".  The next event might have fieldList="location,temp,host", which would need to evaluated in a similar fashion. Any suggestions?
I have the following query: splunk_server=indexer* index=wsi sourcetype=fdpwsiperf (channel_type=ofx2 OR agent_service=OfxAgent) domain=tax api_version=v1 capability=* tax_year=2019 NOT *test* NOT *... See more...
I have the following query: splunk_server=indexer* index=wsi sourcetype=fdpwsiperf (channel_type=ofx2 OR agent_service=OfxAgent) domain=tax api_version=v1 capability=* tax_year=2019 NOT *test* NOT *jmeter-automation* ofx_codes!="[15500,2000]" | lookup Provider_Alert.csv Provider_ID AS partnerId OUTPUT Tier Form_Type | search Tier=Tier1 | eval time_bucket=case(_time>=relative_time(now(),"-1h"), "last_hour", 1==1, "prior_hour") | eval error_type=case(error_code_host="2000", "OFX_2000", error_code_service IN ("5000","5001"), "provider_unavailable", like(http_status_code_host,"5%"), "HTTP_500",1==1,"null") | eval combo=partnerId."::".provider_id."::".Form_Type."::".host_base_url."::".error_type | chart dc(intuit_tid) as total_requests by combo time_bucket | eval partnerId=mvindex(split(combo,"::"),0) | eval provider_id=mvindex(split(combo,"::"),1) | eval Form_Type=mvindex(split(combo,"::"),2) | eval host_base_url=mvindex(split(combo,"::"),3) | eval error_type=mvindex(split(combo,"::"),4) | fields partnerId provider_id Form_Type host_base_url error_type last_hour prior_hour Which produces a table, where the following result is possible: partnerId provider_id Form_type host_base_url error_type last_hour prior_hour partner1 XYZ FormA urlB null 50 30 partner1 XYZ FormA urlB HTTP 500 12 20 partner2 ABC FormB urlZ null 20 30   I would like to add a column that sums values in last_hour according to grouping by partnerId, so that the above example, I would have another column (ie. extra_column) that has 62 (ie. 50 + 12 = 62) in the two rows for partner1. Extra note: I need the volume breakdown by error_type, but not in a chart format. How can I achieve this?
All,  Thought I posted this before, but can't find it in my history.  I am seeing alerts in my Splunk logs statin that the I am getting data from the future on my sourcetype  script:installedapps. ... See more...
All,  Thought I posted this before, but can't find it in my history.  I am seeing alerts in my Splunk logs statin that the I am getting data from the future on my sourcetype  script:installedapps. It's default and unmodified from the Splunk_TA_Window standard. From there I did notice that _indextime and _time were off a bit.  When I look at props.conf provided by Splunk_TA_Windows it has no time stamp recognition. Is there a reason for this? Should I go ahead and add it or is there a trick for this I am missing?    thanks -Daniel 
Hello, Query one returns a result with one fields as list of values. I want to  pass those list of value as the search source path and result returns for second query. Given below is the detail. Pl... See more...
Hello, Query one returns a result with one fields as list of values. I want to  pass those list of value as the search source path and result returns for second query. Given below is the detail. Please suggest how to achieve ?  Query1 :  index="os" (source="/var/log/steps/*/controller")  sourcetype="too_small" (host="ip-101-108-*-*" OR host="ip-101-109-*-*") | transaction source startswith=("/code/ttt_env.sh" OR "/code/ttt_gen.sh" ) endswith="startRun() called" | rex field=_raw "(?<step_function>\bs-[a-zA-Z0-9_]+)" It does return the output and value of  Query1 Output :  step_function values listed as  in field like : s-BBBUL8NJBYE45, s-AAAUL8NJBYEI3 Now these value I want to generate the further query using step_function values like ( Hard coded by hand it worked) append [search index="os" source=("/var/log/steps/s-BBBUL8NJBYE45/stdout" OR /var/log/steps/s-s-AAAUL8NJBYEI3/stdout")  sourcetype="too_small" (host="ip-101-108-*-*"" OR host="ip-101-108-*-*"*")] How to perform dynamically and achieve this functionality without hardcoding.  Tried like this but didn't work  index="os" (source="/var/log/steps/*/controller") sourcetype="too_small" (host="ip-101-108-*-*" OR host="ip-101-108-*-*") | transaction source startswith=("/code/ttt_env.sh" OR "/code/ttt_gen.sh") endswith="startRun() called" | rex field=_raw "(?<rec_prod_step_function>\bs-[a-zA-Z0-9_]+)" | search rec_prod_step_function="*" | append [search index="os" source="/var/log/steps/$rec_prod_step_function$/stdout" sourcetype="too_small" (host="ip-101-108-*-*" OR host="ip-101-108-*-*")] Note : "/var/log/steps/$rec_prod_step_function$/stdout" Thanks in advance.
  I am trying to compare 2 fields in this xml.  I have a field named avg that I want to compare with the other columns. Below is what I have. I am trying to compare values in "00" with avg and it is... See more...
  I am trying to compare 2 fields in this xml.  I have a field named avg that I want to compare with the other columns. Below is what I have. I am trying to compare values in "00" with avg and it is not working. Any suggestions? <format type="color" field="00"> <colorPalette type="expression">if(value &lt;= "avg", "#088D4C",if(value &gt;= 3501 AND value &lt; 6001, "#F5F405" ,"#DE0303"))</colorPalette> </format>      
I did a search of the last 3 months on fields A = "xxx" and B = "yyy" and it has to return me 2 other fields, C and D, which contain dates. I would like to make a total count (total_count) on field ... See more...
I did a search of the last 3 months on fields A = "xxx" and B = "yyy" and it has to return me 2 other fields, C and D, which contain dates. I would like to make a total count (total_count) on field C divided by month. Now another filter where the difference (diff_day) between the 2 dates, C and D, is less than 45 days and count how many events there are (count_event) always divided by month and finally find the percentage for each month. the table must have me: total_count per month, count_event per month count_event / total_count ie the percentage for each month. My query: index="04_analisi"  | where CANALE="DIRECT"  AND TIPO="ATTIVE" |dedup LINK,ID |eventstats count(DATA_OUT) AS total_count |sort DATA_OUT |eval dataout=strptime(DATA_OUT,"%Y-%m-%d") | eval datakpi=strptime(DATA_KPI,"%Y-%m-%d") | eval diff_day=round((datakpi - dataout)/86400,0) | where diff_day <= 45 |eventstats count(DATA_OUT) AS count_event |eval perc=round((count_event/total_count)*100,2) |eval Month=strftime(strptime(DATA_OUT,"%Y/%m/%d %H:%M"),"%b") | timechart span=1mon count(DATA_OUT) I have tried with timechart but it is not what I want. Any suggestions? I'm trying them all, reading the various questions and answers here on the community. I would like to try to better understand where I am wrong. Tks Bye Antonio
Hi all, Newbie question here: I'm trying to set up some of the 'InfoSec App for Splunk' Dashboards, and running into difficulties. This is the default Splunk query for showing Failed Logons: | ts... See more...
Hi all, Newbie question here: I'm trying to set up some of the 'InfoSec App for Splunk' Dashboards, and running into difficulties. This is the default Splunk query for showing Failed Logons: | tstats summariesonly=true allow_old_summaries=true count from datamodel=Authentication.Authentication where Authentication.app=wineventlog Authentication.action=failure I think I understand the Authentication.action field, and the data I'm attempting to point Splunk to are Windows logs with an action field equal to 'failure', but I don't understand the Authentication.app field. Is that the srctype that I'm pointing it to, or the index where the logs are? (Our index is named 'wineventlog', and contains security logs that have 'failure' and 'success' for logons.). Any help is appreciated!
I've successfully installed and configured the TA-meraki app and have all the CIM compliant data coming into Splunk, my question is...why am I not getting all the logs in Splunk that appear in the Me... See more...
I've successfully installed and configured the TA-meraki app and have all the CIM compliant data coming into Splunk, my question is...why am I not getting all the logs in Splunk that appear in the Meraki Dashboard? I'm currently receiving events for the following apps/roles in Splunk (below on the right), but when comparing it to the Roles configured on the Meraki-side (to the left), we're not receiving anything from the Security Events Role in Splunk, although it's logging fine into the Meraki dashboard. Any help would be GREATLY appreciated!  
Hi there,  Wondering if there is a way to have multi-select selections that themselves map to multiple possible values. My universe of possible values is large but can be grouped. I'd like the sel... See more...
Hi there,  Wondering if there is a way to have multi-select selections that themselves map to multiple possible values. My universe of possible values is large but can be grouped. I'd like the selections in my multi-select to represent the resultant groups but for any constituent of the group to match the selection of the group.  Example: Group 1: A, B, C, D Group 2: 1, 2, 3, 4 Group 3: A1, B1, C1, D1 The field can only contain one value (either A or B or C etc.) but I do not want my multi-select to have an option for each of the group constituents (i.e., A, B, C, D, 1, 2, 3, 4, A1, B1, C1, D1). Instead, I would like the user to be able to select Group 1 and for that to result in a match if the field has either A or B or C or D.  If the user selects Group 2, that should be matched by either 1 or 2 or 3 or 4. If the user selects Group 3, that should be matched by either A1 or A2 or A3 or A4. etc.  So essentially I need a field whose value is itself a search. Thanks in advance
Salut tout le monde.  j’aimerai utiliser SPLUNK au sein d'un réseau virtuel avec VMWARE. Mais je ne trouve pas assez de documentation pour ça. I need help please
Good afternoon During an activity, the in index stanza in the indexes.conf file was commented to perform an event cleaning since it was validated that the index had been declared 2 times in differen... See more...
Good afternoon During an activity, the in index stanza in the indexes.conf file was commented to perform an event cleaning since it was validated that the index had been declared 2 times in different FSs. When reviewing the history, it is validated that there is no data. Login to the indexer server to validate which of the 2 FS has the created directories and I validate that the one commenting had the buckets, therefore modify the indexes.conf file and comment on the one that was duplicated. When validating the history of the data, the index only has the files that are coming from different sources, but the data that is stored is not being read. How can I restore the data back to the index? Currently it only has a history of the HOT / WARM and little data in the COLD. Greetings.
I downloaded Splunk Enterprise with my signed up info. After downloading and installing the program I am not able to login with the credentials I used to download the program. Can anyone help me how ... See more...
I downloaded Splunk Enterprise with my signed up info. After downloading and installing the program I am not able to login with the credentials I used to download the program. Can anyone help me how to go about it. Thank you.
I am trying to create an alert to track failed login events on windows machines e.g. index=fa_servers EventCode=4625 OR 533 OR 529  but I have a problem where the account name in the event has a h... See more...
I am trying to create an alert to track failed login events on windows machines e.g. index=fa_servers EventCode=4625 OR 533 OR 529  but I have a problem where the account name in the event has a hyphen. Splunk is treating the hyphen as another account name   Values Count %   - 87 100%   OMGHCLPP-ADS002$ 46 52.874%   ALPHCLPP-ADS002$ 41 47.126%   you can see the 87 count is 46+41 so its treating the hyphen as its own value.  I have been trying to use eval and mvindex to try and just extract the 2 usernames but i am not getting anywhere. can someone explain how i can properly parse these values so it only sees 2 account names 
For instance when trying to access https://mypc:8089/services/apps/local/myapp?refresh=true my credentials aren't being accepted. I'm doing this as the admin account created at setup, and can sign in... See more...
For instance when trying to access https://mypc:8089/services/apps/local/myapp?refresh=true my credentials aren't being accepted. I'm doing this as the admin account created at setup, and can sign in to the web console fine. Splunk is running on my local machine. I have no issues with two other servers Splunk's installed on, one of which uses a self-signed certificate (like my machine).
I want to extract a number from logs where the line of interest looks like, INFO 2020-11-16 12:11:47,161 [ThreadName-1] com.mypackage.myclass TransId: a12345b6-7cde-8901-2f34-g5hi6jk789l0 Req ID - 1... See more...
I want to extract a number from logs where the line of interest looks like, INFO 2020-11-16 12:11:47,161 [ThreadName-1] com.mypackage.myclass TransId: a12345b6-7cde-8901-2f34-g5hi6jk789l0 Req ID - 123456 EvNum-1234567 - Received 12 create/cancel request. I want to extract all occurrences for the number (in this example 12) between "Received " and " create/cancel request." for a time range and get the max. Basically to get what is the largest request the app received. Thank you for your help with this.
Hello, I'm trying to add an email alert as an Adaptive Response Action to a built-in correlation search in Enterprise security, when I add it, it gives me an error.                   Any... See more...
Hello, I'm trying to add an email alert as an Adaptive Response Action to a built-in correlation search in Enterprise security, when I add it, it gives me an error.                   Any help would be appreciated. Thank You
I am trying to be able to get data in from an azure function one our of team's has done. We are not able to get the data in through our HEC endpoints as all of those are internal to our network and ... See more...
I am trying to be able to get data in from an azure function one our of team's has done. We are not able to get the data in through our HEC endpoints as all of those are internal to our network and the azure function sits outside of our network.  So we have gone the route of trying to have the function send into an EventHub  in Azure and then we pull the data from there. using the Microsoft Azure Add-on for Splunk: https://splunkbase.splunk.com/app/3757/    But it only seems to like the standard Azure diagnostic type logged data (admin/metrics).  But not any custom error logging we are sending into that EventHub. I am not seeing any errors for the app saying it can't connect or anything but no luck getting the custom erro logs.   What is the best way to get an applications custom error logs out of an Azure EventHub?
I have the following resultset I want to get the most recent events Resultset A Custom_ID Eligibility Start_date End_Date Updated_date Code 1331931 Not Eligible 1/1/20... See more...
I have the following resultset I want to get the most recent events Resultset A Custom_ID Eligibility Start_date End_Date Updated_date Code 1331931 Not Eligible 1/1/2011 0:00   11/12/2020 13:32 C 1331931 Eligible 1/1/2011 0:00   11/12/2020 13:07 C 1331931 Eligible 1/1/2011 0:00   11/12/2020 12:44 B 1331931 Eligible 1/1/2011 0:00   11/12/2020 12:33 B 1331931 Eligible 1/1/2011 0:00   11/12/2020 11:32 B 1331931 Not Applicable 1/1/2011 0:00   11/12/2020 10:52 B 1318156 Eligible 12/1/2017 0:00 12/31/2017 0:00 2/26/2020 13:36 B 1318156 Eligible 1/1/2018 0:00   2/26/2020 13:36 C 1319106 Eligible 9/9/2018 0:00   2/26/2020 13:36 D 1319106 Eligible 8/1/2016 0:00 9/8/2018 0:00 2/26/2020 13:36 B 1314052 Eligible 11/17/2016 0:00   2/26/2020 13:36 E 1314052 Eligible 1/1/2011 0:00 11/16/2016 0:00 2/26/2020 13:36 A   I am looking for the most recent eligible events Expected output Custom_ID Eligibility Start_date End_Date Updated_date Code 1318156 Eligible 1/1/2018 0:00   2/26/2020 13:36 C 1319106 Eligible 9/9/2018 0:00   2/26/2020 13:36 D 1314052 Eligible 11/17/2016 0:00   2/26/2020 13:36 E     I tried following query index=my_index |  where isnull(END_DT)| table  Custom_ID, Eligibility, Start_date, End_Date, Updated_date,Code Custom_ID Eligibility Start_date End_Date Updated_date Code 1331931 Not Eligible 1/1/2011 0:00   11/12/2020 13:32 C 1331931 Eligible 1/1/2011 0:00   11/12/2020 13:07 C 1331931 Eligible 1/1/2011 0:00   11/12/2020 12:44 B 1331931 Eligible 1/1/2011 0:00   11/12/2020 12:33 B 1331931 Eligible 1/1/2011 0:00   11/12/2020 11:32 B 1331931 Not Applicable 1/1/2011 0:00   11/12/2020 10:52 B 1318156 Eligible 1/1/2018 0:00   2/26/2020 13:36 C 1319106 Eligible 9/9/2018 0:00   2/26/2020 13:36 D 1314052 Eligible 11/17/2016 0:00   2/26/2020 13:36 E   How do I eliminate events with 1331931 since the most recent activity is “Not Eligible”