All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Can some one help me to extract correlation _id from the below sample data. requirement is to extract the correlation_id into a field.   ys_class_name="Incident",closed_by="",dv_closed_by="",follo... See more...
Can some one help me to extract correlation _id from the below sample data. requirement is to extract the correlation_id into a field.   ys_class_name="Incident",closed_by="",dv_closed_by="",follow_up="",dv_follow_up="",parent_incident="",dv_parent_incident="",reopened_by="",dv_reopened_by="",reassignment_count="1",dv_reassignment_count="1",assigned_to="c8c62ea2db51f090439694d3f39619dc",dv_assigned_to="pusapati dixitulu",u_reopening_reason="",dv_u_reopening_reason="None",sla_due="",dv_sla_due="UNKNOWN",comments_and_work_notes="",u_transfer_between_users="",dv_u_transfer_between_users="",agile_story="",dv_agile_story="",escalation="0",dv_escalation="Normal",upon_approval="proceed",dv_upon_approval="Proceed to Next Task",correlation_id="f725d663-7c62-4f50-82b1-1483df23562e",dv_correlation_id="f725d663-7c62-4f50-82b1-1483df23562e",u_business_area="",dv_u_business_area="None",u_plb="",dv_u_plb="None",u_division="",dv_u_division="",u_bu_code="",dv_u_bu_code="",u_is_escalated="false",dv_u_is_escalated="false",child_incidents="0",dv_child_incidents="0",task_effective_number="INC4750863",dv_task_effective_number="INC4750863",u_last_assignment="2021-11-24 05:49:28",dv_u_last_assignment="2021-11-24 06:49:28",resolved_by="",dv_resolved_by Thanks
Hi, The following is my search: index=pace ERROR OR FATAL OUI=* Number=* | stats count by OUI Number | sort -count   After executing the above search i get the following results:   OUI Number... See more...
Hi, The following is my search: index=pace ERROR OR FATAL OUI=* Number=* | stats count by OUI Number | sort -count   After executing the above search i get the following results:   OUI Number count 9C3DCF 4W12757WA51F6 18 80CC9C 4W15177LA0AD1 10 0836C9 4W150B70A3837 4 100C6B 4W15077PA0682 3 80CC9C 4W151778A0A39 3 80CC9C 4W15177GA0A5D 3 Note: The number column are the results I am interested in. I have a separate table named subsdeviceextract.csv as per the following: MAC Model OUI Post Code Serial Number 08:36:C9:9A:F4:6C V6510 0836C9 2775 4W150B70A012A 08:36:C9:9B:5C:FE V6510 0836C9 6437 4W150B70A07A8 08:36:C9:9C:A8:20 V6510 0836C9 2641 4W150B70A110A I would like to look up the Serial number to get the Model Number Please help me, thank you
hi, when i am trying to install splunkforwarder getting error. Please help me. Linux machine: root@client1:/opt/splunkforwarder/bin# uname -r 4.19.0-18-cloud-amd64 package: splunkforwarder-8... See more...
hi, when i am trying to install splunkforwarder getting error. Please help me. Linux machine: root@client1:/opt/splunkforwarder/bin# uname -r 4.19.0-18-cloud-amd64 package: splunkforwarder-8.2.3-cd0848707637-Linux-armv8.tgz error: root@client1:/opt# cd splunkforwarder root@client1:/opt/splunkforwarder# cd bin/ root@client1:/opt/splunkforwarder/bin# ./splunk start --accept-license -bash: ./splunk: cannot execute binary file: Exec format error root@client1:/opt/splunkforwarder/bin# ./splunk start -bash: ./splunk: cannot execute binary file: Exec format error root@client1:/opt/splunkforwarder/bin# ./splunkd start -bash: ./splunkd: cannot execute binary file: Exec format error
Hi I am trying to filter data using week data using 2 dropdowns. Please find info below snippet. the below code throws an error " Error in 'where' command: The operator at '>=2101 and week<=215... See more...
Hi I am trying to filter data using week data using 2 dropdowns. Please find info below snippet. the below code throws an error " Error in 'where' command: The operator at '>=2101 and week<=2152' is invalid." pLease suggest. input type is dropdown and Dropdown menu for starting week : token is "from_week_token" label is "From week" fieldForLabel is "week" fieldForValue is "week" default is "2101" query is source="pdfthroughput_pdf_patches.json" host="LT433534" index="pdf_patches" sourcetype="_json"|eval weekNday=split(planned_stopped_on,".") | eval week=mvindex(weekNday,0) | table week | dedup week | where week>=2101 and week<=2152 input type is dropdown and dropdown menu for end week: token is "to_week_token" label is "To week" default is "2152" fieldForLabel is "week" fieldForValue is "week" query: source="pdfthroughput_pdf_patches.json" host="LT433534" index="pdf_patches" sourcetype="_json"|eval weekNday=split(planned_stopped_on,".") | eval week=mvindex(weekNday,0) | table week | dedup week | where week>=2101 and week<=2152
Hi  I am trying to speed up a query. When I run >>>   index=foo | stats values(host) as F_host   It take less than a minute to return the results. I want to take those results and create an ou... See more...
Hi  I am trying to speed up a query. When I run >>>   index=foo | stats values(host) as F_host   It take less than a minute to return the results. I want to take those results and create an outlookup and match host values against another lookup.  However I need to deliminate the stats results to individual values.   Something like,    ... | makemv delim=" " F_host | outputlookup ...   or maybe,    ... | eval D_host = split(F_host, " ") etc   If I run the original query >>>    index=foo | lookup bar-host.csv barHost AS host OUTPUTNEW barHost as match-host | stats values(match-host) by host   it takes forever.   In this case, bar-host.csv is the lookup filename and barHost is the fieldname. Maybe this is just plain old wrong, any advice appreciated... Thank you
Hello all, I am trying to setup a search that logs ufw commands, while ignoring any ufw status commands. I have tried a number of methods so far but cannot get the COMMAND field to filter appropriat... See more...
Hello all, I am trying to setup a search that logs ufw commands, while ignoring any ufw status commands. I have tried a number of methods so far but cannot get the COMMAND field to filter appropriately. Here is a version of the search:  ``` index="*" host="*dev*" source="/var/log/auth.log" process="sudo" COMMAND="/usr/sbin/ufw" | table _time host user _raw | where COMMAND!="*/usr/sbin/ufw status*" ``` I've tried a number of things including trying NOT instead of !, searching for various strings (status, *status*, etc.), filtering on the _raw field instead of COMMAND, using search instead of where, putting the table after the where, etc. I cannot get the events to filter out. It seems like I either get all the events or none of the events depending on the filter I choose. Any help here? Thank you!
hello, I would like to ask a question on how to assign the value to another variable and set an alert. I have a this data output from Splunk. I would like to assign the value to another variables ... See more...
hello, I would like to ask a question on how to assign the value to another variable and set an alert. I have a this data output from Splunk. I would like to assign the value to another variables and set an alert when the value become(s) is greater than a threshold like 10 or 20. for example when TX_UPS value >= 10, then I send an alert. how should I approach this in Splunk Alert job? shipper count TX_UPS 10 TX_USPS 15 TX_FedEx 5 CO_UPS 5 CO_USPS 9 CO_FedEx 2 MO_UPS 5 MO_USPS 20 MO_FedEx 3 GA_UPS 15 GA_USPS 10 GA_FedEx 5 PA_UPS 9 PA_USPS 21 PA_FedEx 8 NY_UPS 30 NY_USPS 99 NY_FedEx 20 index=main AND "*TRACKING*" | stats count by shipper    
I have a base search: index=oswin EventCode=19 SourceName="Microsoft-Windows-WindowsUpdateClient" earliest=-10d ComputerName=*.somedomain.com | rex "\WKB(?<KB>.\d+)\W" The result populates field ‘... See more...
I have a base search: index=oswin EventCode=19 SourceName="Microsoft-Windows-WindowsUpdateClient" earliest=-10d ComputerName=*.somedomain.com | rex "\WKB(?<KB>.\d+)\W" The result populates field ‘KB’ with a list of values similar to: 5007192 5008601 890830 I need to test if ‘KB’ contains one of the following: “5008601”, “5008602”, “5008603”, “5008604”, “5008605”, “5008606” If a match is found, populate field HotFixID (new field) with the matched value. If no match is found, populate field HotFixID with “NotInstalled”. Using search KB IN (5008601,5008602,5008603,5008604,5008605,5008606) results in matched values only. Case function works only if the matched value is the last one evaluated, otherwise it returns "notInstalled" even though a match is present.
Greetings, Has anyone successfully connected to Azure SQL with DB Connect with Azure Active Directory? I have installed the driver for MSSQL JDBC version 9.4 which Azure supports. I am using the fol... See more...
Greetings, Has anyone successfully connected to Azure SQL with DB Connect with Azure Active Directory? I have installed the driver for MSSQL JDBC version 9.4 which Azure supports. I am using the following connection string:     jdbc:sqlserver://<instance_url>:<instance_port>;database=<db>;encrypt=true;trustServerCertificate=false;hostNameInCertificate=<instance_domain>;loginTimeout=30;authentication=ActiveDirectoryPassword      I am getting error: Failed to load MSAL4J Java library for performing ActiveDirectoryPassword authentication.  Thanks in Advance
Hello, I have a dashboard with a text input that has id "text_input".  With JavaScript I am listening to changes to that input, and when that happens I just want to read the new value into a variabl... See more...
Hello, I have a dashboard with a text input that has id "text_input".  With JavaScript I am listening to changes to that input, and when that happens I just want to read the new value into a variable.  Instead of reading the newly inserted value, though, it reads the previous value.  The following code gives an idea of what I am doing:   var def_tok = mvc.Components.get("default"); var sub_tok = mvc.Components.get("submitted"); ... $("#text_input").on('change', function () { console.log("Change Detected"); var sub_tok_input = sub_tok.get("text_input"); var sub_tok_input_form = sub_tok.get("form.text_input"); var def_tok_input = def_tok.get("text_input"); var def_tok_input_form = def_tok.get("form.text_input"); console.log("sub_tok_input: " + sub_tok_input); console.log("sub_tok_input_form: " + sub_tok_input_form); console.log("def_tok_input: " + def_tok_input); console.log("def_tok_input_form: " + def_tok_input_form); });   When I fill in the first value from blank the first print reads blank instead of the value.  Then when I update the value it reads the first value, and so on...   What I am doing wrong and how can I ensure that I get the newly inserted value instead of the previously inserted one?   Thanks! Andrew
I am looking to see if anyone knows how to do this or if it is possible. I am trying to have splunk read to the Active directory groups that hold the certain subnets for different locations and we us... See more...
I am looking to see if anyone knows how to do this or if it is possible. I am trying to have splunk read to the Active directory groups that hold the certain subnets for different locations and we use it to lock down certain people being able to see these specific subnets. It is a pain to update all these subnets in splunk and the AD group since it is 2 different parties working this. I am asking if anyone knows how I can get rid of the Subnets that I am searching for in Splunk and just have to route to the AD groups that we have set up. This would eliminate the need to update Splunk dashboards and the AD groups. 
All,  I have 2 separate queries working from AWS Description data that we collect on a regular basis. The ask from one of our portfolio leads is to send them a report on a weekly basis (Monday) tha... See more...
All,  I have 2 separate queries working from AWS Description data that we collect on a regular basis. The ask from one of our portfolio leads is to send them a report on a weekly basis (Monday) that includes the following information. For all AWS EC2 instances which are in a stopped state include the following: account_name = this is just a lookup field matching our account numbers to a human readable name EC2 instance ID who owns the instance (a tag applied to the instance) date the instance was stopped - this is in the "reason" field from AWS Description data. how much storage is attached to the stopped instance (a total amount) Right now I have 2 separate queries that results in all of the data that I need but I need to find out a way to merge the two sets of data together into one report. Search #1:  this gets the stopped instance IDs, who owns it (an applied tag), what account it is in, the region, the instance Name (an applied tag) and the reason it was stopped. index=awsdescription source=*ec2_instances  state=stopped | dedup id |  rename tags.Name as Name | rename tags.Owner as Owner  |rename id as instance_id | table account_name, region, Name,  instance_id, Owner, reason Search #2: uses the above search as a sub-search, just pulling out the instance IDs. Then bounces that list off of a different source (*ec2_volumes) to grab the list of volumes associated with the instances that are stopped.  The results are then aggregated (stats sum) to get a total amount of storage attached to each of the stopped instances. index=awsdescription* source=*ec2_volumes [search index=awsdes* source=*ec2_instances  state=stopped | dedup id | rename id as attach_data.instance_id | fields attach_data.instance_id] | rename attach_data.instance_id as instance_id |dedup id |stats sum(size) by instance_id   The two searches combined gives me all of the data that I need but it is in 2 separate reports.  From here, I have to download the results of each, throw them into a spreadsheet and merge the two sets of data (using vlookup on the instance_id) into a single report before I send it off to the customer.   Is there a way to combine these two searches?  If so, I would love some guidance. Thanks in advance
Hey all, I have the Splunk add on for unix/linux deployed to about ~70 servers. All was working fine (and has been for years!) up until yesterday. I'm receiving data into my os index (which is wher... See more...
Hey all, I have the Splunk add on for unix/linux deployed to about ~70 servers. All was working fine (and has been for years!) up until yesterday. I'm receiving data into my os index (which is where those logs are stored) but after searching for anything beyond index, host, sourcetype, it does not work. For example, for a search of 7 days ago I can search for something like:  index=os sourcetype=df host="server1" OR host="server2" | stats max(PercentUsedSpace) as PercentUsed by host,filesystem | sort - PercentUsed | where PercentUsed >=75 It will pull data from 7 days ago up until yesterday.   Searching data for yesterday to now gives me no data.  If I search index=os host="server1" OR host="server2", I'm receiving logs as normal. The other sources and sourcetypes are there.   So i guess my question is, what happened to my "PercentUsedSpace"? It doesnt show in the interesting fields portion. I can't search for it. It returns blank.   My search for index=os source=df host="server1" OR host="server2" shows my logs. But I can't refine it down further.    Edit: Now what is interesting in my logs, every now and then, I see that I am receiving a log that is something along the lines of " CPU pctUser pctNice pctSystem pctIowait pctIdle" , "Name rxPackets_PS txPackets_PS rxKB_PS txKB_PS", "memTotalMB memFreeMB memUsedMB memFreePct memUsedPct pgPageOut swapUsedPct pgSwapOut cSwitches interrupts forks processes threads loadAvg1mi waitThreads interrupts_PS pgPageIn_PS pgPageOut_PS"   So it seems that instead of parsing each field as type of field, it is parsing as a log.    Please assist!
I would like to take report for employees who are completed four different certification courses from my data.  For example: Employee 1 completed 3 courses , Employee 2 completed 2 courses , Employe... See more...
I would like to take report for employees who are completed four different certification courses from my data.  For example: Employee 1 completed 3 courses , Employee 2 completed 2 courses , Employee 3 completed 1 courses etc.. along with it's name and completion date.  Kindly suggest how to write query in this situation. Is it ok to create CSV file with one set results and compare it or any other way available.  
Hi All, We have recently upgraded our splunk enviornment from 7.X to 8.X and we want to compare splunk performance before and after the upgrade. One of the parameter we want to track is time taken ... See more...
Hi All, We have recently upgraded our splunk enviornment from 7.X to 8.X and we want to compare splunk performance before and after the upgrade. One of the parameter we want to track is time taken by cluster to complete the fix up tasks. Can you please guide if there is any way we can monitor the time taken by the fix up tasks and rolling restart to complete. Thanks & Regards, Rahul Bhatia
We've got the Splunk App for Infrastructure inputs for Windows  metrics deployed to our universal forwarders. Metrics are all working fine except in one situation; if any performance counter value is... See more...
We've got the Splunk App for Infrastructure inputs for Windows  metrics deployed to our universal forwarders. Metrics are all working fine except in one situation; if any performance counter value is 0 then Splunk isn't recording the value. For example, if a disk free space counter reaches 0, it'll drop off all charts. Likewise application pool request queues don't record a 0 value if there's nothing in them. I've got plenty of custom metrics I've done (faking statsd protocol) that can write 0 just fine, it seems to only be the perfmon metrics from SAI that have the issue. Has anyone else encountered this and know what the fix is?
Hello, I'm using Splunk App for Web Analytics version 2.3.0, it working well. I used 1 site to monitor, and I have data I need, but, when I added another log to monitor, The site name it use is the... See more...
Hello, I'm using Splunk App for Web Analytics version 2.3.0, it working well. I used 1 site to monitor, and I have data I need, but, when I added another log to monitor, The site name it use is the folder name. I did index=iislogs | dedup site | table site, and I see the 2 sites,  but I want to change the name of ome of them. How Can I change the site name. I tried to change it on the "configure website" but it's not updated. Thenk you, Dov
Hi all, I have two indexes with the following fields: index=sofware sw                        version       author software_1            1.0           Mark software_2            1.1           Hol... See more...
Hi all, I have two indexes with the following fields: index=sofware sw                        version       author software_1            1.0           Mark software_2            1.1           Holly software_3            1.2           Tom software_4            1.3          Gorge index=downloads timestamp                                         sw 2021-11-23 00:00:00          software_1 2021-11-22 00:00:00          software_1 2021-11-21 00:00:00          software_4 2021-11-20 00:00:00          software_1 2021-11-19 00:00:00          software_3 2021-11-18 00:00:00          software_1 I need to create a report with the number of downloads for each software, something like this: sw                     version                 author               #downloads software_1       1.0                     Mark                               4 software_2       1.1                     Holly                               0 software_3       1.2                     Tom                                1 software_4       1.3                   Gorge                               1 I tried using left join but couldn't fine any good solution. Thanks for helping.
Hello Community. I am trying to solve a problem and I can't see a solution. Hope you can help me! I am working with a metrics index. My final goal is to get average of two metrics, but with two dif... See more...
Hello Community. I am trying to solve a problem and I can't see a solution. Hope you can help me! I am working with a metrics index. My final goal is to get average of two metrics, but with two differente filters based on a dimension from that metric index, and get a final calculation from these calculated fields, something like this: | mstats avg(metric1) as result1 avg(metric2) as result2 where index=my_metric_index AND filter_field=filter_list_1 | mstats avg(metric1) as result3 avg(metric2) as result4 where index=my_metric_index AND filter_field=filter_list_2 | eval Final_Result_1=result3-result1, Final_Result_2=result4-result2  I also created a search (which I pretend to use as subsearch in the middle of previous search) to get both lists, filter_list_1 and filter_list_2, something like this: |mcatalog values(values1) as values1 values(values2) as values2 where index=my_metric_index AND filter1 AND filter2 AND filter3 BY values1, values2 {...some modification stuff here...} | table filter_list_1, filter_list_2  Both filter_list_1 and filter_list_2 can be returned as a column list or a multivalue field (created with join command from column list). The chalenge here is how to pass these filter_list_x to both from a subsearch to the main (or precedence) search to use as filter in mstats command. The best I've got was to make subsearch sent back one of the filter list, named as the field I need to filter in main search with, and subsearch formated field_list (automatically, I don't know how it did) as a bunch of "OR statements with all values of the filter_list filed to use with mstat command. But I only could o this with one mstat command, not both. I don't know if I get myself to be well-explained How can I achieve my "final and complicated" goal? Some like this:   | mstats avg(metric1) as result1 avg(metric2) as result2 where index=my_metric_index AND filter_field=filter_list_1 | mstats avg(metric1) as result3 avg(metric2) as result4 where index=my_metric_index AND filter_field=filter_list_2 [|mcatalog values(values1) as values1 values(values2) as values2 where index=my_metric_index AND filter1 AND filter2 AND filter3 BY values1, values2 {...some modification stuff here...} | table filter_list_1, filter_list_2] | eval Final_Result_1=result3-result1, Final_Result_2=result4-result2   Any help will be very appreciated. Thanks in advance for your help. Regards, Carlos M
What is the license required to be acquired for a single instance splunk enterprise deployment which involves zero data indexing? Scenario : Say for a customer who has some static data to be display... See more...
What is the license required to be acquired for a single instance splunk enterprise deployment which involves zero data indexing? Scenario : Say for a customer who has some static data to be displayed in dashboards, where the data values may or not change (and the dashboard performs some logical operations on the available static data to showup some valued-add information/visualizations in the dashboards). Assuming the data will hence be stored in a lookup file or in a database and then be read using splunk app for DB connect, with no other data indexing in plan.