All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am trying to expand couple of fields (locationId, matchRank) using mvexpand. But it only works for shorter duration upto 7days. I want to do this for upto 30 days. My matched_locations json array c... See more...
I am trying to expand couple of fields (locationId, matchRank) using mvexpand. But it only works for shorter duration upto 7days. I want to do this for upto 30 days. My matched_locations json array can have atmost 10 items in it with the presence of both locationId and an associated matchRank in it. We can identify the number of items in matched_locations  by msg.logMessage.numReturnedMatches field. I have thousands of such events. Below is the query which is working for shorter duration of time, but timing out for longer duration. ############################################################# index=app_pcf AND cf_app_name="credit-analytics-api" AND message_type=OUT AND msg.logger=c.m.c.d.MatchesApiDelegateImpl | search "msg.logMessage.numReturnedMatches">0 | eval all_fields = mvzip('msg.logMessage.matched_locations{}.locationId','msg.logMessage.matched_locations{}.matchRank') | mvexpand all_fields | makemv delim="," all_fields | eval LocationId=mvindex(all_fields, 0) | eval rank=mvindex(all_fields,1) | fields LocationId, rank | table LocationId, rank ############################################################# Below is the sample splunk data for one such event. ########################################################### cf_app_name: myApp cf_org_name: myOrg cf_space_name: mySpace job: diego_cell message_type: OUT msg: { application: myApp correlationid: 0.af277368.1669261134.5eb2322 httpmethod: GET level: INFO logMessage: { apiName: Matches apiStatus: Success clientId: oh_HSuoA6jKe0b75gjOIL32gtt1NsygFiutBdALv5b45fe4b error: NA matched_locations: [ { city: PHOENIX countryCode: USA locationId: bef26c03-dc5d-4f16-a3ff-957beea80482 matchRank: 1 merchantName: BIG D FLOORCOVERING SUPPLIES postalCode: 85009-1716 state: AZ streetAddress: 2802 W VIRGINIA AVE } { city: PHOENIX countryCode: USA locationId: ec9b385d-6283-46f4-8c9e-dbbe41e48fcc matchRank: 2 merchantName: BIG D FLOOR COVERING 4 postalCode: 85009 state: AZ streetAddress: 4110 W WASHINGTON ST STE 100 } { [+] } { [+] } { [+] } { [+] } { [+] } { [+] } { [+] } { [+] } ] numReturnedMatches: 10 } logger: c.m.c.d.MatchesApiDelegateImpl } origin: rep source_instance: 1 source_type: APP/PROC/WEB timestamp: 1669261139716063000 } ###########################################################
For example, a real-time search is being performed in the past 10 minutes window. At this time, data with a timestamp of 15 minutes ago was imported. When the search for _time, there are no hits be... See more...
For example, a real-time search is being performed in the past 10 minutes window. At this time, data with a timestamp of 15 minutes ago was imported. When the search for _time, there are no hits because it does not correspond to the past 10 minutes, but when the search for indextime, there are hits. I would like to know which is the basis of the search in real-time search.  
Got these errors while installing acs-cli in WSL/Ubuntu:   ==> Installing acs from splunk/tap ==> Installing dependencies for splunk/tap/acs: linux-headers@5.15, glibc, gmp, isl, mpfr, libmpc, lz4,... See more...
Got these errors while installing acs-cli in WSL/Ubuntu:   ==> Installing acs from splunk/tap ==> Installing dependencies for splunk/tap/acs: linux-headers@5.15, glibc, gmp, isl, mpfr, libmpc, lz4, xz, zlib, zstd, binutils and gcc ==> Installing splunk/tap/acs dependency: linux-headers@5.15 ==> Pouring linux-headers@5.15--5.15.57.x86_64_linux.bottle.1.tar.gz cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_connmark.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_dscp.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_mark.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_rateest.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_tcpmss.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter_ipv4/ipt_ecn.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter_ipv4/ipt_ttl.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter_ipv6/ip6t_hl.h': File exists Error: Failure while executing; `cp -pR /tmp/d20221130-6079-1ef476r/linux-headers@5.15/. /home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15` exited with 1. Here's the output: cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_connmark.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_dscp.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_mark.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_rateest.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter/xt_tcpmss.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter_ipv4/ipt_ecn.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter_ipv4/ipt_ttl.h': File exists cp: cannot create regular file '/home/linuxbrew/.linuxbrew/Cellar/linux-headers@5.15/5.15.57/include/linux/netfilter_ipv6/ip6t_hl.h': File exists This happened after the "brew install acs" command has gone through 13 downloads of libraries etc. Any help would be much appreciated.  
Hi All, we upgraded our Splunk Enterprise to the latest 9.0.2 and noticed something odd in the About page. Why does the products say "Retention" ?  What does it mean?  Have never come across like th... See more...
Hi All, we upgraded our Splunk Enterprise to the latest 9.0.2 and noticed something odd in the About page. Why does the products say "Retention" ?  What does it mean?  Have never come across like this before    
We upgraded the Splunk Universal Forwarders on our web servers from 8.0.5 to 9.0.1 back in late October and since then we've seen a dramatic increase in CPU Utilization by the Splunkd.exe process on ... See more...
We upgraded the Splunk Universal Forwarders on our web servers from 8.0.5 to 9.0.1 back in late October and since then we've seen a dramatic increase in CPU Utilization by the Splunkd.exe process on each server. Each instance is tracking a fairly large amount of files - typically 3k or so per day and in a folder that can contain up to 10k files. I've found reducing the amount of 'old' files in the folder helps, but the CPU load is still dramatically above what it was with version 8.0.5.
Hi all, I would like to use bin command to make the demo data sets into 10 bins according to Exe_time and list Substage_time along with it. Do anyone have ideas about how to use bin command correct... See more...
Hi all, I would like to use bin command to make the demo data sets into 10 bins according to Exe_time and list Substage_time along with it. Do anyone have ideas about how to use bin command correctly? I use these commands, but the output isn't as my expectation. |bin Exe_time as time_bin bins=10 |stats values(Substage_time) by time_bin Demo data sets are listed below:  Exe_time Substage_time Count 10 8 11 2 21 9 12 2 32 8 1 43 9 19 4 54 9 12 3 65 8 11 6 66 9 19 7 67 8 11 6 70 9 12 6 71 8 11 5 80 7 4 81  9 12 11 95 7 8 108 11 3 220 8 11 5 Thank you.
I am currently attempting to create a table that displays the count of one event from the previous month in comparison to the current month. I'm not quite sure what the best way to do this is but I'v... See more...
I am currently attempting to create a table that displays the count of one event from the previous month in comparison to the current month. I'm not quite sure what the best way to do this is but I've created a search with an appended search and I'm attempting to display this in a table comparing the two results from last month to this month. Essentially what Im trying to achieve is the following: Event Last Month(count) Current Month(count) Event 1 4323 435 Event 2 564 23 Here is my base search so far.. <search 1> earliest=-1mon@mon latest=@mon> | multikv  | stats count by event | eval input_type="Last_Month" | append [<search 2> earliest=@mon latest=now | multikv  | stats count by event | eval input_type="Current_Month"] | Thank you!
I need to configure the Splunk Add-on for Linux in both forwarder and Splunk server machines. I have Splunk Enterprise running on windows and collecting the logs from Linux using Splunk forwarder. ... See more...
I need to configure the Splunk Add-on for Linux in both forwarder and Splunk server machines. I have Splunk Enterprise running on windows and collecting the logs from Linux using Splunk forwarder. To get the dashboard from Splunk add on Linux, Do I need to install the add-on on both machines, if so what steps do I need to follow on a Linux host machine?     Many thanks for considering my request.
Hello, I have use cases to find the Delta between 2 sets of events. We get events once a day, our objective is to find the delta between current events (event received today) and the events we rece... See more...
Hello, I have use cases to find the Delta between 2 sets of events. We get events once a day, our objective is to find the delta between current events (event received today) and the events we received yesterday and create a report based on that delta (events). Any recommendation would be highly appreciated. Thank you so much.
I have two Splunk queries 1 and 2 below, and both have one common email , i want the searched emails generated from the result which are email variable to be able to send an alert notification base o... See more...
I have two Splunk queries 1 and 2 below, and both have one common email , i want the searched emails generated from the result which are email variable to be able to send an alert notification base on the search result generated email. I need the common value to have the field with matching values in both queries which is the email , then be able to send an email alert notification Query-1 index="aws-cloudtrail" eventName="AssumeRoleWithSAML" |fields * | spath "requestParameters.roleArn" |search "*super*admin*" | rex field=responseElements.subject "(?<Email>[a-zA-Z0-9]{1,8}@digitlogs.com$)" | search Email=* | table Email,"recipientAccountId" | dedup Email, "recipientAccountId" Query-2 index="okta" displayMessage="Authenticate user with AD agent" | rename target{}.alternateId as email | eval my_ponies=mvindex(email, -3, -2) | eval Email=mvindex(email, 0) | eval email=mvindex(email, 1) | table Email email Here are the two of them, please any input will help
The env was on 8.2.7.  the environment has 3 Node Search Head Cluster. Nodes upgraded from version 8.2.7 to  9.0.2. Post upgrade for one  SHC member the kvstore status was  DOWN.  
Hi  I am not having much luck. I want to find all schedule reports and alerts that use a specific index (e.g. index=foo) or the name contains a keyword (e.g. fooBar). I tried the _audit index   ... See more...
Hi  I am not having much luck. I want to find all schedule reports and alerts that use a specific index (e.g. index=foo) or the name contains a keyword (e.g. fooBar). I tried the _audit index     index=_audit search = *"index=foo"* OR savedsearch_name=*fooBAR* provenance=scheduler | stats values(savedsearch_name)      I get some of the alerts (hopefully "provenance = scheduler" means it is scheduled) but I was looking for a better way, maybe with >>  |rest      | rest /servicesNS/-/search/saved/searches | table title,triggered_alert_count,search, cron_schedule,alert_type,alert_condition | rex field=search "index=(?<indexName>.[^\s]+)" | search indexName=foo     However I am not having much luck getting alerts that contain "index=foo" in the search field. Any advice appreciated. Thank you
Hi, I am not sure if this is possible or not in Splunk classic Dashboard, but if it is, it would make the user experience much better for the Dashboard that I am trying to designing. I have 2 panel... See more...
Hi, I am not sure if this is possible or not in Splunk classic Dashboard, but if it is, it would make the user experience much better for the Dashboard that I am trying to designing. I have 2 panels within my dashboard.  Once the user enters the text in the search box, and press submit, then the results are displayed in a table format in rows.  The results have several columns: _time operation_id location  account_number What I would like to have, is for each row of data, a specific field the operation_id, be selectable like a check box beside the field value.  Typically, I expect the user to select 3-5 row at a time.  Once the user selects the rows of interest, I would like the selected row's operation_id to be used in the second pannel within the Dashboard and provide a different search and display its results.   Each operation_id in the first panel is unique, so there will be 3-5 operation_id selected. I am aware you can pass a token from the first to the second panel, but I am struggling with how to create a check-box beside each result row, so that it can be selected by the user to furnish the opration_id into the second panel (rather than copy and pasting each operation_id) Any input on how to achieve such an interactive Dashboard would be appreciated.
I have a current time query: | makeresults | eval clock = strftime(now(), "%H:%M:%S") | eval timestamp = strftime(now(), "%+") | table clock,timestamp for a single value item on my dashboard. I ... See more...
I have a current time query: | makeresults | eval clock = strftime(now(), "%H:%M:%S") | eval timestamp = strftime(now(), "%+") | table clock,timestamp for a single value item on my dashboard. I would like it to be refreshed every second. I would like this to be a "clock" on my dashboard to display the current time. How can I do this? Thanks, eholz1    
Upgraded Splunk Enterprise to v9.0.2 in a single instance deployment. The CIM app is currently running version 5.0.2 and I am receiving the following error below. I cannot seem to pinpoint what the i... See more...
Upgraded Splunk Enterprise to v9.0.2 in a single instance deployment. The CIM app is currently running version 5.0.2 and I am receiving the following error below. I cannot seem to pinpoint what the issue is and where to start. Any help would be appreciated! Unable to initialize modular input "relaymodaction" defined in the app "Splunk_SA_CIM": Unable to locate suitable script for introspection..
Hey gents,  I am very new to splunk but does anyone have an idea why my search from datamodel=authentication not getting older events (say last month or two)? Below is my search string: | tst... See more...
Hey gents,  I am very new to splunk but does anyone have an idea why my search from datamodel=authentication not getting older events (say last month or two)? Below is my search string: | tstats prestats=true summariesonly=true allow_old_summaries=true count from datamodel=Authentication.Authentication where Authentication.app=win* Authentication.action=* by _time, Authentication.action span=10m | timechart minspan=10m useother=true count by Authentication.action Any suggestion would be so much appreciated!  Cheers 
I have this dataset in SPlunk,  I am trying to see only the events where "firstSeen" is within the last 7 days. I tried to | where firstSeen<7d  but that didn't work also. state Age dns... See more...
I have this dataset in SPlunk,  I am trying to see only the events where "firstSeen" is within the last 7 days. I tried to | where firstSeen<7d  but that didn't work also. state Age dnsName firstSeen ip lastSeen severity pluginID open 32.49 28-Nov-22  28-Nov-22  10.102.10.1 29-Nov-22 informational 10180 open 1 Cat  28-Nov-22 10.102.1.23 29-Nov-22 informational 11219 open 34.06   22-Nov-22   29-Nov-22 informational 19506 open 5.6 Dog  23-Nov-22   28-Nov-22 informational 168007 open 22.65 Lion 6-Nov-22   28-Nov-22 informational 166958 open 31.64 tiger 28-Oct-22   28-Nov-22 informational 166602 open 120.63 giraf 25-Nov-22   28-Nov-22 informational 163588 open 68.47 leap 21-Sep-22   28-Nov-22 informational 163489 open 68.47 big dog 21-Sep-22   28-Nov-22 informational 163488
I use mvzip command  index=main sourcetype="ms.356" | eval nested_payload=mvzip(mvzip(flaw, solution),answer) | eval nested_payload=split(nested_payload,"--") | eval flaw=mvindex(nested_payload,0) ... See more...
I use mvzip command  index=main sourcetype="ms.356" | eval nested_payload=mvzip(mvzip(flaw, solution),answer) | eval nested_payload=split(nested_payload,"--") | eval flaw=mvindex(nested_payload,0) | eval solution=mvindex(nested_payload,1) | eval answer=mvindex(nested_payload,2) | table flaw solution answer what I use above command I get all 3 field value in flaw field separated by commons instead of their own field. what I am doing wrong
I have 2 sourcetype sourcetype="source1" and sourcetype="source2" This is how sample data looks: source1: CID,Cname,CData Source 2: CID,key,FName,LName Here values of CID of source 1 and ... See more...
I have 2 sourcetype sourcetype="source1" and sourcetype="source2" This is how sample data looks: source1: CID,Cname,CData Source 2: CID,key,FName,LName Here values of CID of source 1 and key of source 2 will be same.Even though CID will be present in source 2 but it will be having different value. I need to write query to when CID(source 1) = key (source 2) then fetch all other fields from source 1 and source 2 display in table .   Any suggestions would be appreciated.    
Hi, how to extract the field "alert" with the field name action. help with the regex.. Thanks.