All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

How do I perform stats on a large number of fields matching a certain pattern without doing stats on each one individually? In a sample event below, there are 10+ fields with names beginning with "er... See more...
How do I perform stats on a large number of fields matching a certain pattern without doing stats on each one individually? In a sample event below, there are 10+ fields with names beginning with "er_". My task is to fire an alert if any of the values in these fields increases from the previous event. Sample event:   er_bad_eof: 0 er_bad_os: 0 er_crc: 0 er_crc_good_eof: 0 er_enc_in: 0 er_enc_out: 0 er_inv_arb: 0 er_lun_zone_miss: 0 er_multi_credit_loss: 0 er_other_discard: 11 er_pcs_blk: 0 er_rx_c3_timeout: 0 er_single_credit_loss: 0 er_toolong: 0 er_trunc: 0 er_tx_c3_timeout: 0 er_type1_miss: 0 er_type2_miss: 0 er_type6_miss: 0 er_unreachable: 0 er_unroutable: 11 er_zone_miss: 0 lgc_stats_clear_ts: Never phy_stats_clear_ts: Never port_description: slot12 port46 port_name: 382   SPL where I run stats on just two of those fields and where the "er_..._delta" values will be used to fire an alert if they're > 0:   index="sandbox" source="HEC" | stats count AS events, min(er_enc_out) AS er_enc_out_min, max(er_enc_out) AS er_enc_out_max, min(er_other_discard) AS er_other_discard_min, max(er_other_discard) AS er_other_discard_max, by host port_name, port_description | eval er_enc_out_delta = er_enc_out_max-er_enc_out_min, er_other_discard_delta = er_other_discard_max - er_other_discard_min | sort -er_enc_out_delta -er_other_discard_delta -er_enc_out_max -er_other_discard_max port_name   How do I run similar stats on all fields with names beginning with "er_"? Thanks!
Hi guys im new to Splunk,  Im trying to write a query to compare two search results and shows the differences and the matches, both search results are coming from the same index.  I would like to... See more...
Hi guys im new to Splunk,  Im trying to write a query to compare two search results and shows the differences and the matches, both search results are coming from the same index.  I would like to have something like this, where {path-values}  hold the paths values and {countpath} holds the count. Build-type   |  paths-count | matches-values    | diff-values             | matches-count | diff-count|  gradle           | 20K                  | {path-values}         | {path-values}        | {countpath}         | {countpath}  bazel             | 10K                  | {path-values}         | {path-values}       | {countpath}           | {countpath}  my index is based on this json, where total event is a 30k (number of json posted to splunk) {"source":"build","sourcetype":"json","event":{"type":"bazel","paths":["test3"]}} my current query looks like: index="build" type="bazel" | stats values(paths{}) as paths | stats count(eval(paths)) AS totalbazelpaths | mvexpand totalbazelpaths | eval eventFound = 0 | join type=left run_id paths [ index="build" type="gradle" | stats values(paths{}) as paths | stats count(eval(paths)) AS totalgradlepaths | mvexpand totalgradlepaths | eval eventFound=1] | eval percentage = round(totalbazelpaths/totalgradlepaths, 10) | table totalgradlepaths totalbazelpaths percentage any help how to achieve this? @yuanliu  Thanks 
Hello everyone, I have a lookup file which have 5 entry with filed name and field value as below "New_field"="yes", New_field1="yes", "New_field3"="yes", New_field4="Yes" I need to append a new... See more...
Hello everyone, I have a lookup file which have 5 entry with filed name and field value as below "New_field"="yes", New_field1="yes", "New_field3"="yes", New_field4="Yes" I need to append a new row to the lookup file with all the field value as "No". I am using the below command to do this |inputlookup sample_demo.csv |append [|inputlookup sample_demo.csv|eval "New_field"="no", New_field1="no", "New_field3"="no", New_field4="no"] this query is adding the new row but its adding 5 new row... I just need one row to append with new field value as "no" Can anyone please guide me on this, as what am i missing in the query  
I want to perform a search query which can give me results with respective to a specific time. For example i have a particular time as this:  2022-07-29 18:33:20 My query: index="*" sourcetype=... See more...
I want to perform a search query which can give me results with respective to a specific time. For example i have a particular time as this:  2022-07-29 18:33:20 My query: index="*" sourcetype="pan:threat" 10.196.246.104 url=*   earliest=relative_time("2022-07-29 18:33:20","-1h") AND latest = relative_time("2022-07-29 18:33:20","+1h") | stats values(url) as url by _time,dest_ip,dest_port,app,category,rule,action,user   I am not getting appropriate results with this, can anyone suggest how i can do the filtration on the basis of a particular time.
Hi,   just wondering for the Microsoft Cloud Services from the documentation it says it is only required on the search head cluster however it does say optional on the heavy forwarder. My question ... See more...
Hi,   just wondering for the Microsoft Cloud Services from the documentation it says it is only required on the search head cluster however it does say optional on the heavy forwarder. My question is usually I setup the inputs on the deployment server however as this is cloud data should the inputs.conf be on the heavy forwarder or the search head?   Thanks,   Joe
HI Splunkers,   Requirement: I have to create table for COUNT OF ERRORS based on text search in _raw data. I have created below query:     eventtype=XXX_AC_db ("Transaction (Process ID *)... See more...
HI Splunkers,   Requirement: I have to create table for COUNT OF ERRORS based on text search in _raw data. I have created below query:     eventtype=XXX_AC_db ("Transaction (Process ID *) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.*" OR "Rest Api POST error. Database has timed out. (TT-000346)") | rex field=Exception "System(?<m>.*):\s(?<message>.*)\s+at" | eval message=if(like(message,"%Transaction (Process ID %) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.%"),"Transaction (Process ID XX) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.",message) | stats count by message | append [ stats count | where count=0 | eval message="Transaction (Process ID XX) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."] | append [| search eventtype=XXX_AC_db "Rest Api POST error. Database has timed out. (TT-000346)" | stats count by Message | rename Message as message] | append [ stats count | where count=0 | eval message="Rest Api POST error. Database has timed out. (MG-000346)"] | append [| search eventtype=XXX_AC_db "*Database has timed out. (TT-000346)*" | eval Message=if(like(Message,"%Database has timed out. (TT-000346)%"),"Database has timed out. (TT-000346)",Message) | stats count by Message | rename Message as message] ...................       This query is taking too much time to execute. Is there any other way so that we can include different search and get the result.   Thank you in advance.
Hello everybody  my query :   index=logarithm SrcAddr="192.168.148.1" |eval flag=case(DestAddr="192.168.148.7" OR DestAddr="192.168.148.8" OR DestAddr="192.168.148.24" ,"LAN 1",DestAddr="192.168... See more...
Hello everybody  my query :   index=logarithm SrcAddr="192.168.148.1" |eval flag=case(DestAddr="192.168.148.7" OR DestAddr="192.168.148.8" OR DestAddr="192.168.148.24" ,"LAN 1",DestAddr="192.168.148.21" OR DestAddr="192.168.148.36" OR DestAddr="192.168.148.37" ,"LAN 4" , DestAddr="192.168.148.33" OR DestAddr="192.168.148.34" OR DestAddr="192.168.148.35","LAN 5") |chart count over flag by DestAddr useother=f usenull=f   and in trellis mode there are all DestAddrs for each flag! (as we can see in picture) but I want not to show DestAddrs with 0 values in every chart  by "LAN 1" just show "192.168.148.7" or  "192.168.148.8"  or "192.168.148.24" by "LAN 4" just show "192.168.148.21" or  "192.168.148.36"  or "192.168.148.37" by "LAN 5" just show "192.168.148.33" or  "192.168.148.34"  or "192.168.148.35 "
We have built one dashboard using splunk dashboard studio in absolute layout. We have added some rectangle shapes to the dashboard and added Color to it as well. Now we want to add the Color chan... See more...
We have built one dashboard using splunk dashboard studio in absolute layout. We have added some rectangle shapes to the dashboard and added Color to it as well. Now we want to add the Color change hovering functionality to the rectangle shape. Is there any to achieve this dashboard studio?    
We have built one dashboard using Splunk Dashboard Studio method by using absolute layout. Now we want to remove or hide splunk enterprise bar from the dashboard since client doesn’t want it there ... See more...
We have built one dashboard using Splunk Dashboard Studio method by using absolute layout. Now we want to remove or hide splunk enterprise bar from the dashboard since client doesn’t want it there in the dashboard. Since we use json in dashboard studio I ma not getting any workaround to hide splunk enterprise logo bar. Can someone help me on this?      
Hi,  I have 4 sources from one sourcetype . so i am getting data from 3 sources but not from other 1 source. Logs are present , but not showing up in splunk. checked inputs.conf  everything is--s... See more...
Hi,  I have 4 sources from one sourcetype . so i am getting data from 3 sources but not from other 1 source. Logs are present , but not showing up in splunk. checked inputs.conf  everything is--same configuration for all 4 sources. crccsalt=source  is also there in inputs.config. restarted the servers, but still not able to see the data Can you please tell me anything i am missing.
After following, well verified steps as noted in > https://community.splunk.com/t5/Deployment-Architecture/How-to-move-the-SHC-deployer-to-another-host-Part-2/m-p/604671#M25839 I was not able to suc... See more...
After following, well verified steps as noted in > https://community.splunk.com/t5/Deployment-Architecture/How-to-move-the-SHC-deployer-to-another-host-Part-2/m-p/604671#M25839 I was not able to successfully connect and test a push from the new deployer to the shcluster members.  I received an error >>> Error while deploying apps to first member, aborting apps deployment to all members: Error while fetching apps baseline on target=https://host:8089: Non-200/201 status_code=401; {"messages":[{"type":"ERROR","text":"Unauthorized"}]} Here are my steps: 1. copied the contents of /opt/splunk/etc/shcluster from the old deployer to the new deployer /opt/splunk/etc/shcluster 2) configured the new deployer [shclustering] stanza with the info from the old deployer [shclustering] stanza in /opt/splunk/etc/system/local server.conf 3) Updated conf_deploy_fetch_url in server.conf on each of the shc members 4) restarted the new deployer and a rolling restart on the shc members 5) did a test apply bundle and then received an error unauthorized. I believe the issue could be with the pass4SymmKey (on the new deployer) not being the same as the pass4SymmKey on the SHC members. I did a ./splunk show-decrypt --value <key> from the old deployer [shclustering] pass4SymmKey = <key> shcluster_label = Company_shcluster1 I used the decrypted key as the key for the new deployer pass3SymmKey but ultimately I am not able to run a successful push. Is there a way to recover these keys? The previous admin did not save the original secrets used to setup the deployer. Any advice greatly appreciated. Thank you
Hi there, I am using REHL 8.6 x86_64 (0otpa) / Kernel 4.18.0 and trying to update Splunk Add-on for Unix and Linux...I am getting this error - An error occurred while downloading the app An error oc... See more...
Hi there, I am using REHL 8.6 x86_64 (0otpa) / Kernel 4.18.0 and trying to update Splunk Add-on for Unix and Linux...I am getting this error - An error occurred while downloading the app An error occurred while downloading the app: [HTTP 404] https://127.0.0.1:8089/services/apps/local/Splunk_TA_nix/update; [{'type': 'ERROR', 'code': None, 'text': 'Error downloading update from https://splunkbase.splunk.com/app/833/release/8.6.0/download/?origin=cfu: Not Found'}] When I manually tried to download from this link, - https://splunkbase.splunk.com/app/833/release/8.6.0/download/?origin=cfu - I am getting Oops! 404 Error: Page not found. Please share your thoughts on how to update linux / unix app from the Splunk console
I have two indexes which include same data in a different fields as seen below.  index1 -- user, fileName, ...etc index2 -- event.file, actor user = actor and fileName = event.file The follow... See more...
I have two indexes which include same data in a different fields as seen below.  index1 -- user, fileName, ...etc index2 -- event.file, actor user = actor and fileName = event.file The following search gives me if a user and their file in index2 is available in the index1, but I dont need this since I know they should be included in index1 What I am trying to find is : If a user and their file in index2 is NOT available in the index1, I wanna list them out.  Thanks for help index="index1" [search index="index2" "event"=event2 event.file="something_*" | table event.file, actor | rename event.file as fileName, actor as user ] | table actor
Hi All, I tried running the two SPLs below for same index and time range, but got two very different set of results: - SPL 1: - |tstats values(host) where index=xxx SPL 2: - index=xxx |stats val... See more...
Hi All, I tried running the two SPLs below for same index and time range, but got two very different set of results: - SPL 1: - |tstats values(host) where index=xxx SPL 2: - index=xxx |stats values(host)   In SPL 1, I get one value. In SPL 2. I get six values.   I also tried to run the following: - index=xxx Checked the fields panel on the left hand side and the host field had values same as SPL 2.   Thus, please help to share why the above was observed and how it can be resolved. Thank you
I am attempting to convert most of my xml to javascript in my dashboards.  I have several single values that I can click on and show that specific data in the table.  For example, one particular sing... See more...
I am attempting to convert most of my xml to javascript in my dashboards.  I have several single values that I can click on and show that specific data in the table.  For example, one particular single value is Blacklisted.  When I click on the numeric value, it shows details of files, md5, sha256, dates, etc that have been tagged as blacklisted.  In XML my token is set as follows:   <set token="tkblacklist">blacklist IN (t)</set>   I filter on "true."  The token is used in my table, and I get a list of blacklisted entities.   ... | search $tkblacklist$      Screenshot below is a sample of the current dashboard functionality.   When I try to do this in javascript, I am confused how to apply the token using the SingleView and pass the token to the TableView.  I have done a lot of reading, watching videos, and trial and error, but I can't seem to get this right.  Most of the examples are for text inputs, dropdowns, and muti-select features. My test.js file   require([ 'underscore', 'backbone', 'splunkjs/mvc', 'splunkjs/mvc/searchmanager', 'splunkjs/mvc/postprocessmanager', 'splunkjs/mvc/singleview', 'splunkjs/mvc/tableview', 'splunkjs/mvc/simplexml/ready!' ], function(_, Backbone, mvc, SearchManager, PostProcessManager, SingleView, TableView) { var baseSearch = new SearchManager({ id: "baseSearch", preview: true, cache: false, search: "| tstats count values(modproc.process) AS process from datamodel=dmname.modproc where nodename=modproc by modproc.blacklist modproc.process modproc.md5" // Blacklisted var blacklistProcesses = new PostProcessManager({ id: "blacklistProcesses", managerid: "baseSearch", search: "| rename modproc.* AS * | search blacklist IN (\"t\") | stats count" }); new SingleView({ id: "blacklistProcesses_Dashboard", managerid: "blacklistProcesses", height: "50", el: $("#blacklistProc") }).render(); // Get your div var my_div = $("#blacklistProc"); // Respond to clicks my_div.on("click", function(e) { var tokens = mvc.Components.get("submitted"); tokens.set("mytoken", "| search blacklist IN (\"t\")"); }); // Process Table View var tableProcesses = new PostProcessManager({ id: "tableProcesses", managerid: "baseSearch", search: "| rename modproc.* AS * | $mytoken$" }, {tokens: true}); new TableView({ id: "tblProcess", managerid: "tableProcesses", pageSize: "50", el: $("#tableProc") }).render(); });    My XML   <dashboard script="test.js" stylesheet="test.css" theme="dark"> <label>Test Javascript Dashboard</label> <row> <panel> <html> <h3 class="MainHeading"> Blacklisted </h3> <div id="blacklistProc"/> </html> </panel> </row> <row> <panel> <title>My Table</title> <html> <div id="tableProc"/> </html> </panel> </row> </dashboard>     I have also tried to use drilldown in the SingleView, but that just opens the Search window. Thanks  
In Splunk Enterprise, Is there a way to find all the dashboards etc.. that consume data from a given database input that was set up in dbconnect?
Hi Everyone, Here is some context,  one of our customers is using our Splunk App we created with the Add On builder. All it does is forward alerts into our platform and we use splunk.Intersplunk t... See more...
Hi Everyone, Here is some context,  one of our customers is using our Splunk App we created with the Add On builder. All it does is forward alerts into our platform and we use splunk.Intersplunk to get the search results.  The customer is getting the following error but we do not know why: "07-11-2022 15:31:37.179 +0000 ERROR sendmodalert - action=bigpanda_alert STDERR - backports.configparser.MissingSectionHeaderError: File contains no section headers.","2022-07-11T11:31:37.179+0000","bigpanda_alert",,,,,,,,,,,,,,,sendmodalert,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,15,11,31,july,37,monday,2022,0,,,,,,,,,,,"action=bigpanda_alert STDERR - backports.configparser.MissingSectionHeaderError: File contains no section headers.","err0r nix-all-logs nix_errors splunk_modalert splunkd-log",,,,,,lpec5009spksh03,,,"_internal",,,,,,1,,ERROR,,,,,,,,,,,,,,,,,,,,"--_::._+___-_=__-__..:_____.",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,"/opt/splunk/var/log/splunk/splunkd.log",splunkd,,"URL-REDACTED-HERE",,,,,,,,,error,,,error,,,,,,29,,,0,,,,,,,,,,,,,,,,,   My initial suspicion is that it is related to enableheader  being overriden in etc/system/default/command.conf with false, but I tried it on my instance and I got no errors. Any insights into this would be greatly apreciated!
Hi everyone, I was looking at how I can ingest data from BitBucket Cloud to Splunk Cloud (8.2.2 Victoria).  The old bitbucket app gets rejected by the Splunk Cloud app upload. I saw the Lantern art... See more...
Hi everyone, I was looking at how I can ingest data from BitBucket Cloud to Splunk Cloud (8.2.2 Victoria).  The old bitbucket app gets rejected by the Splunk Cloud app upload. I saw the Lantern article (Atlassian: Bitbucket - Splunk Lantern), but it doesn't have any actual information. Does anyone have any working integrations of Splunk Cloud to Bitbucket Cloud?  Any source type/data type information? Thanks in advance!
Hey everyone, I'm pretty new to both splunk and jira but I'm trying to integrate them both to get real-time events/alerts from splunk sent over to jira as a task or open ticket, I've tried an addon c... See more...
Hey everyone, I'm pretty new to both splunk and jira but I'm trying to integrate them both to get real-time events/alerts from splunk sent over to jira as a task or open ticket, I've tried an addon called Atlassion Jira Issue Alerts but can't seem to get it working after the configuration, no alerts have been showing up, can someone guide me as to how to configure it since the details page for the addon doesn't really say much, or a different addon or even way of integrating through the use of the API's? (not that I've used one before but with proper guidance I can conduct the research to get it) thanks in advance!
i have index=main  user=Local Domain\abc it wont search any result but if i search with index=main  user=Local Domain\\abc it works, i tried rex as well but it didnt work for my dashboard as it wont... See more...
i have index=main  user=Local Domain\abc it wont search any result but if i search with index=main  user=Local Domain\\abc it works, i tried rex as well but it didnt work for my dashboard as it wont display any search, any solution to search without adding another \ to the search