All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

ITSI for Alert $result.service_name$ on host $result.src$ $result.description$ An event has been detected: Host: $result.host$ Source: $result.source$ Error Code: $result.error_code$ Description... See more...
ITSI for Alert $result.service_name$ on host $result.src$ $result.description$ An event has been detected: Host: $result.host$ Source: $result.source$ Error Code: $result.error_code$ Description: $result.description$ I'm fairly new to ITSI and Splunk in general and I couldn't find out any information on tokens that clearly. The only token that is working right now is $result.description$,. Any assistance will be much appreciated.    Thank you  
Hi  I have an event which has prod and test based on env...if it is test it goes to nsps [{},{} ] object an check for the name say A,B,C,D an get their associate  ReadOnlyConsumerNames in tabular fo... See more...
Hi  I have an event which has prod and test based on env...if it is test it goes to nsps [{},{} ] object an check for the name say A,B,C,D an get their associate  ReadOnlyConsumerNames in tabular format Output as: Name      ReadOnlyConsumerNames  A               Application, Lst,data B               Application, Lst C             Lst D            Lst,Gt,PT       { [-] prod: { [] } test: { [-] DistinctAdminConsumers: [ [-] App pd. ] DistinctAdminUser: 2 DistinctReadConsumers: [ [-] Application. GT. Technology. data ] DistinctReadUser: 4 TotalAdminUser: 20 TotalNSPCount: 10 TotalReadUsers: 13 nsps: [ [-] { [-] AdminConsumerNames: [ [-] App. pd. ] AdminUserCount: 2 Name: A ReadOnlyConsumerNames: [ [-] Application Lst data ] ReadonlyUserCount: 3 } { [-] AdminConsumerNames: [ [-] App Data ] AdminUserCount: 2 Name: B ReadOnlyConsumerNames: [ [-] Application Lst ] ReadonlyUserCount: 3 } { [-] AdminConsumerNames: [ [-] preprod pd ] AdminUserCount: 2 Name: C ReadOnlyConsumerNames: [ [-] Lst ] ReadonlyUserCount: 1 } { [-] AdminConsumerNames: [ [+] ] AdminUserCount: 2 Name: D ReadOnlyConsumerNames: [ [-] Lst Gt PT ] ReadonlyUserCount: 1 } ] } }    
Hey im trying to play sounds in my dashboard studio dashboard. I heard its not possible because dashboard studio is not as customizable as classic dashboard. Does anyone know any workaround before I'... See more...
Hey im trying to play sounds in my dashboard studio dashboard. I heard its not possible because dashboard studio is not as customizable as classic dashboard. Does anyone know any workaround before I'll have to switch to classic dashboard?
I am working on a dashboard that has a bunch of field and will be used by multiple teams and people who will be needing different fields from the table.  Is there anyway to add a toggle or filter or... See more...
I am working on a dashboard that has a bunch of field and will be used by multiple teams and people who will be needing different fields from the table.  Is there anyway to add a toggle or filter or anything similar to give a couple of presets (ex fields A D E H to preset 1 for team 1, fields B C D F G to preset 2 for team 2 and so on) I also use filters on fields in the dashboard table as well if possible i would want hiding of the field to not impact the filters at all.  Thanks in advance.
Hello everyone, I have built a dashboard with dashboard studio but in the panels I have noticed that you can use many properties but you cannot change the position of the markdown text. I have alre... See more...
Hello everyone, I have built a dashboard with dashboard studio but in the panels I have noticed that you can use many properties but you cannot change the position of the markdown text. I have already tried to see the documentation but to no avail (maybe I am missing something). By changing position I also mean simply aligning the panel text left,centre,right inside. Do you have any ideas? Thank you, biwanari
Hello , i have a common log file which same name in both production and stage with different name for sourcetype. As i don't want that logs to be ingested from Production i have added below entry... See more...
Hello , i have a common log file which same name in both production and stage with different name for sourcetype. As i don't want that logs to be ingested from Production i have added below entry in props.conf. [source::<Log file path>] Transforms-null= setnull   transforms.conf [setnull] REGEX = BODY DEST_KEY = queue FORMAT = nullQueue [setnull] REGEX = . DEST_KEY = queue FORMAT = nullQueue   But i want same log file from stage and not from production - in props.conf adding the sourctype of prod will restrict the logs from production and ingest the logs from stage where sourcetype name is different?? [source::<Log file path>] [sourcetype = <Prod Sourcetype>] Transforms-null= setnull   in Addition - Prod Source Type i have other two logs and i don't want that get stopped because of this configuration changes. Thanks
The IP address keeps changing with the same error. Forwarder Ingestion Latency Cause(s) d'origine : Indicator 'ingestion_latency_gap_multiplier' exceeded configured value. The observed value is 272... See more...
The IP address keeps changing with the same error. Forwarder Ingestion Latency Cause(s) d'origine : Indicator 'ingestion_latency_gap_multiplier' exceeded configured value. The observed value is 272246. Message from D97C3DE9-B0CE-408F-9620-5274BAC12C72:192.168.1.191:50409 How do you solve the problem?
Hi Community, i have a data source, that submit sometimes faulty humidity data like 3302.4 Percent. To clean / delete this outlier events, i buil a timechart avg to get the real humidity curve,... See more...
Hi Community, i have a data source, that submit sometimes faulty humidity data like 3302.4 Percent. To clean / delete this outlier events, i buil a timechart avg to get the real humidity curve, and from this curve i get the max and min with stats  to get the upper and bottom from this curves. ...but my search wont work, and i need your help, here is a makeresult sample: | makeresults format=json data="[{\"_time\":\"1729115947\", \"humidity\":70.7},{\"_time\":\"1729115887\", \"humidity\":70.6},{\"_time\":\"1729115827\", \"humidity\":70.5},{\"_time\":\"1729115762\", \"humidity\":30.9},{\"_time\":\"1729115707\", \"humidity\":70.6}]" [ search | timechart eval(round(avg(humidity),1)) AS avg_humidity | stats min(avg_humidity) as min_avg_humidity ] | where humidity < min_avg_humidity ```| delete ```
Hi Team, I am fetching unique "ITEM" values from first sql query running on one database. Then passing those values to another sql query to fetch the corresponding values in the second database. ... See more...
Hi Team, I am fetching unique "ITEM" values from first sql query running on one database. Then passing those values to another sql query to fetch the corresponding values in the second database. first SQL query: select distinct a.item from price a, skus b, deps c,supp_country s where zone_id in (5, 25) and a.item = b.sku and b.dept = c.dept and a.item = s.item and s.primary_supp_ind = 'Y' and s.primary_pack_ind = 'Y' and b.dept in (7106, 1666, 1650, 1651, 1654, 1058, 4158, 4159, 489, 491, 492, 493, 495, 496, 497, 498, 499, 501, 7003, 502, 503, 7004, 450, 451, 464, 465, 455, 457, 458, 459, 460, 461, 467, 494, 7013, 448, 462, 310, 339, 7012, 7096, 200, 303, 304, 1950, 1951, 1952, 1970, 1976, 1201, 1206, 1207, 1273, 1352, 1274, 1969, 1987, 342, 343, 7107, 7098, 7095, 7104, 2101, 2117, 7107, 7098, 1990, 477, 162, 604, 900, 901, 902, 903, 904, 905, 906, 908, 910, 912, 916, 918, 7032, 919, 7110, 7093, 7101, 913, 915, 118, 119, 2701, 917) and b.js_status in ('CO'); Second SQL: WITH RankedData AS (SELECT Product_Id, BusinessUnit_Id, Price, LastUpdated, ROW_NUMBER() OVER (PARTITION BY Product_Id, BusinessUnit_Id ORDER BY LastUpdated DESC) AS RowNum FROM RETAIL.DBO.CAT_PRICE(nolock) WHERE BusinessUnit_Id IN ('zone_5', 'zone_25') AND Product_Id IN ($ITEM$) ) SELECT Product_Id, BusinessUnit_Id, Price, LastUpdated FROM RankedData WHERE RowNum = 1; When I am using map command as shown below, expected results are fetched but only 10k records as per map command limitations. But I want to to fetch all the records(around 30K) Splunk query: | dbxquery query="First SQL query" connection="ABC" |eval comma="'" |eval ITEM='comma' + 'ITEM' + 'comma'+"," |mvcombine ITEM |nomv ITEM |fields - comma |eval ITEM=rtrim(tostring(ITEM),",")| map search="| dbxquery query=\"Second SQL query" connection=\"XYZ\"" But when i am using join command as shown below to get all the results(more than 10K), I am not getting the desired output. The output only contains results from first query. I tried replacing the column name Product_Id in second sql with ITEM at all places, but still no luck. | dbxquery query="First SQL query" connection="ABC" |fields ITEM | join type=outer ITEM[search dbxquery query=\"Second SQL query" connection=\"XYZ\"" Could someone help me in understanding what is going wrong and how can i get all the matching results from second query?
Hello everyone, and thanks in advance for your help. I'm very new to this subject so if anything is unclear, i'll try to explain my problem more in details. I'm using spunk 9.2.1, and i'm trying to ... See more...
Hello everyone, and thanks in advance for your help. I'm very new to this subject so if anything is unclear, i'll try to explain my problem more in details. I'm using spunk 9.2.1, and i'm trying to generate a PDF from one of my dashboard on the last 24 hours, using a splunk API call. I'm using a POST request to the ".../services/pdfgen/render" endpoint. First I couldn't find any documentation on  this matter. Furthermore, even when looking at $SPLUNK_HOME/lib/python3.7/sites-packages/splunk/pdf/pdfgen_*.py  (endpoint,views,search,utils) i could'nt really understand what arguments to use to ask for the last 24 hours data. I know it should be possible because it is doable on the splunk GUI, where you can choose a time range and render according to it.  I saw something looking like time range args : et and lt, which should be earliest time and latest time, but i don't know what type of time data it is expecting an trying random things didn't get me anywhere. If you know anything on this subject please help me thank you
HI  I want to know if it is possible to have a line chart with the area between max and min value filled with color.  Example :  For the below chart , we will be having 2 more new lines ( Max and ... See more...
HI  I want to know if it is possible to have a line chart with the area between max and min value filled with color.  Example :  For the below chart , we will be having 2 more new lines ( Max and Min) and we would like to have color filed in the area between Max and Min lines.  Current Query to generate the 3 lines :  | table Start_Time CurrentWeek "CurrentWeek-1" "CurrentWeek-2"  2 more lines ( Max and Min ) needs to be added in the above linechart and fill the color between max and min. 
Dear all, I'm trying to search for denied actions in a subnet, regardless if it is the source or destination. I tried those without success, maybe you can help me out. Thank you! index=* AND sr... See more...
Dear all, I'm trying to search for denied actions in a subnet, regardless if it is the source or destination. I tried those without success, maybe you can help me out. Thank you! index=* AND src="192.168.1.0/24" OR dst="192.168.1.0/24" AND action=deny index=* action=deny AND src_ip=192.168.1.0/24 OR dst_ip=192.168.1.0/24 Just found it: index=* dstip="192.168.1.0/24" OR srcip="192.168.1.0/24" action=deny  
こんにちは。 自宅環境にSplunk Enterprise 7.3のトライアル版をインストールしました。 環境:Windows Server 2019 Essencial (192.168.0.x)    Active Directory インストールしたのは一台だけで、Forwarder等は使用していません。 SplunkWebにアクセスするときに以下のアドレスだと正しく接続できます... See more...
こんにちは。 自宅環境にSplunk Enterprise 7.3のトライアル版をインストールしました。 環境:Windows Server 2019 Essencial (192.168.0.x)    Active Directory インストールしたのは一台だけで、Forwarder等は使用していません。 SplunkWebにアクセスするときに以下のアドレスだと正しく接続できます。 https://localhost:8000 https://127.0.0.1:8000 しかし、本来そのサーバが持っているIPアドレスやホスト名ではアクセスできずタイムアウトしてしまいます。 https://192.168.0.x:8000 https://foo:8000 ほかのクライアントから上記のアドレスでアクセスしてもやはりSplunkの画面は表示されません。   これはなぜでしょうか。 おそらく、何かの設定が足りないためだとは思うのですが。 アドバイスをお願いします。
Hello ! I'm using Splunk_SA_CIM with ESS and I'm currently studying most of the ESCU correlation search for my own purposes. Problem : I discovered that most of my ESCU rules are creating a lot o... See more...
Hello ! I'm using Splunk_SA_CIM with ESS and I'm currently studying most of the ESCU correlation search for my own purposes. Problem : I discovered that most of my ESCU rules are creating a lot of notable events, which after investigation, were all false positives. All these rules are based on fields coming from Endpoint Data Model (for exemple, Processes.process_path), and because most of the process.path values are equal to "null", it triggers the search and create a notable event. I've already updated every app I use, and to gather Windows data, I'm using Splunk_TA_Windows add-on. Do you have any clue on how I can find where the problem is and solve it ?    
I have a saved search which is scheduled but it is not showing and not running at the scheduled time.
Is this TA still being developed and supported? https://splunkbase.splunk.com/app/4950/ I followed the 'visit site' link on the splunkbase page and couldn't see the Enterprise version advertised?
Hi Splunk Community,  I am having issues with Splunk DB Connect 3.18.0 not sending data.  I was able to connect the db connect app to the database and query properly but no luck seeing the data... See more...
Hi Splunk Community,  I am having issues with Splunk DB Connect 3.18.0 not sending data.  I was able to connect the db connect app to the database and query properly but no luck seeing the data from splunk cloud. I am able to send other logs and data to Splunk cloud with no issues.  Thanks!
The environment I'm monitoring has a large number of custom database metrics.  For those not familiar, these are queries run against the database by the appdynamics agent, that are then displayed in ... See more...
The environment I'm monitoring has a large number of custom database metrics.  For those not familiar, these are queries run against the database by the appdynamics agent, that are then displayed in custom dashboards.  This works great for us.  The problem is, our environment is complex, and frequently changing.  The Custom Metrics are currently maintained by hand (someone has to go in and modify them when the environment changes).  There is no import/export option in the UI.  I've read through the API that is available, but I'm not able to find a way to upload or download a custom database metric.  Alternately, is there a way to perform a variable substitution for the database server and value in the query? Anything that could make this less of a manual process.   Thanks
I'm trying to let Splunk Enterprise log some creation of a user on the same system as where Splunk is installed. My Splunk-version is 9.3.1. Alongside with this install, I've installed the latest Un... See more...
I'm trying to let Splunk Enterprise log some creation of a user on the same system as where Splunk is installed. My Splunk-version is 9.3.1. Alongside with this install, I've installed the latest Universal Forwarder (win) (on localhost 127.0.0.1). When installing: - I skip the SSL page - click "Next" - select "Local System" - click "Next" - check all items under "Windows Log Events" - click "Next" - generate an admin account and password - leave the "Deployment Server"-settings empty - enter "127.0.0.1:9997" as Host and port for "Receiving Indexer" - finish the installer Then I create a user (net user /add <user>) in CMD. After this step I return to Splunk Search and enter * as search criteria but nothing is found. Even when I enter the username (I added) the software finds nothing. Can someone tell me what I'm doing wrong or what the issue can be? Thanks! Gerd
I've the below event, where I need to display only event which has action=test and category=testdata. test { line1: 1 "action": "test", "category": "testdata", } test1 { line1: 1 "action": ... See more...
I've the below event, where I need to display only event which has action=test and category=testdata. test { line1: 1 "action": "test", "category": "testdata", } test1 { line1: 1 "action": "event", "category": "testdata", } test2 { line1: 1 "action": "test", "category": "duplicate_data", }