All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I am very new to Splunk reporting and I am trying to change a timechart I have to show in GB since we are using a 4TB disk share. Below is my search query. It comes up OK, but just trying to make... See more...
Hi, I am very new to Splunk reporting and I am trying to change a timechart I have to show in GB since we are using a 4TB disk share. Below is my search query. It comes up OK, but just trying to make it more friendly. index=perfmon host=ServerName eventtype=perfmon_windows NetShare14| timechart span=24h avg(Free_Megabytes) as MB_Free Thanks Kesrich
When attempting to install the Rapid 7 TA 1.2.1, I am getting a 500 internal server error when I attempt to run setup.  I am not getting any useful clues in the logs. Is there any solution for this ... See more...
When attempting to install the Rapid 7 TA 1.2.1, I am getting a 500 internal server error when I attempt to run setup.  I am not getting any useful clues in the logs. Is there any solution for this or advice for troubleshooting?
Is there a way to make the 1st column in the “settings ->Data Inputs -> <app data input>” list, hyperlink to allow me to edit the Data Input when using a custom UI manager.xml?  I would like to make ... See more...
Is there a way to make the 1st column in the “settings ->Data Inputs -> <app data input>” list, hyperlink to allow me to edit the Data Input when using a custom UI manager.xml?  I would like to make the server's IP address or FQDN hyperlink to allow me to edit the data input settings. sample: manager.xml     <endpoint name="data/inputs/servertest”> <header>App server test</header> <breadcrumb> <parent hidecurrent="False">datainputstats</parent> <name>App server test</name> </breadcrumb> <elements> <element name=“server" type="textfield" label="Server FQDN or IP Address"> <view name="create"/> <view name="edit"/> <view name="list"/> </element>     https://docs.splunk.com/Documentation/Splunk/8.0.4/AdvancedDev/ModInputsCustomizeUI  
Hi! I have a dashboard with two panels where searches and fields are different. For example: One panel has a search  index=example sourcetype=abc |table field1 field2 field3 Another panel has a... See more...
Hi! I have a dashboard with two panels where searches and fields are different. For example: One panel has a search  index=example sourcetype=abc |table field1 field2 field3 Another panel has a search  index=example sourcetype=xyz |table field1 somethingElse What I would like to do is to have a text input for field2 in the first panel. And then populate the second panel with the values for somethingElse and field1 where field1 values match that from the first panel. Sorry for this example, but my search is too long and complex. Thank you very much!
{ "DbMaintenanceDailyRoutineSummary": { "success": [ { "server-002": [ { "vacuum": true, "analyze": true, "warehouse": "mydat... See more...
{ "DbMaintenanceDailyRoutineSummary": { "success": [ { "server-002": [ { "vacuum": true, "analyze": true, "warehouse": "mydatabase@aaaaaa" }, { "vacuum": true, "analyze": true, "warehouse": "mydatabase@bbbbbb" } ] }, { "server-003": [ { "vacuum": true, "analyze": true, "warehouse": "mydatabase@ccccccc" }, { "vacuum": true, "analyze": true, "warehouse": "mydatabase@ddddddd" } ] } ], "fail": [ { "server-002": [ { "vacuum": true, "analyze": false, "warehouse": "mydatabase@eeeeee" } ] }, { "server-003": [ { "vacuum": false, "analyze": true, "warehouse": "mydatabase@fffffff" }, { "vacuum": true, "analyze": false, "warehouse": "mydatabase@gggggg" }, { "vacuum": true, "analyze": false, "warehouse": "mydatabase@hhhhhh" } ] } ] } }   I am wondering how can I convert this result in something like the following message to sent it as a alert by email.   DbMaintenanceDailyRoutineSummary fail: server002: mydatabase@eeeeee:  analyze: false, vacuum: true server003: mydatabase@fffffff - analyze: false, vacuum: true mydatabase@ggggg - analyze: false, vacuum: true   success: server002: mydatabase@aaaaaa- analyze: true, vacuum: true mydatabase@bbbbbb - analyze: true, vacuum: true server003: mydatabase@ccccccc  - analyze: false, vacuum: true mydatabase@dddddd - analyze: false, vacuum: true
Hi at all, I have a problem that is described many times in Splunk docs but I didn't find my Use Case: I have to send all my logs from an Heavy Forwarder to an Indexer and to a third party system ... See more...
Hi at all, I have a problem that is described many times in Splunk docs but I didn't find my Use Case: I have to send all my logs from an Heavy Forwarder to an Indexer and to a third party system via syslog, Indexer must receive all the logs, Third Party system must receive a subset of these data (three sourcetypes) using syslogs (udp). I used the available documentation (https://docs.splunk.com/Documentation/Splunk/8.0.4/Forwarding/Forwarddatatothird-partysystemsd and https://docs.splunk.com/Documentation/Splunk/latest/Admin/Outputsconf ) but the result is that I'm sending all the logs both to Indexer and syslog, in other words I'm not be able to filter syslogs output. These are my conf files on HF:   outputs.conf         [tcpout] defaultGroup = Nothing indexAndForward = 0 [tcpout:Splunk] server = 1.1.1.1:9997 [tcpout-server://1.1.1.1:9997] [syslog] defaultGroup = syslog [syslog:syslog] type=udp server=2.2.2.2:514         I tried with and/or without defaultGroup on Splunk and syslog; thern I tried to add syslogSourceType = sourcetype::sourcetype1/2/3 to the syslog stanza to filter data.   props.conf:         [sourcetype1] TRANSFORMS-routing = Splunk,syslog [sourcetype2] TRANSFORMS-routing = Splunk,syslog [sourcetype3] TRANSFORMS-routing = Splunk,syslog         I tried also adding TRANSFORMS-routing = Splunk for all the other sourcetypes to send to Indexer but not to syslog. Then I tried to use two TRANSFORMS stanzas.   transforms.conf:         [Splunk] REGEX=. DEST_KEY=_TCP_ROUTING FORMAT=Splunk [syslog] REGEX=. DEST_KEY=_SYSLOG_ROUTING FORMAT=syslog         I tried also using three stanzas, one for each sourcetypes and also adding a regex for each sourcetype.   At the end, I continue to have all the data both to Indexer and syslog!   Can Anyone help me to understand where I'm going wrong? Ciao and thanks. Giuseppe
I tried to install the Enterprise Security on my own computers, but after the first restart of Splunk Enterprise, the second restart (after the ES initial configuration) didn't complete.  I have a P... See more...
I tried to install the Enterprise Security on my own computers, but after the first restart of Splunk Enterprise, the second restart (after the ES initial configuration) didn't complete.  I have a PC with 32Gb RAM and 4CPU Is it a problem? Is it possible to install the ES in an all-in-one configuration? Thanks, Mauro  
Hello Team, As part of our Splunk Enterprise update to version 7.3.5 we have analyzed all the TA’s and apps currently installed in our search heads and found that following apps are not compatible w... See more...
Hello Team, As part of our Splunk Enterprise update to version 7.3.5 we have analyzed all the TA’s and apps currently installed in our search heads and found that following apps are not compatible with the version we are planning to upgrade and which  are not supported by Splunk .Please  let us know with your feedbacks and possible solutions so that we can ensure they remain functional. Name Existing Version Support Compatible to 7.3.5 with current version Upgrade Required For 7.3.5 Upgrade Version for 7.3.5 Base64 1.1 Non Splunk No Yes No Upgrade version Available Custom Visualization - donut 1.0.2 Non Splunk No Yes No Upgrade version Available Endace Fusion Connector 2 Non Splunk No Yes No Upgrade version Available Splunk Add-on for Cisco IPS 2.1.6 Non Splunk No Yes No Upgrade version Available Splunk Add-on for Microsoft Active Directory 1.0.0 Non Splunk No Yes No Upgrade version Available Splunk Add-on for Microsoft SQL Server 1.3.0 Splunk No Yes No Upgrade version Available Splunk Add-on for NetFlow 3.0.1 Non Splunk No Yes No Upgrade version Available Splunk_TA_oracle   Splunk No Yes No Upgrade version Available URL Toolbox 1.6  Non Splunk No Yes No Upgrade version Available XS Visualization 1.2.1 Non Splunk No Yes No Upgrade version Available Custom Visualization - donut 1.0.2 Non Splunk No Yes No Upgrade version Available Hadoop Connect 1.2.5 Non Splunk No Yes No Upgrade version Available Heatmap 1.0.1 Non Splunk No Yes No Upgrade version Available timewrap 2.4 Non Splunk No Yes No Upgrade version Available Network Topology for Splunk 1.1 Non Splunk No Yes No Upgrade version Available Splunk Add-on for Microsoft SQL Server 1.3.0 Splunk No Yes No Upgrade version Available     Please let us know if you required any  other details .
Hello, I need your assistance on the following questions: 1. If Appdynamics provides us with a synthetic license, how do we add the new synthetic license to the controller? 2. Is it a new license.... See more...
Hello, I need your assistance on the following questions: 1. If Appdynamics provides us with a synthetic license, how do we add the new synthetic license to the controller? 2. Is it a new license.lic file, which I have to place in the controller home directory? Please let me know. Thanks, Emrul
Hello, I have a Windows CA Server to sign my own requests. For the Web Certificate I have used the "Web Server" template. What template should I use for the server.pem? What purposes should it inc... See more...
Hello, I have a Windows CA Server to sign my own requests. For the Web Certificate I have used the "Web Server" template. What template should I use for the server.pem? What purposes should it include? Thank you
I have an MS SQL server writing audit data to a .sqlaudit file. I need to get this data into Splunk. I have DB Connect installed, but I'm not sure how to ingest the .sqlaudit file data. Do I use DB C... See more...
I have an MS SQL server writing audit data to a .sqlaudit file. I need to get this data into Splunk. I have DB Connect installed, but I'm not sure how to ingest the .sqlaudit file data. Do I use DB Connect or the UF?
Hello , I have 2 questions about Appdynamics. 1. I want to get the following definitions with api and then open an automatic ticket for servicenow application. But I can't get the result I want wit... See more...
Hello , I have 2 questions about Appdynamics. 1. I want to get the following definitions with api and then open an automatic ticket for servicenow application. But I can't get the result I want with api. How can I get the definitions here? If there are sample documents, can you share them? "Name" and CallPerMin "that appear in the picture, these definitions are required with a special date range. When I look at Chrome F12, these are the parameters that respond when the corresponding url is called.   I want to automatically create a request for servicenow based on monthly response times. 2. I test with the postman I get a continuous 500 or 401 error. I searched on the internet but I didn't get much results.
I have a user which needs to be able to write one specific lookup table which has to be shared globally. I have to control with the permission settings on the lookup  as I have several users/roles wh... See more...
I have a user which needs to be able to write one specific lookup table which has to be shared globally. I have to control with the permission settings on the lookup  as I have several users/roles where each role has to grant write access to a different lookup table. What i have observed so far: in order to be able to write a lookup table with | outputlookup xxxx.csv, the user needs the capability output_file. This capability is be default granted trough the predefined user role. With the capability output_file, the permission configured in local.meta is ignored,  outputlookup writes happily any lookup file, regardless of the permission, even files which don't exist.   What i am missing here?
I am want to improve the response time for search with rare event and searching with date variable. Note: I am interested in latest 500 event only. Here "local date" is other than _time variable. Fo... See more...
I am want to improve the response time for search with rare event and searching with date variable. Note: I am interested in latest 500 event only. Here "local date" is other than _time variable. For _time is set on other date variable in event which i we can't change. Scenario 1 :  index="ABC" earliest=-120d sourcetype="XYX" flag = "Y" xxnumber="8XV5F5FF4" | head 500| fields id This search has completed and has returned 6 results by scanning 18 events in 0.254 seconds Scenario 2 :  index="ABC" earliest=-120d sourcetype="XYX" flag = "Y" local_date="2020-06-01 00:00:00" | head 500| fields d This search has completed and has returned 22 results by scanning 469,911 events in 25.058 seconds   Scenario 3: Rare events index="ABC" earliest=-120d sourcetype="XYX" flag = "Y" local_date>="2020-06-15 00:00:00" | head 500| fields id This search has completed and has returned 57 results by scanning 2,943,130 events in 67.789 seconds if we use any other filter than date filter (which is other than _time) takes less time (less than seconds).  Can any one suggest how i can get better response time with date filters.?
I have same source path in 2 different hosts and i want to setup 2 different source type for each server. how to do this ? host: fela01u source: /apps/test/*/stage/logs/*.log sourcetype: fel_log ... See more...
I have same source path in 2 different hosts and i want to setup 2 different source type for each server. how to do this ? host: fela01u source: /apps/test/*/stage/logs/*.log sourcetype: fel_log host: cola01u source: /apps/test/*/stage/logs/*.log sourcetype: col_log When i search using source type 'col_log', it should give me the results of 'cola01u'.
Hi All, Currently we are using 1.0.3 but we want to update to its latest version. What are the steps do we need to follow,please suggest. Thanks
Hi @gcusello  We are looking to monitor azure logs from splunk. The followings are the demands of user. Logs related to our CDN and blob storage to start with. And possibly expanding to any Azur... See more...
Hi @gcusello  We are looking to monitor azure logs from splunk. The followings are the demands of user. Logs related to our CDN and blob storage to start with. And possibly expanding to any Azure alerts or logs relating to our servers, load balancer, or other resources. Can you suggest us how should we proceed. Thanks.  
HI, When we open  the Splunk DB connect , the message "The Java Bridge server is not running"   How can I troubleshoot this issue. We are using Splunk Enterprise 7.1.3.
I've created a summary index where it contains 6 eval cases, for example: eval 1=case(match(something,"a",...."b","c"), eval 2 =case (d,e,f)....eval 6=case(x,y,z) where a,b,c....x,y,z are the indiv... See more...
I've created a summary index where it contains 6 eval cases, for example: eval 1=case(match(something,"a",...."b","c"), eval 2 =case (d,e,f)....eval 6=case(x,y,z) where a,b,c....x,y,z are the individual detailed functions and 1,2,3,4,5,6 are overall functions. Now I have combined all eval functions into a single value using eval Total_Function = mvappend(1,2,3,4,5,6). But I want to list the table with both overall function & individual detailed function as well. But I am not sure how to get individual detail values in the table along with overall function. Expected table as below: Time Total_Function      Overallfunction Individual function XX     T otal_Function          1                               a YY       Total_Function          1                               b ZZ       Total_Function          1                               c AA       Total_Function         6                               x BB       Total_Function         6                               y CC      Total_Function          6                               z                      Kindly help me please. (Please note, there are multiple individual functions in each eval case)
Hello there,   I recently created new trial instance for SPLUNK cloud, when i try login to the application it was not allowing me to login. getting login failed error.