All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Issue:Unable to add search peer from search head using distributed search :no route to host or connection refused error we have 5 instance search head license master indexer search head ent... See more...
Issue:Unable to add search peer from search head using distributed search :no route to host or connection refused error we have 5 instance search head license master indexer search head enterprise security heavy forwader deployment server all vm instances are created , we are now adding search peers from search head license master and search head enterprise security, the search peer would be indexer. here from search head LM we cannot do a telnet to indexer using 8089 port, but vice versa it is working. also telnet from search head Enterprise security to search head LM is also connecting but we are unable to do telnet to indexer on port 8089 from both SH LM and SH ES. while trying to add new peer if we put htttps://ipaddress:8089 we get error no route to host if we put https://hostnameofindexer:8089 and add peer we get error connection refused splunk version:8.0 vmware esxi os:centos 8 This issue is very critical as whole project is stuck now.
Hi Splunkers, I need to send data through HEC token to on-prem as well as Cloud splunk instance. Please help me with some pointers. Thanks in advance
Hi , I want to display two charts , one column and line chart in single panel based on condition. For example, if result=found then column chart based on user for all months and when result="not... See more...
Hi , I want to display two charts , one column and line chart in single panel based on condition. For example, if result=found then column chart based on user for all months and when result="not found" then line chart based on user My data is like, Users result Jan Feb Mar Apr May Jun July .. Dec ABC found 100 102 103 102 100 105 200... 70 ABC Not-Found 20 50 22 30 60 22 43 .... 10 XYZ found 120 80 70 .......................................... XYZ Not-Found 24 30 15 ......................................... Now, want to display coulmn chart when result=found ,x-axis (Months) and in Y-axis (months Value) by Users and in same panel want to show line chart when result=not found , ,x-axis (Months) and in Y-axis (months Value) by Users Please suggest.
Hi, Does anyone here experienced the same problem that we got. It just suddenly happened that the Service Analyzer we're working on suddenly turned on to a blank page. The search bar and the button... See more...
Hi, Does anyone here experienced the same problem that we got. It just suddenly happened that the Service Analyzer we're working on suddenly turned on to a blank page. The search bar and the buttons are still there, even the tabs for deep dive, glass table, and so on are still visible but the service analyzer is just full white-blank page. Hope to get answers here. Btw, we are using version 4.3.0.
I want to create one single graphe, but with two different colors for each searches. So, create overlay of the two search is easy with OR, but how can I colorize the plots with distinct colors? C... See more...
I want to create one single graphe, but with two different colors for each searches. So, create overlay of the two search is easy with OR, but how can I colorize the plots with distinct colors? Currently, I'm stuck at vhost=x OR vhost=y Thanks for the help
Hello , I am trying to collect logs from 25-30 computers in my local LAN network and learn how to use splunk . I have installed a splunk universal forwarder in one of the Debain PC ( PC1 , fro... See more...
Hello , I am trying to collect logs from 25-30 computers in my local LAN network and learn how to use splunk . I have installed a splunk universal forwarder in one of the Debain PC ( PC1 , from where i want to collect the logs ) and a Splunk server in another Debian machine (PC2 ) . I am able to see the logs of PC1 in the server , create co-relation rules and set alerts and other things which i have been reading in the splunk documentation. Can anyone Pls help me as to how will i install splunk universal or heavy forwarder in all 30 PCS in my LAN and start getting logs from all PCs. ? Should i manually do it in every computer or is there a better way ?
Hi All, We'v implemented reverse proxy in our environment for authentication. We're able to login thorough our card but after logging in. We could see all the dashboards but the saved searches... See more...
Hi All, We'v implemented reverse proxy in our environment for authentication. We're able to login thorough our card but after logging in. We could see all the dashboards but the saved searches and queries on dashboard cannot run. It shows error as following "Search could not start/found Following is the change that we've done in web.conf root_endpoint = / SSOMode = strict trustedIP = proxy IP remoteUser = user on proxy server tools.proxy.on = false enableWebDebug=true" Could you please look into this and advice what could be the reason ?
I'm calling REST to submit a job search and always it considers All Time and ignores the earliest.. Is there anything wrong in the search query i am posting? search=search "index=center host=... See more...
I'm calling REST to submit a job search and always it considers All Time and ignores the earliest.. Is there anything wrong in the search query i am posting? search=search "index=center host=center* AND sourcetype=abf:afz.log "DebugLogSubmission" "time" earliest=-24h@h | rex time=(?[\d]+) |table _time mytime | head 100"
Hello, I have to setup appdynamic with ruby on rails, I have added gem "gem 'appdynamics' " in gemfile, also created appdynamics.yml file with following details: app_name: "My app name" tier_n... See more...
Hello, I have to setup appdynamic with ruby on rails, I have added gem "gem 'appdynamics' " in gemfile, also created appdynamics.yml file with following details: app_name: "My app name" tier_name: "ROR" node_name: "test" controller: host: "test" port: "test" account: "test" access_key: "some key" Also Please Do Let Me know How Will I get  app_name, tier_name, node_name, controller, host, port, account, access_key. I am getting this error: 1) WARN -- AppDynamics: Unable to start Instrumenter due to a configuration error: app name required 2) AppDynamics: [AppDynamics] [1.1.2] Running AppDynamics in development mode. No data will be reported until you deploy your app. can anyone help me for this ???
I want to know how to take data from multiple data sources by ID. The following is an example of a data source. A Datasource(A-id,B-id or C-id,B-Manhour or C-Manhour) B Datasource(B-id,B-subjec... See more...
I want to know how to take data from multiple data sources by ID. The following is an example of a data source. A Datasource(A-id,B-id or C-id,B-Manhour or C-Manhour) B Datasource(B-id,B-subject) C Datasource(C-id,B-id,C-subject) ※A datasource is Man-hour information、B Datasource is Parent ticket information、C-datasource is Child ticket information I want the output below. id subject man-hour B-id B-subject Total value of B-Manhour B-id B-subject Total value of B-Manhour+Total value of C-Manhour B-id B-subject Total value of C-Manhour Please tell me how to do it.
G'day! My Health Post app on my phone shows data upload succeeded and the logs show that it's getting 200's in response...but no data shows in my configured index per the HEC token (and the video)... See more...
G'day! My Health Post app on my phone shows data upload succeeded and the logs show that it's getting 200's in response...but no data shows in my configured index per the HEC token (and the video). I've checked my token from outside (so no firewall issue). I turned off https because I'm not currently serving a cert on my HEC port, and I use a reverse proxy to get to the front-end UI. I'm open to suggestions, but I think at this point it may be how the iOS app translates my Splunk URL into a HEC endpoint...
Is there a way I can group a window of 3 time points and add it as a field with the last two remaining being ignored? I'm trying to classify time series patterns using a support vector machine with... See more...
Is there a way I can group a window of 3 time points and add it as a field with the last two remaining being ignored? I'm trying to classify time series patterns using a support vector machine with the Splunk MLTK and I'm unsure how to get the data in these windows. (ex) My data has the _time and amount fields, and I would like to add the windows field: 1. _time amount windows 2. XX:12:XX 6 [ 6, 8, 4] 3. XX:13 XX 8 [8, 4, 4] 4. XX:14:XX 4 [4, 4, 3] 5. XX:15:XX 4 [ 4, 3, 2] 6. XX:16:XX 3 ...` 7. XX:17:XX 2 ... `
hi I am having an Installation error on my Windows 10 64bit, while installing the .msi package. It says that a DLL that is required for the setup to complete is not running contact system administr... See more...
hi I am having an Installation error on my Windows 10 64bit, while installing the .msi package. It says that a DLL that is required for the setup to complete is not running contact system administrator. Looking at the forum for answers i found an answer from 2018 which mentions installing from command prompt. I did that too but no use. Screen shots attached.
How do I auto-refresh a panel in Splunk Enterprise 8.0.0? I tried the refresh.auto.interval option but it's already deprecated. Tried the refresh attribute too but it seems that it only applies on th... See more...
How do I auto-refresh a panel in Splunk Enterprise 8.0.0? I tried the refresh.auto.interval option but it's already deprecated. Tried the refresh attribute too but it seems that it only applies on the whole dashboard and not the individual panel.
Hello, I created SPL search, that should pull out the log entries, based on the if-then-else condition, but it does not work correctly. Please, find this search below: index=cdc_apache OR ind... See more...
Hello, I created SPL search, that should pull out the log entries, based on the if-then-else condition, but it does not work correctly. Please, find this search below: index=cdc_apache OR index=datapower | eval tag_name=if(index=="cdc_apache", "ED_ENDI_Digital_Retail_WebS_Test", 'NULL') | where (tag=tag_name) | where isnum(status) I need to find the log entries from both indexes: index=cdc_apache and index=datapower. But I'm getting log entries ONLY from index=cdc_apache. I realize that my SPL query is wrong, but I don't know how to fix it. Please, help. Thanks, Sergei Cher
I install Trend Micro Deep Security on a standalone test server. Everything run as expected: inputs.conf set index to av_int_deepsecurity and sourcetype to deepsecurity. Then props.conf and trans... See more...
I install Trend Micro Deep Security on a standalone test server. Everything run as expected: inputs.conf set index to av_int_deepsecurity and sourcetype to deepsecurity. Then props.conf and transforms.conf rewrite the sourcetype to deepsecurity-firewall, deepsecurity-antimalware, etc. Searching in the app show events with different sourcetypes: deepsecurity-firewall, deepsecurity-antimalware, etc. I install Trend Micro Deep Security in a productive cluster I push the app to Search Heads, Indexers, Forwarders but searching in the app does not show events with different sourcetypes: deepsecurity-firewall, deepsecurity-antimalware, etc. It only shows events with sourcetype deepsecurity. Test standalone server is working fine. Productive cluster is not working as expected... What did I do wrong? Splunk here is 7.1.2 Devices are sending machine data to a server with Syslog-NG that make files. These files are monitored by SplunkForwarder that forwards data to the productive cluster. These files are also copied by a batch job to the test standalone server. Thank you for your help. I install Trend Micro Deep Security on a standalone test server. inputs.conf put the data in index av_int_deepsecurity and fix the sourcetype to deepsecurity. props.conf and transforms.conf rewrite sourcetypes to deepsecurity-firewall, deepsecurity-antimalware, etc. Searching events from the app, I see deepsecurity-firewall, deepsecurity-antimalware, etc. as expected Then I install Trend Micro Deep Security in the productive cluster. Searching events from the app, I see only sourcetype deepsecurity and NOT deepsecurity-firewall, deepsecurity-antimalware, etc. as expected. I install the app on Search Heads, on Indexers, on Master and on Heavy Forwarders, without success. What I did wrong? In standalone test, we copy a file monitored by the standalone server. In the cluster, devices are forwarding events to a Syslog-NG that put data in a file and this file is monitored by the SplunkForwarder installed on the same server. Data is then sent to the cluster indexers. We use Splunk 7.1.2. Thank you for your help.
Hi, I just got a splunk cloud and i want to connect it with palo alto firewall using API. which steps do i need to follow as I am new user of splunk. please let me know.
Currently Splunk's Docker container does not support TZ or TIMEZONE options. I am able to change the TZ for a individual user but this still causes discrepancies in some log files and times. Addition... See more...
Currently Splunk's Docker container does not support TZ or TIMEZONE options. I am able to change the TZ for a individual user but this still causes discrepancies in some log files and times. Additionally for free users who are unable to set a TZ for users this means they are stuck on UTC.
I have events which has EST timestamp already and i don't want splunk to do any time conversion. whats occurring right now is splunk is transforming this to EST again thinking its UTC time. raw... See more...
I have events which has EST timestamp already and i don't want splunk to do any time conversion. whats occurring right now is splunk is transforming this to EST again thinking its UTC time. raw event time- 02/05/2020 02:08:49.074 splunk timestamp - 2/4/20 9:08:49.074 PM i am looking to have no transformation applied and keep the raw event time as the timestamp
We have a CSV with a field called application and another called IP. Within the field ip there are ip addresses and some ip addresses with CIDR notation. We have hundreds of field entries for appli... See more...
We have a CSV with a field called application and another called IP. Within the field ip there are ip addresses and some ip addresses with CIDR notation. We have hundreds of field entries for applications and IP, below is smaller dummy data version of the list: Application IP sec_system 192.168.4.0/26 sec_system 192.168.1.0/25 sec_system 192.168.2.0/24 sec_system 192.168.3.0/24 internal_system 192.168.2.5 internal_system 192.168.3.50 internal_system 192.168.4.32 internal_system 192.168.1.4 win_system 192.168.1.50 win_system 192.168.1.3 Is there a way to match applcations/ips, with applications/ips with CIDR notations? (I've seen some people say you need to use the tranforms.conf or props.conf file, I can't use that file because I don't have access to it)