All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I have scoured the forums and checked the web_service.log but I can't seem to be able to figure out what my problem is.  What can I be looking for?   Checking prerequisites... Checking http port [... See more...
I have scoured the forums and checked the web_service.log but I can't seem to be able to figure out what my problem is.  What can I be looking for?   Checking prerequisites... Checking http port [8000]: open Checking mgmt port [8089]: open Checking appserver port [127.0.0.1:8065]: open Checking kvstore port [8191]: open Checking configuration... Done. Checking critical directories... Done Checking indexes... Validated: _audit _internal _introspection _metrics _metrics_rollup _telemetry _thefishbucket history main nagios nix perfmon summary windows Done Checking filesystem compatibility... Done Checking conf files for problems... Done Checking default conf files for edits... Validating installed files against hashes from '/opt/splunk/splunk-8.1.2-545206cc9f70-linux-2.6-x86_64-manifest' All installed files intact. Done All preliminary checks passed. Starting splunk server daemon (splunkd)... Done Waiting for web server at https://127.0.0.1:8000 to be available..
I have an alert action that was created using the splunk add on builder in our test environment and it does not log any events to the cim_modactions index now that it is in production.  The SA_SPLUNK... See more...
I have an alert action that was created using the splunk add on builder in our test environment and it does not log any events to the cim_modactions index now that it is in production.  The SA_SPLUNK_CIM app is installed and the index is in splunk but it is empty.  I copied the SA_SPLUNK_CIM  app to our QA Environment and it logs events in the QA environment.  has anyone had an issue with the cim_modactions index not getting events logged to it?
I have 4 applications integrated with each other -  their names let's say A, B, C, D respectively. All these applications have different index, sourcetype and source. Now, when i call my API from app... See more...
I have 4 applications integrated with each other -  their names let's say A, B, C, D respectively. All these applications have different index, sourcetype and source. Now, when i call my API from application A, it traverse to B, C and D to perform some operations. I want to trace the error logs of all these applications at one location in splunk. Also, if an error occurs, there is a common reference id which is logged in all the application's logs respectively. How to achieve this ? Is there a straightforward concept for this type of scenario? What i am imagining is like this : Ref ID Application A Logs Application B  Logs Application C  Logs Application D  Logs 1234 <logs of app A for id 1234> <logs of app B for id 1234> <logs of app C for id 1234> <logs of app D for id 1234> 4567 <logs of app A for id 4567> <logs of app B for id 4567> <logs of app C for id 4567> <logs of app D for id 4567>     Or is there a better way of doing this? I am trying to create a dashboard panel with this result.
Hello All, I need some help with forwarding vCenter logs to Splunk. There are multiple articles which is creating more confusion. - As per the articles, we need to have a Data Collection Node. Can... See more...
Hello All, I need some help with forwarding vCenter logs to Splunk. There are multiple articles which is creating more confusion. - As per the articles, we need to have a Data Collection Node. Can I create DCN on my Heavy Forwarder. - Whats the process to install & configure .ova file to create a DCN on Heavy Forwarder. Thanks in advance,
Noob here. Can anyone tell me why the following search: search sourcetype=srt  | table serialNumber will give me a one-column table of serial numbers as expected, while the same query in subsearch ... See more...
Noob here. Can anyone tell me why the following search: search sourcetype=srt  | table serialNumber will give me a one-column table of serial numbers as expected, while the same query in subsearch brackets [search sourcetype=srt  | table serialNumber] does not return the same table that I expect, but rather returns the full record?
Hello Fellas! Im trying for so many days to usa the values stored in a field as  values to search for in anoter subset of a multi search without any luck, I hope I am making myself understood. ... See more...
Hello Fellas! Im trying for so many days to usa the values stored in a field as  values to search for in anoter subset of a multi search without any luck, I hope I am making myself understood. What I want to do: 1) store the IDS from the first search and saved them in a field named START 2) use all the IDS I have in the field START to run another search which requires the  field id_user what Im doing: | multisearch [|search index="medi" AND bloodp="high" AND id_user=* AND facility=5 | eval START=id_user] [|search index="medi" AND bloodp="high" AND id_user=START AND facility=6 AND trx=* | eval treatmentchose=trx] I cannot seem to be using the ids in facility 5 to search for the medication that was giving to the patient in facilty 6 by using the IDS that I stored in the field START, can someone please please help me?
Is it possible to make a legend wider? I'm trying to use html in the xml file to assign the width to be wider but it doesn't seem to be working.
I have a csv file that is written to once a day.    The input points to a custom sourcetype [csvtest] which has appropriate settings for the data within.   inputs.conf:   [monitor://c:\opt\splunk... See more...
I have a csv file that is written to once a day.    The input points to a custom sourcetype [csvtest] which has appropriate settings for the data within.   inputs.conf:   [monitor://c:\opt\splunk\etc\apps\csvtest\data\csvtest.csv] index = main sourcetype = csvtest     props.conf:   [csvtest] LINE_BREAKER = ([\r\n]+) SHOULD_LINEMERGE = false TIME_PREFIX = ^ TIME_FORMAT = %Y-%m-%d MAX_TIMESTAMP_LOOKAHEAD = 15 TRUNCATE = 10000     and data looks as so:    2020-01-26,,,1.0,,,,,,,,9,,,,,,,,,, 2020-01-27,,,2.0,,,,,,,,19,,,,,,,,,, 2020-01-28,,,1.0,1.0,,,1.0,,,,11,,,,,,,,,, 2020-01-30,,,0.0,2.0,,,2.0,,,,27,,,,,,,,,, 2020-01-31,,,0.0,2.0,,,2.0,,,,17,,,,,,,,,, 2020-02-03,,,0.0,3.0,,,3.0,,,,29,,,,,,,,,, 2020-02-04,90.0,12.0,0.0,3.0,,,3.0,139.0,,,34,,,,,,,,,, 2020-02-05,96.0,8.0,0.0,3.0,,,3.0,150.0,,,43,,,,,,,,,, 2020-02-06,104.0,0.0,0.0,3.0,,,3.0,169.0,,,62,,,,,,,,,, 2020-02-08,130.0,25.0,0.0,3.0,,,3.0,197.0,,,39,,,,,,,,,, 2020-02-10,167.0,81.0,0.0,3.0,,,3.0,259.0,,,8,,,,,,,,,, 2020-02-11,184.0,79.0,0.0,3.0,,,3.0,285.0,,,19,,,,,,,,,, 2020-02-12,257.0,44.0,0.0,2.0,1.0,,3.0,313.0,,,9,,,,,,,,,, 2020-02-13,306.0,16.0,0.0,2.0,1.0,,3.0,340.0,,,15,,,,,,,,,, 2020-02-14,353.0,0.0,0.0,2.0,1.0,,3.0,364.0,,,8,,,,,,,,,, 2020-02-17,399.0,0.0,0.0,2.0,1.0,,3.0,402.0,,,0,,,,,,,,,, 2020-02-18,418.0,0.0,0.0,2.0,1.0,,3.0,421.0,,,0,,,,,,,,,, 2020-02-19,436.0,0.0,0.0,2.0,1.0,,3.0,456.0,,,17,,,,,,,,,, 2020-02-20,462.0,,0.0,1.0,2.0,,3.0,479.0,,,14,,,,,,,,,, 2020-02-21,483.0,,0.0,0.0,3.0,,3.0,498.0,,,12,,,,,,,,,, 2020-02-22,540.0,,0.0,1.0,3.0,,4.0,553.0,,,9,,,,,,,,,,     Every time this file adds a new line to the bottom , the whole file is ingested... causing duplicate data in the index. I have created the same scenario on Linux and the forwarder appropriately uses the default initCrcLength and identifies it has seen the file before and only ingests the new event.    in the _internal index, i am seeing events like so>    03-23-2021 20:00:02.851 -0400 INFO WatchedFile - Will begin reading at offset=0 for file='C:\opt\splunk\etc\apps\csvtest\data\csvtest.csv'.     Is this just a windows forwarder symptom?    Is there an inputs.conf setting I can utilize to fix this?  Thanks in advance!
I have a query result . i want to append the three colors  based on values  and the table is dynamic based on the time frame  column Messages and Nov20 - no change should be same  Messages No... See more...
I have a query result . i want to append the three colors  based on values  and the table is dynamic based on the time frame  column Messages and Nov20 - no change should be same  Messages Nov 20 Dec 20 Jan 20 Feb 20 Messge 0 0 1 0 0 Messge 1 1 3 1 1 Messge 2 11 0 0 0 Messge 3 1 0 0 0 Messge 4 9 5 0 0 Messge 5 1 1 0 0 Messge 6 1 1 0 0 Messge 7 0 1 0 0 column Dec20 Jan20 Feb20 - values should append field value and with color value Messages Nov 20 Dec 20 Jan 20 Feb 20 Messge 0 0       1 GREEN       0 RED 0 YELLOW Messge 1 1 3 GREEN 1 RED 1 YELLOW Messge 2 11 0 RED 0 YELLOW 0 YELLOW Messge 3 1 0 RED 0 YELLOW 0 YELLOW Messge 4 9 5 RED 0 YELLOW 0 YELLOW Messge 5 1 1 YELLOW 0 RED 0 YELLOW Messge 6 1 1 YELLOW 0 RED 0 YELLOW Messge 7 0 1 GREEN 0 RED 0 YELLOW
What .conf files do I change (path?) to add new windows event codes to like 4726 and so on? What Splunk sever is this done on?
Hi Community, I am trying to set a token based on a search in an ITSI glass table, but I cannot find any way to do it "dynamically", even by changing the JSON code of the glass table. My final go... See more...
Hi Community, I am trying to set a token based on a search in an ITSI glass table, but I cannot find any way to do it "dynamically", even by changing the JSON code of the glass table. My final goal is using the token value to set the color of an icon based on the value returned by the related search. According to what I found, the only option to colorize an icon with a value changing dynamically is by using a token (and this works, with a static token set via text input with an hex value corresponding to a color, e.g. "#FFFFFF"). The only reference documentation I can find is https://docs.splunk.com/Documentation/ITSI/4.8.0/SI/Inputs#How_inputs_connect_to_visualizations Below, the code of the icon, with its color set to a token named "vizcolor": {     "type": "viz.singlevalueicon",     "options": {         "showValue": false,         "icon": "splunk-enterprise-kvstore://6001e599aea29f5df6382024",         "color": "$vizcolor$"     },     "dataSources": {         "primary": "ds_XXXXXX"     } }   In conclusion, I would like to know if any one of you can suggest me how to set a token from a search result on a glass table (examples are appreciated). Thanks, G.P.
I have been working on a project in Splunk along with a coworker and he has the ability to convert their dashboard code from Splunk XML to HTML, using the ellipsis button in the top right corner, but... See more...
I have been working on a project in Splunk along with a coworker and he has the ability to convert their dashboard code from Splunk XML to HTML, using the ellipsis button in the top right corner, but I don’t.    I tried deleting and re-downloading a new instance of Splunk v8.1.3 onto my MacBook (macOS Big Sur), but the issue persists. My coworker did the same on his computer and it still worked for them.  Does anyone have any solutions they can offer or run into a similar issue?
I have been working on a project in Splunk along with a coworker and he has the ability to convert their dashboard code from Splunk XML to HTML, using the ellipsis button in the top right corner, but... See more...
I have been working on a project in Splunk along with a coworker and he has the ability to convert their dashboard code from Splunk XML to HTML, using the ellipsis button in the top right corner, but I don’t.    I tried deleting and re-downloading a new instance of Splunk v8.1.3 onto my MacBook (macOS Big Sur), but the issue persists. My coworker did the same on his computer and it still worked for them.  Does anyone have any solutions they can offer or run into a similar issue?
Hi Community, I am trying to set a token based on a search in an ITSI glass table, but I cannot find any way to do it "dynamically", even by changing the JSON code of the glass table. My final go... See more...
Hi Community, I am trying to set a token based on a search in an ITSI glass table, but I cannot find any way to do it "dynamically", even by changing the JSON code of the glass table. My final goal is using the token value to set the color of an icon based on the value returned by the related search. According to what I found, the only option to colorize an icon with a value changing dynamically is by using a token (and this works, with a static token set via text input with an hex value corresponding to a color, e.g. "#FFFFFF"). The only reference documentation I can find is https://docs.splunk.com/Documentation/ITSI/4.8.0/SI/Inputs#How_inputs_connect_to_visualizations Below, the code of the icon, with its color set to a token named "vizcolor": {     "type": "viz.singlevalueicon",     "options": {         "showValue": false,         "icon": "splunk-enterprise-kvstore://6001e599aea29f5df6382024",         "color": "$vizcolor$"     },     "dataSources": {         "primary": "ds_XXXXXX"     } }   In conclusion, I would like to know if any one of you can suggest me how to set a token from a search result on a glass table (examples are appreciated). Thanks, G.P.
Hi all- we want to get a bit more elegant with correlation searching between two different indexes.  There seems to be a lot of different approaches, but ultimately this is what we are trying to do: ... See more...
Hi all- we want to get a bit more elegant with correlation searching between two different indexes.  There seems to be a lot of different approaches, but ultimately this is what we are trying to do: 1) we have a set of events returned from a firewall index search EXAMPLE:   (index=XXXXXX) level=warning host="XXXXXXXX" category="Malicious Websites" | stats count by srcip 2) we have the record of the IP in question in our DHCP index: EXAMPLE:  index="dhcp" host="XXXXXXXX" | stats count by ip, hostname  What is the most elegant approach to searching so that values from our firewall report are returned using the hostname information that was listed in DHCP?    I assume I would need to use the rename command to ensure srcip and ip match up, and see a lot of different ways to potentially achieve this and could use some direction on which is the simplest path to take (ie: subsearch?) Desired End Result: A report that lists firewall data that includes both IP and Hostname at the time of the log, vs what a DNS lookup would provide, preserving and confirming what IP was assigned to what hostname at the time of the firewall log.    
Hello everyone,  I have a situation, I would like to read a lookup and for each field that match with a search criteria, send a email. For example: I have the lookup mails.csv and if I find somethi... See more...
Hello everyone,  I have a situation, I would like to read a lookup and for each field that match with a search criteria, send a email. For example: I have the lookup mails.csv and if I find something like this  Email send  test1@gmail.com NO test2@gmail.com NO   So I would like to send a email for each field in the column send with the word "NO" and after that change the state to "Yes". I have this search,  | inputlookup append=t mails.csv | search send = NO | eval email_subj= subject | eval body = email_subj | eval email_to ="desk@xxx.xx"  | table  email_subj, email_to, body | sendresults    --- >>>>At this point, just send the first result that found [search lookup cm.csv | fields subject| rename subject As email_subj | eval send= "YES" | eval asunto=email_subj | table asunto, sender , send] | outputlookup cm.csv --- >>>> Change the value in any field with the word NO and delete the others     Thanks in advance.
I have a data source which I collect using DB CONNECT from an oracle database which brings the information in JSON format, but I see that the use of regular expressions is not successful, neither is ... See more...
I have a data source which I collect using DB CONNECT from an oracle database which brings the information in JSON format, but I see that the use of regular expressions is not successful, neither is the field extraction assistant. The problem is because it is a vector of vectors, it is a vector that within each log in JSON format this in turn has more JSON logs Someone something similar happened to him?
Hi all, I have a table like this _time file1.txt file2.txt file3.txt *.txt 1472160022 1472160022 1472160000 1472160099 ... 1472160024 1472160100 1472160300 1472160040 ..... See more...
Hi all, I have a table like this _time file1.txt file2.txt file3.txt *.txt 1472160022 1472160022 1472160000 1472160099 ... 1472160024 1472160100 1472160300 1472160040 ... ... ... ... ... ... The filename columns are all of the format *.txt but there are so many and they can change in the future that I don't want to hardcode them. I would like to subtract the *.txt from the the _time column I basically want to do     eval *.txt = _time-*.txt    Which would theoretically give these values: _time file1.txt file2.txt file3.txt *.txt 1472160022 0 22 -77 ... 1472160024 -76 -276 -16 ... ... ... ... ... ... But I can't seem to wildcard in the subtraction in the eval. Any ideas are much appreciated! 
I have a field with similar values: myField JCH Corn JCH Carrot JCH Apple ME/Orange I would like to populate a new field depending on the value: if myfield="JCH Corn" myNewField="Corn" ... See more...
I have a field with similar values: myField JCH Corn JCH Carrot JCH Apple ME/Orange I would like to populate a new field depending on the value: if myfield="JCH Corn" myNewField="Corn" if myfield="JCH Carrot" myNewField="Carrot" if myfield="JCH Apple" myNewField="Apple" if myfield=ME/Orange" myNewField="Orange" myNewField Corn Carrot Apple Orange I was thinking I could do an eval if value statement, but I am not sure what is best in this situation. Thanks!
Hi, Please help me with regex to capture only highlighted data z+o.in_XTY_PREDICTION_S1.gpg.1.txt.1.20210219090217 p+d.zwryun.yhudatei.600.gpg.1.20210127014546.gpg t+d.tcoyuing.stkmopini.600.2.2... See more...
Hi, Please help me with regex to capture only highlighted data z+o.in_XTY_PREDICTION_S1.gpg.1.txt.1.20210219090217 p+d.zwryun.yhudatei.600.gpg.1.20210127014546.gpg t+d.tcoyuing.stkmopini.600.2.20210127042957.gpg a+p.zpitdap1.in0000ci.600.6. 20210127042957.gpg n+o.in_satght.poi.mo.syh.gpg.1.txt.1 a+o.deniedin_com.dat.1