All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

There are no packet errors on the UF ip -s link show ens192 2: ens192: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000 link/ether 00:50:56:bb:07:5... See more...
There are no packet errors on the UF ip -s link show ens192 2: ens192: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000 link/ether 00:50:56:bb:07:59 brd ff:ff:ff:ff:ff:ff RX: bytes packets errors dropped overrun mcast 13730859826 7324421 0 0 0 358 TX: bytes packets errors dropped carrier collsns 1976804117 6163908 0 0 0 0  
Hello, thanks for your help.  Until now were using a single deployment of splunk (indexer, search head and data inputs) on the same box.  Now we have just started to split the roles by deploying a ... See more...
Hello, thanks for your help.  Until now were using a single deployment of splunk (indexer, search head and data inputs) on the same box.  Now we have just started to split the roles by deploying a new search head.  By the search is not working I meant that the service is up and running, we can log on it but the searches are not running. We got this message:  Unable to distribute to peer named [indexer_splunk_instancename] at uri https://[indexer_ip]:8089 because replication was unsuccessful. ReplicationStatus: Failed - Failure info: failed_because_BUNDLE_DATA_TRANSMIT_FAILURE. Verify connectivity to the search peer, that the search peer is up, and that an adequate level of system resources are available.  On the indexer, on splunkd.log we got these messages:  File length is greater than 260, File creation may fail. After reading the doc, I saw the  app is supported on the indexers but it is not required. If we move this application to one heavy forwarder. It will not be included on the replication bundle between SH and Indexer?  
Some sample logs: MD Core Data [INFO ] 2024.04.23 01:02:36.169: (common.update) Metadescriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/metadescriptor', download... See more...
Some sample logs: MD Core Data [INFO ] 2024.04.23 01:02:36.169: (common.update) Metadescriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/metadescriptor', downloadlink='https://xxx.domain.tld:9000/console/core/metadescriptor?version=5.8.0&deployment=MSCW6YaXaCaj1y1gv23U4JxzRHFhNUZLENEX&key=2041ed80a6043bf436fc7be518df4a13&serial=1' [msgid: 622] [INFO ] 2024.04.23 01:02:36.371: (common.update) Package descriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml', url='https://xxx.domain.tld:9000/console/core/package/bitdefender_1_windows/bitdefender_1_windows-database-1713819646-1713819720.yml' [msgid: 618] [INFO ] 2024.04.23 01:02:36.442: (common.update) Checksum and digital signature validation of package descriptor is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml' [msgid: 2320] [INFO ] 2024.04.23 01:02:36.454: (common.update) Package descriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml', url='https://xxx.domain.tld:9000/console/core/package/eset_1_windows/eset_1_windows-database-1713821049-1713821161.yml' [msgid: 618] [INFO ] 2024.04.23 01:02:36.459: (common.update) Checksum and digital signature validation of package descriptor is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml' [msgid: 2320] [INFO ] 2024.04.23 01:02:51.383: (common.update) Package successfully downloaded, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz' [msgid: 671] [INFO ] 2024.04.23 01:02:51.383: (common.update) Checksum validation of package content is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml', packageName='bitdefender_1_windows-database-1713819646.zip', type='database', filesChecked='951' [msgid: 2321] [INFO ] 2024.04.23 01:02:57.775: (common.update) Package successfully downloaded, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y' [msgid: 671] [INFO ] 2024.04.23 01:02:57.775: (common.update) Checksum validation of package content is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml', packageName='eset_1_windows-database-1713821049.zip', type='database', filesChecked='36' [msgid: 2321] [INFO ] 2024.04.23 01:03:00.597: (engines) Default parallel count set for engine, engineId='eset_1_windows', parallelcount='20' [msgid: 4602] [INFO ] 2024.04.23 01:03:00.718: (engines) Accepting local socket, engine_id='eset_1_windows', socket='\\.\pipe\C:/Windows/Temp/ometascan/9e14Ds_13680', socketDescriptor='5924' [msgid: 4547] [INFO ] 2024.04.23 01:03:01.415: (engines) Default parallel count set for engine, engineId='bitdefender_1_windows', parallelcount='20' [msgid: 4602] [INFO ] 2024.04.23 01:03:01.512: (engines) Accepting local socket, engine_id='bitdefender_1_windows', socket='\\.\pipe\C:/Windows/Temp/ometascan/yC8oEL_14344', socketDescriptor='7852' [msgid: 4547] [INFO ] 2024.04.23 01:03:04.056: (engines) Try to swap engineprocess log, engine_id='eset_1_windows' [msgid: 5594] [INFO ] 2024.04.23 01:03:10.902: (common.update) Successfully verified product [msgid: 4696] [INFO ] 2024.04.23 01:05:03.731: (engines) Try to swap engineprocess log, engine_id='bitdefender_1_windows' [msgid: 5594] Syslog Data Apr 23 01:02:36 10.178.102.75 MSCW[2456] Metadescriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/metadescriptor', downloadlink='https://xxx.domain.tld:9000/console/core/metadescriptor?version=5.8.0&deployment=MSCW6YaXaCaj1y1gv23U4JxzRHFhNUZLENEX&key=2041ed80a6043bf436fc7be518df4a13&serial=1' [msgid: 622] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Package descriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml', url='https://xxx.domain.tld:9000/console/core/package/bitdefender_1_windows/bitdefender_1_windows-database-1713819646-1713819720.yml' [msgid: 618] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Checksum and digital signature validation of package descriptor is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml' [msgid: 2320] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Package descriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml', url='https://xxx.domain.tld:9000/console/core/package/eset_1_windows/eset_1_windows-database-1713821049-1713821161.yml' [msgid: 618] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Checksum and digital signature validation of package descriptor is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml' [msgid: 2320] Apr 23 01:02:51 10.178.102.75 MSCW[2456] Package successfully downloaded, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz' [msgid: 671] Apr 23 01:02:51 10.178.102.75 MSCW[2456] Checksum validation of package content is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml', packageName='bitdefender_1_windows-database-1713819646.zip', type='database', filesChecked='951' [msgid: 2321] Apr 23 01:02:57 10.178.102.75 MSCW[2456] Package successfully downloaded, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y' [msgid: 671] Apr 23 01:02:57 10.178.102.75 MSCW[2456] Checksum validation of package content is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml', packageName='eset_1_windows-database-1713821049.zip', type='database', filesChecked='36' [msgid: 2321] Apr 23 01:03:00 10.178.102.75 MSCW[2456] Default parallel count set for engine, engineId='eset_1_windows', parallelcount='20' [msgid: 4602] Apr 23 01:03:00 10.178.102.75 MSCW[2456] Accepting local socket, engine_id='eset_1_windows', socket='\\.\pipe\C:/Windows/Temp/ometascan/9e14Ds_13680', socketDescriptor='5924' [msgid: 4547] Apr 23 01:03:01 10.178.102.75 MSCW[2456] Default parallel count set for engine, engineId='bitdefender_1_windows', parallelcount='20' [msgid: 4602] Apr 23 01:03:01 10.178.102.75 MSCW[2456] Accepting local socket, engine_id='bitdefender_1_windows', socket='\\.\pipe\C:/Windows/Temp/ometascan/yC8oEL_14344', socketDescriptor='7852' [msgid: 4547] Apr 23 01:03:04 10.178.102.75 MSCW[2456] Try to swap engineprocess log, engine_id='eset_1_windows' [msgid: 5594] Apr 23 01:03:10 10.178.102.75 MSCW[2456] Successfully verified product [msgid: 4696] Apr 23 01:05:03 10.178.102.75 MSCW[2456] Try to swap engineprocess log, engine_id='bitdefender_1_windows' [msgid: 5594] Splunk Apr 23 01:02:36 10.178.102.75 MSCW[2456] Metadescriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/metadescriptor', downloadlink='https://xxx.domain.tld:9000/console/core/metadescriptor?version=5.8.0&deployment=MSCW6YaXaCaj1y1gv23U4JxzRHFhNUZLENEX&key=2041ed80a6043bf436fc7be518df4a13&serial=1' [msgid: 622] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Package descriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml', url='https://xxx.domain.tld:9000/console/core/package/bitdefender_1_windows/bitdefender_1_windows-database-1713819646-1713819720.yml' [msgid: 618] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Checksum and digital signature validation of package descriptor is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml' [msgid: 2320] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Package descriptor received, file_name='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml', url='https://xxx.domain.tld:9000/console/core/package/eset_1_windows/eset_1_windows-database-1713821049-1713821161.yml' [msgid: 618] Apr 23 01:02:36 10.178.102.75 MSCW[2456] Checksum and digital signature validation of package descriptor is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/eset_1_windows_WNDq4Y/packagedescriptor.yml' [msgid: 2320] Apr 23 01:02:51 10.178.102.75 MSCW[2456] Package successfully downloaded, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz' [msgid: 671] Apr 23 01:02:51 10.178.102.75 MSCW[2456] Checksum validation of package content is ok, packageDir='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz', descriptor='C:/Program Files/OPSWAT/MetaDefender Data/data/updates/db/bitdefender_1_windows_GiYkXz/packagedescriptor.yml', packageName='bitdefender_1_windows-database-1713819646.zip', type='database', filesChecked='951' [msgid: 2321] Apr 23 01:05:03 10.178.102.75 MSCW[2456] Try to swap engineprocess log, engine_id='bitdefender_1_windows' [msgid: 5594] There is no duplicate data in the logs. It happens not only in the first 12 days of the month as you can see in the example logs above...
Just because events have multiple indexes and source types does not mean you can't use stats to correlate events in the events pipeline. In addition to @richgalloway's request, please also share some... See more...
Just because events have multiple indexes and source types does not mean you can't use stats to correlate events in the events pipeline. In addition to @richgalloway's request, please also share some sample representative anonymised events showing how you would like these events to correlated.
Your first error is deploying Splunk on Windows.   See https://community.splunk.com/t5/Getting-Data-In/What-are-the-pain-points-with-deploying-your-Splunk-architecture/m-p/650011 Please elaborate on... See more...
Your first error is deploying Splunk on Windows.   See https://community.splunk.com/t5/Getting-Data-In/What-are-the-pain-points-with-deploying-your-Splunk-architecture/m-p/650011 Please elaborate on "the search head is not working".  What about it is not working?  An error on an indexer does not necessarily mean there's a problem with the SH. One workaround is to rename the TA so it resides in a directory with a shorter name (by at least 8 characters).  Of course, you will have to maintain that forever.
First you need to identify what the format is it JSON/syslog/CEF etc is supposed to be. Then you need to install the TA which you have on a HF and in cloud SH.  Then you need to ensure the TA on ... See more...
First you need to identify what the format is it JSON/syslog/CEF etc is supposed to be. Then you need to install the TA which you have on a HF and in cloud SH.  Then you need to ensure the TA on the HF has been configured with the correct options, you will most likely need to ensure the Guardicore system is configured for the format your require,  or the default options - Speak to the Guardicore admin.  From the Splunk base there appears not to be any detailed documentation but it does state the TA uses REST API and processes events received from the Syslog exporter. So, sounds like the TA app will have the config options to pull data.
Hey everyone,  I currently have a use case for which I set up a Splunk Enterprise environment in an Ubuntu VM (VMware) and want to build an app with the Add-on Builder, which uses a Python Script as... See more...
Hey everyone,  I currently have a use case for which I set up a Splunk Enterprise environment in an Ubuntu VM (VMware) and want to build an app with the Add-on Builder, which uses a Python Script as Input Method to make an API-Call to get my data into Splunk. That's the goal at least.   The VM communicates with the Internet just fine (even if via proxy) and my python script gets the data from the API-Endpoint. However, if I try to enter the proxy credentials from my VM into the Configuration of the Add-on Builder I get the following Error: "There was a problem connecting to the App Certification service. The service might not be available at this time, or you might need to verify your proxy settings and try again."  Now, assuming that I did not mess up the proxy credentials, my next best bet would be that I need to give my Splunk environment a certificate to adequately communicate with the proxy. So we finally reach my question:  Where would I need to place such a certificate file in the directory structure, so that the the Splunk add-on app can find it?  
Thank you for the reply
Yes it bit painful,  if you have made lots of /local/ based configs in your apps backup the /opt/splunk/etc/apps folder at minimum, this way you ar least have your app config's backed up and can rest... See more...
Yes it bit painful,  if you have made lots of /local/ based configs in your apps backup the /opt/splunk/etc/apps folder at minimum, this way you ar least have your app config's backed up and can restore those apps after your re-install Splunk to keep it clean. 
@renjith_nair  not exactly Currently,  I am using checkbox type to filter out error log events and those need to be pre-defined already see the whole dashboard <form theme="light"> <label... See more...
@renjith_nair  not exactly Currently,  I am using checkbox type to filter out error log events and those need to be pre-defined already see the whole dashboard <form theme="light"> <label>LDP Apps monitoring</label> <fieldset submitButton="false" autoRun="false"> <input type="dropdown" token="app" searchWhenChanged="true"> <label>Application</label> <choice value="app_1">App 1</choice> <choice value="app_2">App 2</choice> <choice value="app_3">App 3</choice> <default>App 1</default> <initialValue>App 1</initialValue> </input> <input type="dropdown" token="env" searchWhenChanged="true"> <label>Environment</label> <choice value="qa">QA</choice> <choice value="uat">UAT</choice> <choice value="prod">PROD</choice> <default>prod</default> <initialValue>prod</initialValue> </input> <input type="time" token="time_range"> <label>Time Period</label> <default> <earliest>-24h@h</earliest> <latest>now</latest> </default> </input> <input type="text" token="search_input" id="search_input" searchWhenChanged="true"> <label>Search for a certain log message</label> </input> <html> <style> div[id^="search_input"]{ width: 1000px !important; } </style> </html> <input type="checkbox" token="selected" searchWhenChanged="true" id="checkboxes"> <label>Filter out frequent errors:</label> <choice value="AND NOT &quot;Error Log Message 1 to filter out&quot;">Error Log Message 1 to filter out</choice> <choice value="AND NOT &quot;Error Log Message 2 to filter out&quot;">Error Log Message 2 to filter out</choice> <choice value="AND NOT &quot;Error Log Message 3 to filter out&quot;">Error Log Message 3 to filter out</choice> <choice value="AND NOT &quot;Error Log Message 4 to filter out&quot;">Error Log Message 4 to filter out</choice> <delimiter> </delimiter> <default></default> </input> <html> <style> div[id^="checkboxes"]{ width: 1000px !important; } </style> </html> </fieldset> <row> <panel> <title>$app$ Access logs - status code</title> <chart> <title>**hardcoded time period</title> <search> <query>index="$app$-$env$" access_log status_code!="20*" | timechart span=10m count by status_code</query> <earliest>-3d</earliest> <latest>now</latest> </search> <option name="charting.axisTitleX.visibility">visible</option> <option name="charting.axisTitleY.visibility">visible</option> <option name="charting.axisTitleY2.visibility">visible</option> <option name="charting.chart">area</option> <option name="charting.chart.nullValueMode">gaps</option> <option name="charting.chart.showDataLabels">none</option> <option name="charting.chart.stackMode">default</option> <option name="charting.drilldown">none</option> <option name="charting.layout.splitSeries">1</option> <option name="charting.legend.placement">right</option> <option name="refresh.display">progressbar</option> </chart> </panel> <panel> <title>$app$ Error Frequency</title> <chart> <search> <query>index="$app$-$env$" logLevel="ERROR" $selected$ | multikv | eval ReportKey="error rate" | timechart span=30m count by ReportKey</query> <earliest>$time_range.earliest$</earliest> <latest>$time_range.latest$</latest> <sampleRatio>1</sampleRatio> <refresh>1m</refresh> <refreshType>delay</refreshType> </search> <option name="charting.chart">area</option> <option name="charting.chart.nullValueMode">connect</option> <option name="charting.chart.showDataLabels">all</option> <option name="charting.chart.stackMode">default</option> <option name="charting.drilldown">none</option> <option name="charting.layout.splitSeries">1</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> <row> <panel> <title>$app$ Specific Error Logs</title> <table> <search> <query>index="$app$-$env$" logLevel="ERROR" $selected$ | rex mode=sed "s:&lt;1512&gt;:\n:g" | bucket _time span=5m | table _time, logName, logLevel, _raw | sort -_time</query> <earliest>$time_range.earliest$</earliest> <latest>$time_range.latest$</latest> <sampleRatio>1</sampleRatio> <refresh>1m</refresh> <refreshType>delay</refreshType> </search> <option name="count">10</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> <row> <panel> <title>$app$ WARN Frequency</title> <chart> <search> <query>index="$app$-$env$" logLevel="WARN" $selected$ | multikv | eval ReportKey="warn rate" | timechart span=30m count by ReportKey</query> <earliest>$time_range.earliest$</earliest> <latest>$time_range.latest$</latest> <sampleRatio>1</sampleRatio> <refresh>1m</refresh> <refreshType>delay</refreshType> </search> <option name="charting.chart">area</option> <option name="charting.chart.showDataLabels">all</option> <option name="charting.drilldown">none</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> <row> <panel> <title>$app$ Warn Messages</title> <table> <search> <query>index="$app$-$env$" logLevel="WARN" $selected$ | rex mode=sed "s:&lt;1512&gt;:\n:g" | bucket _time span=5m | table _time, logName, logLevel, _raw | sort -_time</query> <earliest>$time_range.earliest$</earliest> <latest>$time_range.latest$</latest> <refresh>1m</refresh> <refreshType>delay</refreshType> </search> <option name="count">10</option> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> <format type="color" field="logLevel"> <colorPalette type="list">[#53A051,#006D9C,#F8BE34,#F1813F,#DC4E41]</colorPalette> <scale type="threshold">0,30,70,100</scale> </format> </table> </panel> </row> <row> <panel> <title>Specific log event search</title> <chart> <title>**Copy a log message to search for an error log history, hardcoded time period</title> <search> <query>index="$app$-$env$" "$search_input$" | eval search_input="$search_input$" | where isnotnull(search_input) AND search_input!="" | multikv | eval ReportKey="searched_event" | timechart span=30m count by ReportKey</query> <earliest>$time_range.earliest$</earliest> <latest>$time_range.latest$</latest> <sampleRatio>1</sampleRatio> </search> <option name="charting.chart">column</option> <option name="charting.chart.showDataLabels">all</option> <option name="charting.drilldown">all</option> <option name="refresh.display">progressbar</option> </chart> </panel> </row> </form> I want to use text box input type to add a specific error message string in to multiselect and that multiselect will be placed to each query provided above  so can filter out a certain events without having them specified in checkboxes  Also I want to use an empty multiselect as default each time a dashboard is loaded
Have you carefully installed and deployed this add on within your Splunk deployment architecture Follow the instructions https://splunkbase.splunk.com/app/4564 - click on the link and look for where... See more...
Have you carefully installed and deployed this add on within your Splunk deployment architecture Follow the instructions https://splunkbase.splunk.com/app/4564 - click on the link and look for where to install this add on section first. You would typically be install this onto a heavy forwarder if you are using one and set the inputs up, this would forward the data to the indexers and data will be parsed. The add is required on the Search Heads for parsing (Knowledge Objects) so needs to be installed there, into the correct path. So Install everythings as required, configure it and then look at the logs. If you have already configured as required then this log message indicates something else. It states "The system cannot find the path specified" Have you installed it correctly?
@vananhnguyen , Here is a run anywhere example. Are you looking for something like this ? You can change the value in the dropdown and the colors will be reset { "visualizations": { "vi... See more...
@vananhnguyen , Here is a run anywhere example. Are you looking for something like this ? You can change the value in the dropdown and the colors will be reset { "visualizations": { "viz_NJsTjQl4": { "type": "splunk.singlevalue", "options": { "majorColor": "> majorValue | matchValue(majorColorEditorConfig)" }, "dataSources": { "primary": "ds_275I8YNY" }, "context": { "majorColorEditorConfig": [ { "match": "Running", "value": "#118832" }, { "match": "Stopped", "value": "#d41f1f" } ] } } }, "dataSources": { "ds_275I8YNY": { "type": "ds.search", "options": { "query": "| makeresults\n| eval value=\"$status$\"" }, "name": "Search_1" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "latest": "$global_time.latest$", "earliest": "$global_time.earliest$" } } } } }, "inputs": { "input_global_trp": { "type": "input.timerange", "options": { "token": "global_time", "defaultValue": "-24h@h,now" }, "title": "Global Time Range" }, "input_BHJAbWl2": { "options": { "items": [ { "label": "Running", "value": "Running" }, { "label": "Stopped", "value": "Stopped" } ], "token": "status", "selectFirstSearchResult": true }, "title": "Status", "type": "input.dropdown" } }, "layout": { "type": "grid", "options": { "width": 1440, "height": 960 }, "structure": [ { "item": "viz_NJsTjQl4", "type": "block", "position": { "x": 0, "y": 0, "w": 1440, "h": 400 } } ], "globalInputs": [ "input_global_trp", "input_BHJAbWl2" ] }, "description": "", "title": "single_panel_studio" }   Reference : https://docs.splunk.com/Documentation/Splunk/9.2.1/DashStudio/visualEditDynamic 
Hi, Thank you for your suggestion. When I use a scheduled report, and see the recent search, Splunk appends "| summaryindex" at the end of the search, not "collect" command.   So, I thought "collec... See more...
Hi, Thank you for your suggestion. When I use a scheduled report, and see the recent search, Splunk appends "| summaryindex" at the end of the search, not "collect" command.   So, I thought "collect" always refers to "manual" push versus "summary index" in a scheduled report.  My understanding you have 2 scheduled reports. Is the following accurate? 1)  roll forward existing data (from and to the same summary index - say index A)  2) push new data (from a different index to summary index - say from index X to index A) In my case, the data from DBXquery is always get re-write, so I only need the latest data, but I may use your method in the future. Thanks for this. Based on the link that you sent and the following post, it looks like I still need the CSV file. (See below it does inputlookup to CSV first, then outputlookup to KV) Is this correct?    My goal is to avoid having CSV file since there's a limit in size   https://community.splunk.com/t5/Getting-Data-In/How-to-transfer-existing-CSV-data-to-kvstore/m-p/144641 | inputlookup filename.csv | outputlookup lookup_name Thank you again.
Huh.  Guess I was just assuming that it needed both, and that's the way I've always done it.  Now I'll have to play around in the lab and see what happens when I remove it.   Thanks!
@Splunkerninja , Can we add another row only for the html panel and hide the result row/panel with a "depends" token? Something like <row> <panel depends="$hide_this_always$"> <table> <search> <do... See more...
@Splunkerninja , Can we add another row only for the html panel and hide the result row/panel with a "depends" token? Something like <row> <panel depends="$hide_this_always$"> <table> <search> <done> <eval token="date">strftime(now(), "%d-%m-%Y")</eval> <set token="sid">$job.sid$</set> </done> <query>index=test</query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">true</option> </table> </panel> </row> <row> <html> <a href="/api/search/jobs/$sid$/results?isDownload=true&amp;timeFormat=%25FT%25T.%25Q%25%3Az&amp;maxLines=0&amp;count=0&amp;filename=test_$date$.csv&amp;outputMode=csv" class="button js-button">Download</a> <style> .button { background-color: steelblue; border-radius: 5px; color: white; padding: .5em; text-decoration: none; } .button:focus, .button:hover { background-color: #2A4E6C; color: White; } </style> </html> </row>
You can make use of the KV store to store information and retrieve https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/kvstore/  https://dev.splunk.com/enterprise/docs/developapps/ma... See more...
You can make use of the KV store to store information and retrieve https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/kvstore/  https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/kvstore/usetherestapitomanagekv/ https://dev.splunk.com/enterprise/docs/developapps/manageknowledge/kvstore/uselookupswithkvstore/ Splunk JS can be then used to control the button actions and processing the record. There were some dev materials available explaining the overall process but looks like its removed. There are some third party documentation available in public for splunk KVStore CRUD  
Hi So far, I didn't see any missing data, so it's not causing a problem, except for the error message. I am not sure why Splunk has to throw the error message.  Thank you for your help.
If the step has neither Success nor Failure, the counts for these columns will be zero
@pevniacik , Are you looking for something like this ? Test by selecting few projects and add a text "Error" to the text box to filter <form version="1.1" theme="light"> <label>MultiSelect_Text<... See more...
@pevniacik , Are you looking for something like this ? Test by selecting few projects and add a text "Error" to the text box to filter <form version="1.1" theme="light"> <label>MultiSelect_Text</label> <fieldset submitButton="false"> <input type="multiselect" token="Project"> <label>Project</label> <valuePrefix>"</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter>,</delimiter> <fieldForLabel>Project</fieldForLabel> <fieldForValue>Project</fieldForValue> <search> <query>|makeresults count=5|streamstats count |eval Project="Project".count|eval Record="Some records "|eval Record=if(count%2==0,Record,Record."Error")</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </input> <input type="text" token="text_filter" searchWhenChanged="true"> <label>Text to Filter</label> <default>*</default> </input> </fieldset> <row> <panel> <table> <search> <query>|makeresults count=5|streamstats count |eval Project="Project".count|eval Record="Some records "|eval Record=if(count%2==0,Record,Record."Error") |where Project in ($Project$) AND NOT like (Record,"%$text_filter$%")</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row> </form>