All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

The btool command shows *all* of the settings that will be applied the next time Splunk restarts.  It takes file precedence into account when generating the output.  The first column produced by the ... See more...
The btool command shows *all* of the settings that will be applied the next time Splunk restarts.  It takes file precedence into account when generating the output.  The first column produced by the --debug option is the name of the file from which the setting was read.
Hello @MihaiGheorghita , You can restore access to older jQuery libraries on the following path:  Go to your Search Head and then: - In the Search & Reporting app, select Settings > Server Setting... See more...
Hello @MihaiGheorghita , You can restore access to older jQuery libraries on the following path:  Go to your Search Head and then: - In the Search & Reporting app, select Settings > Server Settings, and then select Internal Library Settings >  toggle the jQuery Libraries older than 3.5 section to restrict/unrestrict older jQuery libraries. Doc: https://docs.splunk.com/Documentation/UpgradejQuery/1/UpgradejQuery/LibrarySettings#Restrict_access_to_older_versions_of_jQuery Thanks.
Hi, first question here ! I'm new on Splunk and I have a basic question on btool. With this command line :    /splunk btool outputs list --debug   the result is that the first element... See more...
Hi, first question here ! I'm new on Splunk and I have a basic question on btool. With this command line :    /splunk btool outputs list --debug   the result is that the first element in the (long) list is the one which is applied in case if there is no outputs.conf in a deployed app on the Heavy Forwarder ? Am I right ? Thanks Nico
Correct, but you could reparse the raw event data to extract the nanoseconds and do the extra maths however, the strptime() and strftime() functions don't support precisions beyond microseconds.
I'll presume you've read the CIM manual at https://docs.splunk.com/Documentation/CIM/5.3.1/User/Overview .  What specific questions do you have about what you read (or couldn't find)? CIM is not an ... See more...
I'll presume you've read the CIM manual at https://docs.splunk.com/Documentation/CIM/5.3.1/User/Overview .  What specific questions do you have about what you read (or couldn't find)? CIM is not an Easy Button.  That is, installing the app will not make your apps CIM-compliant.  Instead, you must add aliases, calculated fields, and/or other KOs so your app will produce CIM-compliant data.  The CIM manual lists the fields expected by each datamodel (not all fields are required). Depending on your app, it's possible no DM will apply.  That's OK.
Hi @Mad2, about Universal Forwarder, as @richgalloway said, you don't need it if you have a full Splunk instance, even if it's a lab installation. About the opportunity to have Search Head, Indexer... See more...
Hi @Mad2, about Universal Forwarder, as @richgalloway said, you don't need it if you have a full Splunk instance, even if it's a lab installation. About the opportunity to have Search Head, Indexer and Monitoring Console on the same server, it's possible if you have a stand alone Splunk Server , and to have it, you don't need to do nothing, only install Splunk. If instead you have a distributed architecture, with more SHs and/or more indexers, it isn't possible: you must have dedicated systems for SHs and different dedicated systems for IDXs. Monitoring Console could share the system with other roles, but not SHs, IDXs and Deployment Server (if you have to manage more than 50 clients). Ciao. Giuseppe
Don't install multiple instances of Splunk on the same server as that invites trouble.  It can be done, but it requires and lot of customization. There's no need to have a UF on the same server as a... See more...
Don't install multiple instances of Splunk on the same server as that invites trouble.  It can be done, but it requires and lot of customization. There's no need to have a UF on the same server as a full instance of Splunk since the full instance can do everything a UF can do (and more).
... you have to click the data collector rule you want to choose the business transactions for, to highlight it, and then click on 'Configure Transactions using this Data Collector,' to bring up this... See more...
... you have to click the data collector rule you want to choose the business transactions for, to highlight it, and then click on 'Configure Transactions using this Data Collector,' to bring up this box, where you select the business transactions within which the method will be called, and move it over the the left, then click 'save.'  If you haven't done this part, the data will never make it to Analytics, even if it is showing up in Snapshots.
Put the savedsearches.conf file into a custom app and upload it to Splunk Cloud.  The file must be in the defaults folder, which must also contain app.conf.
Hello I have great difficulties to understand where to begin for using the CIM datamodel Is anybody can clearly summarize the different ways to apply a CIM datamodel in my own apps? Thanks in adva... See more...
Hello I have great difficulties to understand where to begin for using the CIM datamodel Is anybody can clearly summarize the different ways to apply a CIM datamodel in my own apps? Thanks in advance
Hey Jason.  Did you make sure to pick the business transactions where the methods will show up, after you created the collectors?  If you don't, the data will never make it to Analytics.
As of ES 7.2, auto-refresh feature is available on Incident Review Page. Auto-refresh is paused when selecting or editing a Notable.  This will allow you to work the notable without refreshing the li... See more...
As of ES 7.2, auto-refresh feature is available on Incident Review Page. Auto-refresh is paused when selecting or editing a Notable.  This will allow you to work the notable without refreshing the list.  You can manually re-load the IR Page and Auto-refresh is turned back on. See below KB for more details on how to enable it: https://docs.splunk.com/Documentation/ES/7.3.0/Admin/CustomizeIR#Configure_auto-refresh_to_update_notables 
@ITWhisperer however, I need to create a parser which will parse the events which have the timestamp like  "time: 2024-02-15T11:40:26.498494245Z" I read on the community that Splunk does not suppor... See more...
@ITWhisperer however, I need to create a parser which will parse the events which have the timestamp like  "time: 2024-02-15T11:40:26.498494245Z" I read on the community that Splunk does not support nanoseconds. so, if I put in the microseconds logic(%Y-%m-%dT%H:%M:%S.%6Q%Z), will I get the values till microseconds?  
@ITWhisperer  index="test1" | fieldformat _time=strftime(_time,"%Y-%m-%dT%H:%M:%S.Z")
Can someone help me understand the totalResultCount function? I have looked at the documentation and spent an hour or two fiddling with it, but I can't figure out what it is supposed to do.
The configs may *look* well, but maybe they aren't.  Share the inputs.conf and outputs.conf settings for a second opinion. What gave you the impression that Splunk has no rights to send logs? How a... See more...
The configs may *look* well, but maybe they aren't.  Share the inputs.conf and outputs.conf settings for a second opinion. What gave you the impression that Splunk has no rights to send logs? How are you attempting to send data to the third-party tool?  Have you seen https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Forwarddatatothird-partysystemsd ?
need to install the splunk enterprise and wanted to make SH and indexer , universal forwarder  same system , please advise
I presume you're referring to the Splunk Add-on for Windows since the app does not have any inputs. It's not enough to change the destination index to a metrics index.  The format of the data must a... See more...
I presume you're referring to the Splunk Add-on for Windows since the app does not have any inputs. It's not enough to change the destination index to a metrics index.  The format of the data must also change. See https://docs.splunk.com/Documentation/AddOns/released/Windows/Configuration#Collect_perfmon_data_and_wmi:uptime_data_in_metric_index for the list of Windows metrics that are available and how to enable them.
Hi! We recently decided to move from Splunk on-prem to Cloud.  Is there any quick way for me to upload my savedsearches.conf file from the On-Prem to the Cloud instance?   I am looking for a way whe... See more...
Hi! We recently decided to move from Splunk on-prem to Cloud.  Is there any quick way for me to upload my savedsearches.conf file from the On-Prem to the Cloud instance?   I am looking for a way where I dont have to manually copy my saved searches.  Thanks!  
I have Solarwinds add-on installed on Linux HF. I am seesin this error: +0000 log_level=WARNING, pid=28286, tid=Thread-4, file=ext.py, func_name=time_str2str, code_line_no=321 | [stanza_name="Solarw... See more...
I have Solarwinds add-on installed on Linux HF. I am seesin this error: +0000 log_level=WARNING, pid=28286, tid=Thread-4, file=ext.py, func_name=time_str2str, code_line_no=321 | [stanza_name="SolarwindAlerts"] Unable to convert date_string "2024-02-15T13:44:46.6370000" from format "%Y-%m-%dT%H:%M:%S.%f" to "%Y-%m-%dT%H:%M:%S.%f", return the original date_string, cause=Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/core/ext.py", line 304, in time_str2str     dt = datetime.strptime(date_string, from_format)   File "/opt/splunk/lib/python3.7/_strptime.py", line 577, in _strptime_datetime     tt, fraction, gmtoff_fraction = _strptime(data_string, format)   File "/opt/splunk/lib/python3.7/_strptime.py", line 362, in _strptime     data_string[found.end():]) ValueError: unconverted data remains: 0   Can someone help. I have no data from solarwinds. I tried reinstalling the add on and reconfiguring it. It was working till 8.* version of HF now we have upgraded to 9.1.3. Its showing supported in splunkbase.