All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @andgarciaa , are you speaking of Splunk Cloud or On-premise? if Splunk Cloud, you have to ask to your Splunk Sales. If on premise, the only cost is the additional storage that you can estimate... See more...
Hi @andgarciaa , are you speaking of Splunk Cloud or On-premise? if Splunk Cloud, you have to ask to your Splunk Sales. If on premise, the only cost is the additional storage that you can estimate duplicating the actual storage. Ciao. Giuseppe
If I have an index with a retention of 90 days. Can I make a rough estimate about the cost of increasing the retention of index=  index-name  extra 90 day?
Hi, we decided to create backups and just go for it. It worked fine! After upgrade everything is indexing without any issues. Also no problem during upgrade from msi. Thanks for giving us a little ... See more...
Hi, we decided to create backups and just go for it. It worked fine! After upgrade everything is indexing without any issues. Also no problem during upgrade from msi. Thanks for giving us a little courage I guess. We decided to "experiment". For the greater good heh.
<form version="1.1"> <label>My dashboard</label> <fieldset submitButton="false"></fieldset> <row id="mainFilterRow" depends="$showHidePanel$"> <panel id="mainFilterPanel1"> <input t... See more...
<form version="1.1"> <label>My dashboard</label> <fieldset submitButton="false"></fieldset> <row id="mainFilterRow" depends="$showHidePanel$"> <panel id="mainFilterPanel1"> <input type="time" token="time"> <label>DateTime</label> <default> <earliest>@d</earliest> <latest>now</latest> </default> </input> </panel> <panel id="mainFilterPanel21"> <input type="dropdown" token="TimeDrop"> <label>TimeDrop</label> <choice value="+1d">1d</choice> <choice value="+2d">2d</choice> <choice value="+5d">5d</choice> <default>1d</default> <change> <eval token="latest_Time">relative_time($time.latest$, $TimeDrop$)</eval> </change> </input> </panel> </row> <row id="chartRow" depends="$showHidePanel$"> <panel> <search> <query>index=main | stats count by host</query> <earliest>$time.earliest$</earliest> <latest>$latest_Time$</latest> </search> </panel> </row> </form>
This is a broad question. What is your specific usecase that you are trying to solve?
If you mean to change the standard timepicker to include your special options into a modified timepicker, try adding new timeranges: Time ranges are configured in Settings -> Knowledge -> User interf... See more...
If you mean to change the standard timepicker to include your special options into a modified timepicker, try adding new timeranges: Time ranges are configured in Settings -> Knowledge -> User interface -> Time ranges section of the Splunk interface.
Explain me construction structure of configuration file in splunk and what all component it contain and what we call them.  [what are imp configuration files in splunk, what is the purpose of these ... See more...
Explain me construction structure of configuration file in splunk and what all component it contain and what we call them.  [what are imp configuration files in splunk, what is the purpose of these diffenet files. If a file suppose inputs.conf is present in multiple apps then how splunk will consolidate it. what is the file precedency order. can i have my own configuration file name like my nameinputs.conf file, will it work and how.]
Please share the source of your dashboard, just sharing screengrabs does not show what is going on behind the scenes!
The statement is not working.    According to the above selection made, the earliest time should be 5/8/2024 05:00:00 & latest time should be 5/9/2024 06:00:00 (because the time span selected i... See more...
The statement is not working.    According to the above selection made, the earliest time should be 5/8/2024 05:00:00 & latest time should be 5/9/2024 06:00:00 (because the time span selected is +1d) but it is not working despite of using the below eval statement.  <eval token="latest_Time">relative_time($time.latest$, $timedrop$)</eval> Results:    
@Ryan.Paredez  I have tried again installation on new VM. I did all steps as mentioned. I am able to see the Custom Metric/Linux Monitor folder under the VM on AppD dashboard. But under mountedNFSSt... See more...
@Ryan.Paredez  I have tried again installation on new VM. I did all steps as mentioned. I am able to see the Custom Metric/Linux Monitor folder under the VM on AppD dashboard. But under mountedNFSStatus i am not getting any data. Sharing below snapshot.   Also i am getting nullpointer exception in machine agent logs. vm==> [Agent-Monitor-Scheduler-1] 13 May 2024 05:56:29,943 INFO MetricWriteHelperFactory-Linux Monitor - The instance of MetricWriteHelperFactory is com.appdynamics.extensions.MetricWriteHelper@e8e0a3b vm==> [Monitor-Task-Thread3] 13 May 2024 05:56:30,446 ERROR NFSMountMetricsTask-Linux Monitor - Exception occurred collecting NFS I/O metrics java.lang.NullPointerException: null at com.appdynamics.extensions.linux.NFSMountMetricsTask.getMountIOStats(NFSMountMetricsTask.java:173) [?:?] at com.appdynamics.extensions.linux.NFSMountMetricsTask.run(NFSMountMetricsTask.java:66) [?:?] at com.appdynamics.extensions.executorservice.MonitorThreadPoolExecutor$TaskRunnable.run(MonitorThreadPoolExecutor.java:113) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] at java.lang.Thread.run(Thread.java:829) [?:?] vm==> [Monitor-Task-Thread1] 13 May 2024 05:56:30,763 INFO LinuxMonitorTask-Linux Monitor - Completed the Linux Monitoring task
This is exactly what I was looking for. but is it possible to incorporate along with the existing Time Range Picker?
Big thanks to you, @ITWhisperer  ,The solution works flawlessly, and I'm particularly impressed by the elegant utilization of the foreach command. It perfectly aligns with our exact requirements. Tha... See more...
Big thanks to you, @ITWhisperer  ,The solution works flawlessly, and I'm particularly impressed by the elegant utilization of the foreach command. It perfectly aligns with our exact requirements. Thanks for the guidance and assistance .
@tscroggins, hope the information is helpful, please let me know if you need any additional details
Attached sample data of two tables.  for each SNC1, SNC2, there will be data for each 15 mins and values can be different. Now the idea is to do timeseries for each SNC any of the values and filterin... See more...
Attached sample data of two tables.  for each SNC1, SNC2, there will be data for each 15 mins and values can be different. Now the idea is to do timeseries for each SNC any of the values and filtering will be mainly based on SNC and any of the values (one or more values at the same time )
Report data would be as below par1 time b e f g l m n r s SNC1 12/5/2024 16:30 299367 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.93 12.91 SNC1 12/5/2024 16:45 299364... See more...
Report data would be as below par1 time b e f g l m n r s SNC1 12/5/2024 16:30 299367 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.93 12.91 SNC1 12/5/2024 16:45 299364 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.95 12.87 SNC1 12/5/2024 17:00 299369 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.89 12.88 SNC1 12/5/2024 17:15 299364 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.92 12.89 SNC1 12/5/2024 17:30 299368 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.87 12.83 SNC1 12/5/2024 17:45 299362 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.92 12.78 SNC1 12/5/2024 18:00 299368 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.92 12.88 SNC1 12/5/2024 18:15 299371 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.95 12.88 SNC1 12/5/2024 18:30 299359 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.94 12.83 SNC1 12/5/2024 18:45 299362 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.92 12.86 SNC1 12/5/2024 19:00 299369 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.89 12.85 SNC1 12/5/2024 19:15 299365 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.92 12.89 SNC1 12/5/2024 19:30 299368 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.9 12.75 SNC1 12/5/2024 19:45 299369 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.92 12.85 SNC1 12/5/2024 20:00 299363 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.93 12.89 SNC1 12/5/2024 20:15 299358 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.93 12.85 SNC2 12/5/2024 16:30 259482 -7.6 -6.9 -7.6 9.00E-35 1.00E-34 0.0011 9.58 9.54 SNC2 12/5/2024 16:45 259479 -7.5 -6.9 -7.6 8.00E-35 1.00E-34 0.0011 9.59 9.53 SNC2 12/5/2024 17:00 259478 -7.5 -6.9 -7.6 8.00E-35 1.00E-34 0.0011 9.59 9.56 SNC2 12/5/2024 17:15 259484 -7.5 -6.9 -7.6 5.00E-35 1.00E-34 0.0011 9.61 9.55 SNC2 12/5/2024 17:30 259487 -7.6 -6.9 -7.6 6.00E-35 2.00E-34 0.0011 9.56 9.52 SNC2 12/5/2024 17:45 259480 -7.5 -6.9 -7.6 8.00E-35 1.00E-34 0.0011 9.57 9.53
reports_metadata file contains data as below snc_label deployment_state par1 par2 par3 par4 par5 par6 par7 par8 par9 par10 par11 par12 par13 par14 par15 par16 par17 par18 par1... See more...
reports_metadata file contains data as below snc_label deployment_state par1 par2 par3 par4 par5 par6 par7 par8 par9 par10 par11 par12 par13 par14 par15 par16 par17 par18 par19 SNC1 discovered L0CP C4 100 37.5 ABC MOTR ABC-0101 XYZ-0101 1-1-1 15-7-1 15.5   -23.697888 133.879791     A B AA SNC2 discovered NL0CP C4 200 37.5 DCE OTR DCE-0102 CSNO-0101 7-8-1 10-2-2 15.5 15.5 -30.296649 153.113164 -28.864117 153.047084 B B AB SNC3 discovered L0CP C74 300 37.5 XYZ MOTR ABC-0101 PTMA-0101 15-7-1 15-7-1 15.5 15.5 -30.296649 153.113164 -31.431357 152.914377 A A AD SNC4 discovered NL0CP C64 100 37.5 ABC MOTR DCE-0102 BRDE-0102 15-7-1 10-2-2 15.5 15.5 -27.357494 153.022632 -27.471961 153.025407 C C CA SNC5 discovered L0CP C44 200 37.5 ABB MOTR CZWX-0201 HABC-0101 10-2-2 1-1-1 15.5 15.5 -33.797823 151.180644 -33.896447 151.193881 D E DZ
What @richgalloway said, but whenever you reference a JSON field containing dots in the right hand side of an eval you MUST wrap the field name in single quotes, i.e. the first suggestion should be ... See more...
What @richgalloway said, but whenever you reference a JSON field containing dots in the right hand side of an eval you MUST wrap the field name in single quotes, i.e. the first suggestion should be eval Error=case(isnotnull('attr.error'), 'attr.error', isnotnull('attr.error.errmsg'), 'attr.error.errmsg') but for your solution the coalesce() option would make sense - note there the use of single quotes - always for the right hand side of the eval.  This applies not just to JSON field names, but any field name that contains non simple characters or field names that start with numbers.
See this https://docs.splunk.com/Documentation/ES/7.3.1/Admin/Formatassetoridentitylist So your search will be index=my_asset_source ... | eval priority="high" | table nt_host priority ... | outpu... See more...
See this https://docs.splunk.com/Documentation/ES/7.3.1/Admin/Formatassetoridentitylist So your search will be index=my_asset_source ... | eval priority="high" | table nt_host priority ... | outputlookup my_asset_definition.csv You just need to fill in the gaps so you can collect the fields mentioned in the document. Set the priority as you want it to be based on your business rules for defining how you want to assign priority.  
Hi @uagraw01, Browsers will not trust your self-signed certificates without additional configuration. In most cases, you'll want to use a certificate signed by a mutually trusted certificate authori... See more...
Hi @uagraw01, Browsers will not trust your self-signed certificates without additional configuration. In most cases, you'll want to use a certificate signed by a mutually trusted certificate authority. This is not an endorsement of Qualys, but https://www.ssllabs.com/ provides general information on SSL/TLS that you may find beneficial.
Yes, it is possible to upgrade forwarders first.  As you've noted, that is contrary to the published procedures and may not work.  Also, Splunk version 7.3.1 is well outdated so there is no guidance ... See more...
Yes, it is possible to upgrade forwarders first.  As you've noted, that is contrary to the published procedures and may not work.  Also, Splunk version 7.3.1 is well outdated so there is no guidance about its compatibility with other versions. This will be an interesting experiment.  Please let us know how it goes.