All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@Ryan.Paredez  I have tried again installation on new VM. I did all steps as mentioned. I am able to see the Custom Metric/Linux Monitor folder under the VM on AppD dashboard. But under mountedNFSSt... See more...
@Ryan.Paredez  I have tried again installation on new VM. I did all steps as mentioned. I am able to see the Custom Metric/Linux Monitor folder under the VM on AppD dashboard. But under mountedNFSStatus i am not getting any data. Sharing below snapshot.   Also i am getting nullpointer exception in machine agent logs. vm==> [Agent-Monitor-Scheduler-1] 13 May 2024 05:56:29,943 INFO MetricWriteHelperFactory-Linux Monitor - The instance of MetricWriteHelperFactory is com.appdynamics.extensions.MetricWriteHelper@e8e0a3b vm==> [Monitor-Task-Thread3] 13 May 2024 05:56:30,446 ERROR NFSMountMetricsTask-Linux Monitor - Exception occurred collecting NFS I/O metrics java.lang.NullPointerException: null at com.appdynamics.extensions.linux.NFSMountMetricsTask.getMountIOStats(NFSMountMetricsTask.java:173) [?:?] at com.appdynamics.extensions.linux.NFSMountMetricsTask.run(NFSMountMetricsTask.java:66) [?:?] at com.appdynamics.extensions.executorservice.MonitorThreadPoolExecutor$TaskRunnable.run(MonitorThreadPoolExecutor.java:113) [?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?] at java.lang.Thread.run(Thread.java:829) [?:?] vm==> [Monitor-Task-Thread1] 13 May 2024 05:56:30,763 INFO LinuxMonitorTask-Linux Monitor - Completed the Linux Monitoring task
This is exactly what I was looking for. but is it possible to incorporate along with the existing Time Range Picker?
Big thanks to you, @ITWhisperer  ,The solution works flawlessly, and I'm particularly impressed by the elegant utilization of the foreach command. It perfectly aligns with our exact requirements. Tha... See more...
Big thanks to you, @ITWhisperer  ,The solution works flawlessly, and I'm particularly impressed by the elegant utilization of the foreach command. It perfectly aligns with our exact requirements. Thanks for the guidance and assistance .
@tscroggins, hope the information is helpful, please let me know if you need any additional details
Attached sample data of two tables.  for each SNC1, SNC2, there will be data for each 15 mins and values can be different. Now the idea is to do timeseries for each SNC any of the values and filterin... See more...
Attached sample data of two tables.  for each SNC1, SNC2, there will be data for each 15 mins and values can be different. Now the idea is to do timeseries for each SNC any of the values and filtering will be mainly based on SNC and any of the values (one or more values at the same time )
Report data would be as below par1 time b e f g l m n r s SNC1 12/5/2024 16:30 299367 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.93 12.91 SNC1 12/5/2024 16:45 299364... See more...
Report data would be as below par1 time b e f g l m n r s SNC1 12/5/2024 16:30 299367 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.93 12.91 SNC1 12/5/2024 16:45 299364 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.95 12.87 SNC1 12/5/2024 17:00 299369 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.89 12.88 SNC1 12/5/2024 17:15 299364 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.92 12.89 SNC1 12/5/2024 17:30 299368 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.87 12.83 SNC1 12/5/2024 17:45 299362 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.92 12.78 SNC1 12/5/2024 18:00 299368 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.92 12.88 SNC1 12/5/2024 18:15 299371 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.95 12.88 SNC1 12/5/2024 18:30 299359 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.94 12.83 SNC1 12/5/2024 18:45 299362 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.92 12.86 SNC1 12/5/2024 19:00 299369 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.89 12.85 SNC1 12/5/2024 19:15 299365 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.92 12.89 SNC1 12/5/2024 19:30 299368 -7.6 -7.9 -7.7 1.00E-37 1.00E-37 1.80E-07 13.9 12.75 SNC1 12/5/2024 19:45 299369 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.92 12.85 SNC1 12/5/2024 20:00 299363 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.93 12.89 SNC1 12/5/2024 20:15 299358 -7.7 -7.9 -7.7 1.00E-37 1.00E-37 1.90E-07 13.93 12.85 SNC2 12/5/2024 16:30 259482 -7.6 -6.9 -7.6 9.00E-35 1.00E-34 0.0011 9.58 9.54 SNC2 12/5/2024 16:45 259479 -7.5 -6.9 -7.6 8.00E-35 1.00E-34 0.0011 9.59 9.53 SNC2 12/5/2024 17:00 259478 -7.5 -6.9 -7.6 8.00E-35 1.00E-34 0.0011 9.59 9.56 SNC2 12/5/2024 17:15 259484 -7.5 -6.9 -7.6 5.00E-35 1.00E-34 0.0011 9.61 9.55 SNC2 12/5/2024 17:30 259487 -7.6 -6.9 -7.6 6.00E-35 2.00E-34 0.0011 9.56 9.52 SNC2 12/5/2024 17:45 259480 -7.5 -6.9 -7.6 8.00E-35 1.00E-34 0.0011 9.57 9.53
reports_metadata file contains data as below snc_label deployment_state par1 par2 par3 par4 par5 par6 par7 par8 par9 par10 par11 par12 par13 par14 par15 par16 par17 par18 par1... See more...
reports_metadata file contains data as below snc_label deployment_state par1 par2 par3 par4 par5 par6 par7 par8 par9 par10 par11 par12 par13 par14 par15 par16 par17 par18 par19 SNC1 discovered L0CP C4 100 37.5 ABC MOTR ABC-0101 XYZ-0101 1-1-1 15-7-1 15.5   -23.697888 133.879791     A B AA SNC2 discovered NL0CP C4 200 37.5 DCE OTR DCE-0102 CSNO-0101 7-8-1 10-2-2 15.5 15.5 -30.296649 153.113164 -28.864117 153.047084 B B AB SNC3 discovered L0CP C74 300 37.5 XYZ MOTR ABC-0101 PTMA-0101 15-7-1 15-7-1 15.5 15.5 -30.296649 153.113164 -31.431357 152.914377 A A AD SNC4 discovered NL0CP C64 100 37.5 ABC MOTR DCE-0102 BRDE-0102 15-7-1 10-2-2 15.5 15.5 -27.357494 153.022632 -27.471961 153.025407 C C CA SNC5 discovered L0CP C44 200 37.5 ABB MOTR CZWX-0201 HABC-0101 10-2-2 1-1-1 15.5 15.5 -33.797823 151.180644 -33.896447 151.193881 D E DZ
What @richgalloway said, but whenever you reference a JSON field containing dots in the right hand side of an eval you MUST wrap the field name in single quotes, i.e. the first suggestion should be ... See more...
What @richgalloway said, but whenever you reference a JSON field containing dots in the right hand side of an eval you MUST wrap the field name in single quotes, i.e. the first suggestion should be eval Error=case(isnotnull('attr.error'), 'attr.error', isnotnull('attr.error.errmsg'), 'attr.error.errmsg') but for your solution the coalesce() option would make sense - note there the use of single quotes - always for the right hand side of the eval.  This applies not just to JSON field names, but any field name that contains non simple characters or field names that start with numbers.
See this https://docs.splunk.com/Documentation/ES/7.3.1/Admin/Formatassetoridentitylist So your search will be index=my_asset_source ... | eval priority="high" | table nt_host priority ... | outpu... See more...
See this https://docs.splunk.com/Documentation/ES/7.3.1/Admin/Formatassetoridentitylist So your search will be index=my_asset_source ... | eval priority="high" | table nt_host priority ... | outputlookup my_asset_definition.csv You just need to fill in the gaps so you can collect the fields mentioned in the document. Set the priority as you want it to be based on your business rules for defining how you want to assign priority.  
Hi @uagraw01, Browsers will not trust your self-signed certificates without additional configuration. In most cases, you'll want to use a certificate signed by a mutually trusted certificate authori... See more...
Hi @uagraw01, Browsers will not trust your self-signed certificates without additional configuration. In most cases, you'll want to use a certificate signed by a mutually trusted certificate authority. This is not an endorsement of Qualys, but https://www.ssllabs.com/ provides general information on SSL/TLS that you may find beneficial.
Yes, it is possible to upgrade forwarders first.  As you've noted, that is contrary to the published procedures and may not work.  Also, Splunk version 7.3.1 is well outdated so there is no guidance ... See more...
Yes, it is possible to upgrade forwarders first.  As you've noted, that is contrary to the published procedures and may not work.  Also, Splunk version 7.3.1 is well outdated so there is no guidance about its compatibility with other versions. This will be an interesting experiment.  Please let us know how it goes.
Hello,  There is this old system where we want to upgrade splunk to the newest version First we want to upgrade the forwarders on 3 test servers  The current version of splunk universal forwarder ... See more...
Hello,  There is this old system where we want to upgrade splunk to the newest version First we want to upgrade the forwarders on 3 test servers  The current version of splunk universal forwarder is 7.0.3.0 We want to rise it to the 9.21 Would that version works for the time being with Splunk Enterprise 7.3.1? I know it would be better first upgrade the enterprise, as best practice is to use indexers with versions that are the same or higher than forwarder versions. (but there is hesitation to upgrade indexers first, as it's used also for data from production) But would it be possible to do forwarders first?  Edit: Upgrade was succesfull  
Check for the existence of a field with the isnotnull() function. eval Error=case(isnotnull(attr.error), 'attr.error', isnotnull(attr.error.errmsg), 'attr.error.errmsg') or use the coalesce() funct... See more...
Check for the existence of a field with the isnotnull() function. eval Error=case(isnotnull(attr.error), 'attr.error', isnotnull(attr.error.errmsg), 'attr.error.errmsg') or use the coalesce() function, which does the tests for you and selects the first listed field that is not null. eval Error=coalesce('attr.error','attr.error.errmsg')  
If attr.error exist then Error will be attr.error. If attr.error not exist and attr.error.errmsg exist then Error would be attr.error.errmsg.  i have tried the below code. only one case works other c... See more...
If attr.error exist then Error will be attr.error. If attr.error not exist and attr.error.errmsg exist then Error would be attr.error.errmsg.  i have tried the below code. only one case works other case fails. please advise eval Error=case(NOT attr.error =="*", 'attr.error',NOT attr.error.errmsg =="*", 'attr.error.errmsg')  
Hello Giuseppe, Thanks, and will do this Monday.  Best regards, Dennis
Assuming all your "dynamic" fields follow naming convention, try this | foreach rw* [| eval maxelements=if(isnull(maxelements),mvcount('<<FIELD>>'),if(maxelements<mvcount('<<FIELD>>'),mvcount('<... See more...
Assuming all your "dynamic" fields follow naming convention, try this | foreach rw* [| eval maxelements=if(isnull(maxelements),mvcount('<<FIELD>>'),if(maxelements<mvcount('<<FIELD>>'),mvcount('<<FIELD>>'),maxelements))] | eval row=mvrange(0,maxelements) | mvexpand row | foreach rw* [| eval "<<FIELD>>"=mvindex('<<FIELD>>',row)] | fields - maxelements row
Hi @sanjai, If your original values come from separate events, then a simple table may be all you need: | table ds_file_path rwws01 rwmini01 However, the x-axis is a bit wordy. Can you provid... See more...
Hi @sanjai, If your original values come from separate events, then a simple table may be all you need: | table ds_file_path rwws01 rwmini01 However, the x-axis is a bit wordy. Can you provide a mock sample of your original data and a drawing of your target visualization?
Hello Splunk Community, I'm encountering challenges while converting multivalue fields to single value fields for effective visualization in a line chart. Here's the situation: Output : rwws01  rw... See more...
Hello Splunk Community, I'm encountering challenges while converting multivalue fields to single value fields for effective visualization in a line chart. Here's the situation: Output : rwws01  rwmini01 ds_file_path rwws01 rwmini01 \\swmfs\orca_db_january_2024\topo\raster.ds 0.56 0.98 0.99 5.99 9.04 8.05 5.09 5.66 7.99 8.99   In this output chart table, the fields rwws01 and rwmini01 are dynamic, so hardcoding them isn't feasible. The current output format is causing challenges in visualizing the data into a line chart. My requirement is get output  : ds_file_path rwws01 rwmini01 \\swmfs\orca_db_january_2024\topo\raster.ds 0.98 5.99 \\swmfs\orca_db_january_2024\topo\raster.ds   0.99 3.56 \\swmfs\orca_db_january_2024\topo\raster.ds   0.56 4.78 \\swmfs\orca_db_january_2024\topo\raster.ds NULL (or 0) 9.08 \\swmfs\orca_db_january_2024\topo\raster.ds NULL( or 0) 2.98 \\swmfs\orca_db_january_2024\topo\raster.ds NULL (or 0) 5.88   I tried different commands and function, but nothing gave me the desired output, I'm seeking suggestions on how to achieve this single value field format or alternative functions and commands to achieve this output and create a line chart effectively. Your insights and guidance would be greatly appreciated! Thank you.
It's an interesting thought, though the same issue is occuring on 9.0.1 for me but on Server2022
The event you have chosen to show does not match "Message=.*" so you won't get apiName extracted, therefore your chart will return no results (at least for this event). Your lookup appears to use "C... See more...
The event you have chosen to show does not match "Message=.*" so you won't get apiName extracted, therefore your chart will return no results (at least for this event). Your lookup appears to use "Client" as a field name, whereas your event appears to use "client" - fieldnames are case sensitive so these are two different fields. I hope this helps you resolve your issue.