All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

There are no defaults for charting.fieldColors.
Hi @gcusello. How are you? I'm the same person who had uploaded the question two years ago. How can I check if we have an Indexer Cluster? The same with the Search Head. About the MC, yes, it is a ... See more...
Hi @gcusello. How are you? I'm the same person who had uploaded the question two years ago. How can I check if we have an Indexer Cluster? The same with the Search Head. About the MC, yes, it is a Monitoring Console. We have a total of 6 servers. 1 of them include all Distributed Search, License Manager, Monitoring Console and SHCD that I don't know exactly what is. Then we have 3 indexers servers 1 server is the Heavy Forwarder The last one is the Search Head. Regards!
Hello @bowesmana! Thank you for responding quickly and apologize for the delay. So this would work if I didnt have the heatmap to go along with it. To put more information, I will include my original... See more...
Hello @bowesmana! Thank you for responding quickly and apologize for the delay. So this would work if I didnt have the heatmap to go along with it. To put more information, I will include my original query.  index=Basketball | timechart span=1d count by players limit=100 The above is my query and when I add what you typed individually it works but when I put it together no results appear. I want to use the calculation that I get from what you typed to put it in my heat graph as thresholds. Does that make more sense? For example, Lets say Lebron has a total 100 one week. once put into the equation, the product would be 7.14. This is the first threshold and it would show if he was below or above that threshold based on the color. Now another week goes by and this time the total was 300. Now the first threshold that was once 7.14 now goes up to 21.43. And the same thing happens. Does that make sense? Please let me know. Thank you for your help once again, I hope to hear from you soon!
Thank you, I must have missed this when looking through the documentation.  To piggy back, is there a more updated hex code list of the default options? 
Please share your current dashboard source (otherwise we have no idea what you might have done wrong!)
Assuming you want all possible combinations of System Group and Environment for each of the possible ScanMonth and ScanYear already present in your results, you could try something like this index=s... See more...
Assuming you want all possible combinations of System Group and Environment for each of the possible ScanMonth and ScanYear already present in your results, you could try something like this index=sample_index sourcetype=sample_sourcetype AcknowledgedServiceAccount="No" System="ABC" | eval ScanMonth_Translate=case( ScanMonth="1","January", ScanMonth="2","February", ScanMonth="3","March", ScanMonth="4","April", ScanMonth="5","May", ScanMonth="6","June", ScanMonth="7","July", ScanMonth="8","August", ScanMonth="9","September", ScanMonth="10","October", ScanMonth="11","November", ScanMonth="12","December") | fields ID, System, GSS, RemediationAssignment, Environment, SeverityCode, ScanYear, ScanMonth | fillnull value="NULL" ID, System, GSS, RemediationAssignment, Environment, SeverityCode, ScanYear, ScanMonth | foreach System Group Environment ScanMonth, ScanYear, SeverityCode [| eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") ] | stats count AS Total_Vulnerabilities BY ScanMonth, ScanYear, System, Group, Environment, SeverityCode | fields System, Group, ScanMonth, ScanYear, Environment, SeverityCode, Total_Vulnerabilities | stats values(eval(if(SeverityCode="1 CRITICAL",Total_Vulnerabilities, null()))) as "4_CRITICAL" values(eval(if(SeverityCode="2 HIGH",Total_Vulnerabilities, null()))) as "3_HIGH" values(eval(if(SeverityCode="3 MEDIUM",Total_Vulnerabilities, null()))) AS "2_MEDIUM" values(eval(if(SeverityCode="4 LOW",Total_Vulnerabilities, null()))) as "1_LOW" sum(Total_Vulnerabilities) AS TOTAL by System, Group, ScanMonth, ScanYear, Environment | fillnull value="0" 4_CRITICAL, 3_HIGH, 2_MEDIUM, 1_LOW | fields System, Group, Environment, ScanMonth, ScanYear, 4_CRITICAL, 3_HIGH, 2_MEDIUM, 1_LOW, TOTAL | replace "*PROD*" WITH "1_PROD" IN Environment | replace "*DR*" WITH "2_DR" IN Environment | replace "*TEST*" WITH "3_TEST" IN Environment | replace "*DEV*" WITH "4_DEV" IN Environment | sort 0 + System, GSS, Environment, ScanMonth, ScanYear | appendpipe [| stats values(System) as System values(Group) as Group values(Environment) as Environment by ScanMonth ScanYear | eventstats values(System) as System values(Group) as Group values(Environment) as Environment | mvexpand System | mvexpand Group | mvexpand Environment | fillnull value="0" 4_CRITICAL, 3_HIGH, 2_MEDIUM, 1_LOW, TOTAL ] | stats sum(TOTAL) AS TOTAL sum(1_LOW) AS 1_LOW sum(2_MEDIUM) AS 2_MEDIUM sum(3_HIGH) AS 3_HIGH sum(4_CRITICAL) AS 4_CRITICAL by System, Group, ScanMonth, ScanYear, Environment | sort 0 + System, Group, Environment, ScanMonth, ScanYear
Where is the input defined? You should be able to disable it where it is defined. If you have access to the command line on the machine, do: splunk btool inputs list --debug | fgrep "<the input n... See more...
Where is the input defined? You should be able to disable it where it is defined. If you have access to the command line on the machine, do: splunk btool inputs list --debug | fgrep "<the input name>"  Where the input is defined, you can go to the config file and delete the input, or disable it.
>Are we saying that the following parameter in limits.conf is no longer applied/valid when modified? Yes on UF.
Hi all, I have a situation. Below is my search. Search needs to produce past 6 months of report. The goal is to produce ZEROs for the months with no events. However, below search is producing result... See more...
Hi all, I have a situation. Below is my search. Search needs to produce past 6 months of report. The goal is to produce ZEROs for the months with no events. However, below search is producing results with ZEROs for the whole year instead of just 6 months. How to make it do only for 6 months? Thank you! Search: index=sample_index sourcetype=sample_sourcetype AcknowledgedServiceAccount="No" System="ABC" | eval ScanMonth_Translate=case( ScanMonth="1","January", ScanMonth="2","February", ScanMonth="3","March", ScanMonth="4","April", ScanMonth="5","May", ScanMonth="6","June", ScanMonth="7","July", ScanMonth="8","August", ScanMonth="9","September", ScanMonth="10","October", ScanMonth="11","November", ScanMonth="12","December") | fields ID, System, GSS, RemediationAssignment, Environment, SeverityCode, ScanYear, ScanMonth | fillnull value="NULL" ID, System, GSS, RemediationAssignment, Environment, SeverityCode, ScanYear, ScanMonth | foreach System Group Environment ScanMonth, ScanYear, SeverityCode [| eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") | eval <<FIELD>> = split(<<FIELD>>, "\n") ] | stats count AS Total_Vulnerabilities BY ScanMonth, ScanYear, System, Group, Environment, SeverityCode | fields System, Group, ScanMonth, ScanYear, Environment, SeverityCode, Total_Vulnerabilities | stats values(eval(if(SeverityCode="1 CRITICAL",Total_Vulnerabilities, null()))) as "4_CRITICAL" values(eval(if(SeverityCode="2 HIGH",Total_Vulnerabilities, null()))) as "3_HIGH" values(eval(if(SeverityCode="3 MEDIUM",Total_Vulnerabilities, null()))) AS "2_MEDIUM" values(eval(if(SeverityCode="4 LOW",Total_Vulnerabilities, null()))) as "1_LOW" sum(Total_Vulnerabilities) AS TOTAL by System, Group, ScanMonth, ScanYear, Environment | fillnull value="0" 4_CRITICAL, 3_HIGH, 2_MEDIUM, 1_LOW | fields System, Group, Environment, ScanMonth, ScanYear, 4_CRITICAL, 3_HIGH, 2_MEDIUM, 1_LOW, TOTAL | replace "*PROD*" WITH "1_PROD" IN Environment | replace "*DR*" WITH "2_DR" IN Environment | replace "*TEST*" WITH "3_TEST" IN Environment | replace "*DEV*" WITH "4_DEV" IN Environment | sort 0 + System, GSS, Environment, ScanMonth, ScanYear | append [| makeresults | eval ScanMonth="1,2,3,4,5,6,7,8,9,10,11,12" | eval 4_CRITICAL="0" | eval 3_HIGH="0" | eval 2_MEDIUM="0" | eval 1_LOW="0" | eval TOTAL="0" | makemv delim="," ScanMonth | stats count by ScanMonth, 4_CRITICAL, 3_HIGH, 2_MEDIUM, 1_LOW, TOTAL | fields - count ] | fillnull value="0" 4_CRITICAL, 3_HIGH, 2_MEDIUM, 1_LOW, TOTAL | filldown | stats sum(TOTAL) AS TOTAL sum(1_LOW) AS 1_LOW sum(2_MEDIUM) AS 2_MEDIUM sum(3_HIGH) AS 3_HIGH sum(4_CRITICAL) AS 4_CRITICAL by System, Group, ScanMonth, ScanYear, Environment | sort 0 + System, Group, Environment, ScanMonth, ScanYear Output: System Group ScanMonth ScanYear Environment TOTAL 1_LOW 2_MEDIUM 3_HIGH 4_CRITICAL A1234 GSS-27 2 2025 3_TEST 216 2 28 155 31 A1234 GSS-27 3 2025 3_TEST 430 4 56 308 62 A1234 GSS-27 1 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 2 2025 4_DEV 222 2 28 161 31 A1234 GSS-27 3 2025 4_DEV 444 4 56 322 62 A1234 GSS-27 4 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 5 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 6 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 7 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 8 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 9 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 10 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 11 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 12 2025 4_DEV 0 0 0 0 0 Desired Output: System Group ScanMonth ScanYear Environment TOTAL 1_LOW 2_MEDIUM 3_HIGH 4_CRITICAL A1234 GSS-27 1 2025 3_TEST 0 0 0 0 0 A1234 GSS-27 2 2025 3_TEST 221 3 4 214 0 A1234 GSS-27 3 2025 3_TEST 430 4 56 308 62 A1234 GSS-27 10 2024 3_TEST 0 0 0 0 0 A1234 GSS-27 11 2024 3_TEST 0 0 0 0 0 A1234 GSS-27 12 2024 3_TEST 5 1 2 0 2 A1234 GSS-27 1 2025 4_DEV 10 5 2 2 1 A1234 GSS-27 2 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 3 2025 4_DEV 0 0 0 0 0 A1234 GSS-27 10 2024 4_DEV 12 4 3 2 3 A1234 GSS-27 11 2024 4_DEV 20 10 5 2 3 A1234 GSS-27 12 2024 4_DEV 0 0 0 0 0
Can Splunk read a CSV file located on a remote server using a forwarder and automatically upload it as a lookup? what i know there is two option, upload csv as lookup or read line by line from the f... See more...
Can Splunk read a CSV file located on a remote server using a forwarder and automatically upload it as a lookup? what i know there is two option, upload csv as lookup or read line by line from the file as a log
Just to confirm here. When we say. "Note: As a side effect of this issue, maxKbps(limits.conf) will also be impacted as it requires thruput metrics to function." Are we saying that the followin... See more...
Just to confirm here. When we say. "Note: As a side effect of this issue, maxKbps(limits.conf) will also be impacted as it requires thruput metrics to function." Are we saying that the following parameter in limits.conf is no longer applied/valid when modified? [thruput] maxKBps I originally thought this solely a regression on the thruput maxKBps metric not being displayed in the logs.
I have an requirement to extract a value from an mqtt string before i parse it to json. Initially i was using MQTT Modular input app to pull each of the topics with their own input.  I found that w... See more...
I have an requirement to extract a value from an mqtt string before i parse it to json. Initially i was using MQTT Modular input app to pull each of the topics with their own input.  I found that with more than 3 inputs /topics enabled i am dropping some if not all data. So i decided to pull all the topics in a single input. This works well except i still need to be able to separate the topics for searches. I managed to get this working using multiple transforms. i changed something and now i can get it to work again. Using Transforms i can parse to json with no issues (mqtttojson) Transforms.conf [mqtttojson] REGEX = msg\=(.+)$ FORMAT = $1 DEST_KEY = _raw   [mqtttopic] CLEAN_KEYS = 0 FORMAT = Topic::"$1" REGEX = tgw\/data\/0x155f\/(?<Topic>\S*?)\/     Props.conf [mqtttojson_ubnpfc_all] DATETIME_CONFIG =  LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true TIME_PREFIX = \"ts\": TZ = Europe/London category = Custom pulldown_type = 1 TRANSFORMS-mqtttopic = mqtttojson, mqtttopic In the example below i need the 4th topic level i.e. "TransportContextTracking". Thu Apr 24 12:42:15 GMT 2025 name=mqtt_msg_received event_id= topic=tgw/data/0x155f/TransportContextTracking/MFC/0278494 msg={"data":{"destination":{"locationAddress":"/UrbanUK/PCOTS13/Exit"},"errorCode":null,"event":"Started","loadCarrierId":"0278494","source":{"locationAddress":"/UrbanUK/PCOTS13/Pick"},"transportId":"f0409b2a-e9d4-407c-bd65-48ccea17b520","transportType":"Transport"},"dbid":8104562815,"ts":1745498528217}     What am i missing ?????
Thanks for the info.  I had a user wondering why the theme was not changing to dark on the custom app.  
Thank you @livehybrid , It did work after installing compatible version for urllib3, However now when I try testing the running the app, I am facing a new issue which says: 'Error: HTTPConnecti... See more...
Thank you @livehybrid , It did work after installing compatible version for urllib3, However now when I try testing the running the app, I am facing a new issue which says: 'Error: HTTPConnectionPool(host='icia-mesapp1oc.na.pg.com', port=5985): Max retries exceeded with url: /wsman (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f432b936c40>: Failed to establish a new connection: [Errno -2] Name or service not known'))' I have added all the possible compatible modules and dependencies. 
Hello,  i am trying to install php agent for appdynamics on an ec2 istance Linux 2023 I am having this error  [AD Thread Pool-Global0] 24 Apr 2025 13:34:39,645 ERROR com.singularity.CONFIG.Config... See more...
Hello,  i am trying to install php agent for appdynamics on an ec2 istance Linux 2023 I am having this error  [AD Thread Pool-Global0] 24 Apr 2025 13:34:39,645 ERROR com.singularity.CONFIG.ConfigurationChannel - HTTP Request failed: HTTP/1.1 502 badgateway [AD Thread Pool-Global0] 24 Apr 2025 13:34:39,645 WARN com.singularity.CONFIG.ConfigurationChannel - Could not connect to the controller/invalid response from controller, cannot get initialization information,controller host [appdynamics.dom101.intres], port[8090], exception [null] Can  you help me to identify the issue ?
Hello all,  I have a dashboard that utilizes a dynamic panel for loading different tables depending on which link is clicked.  However, the links are all garbled.  The first 4 display, but the oth... See more...
Hello all,  I have a dashboard that utilizes a dynamic panel for loading different tables depending on which link is clicked.  However, the links are all garbled.  The first 4 display, but the others are underneath the dynamic panel.  If you right arrow after clicking one of the links, you can get to each link, but the display is all messed up.   How do I get all of the links into one single line? Thanks!  
Hi,  if I understood right your requirement, you can add a button with JS in the table. When the button is clicked, it will trigger a splunk search to update a lookup where you will save the status... See more...
Hi,  if I understood right your requirement, you can add a button with JS in the table. When the button is clicked, it will trigger a splunk search to update a lookup where you will save the status change of the field solved. After the search is complete you can re-run the search in the table so see the update. You have also to change the search in the table in order to get the last updated value for the field solved. This flow can be done using JS. require([ 'splunkjs/mvc', 'splunkjs/mvc/searchmanager', 'splunkjs/mvc/tableview', 'splunkjs/mvc/simplexml/ready!', 'jquery' ], function(mvc, SearchManager, TableView, ignored, $) { // Define a simple cell renderer with a button var ActionButtonRenderer = TableView.BaseCellRenderer.extend({ canRender: function(cell) { return cell.field === 'myfieldwheredispaybutton'; }, render: function($td, cell) { $td.addClass('button-cell'); var $btn = $('<button class="btn btn-primary">Execute</button>'); $btn.on('click', function(e) { e.preventDefault(); e.stopPropagation(); var rowId = cell.value; // value from the cell (e.g., unique row ID) console.log("Button clicked for row:", rowId); var searchQuery = `| makeresults | eval row_id=\"${rowId}\", _time=now() | outputlookup append=true custom_lookup.csv`; var writeSearch = new SearchManager({ id: "writeSearch_" + Math.floor(Math.random() * 100000), search: searchQuery, autostart: true }); writeSearch.on('search:done', function() { console.log("Search completed and lookup updated"); var panelSearch = mvc.Components.get('panel_search_id'); if (panelSearch) { panelSearch.startSearch(); console.log("Panel search restarted"); } }); }); $td.append($btn); } }); // Apply the renderer to the specified table var tableComponent = mvc.Components.get('generic_table_id'); if (tableComponent) { tableComponent.getVisualization(function(tableView) { tableView.table.addCellRenderer(new ActionButtonRenderer()); tableView.table.render(); }); } });  
Hi @JGP  You can use the license API to collect this information if you want to present it or record it elsewhere - check out https://docs.appdynamics.com/appd/24.x/latest/en/extend-splunk-appdynami... See more...
Hi @JGP  You can use the license API to collect this information if you want to present it or record it elsewhere - check out https://docs.appdynamics.com/appd/24.x/latest/en/extend-splunk-appdynamics/splunk-appdynamics-apis/license-api  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Mridu27  Use the exact channel name as listed by PowerShell on the server. The names shown in the Event Viewer GUI might not be the programmatic names required by the Splunk forwarder. To find ... See more...
Hi @Mridu27  Use the exact channel name as listed by PowerShell on the server. The names shown in the Event Viewer GUI might not be the programmatic names required by the Splunk forwarder. To find the correct channel name for DSC Operational logs, run this PowerShell command on the Windows server: Get-WinEvent -ListLog *DSC* | Select-Object LogName   Similarly, for DNS Operational logs: Get-WinEvent -ListLog *DNS* | Select-Object LogName   Use the LogName value returned by PowerShell in your inputs.conf.   # Example inputs.conf on the Universal Forwarder # For DSC Operational logs (use the exact name found via PowerShell) [WinEventLog://Microsoft-Windows-DSC/Operational] disabled = 0 index = winevents sourcetype = WinEventLog:Microsoft-Windows-DSC/Operational # For DNS Server Operational logs (use the exact name found via PowerShell) [WinEventLog://Microsoft-Windows-DNS-Server/Operational] disabled = 0 index = winevents sourcetype = WinEventLog:Microsoft-Windows-DNS-Server/Operational The Splunk Universal Forwarder requires the precise channel name registered with the Windows Event Log service. Characters like %4 seen in some GUI tools are often display artifacts and not part of the actual channel name. Separators are typically forward slashes (/), not underscores (_), hyphens (-), or spaces. Wildcards (*) are not supported directly within the channel name specification in the stanza header. Ensure the Splunk Universal Forwarder service account has permissions to read the specified event log channels. Restart the Splunk Universal Forwarder service after modifying inputs.conf. Monitor Windows event log data Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
@Mridu27  Ensure that the channel name you're using matches exactly what is listed in the Event Viewer. Sometimes, even small discrepancies can cause errors. https://community.splunk.com/t5/Getting... See more...
@Mridu27  Ensure that the channel name you're using matches exactly what is listed in the Event Viewer. Sometimes, even small discrepancies can cause errors. https://community.splunk.com/t5/Getting-Data-In/Failed-to-find-Event-Log/m-p/363954