All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| stats list(Date) as Date by File | eval row=mvrange(0,2) | mvexpand row | eval Date=mvindex(Date,row) | eval File=if(isnotnull(Date),File,"missing file") | fields - row
I would like to extract the results of each test within the logs array by distinct count of serial number. That is, for each serial number i want to extract the test_name and result and plot the pas... See more...
I would like to extract the results of each test within the logs array by distinct count of serial number. That is, for each serial number i want to extract the test_name and result and plot the pass/fail rates of eachindividual test. The test names are the same in each event with only a different serial_number For the raw JSON below there would be 12 tests to extract the results from.  How would i go about this in search?       {"serial_number": "VE7515966", "type": "Test", "result": "Pass", "logs": [{"test_name": "UGC Connect", "result": "Pass"}, {"test_name": "Disable UGC USB Comm Watchdog", "result": "Pass"}, {"test_name": "ACR Data Read", "result": "Pass", "received": "ACR model extracted as Test"}, {"test_name": "Thermocouple in Grill", "tc_1": "295", "tc_2": "578", "req_diff": 50, "result": "Pass"}, {"test_name": "Glow Plug in Grill", "m_ugc": "1849", "tol_lower": 0, "tol_upper": 10000, "cond_ugc": "0", "tol_cond": 0, "result": "Pass"}, {"test_name": "Glow Plug Power Toggle", "result": "Pass", "received": "Glow plug power toggled"}, {"test_name": "Glow Plug in Grill", "m_ugc": "2751", "tol_lower": 0, "tol_upper": 10000, "cond_ugc": "0", "tol_cond": 0, "result": "Pass"}, {"test_name": "Motor", "R_rpm_ugc": 3775.0, "R_rpm": 3950, "R_v_ugc": 984.3333333333334, "R_v": 800, "R_rpm_t": 550, "R_v_t": 500, "R_name": "AUGER 640 R", "F_rpm_ugc": 3816.6666666666665, "F_rpm": 3950, "F_v_ugc": 985.3333333333334, "F_v": 800, "F_rpm_t": 550, "F_v_t": 500, "F_name": "AUGER 640 F"}, {"test_name": "Fan", "ugc_rpm": 2117.0, "rpm": 2700, "rpm_t": 800, "ugc_v": 554.3333333333334, "v": 630, "v_t": 200, "result": "Pass"}, {"test_name": "RS 485", "result": "Pass", "received": "All devices detected", "expected": "Devices detected: ['P', 'F']"}, {"test_name": "Close UGC Port", "result": "Pass"}, {"test_name": "Confirm Grill Size", "result": "Pass", "received": "Grill confirmed as Test"}]}
oooh can doo! Format a column first, and make a range map or value match then take note of that in the code, for instance the "Users in group" column:  "Users in Group": { "data": "> table | se... See more...
oooh can doo! Format a column first, and make a range map or value match then take note of that in the code, for instance the "Users in group" column:  "Users in Group": { "data": "> table | seriesByName(\"Users in Group\") | formatByType(Users_in_GroupColumnFormatEditorConfig)", "rowColors": "> table | seriesByName('Users in Group') | pick(Users_in_GroupRowColorsEditorConfig)", "rowBackgroundColors": "> table | seriesByName(\"Users in Group\") | rangeValue(Users_in_GroupRowBackgroundColorsEditorConfig)" }    Then find tableFormat and change rowBackgroundColors to the same as you used to color the column "headerVisibility": "fixed", "backgroundColor": "transparent", "tableFormat": {     "rowBackgroundColors": "> table | seriesByName('Users in Group') | rangeValue(Users_in_GroupRowBackgroundColorsEditorConfig)",     "headerBackgroundColor": "> backgroundColor | setColorChannel(tableHeaderBackgroundColorConfig)",     "rowColors": "> rowBackgroundColors | maxContrast(tableRowColorMaxContrast)",     "headerColor": "> headerBackgroundColor | maxContrast(tableRowColorMaxContrast)" }  And the whole row is now that color.     
Unfortunately, it is not possible to get data collector variable information in Email Digest  . The product doesn't support.  
In the Splunk Synthetic doc, it mentioned, you need to use the following syntax to reference the custom variable you defined in earlier on steps.  {{custom.$variable}} However, when I tried, it see... See more...
In the Splunk Synthetic doc, it mentioned, you need to use the following syntax to reference the custom variable you defined in earlier on steps.  {{custom.$variable}} However, when I tried, it seems that syntax only works when you try to reference it in the "Fill in Field" sort of steps.   If I have a downstream javaScript step, and in that javaScript,  I want to reference the custom variable I defined in my earlier on steps,  how do I reference them? I tried the  {{custom.$variable}} syntax, they do NOT work in my javaScript
I still think it's a mistake. There is a JS SDK separately downloadable from https://dev.splunk.com/enterprise/downloads
Hello, I am getting the below error when i attempt to execute the process of creating a secret storage in /opt/splunk/bin [root@ba-log bin]# splunk rdrand-gen -v -n 1M -m reseed_delay -o /opt/spl... See more...
Hello, I am getting the below error when i attempt to execute the process of creating a secret storage in /opt/splunk/bin [root@ba-log bin]# splunk rdrand-gen -v -n 1M -m reseed_delay -o /opt/splunk/rdrand.bin Passphrase that unlocks secret storage: /usr/bin/env: ‘python’: No such file or directory /opt/splunk/etc/system/bin/gnome_keyring.py: process returned non-zero status: status=done, exit=127, stime_sec=0.005357, max_rss_kb=13716, vm_minor=216, sched_vol=1, sched_invol=7 Error accessing secret storage.   permission of file: -rwxr-xr-x. 1 root root 42392 Jan 6 2023 /usr/bin/env I verified with splunk envvers and the path to /usr/bin is in there. I run the splunk command as root user, so why isn't splunk able to read the /usr/bin/env file?
Thank you for the additional detail Jananie.  This is super helpful we will be storing 10 years of AppD case data in an internal database accessible by Support,  so when an issue comes up,  the engin... See more...
Thank you for the additional detail Jananie.  This is super helpful we will be storing 10 years of AppD case data in an internal database accessible by Support,  so when an issue comes up,  the engineers have the ability to pull on 10 years of data for a quick resolution.  I do realize that this doesn't help customers and partners self-serve and do that in-depth research themselves,  but we're not losing that history for support engineers.  Thanks so much for taking the time to provide feedback,  we'll weigh this input balancing cost,  resources and impact to timelines.
I agree that it makes no sense. I opened an SR and this is in the reply I got back from support: The app  Splunk SDK for Python must work for .js as well as .py, please attempt using .js an let us... See more...
I agree that it makes no sense. I opened an SR and this is in the reply I got back from support: The app  Splunk SDK for Python must work for .js as well as .py, please attempt using .js an let us know the outcome. I thought, surely this can't be right. Especially on Python 3.7. There are packages built to allow for the import of Python libraries into Node.js, but the ones I know are designed for 3.8+.
Splunk isn't one product. It is split all over the place. You have to sign up for each product separately. The homepage signs you up to splunk cloud not splunk observability.   To fix this, sign up... See more...
Splunk isn't one product. It is split all over the place. You have to sign up for each product separately. The homepage signs you up to splunk cloud not splunk observability.   To fix this, sign up for observability separately.
@Kim.Frazier, Thanks for asking your question on the community. Since it's been a few days with no reply, did you find a solution or any new discoveries you could share? If you are still looking... See more...
@Kim.Frazier, Thanks for asking your question on the community. Since it's been a few days with no reply, did you find a solution or any new discoveries you could share? If you are still looking for help, you can contact Cisco AppDynamics Support. How do I submit a Support ticket? An FAQ 
Hi @Justin.Matthew, Thanks for asking your question on the community. It's been a few days, have you discovered anything worth sharing? If not, you can contact Cisco AppDynamics Support for more h... See more...
Hi @Justin.Matthew, Thanks for asking your question on the community. It's been a few days, have you discovered anything worth sharing? If not, you can contact Cisco AppDynamics Support for more help. How do I submit a Support ticket? An FAQ 
3 members in the cluster, has not updated since I made the change yesterday, even on the instance I made the change on.
Haven't tried yet, but wanted to confirm if it works for POC. Thanks.
Just using the local Splunk authentication (username and password), nothing external.
Hi @Anthony.Dahanne, Thanks for asking your question on the community. Since it's been a few days, have you discovered a solution or anything worth sharing? 
Hi @iam_ironman , does it run in this way? Ciao. Giuseppe
Hi @Haleb , where did you run the Monitoring Console, on SH? it's better to use it on Cluster Manager or (better) on a dedicated server. Anyway, if this is a lab, you have to configure your Monico... See more...
Hi @Haleb , where did you run the Monitoring Console, on SH? it's better to use it on Cluster Manager or (better) on a dedicated server. Anyway, if this is a lab, you have to configure your Monicoring Console, accessing all the systems in your infrastructure as Search Peer. In other words, go in [Settings > Distributed Search > Add Peer] and add also the Cluster Manager as Search peer (on 8089 port) and you'll see it in the Monitoring Console. I did the same error some years ago! Ciao. Giuseppe
How to add a dummy row to the table in the Splunk dashboard. We are receiving 2 files everyday 4 times in between 6-7:30AM, 11-12:30 PM, 6-7:30PM, 9-10:05PM. I need output like below if received on... See more...
How to add a dummy row to the table in the Splunk dashboard. We are receiving 2 files everyday 4 times in between 6-7:30AM, 11-12:30 PM, 6-7:30PM, 9-10:05PM. I need output like below if received one file means has to display like missing other file. Using | makeresults command we can create a row but it is applicable while calculating the timings. Input :  File  Date TI7L 03-06-2024 06:52 TI7L 03-06-2024 06:55 TI8L 03-06-2024 11:51 TI8L 03-06-2024 11:50 TI9L 03-06-2024 19:06 TI9L 03-06-2024 19:10 TI5L 03-06-2024 22:16 TI5L 03-06-2024 22:20     Output:  File  Date TI7L 03-06-2024 06:52 Missing file Missing file TI8L 03-06-2024 11:50 TI9L 03-06-2024 19:06 Missing file TI5L 03-06-2024 22:16 Missing file
Good Day, On the below message. Adding the IP to the Server Settings. Does the Server Settings sit in PowerBI or in Splunk? To find List Management. I have exactly the same error trying to conn... See more...
Good Day, On the below message. Adding the IP to the Server Settings. Does the Server Settings sit in PowerBI or in Splunk? To find List Management. I have exactly the same error trying to connect to Splunk Cloud connection from PowerBI Any help would be appreciated - Thanks