All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Well, actually, I think so! The logs about the XmlConfig not finding configuration are a red herring: they don't mean anything is wrong with env. variables. Those are the required env. variables th... See more...
Well, actually, I think so! The logs about the XmlConfig not finding configuration are a red herring: they don't mean anything is wrong with env. variables. Those are the required env. variables that ALL need to be set explicitly to make it work without am xml config file: APPDYNAMICS_CONTROLLER_HOST_NAME: <key-value> APPDYNAMICS_CONTROLLER_SSL_ENABLED: "true" APPDYNAMICS_CONTROLLER_PORT: "443" APPDYNAMICS_AGENT_ACCOUNT_ACCESS_KEY: <key-value> APPDYNAMICS_AGENT_ACCOUNT_NAME: <key-value> APPDYNAMICS_AGENT_TIER_NAME: <key-value> APPDYNAMICS_AGENT_NODE_NAME: <key-value> APPDYNAMICS_AGENT_APPLICATION_NAME: <key-value> I forgot to add APPDYNAMICS_AGENT_ACCOUNT_NAME and there no messages in the logs pointing me into the right direction. I would suggest the doc would insist on all of those values being set, or the agent won't connect. Thanks!
I have downloaded splunk 9.2 rpm file and installed on rhel9.2. While i'm running splunk enable boot-start it's throwing  the error as below.    [root@splunk~]# splunk enable boot-start execve: No ... See more...
I have downloaded splunk 9.2 rpm file and installed on rhel9.2. While i'm running splunk enable boot-start it's throwing  the error as below.    [root@splunk~]# splunk enable boot-start execve: No such file or directory while running command /sbin/chkconfig [root@splunk~]#     Can someone help me on this ?
| spath logs{} output=logs | spath serial_number | mvexpand logs | spath input=logs
| stats list(Date) as Date by File | eval row=mvrange(0,2) | mvexpand row | eval Date=mvindex(Date,row) | eval File=if(isnotnull(Date),File,"missing file") | fields - row
I would like to extract the results of each test within the logs array by distinct count of serial number. That is, for each serial number i want to extract the test_name and result and plot the pas... See more...
I would like to extract the results of each test within the logs array by distinct count of serial number. That is, for each serial number i want to extract the test_name and result and plot the pass/fail rates of eachindividual test. The test names are the same in each event with only a different serial_number For the raw JSON below there would be 12 tests to extract the results from.  How would i go about this in search?       {"serial_number": "VE7515966", "type": "Test", "result": "Pass", "logs": [{"test_name": "UGC Connect", "result": "Pass"}, {"test_name": "Disable UGC USB Comm Watchdog", "result": "Pass"}, {"test_name": "ACR Data Read", "result": "Pass", "received": "ACR model extracted as Test"}, {"test_name": "Thermocouple in Grill", "tc_1": "295", "tc_2": "578", "req_diff": 50, "result": "Pass"}, {"test_name": "Glow Plug in Grill", "m_ugc": "1849", "tol_lower": 0, "tol_upper": 10000, "cond_ugc": "0", "tol_cond": 0, "result": "Pass"}, {"test_name": "Glow Plug Power Toggle", "result": "Pass", "received": "Glow plug power toggled"}, {"test_name": "Glow Plug in Grill", "m_ugc": "2751", "tol_lower": 0, "tol_upper": 10000, "cond_ugc": "0", "tol_cond": 0, "result": "Pass"}, {"test_name": "Motor", "R_rpm_ugc": 3775.0, "R_rpm": 3950, "R_v_ugc": 984.3333333333334, "R_v": 800, "R_rpm_t": 550, "R_v_t": 500, "R_name": "AUGER 640 R", "F_rpm_ugc": 3816.6666666666665, "F_rpm": 3950, "F_v_ugc": 985.3333333333334, "F_v": 800, "F_rpm_t": 550, "F_v_t": 500, "F_name": "AUGER 640 F"}, {"test_name": "Fan", "ugc_rpm": 2117.0, "rpm": 2700, "rpm_t": 800, "ugc_v": 554.3333333333334, "v": 630, "v_t": 200, "result": "Pass"}, {"test_name": "RS 485", "result": "Pass", "received": "All devices detected", "expected": "Devices detected: ['P', 'F']"}, {"test_name": "Close UGC Port", "result": "Pass"}, {"test_name": "Confirm Grill Size", "result": "Pass", "received": "Grill confirmed as Test"}]}
oooh can doo! Format a column first, and make a range map or value match then take note of that in the code, for instance the "Users in group" column:  "Users in Group": { "data": "> table | se... See more...
oooh can doo! Format a column first, and make a range map or value match then take note of that in the code, for instance the "Users in group" column:  "Users in Group": { "data": "> table | seriesByName(\"Users in Group\") | formatByType(Users_in_GroupColumnFormatEditorConfig)", "rowColors": "> table | seriesByName('Users in Group') | pick(Users_in_GroupRowColorsEditorConfig)", "rowBackgroundColors": "> table | seriesByName(\"Users in Group\") | rangeValue(Users_in_GroupRowBackgroundColorsEditorConfig)" }    Then find tableFormat and change rowBackgroundColors to the same as you used to color the column "headerVisibility": "fixed", "backgroundColor": "transparent", "tableFormat": {     "rowBackgroundColors": "> table | seriesByName('Users in Group') | rangeValue(Users_in_GroupRowBackgroundColorsEditorConfig)",     "headerBackgroundColor": "> backgroundColor | setColorChannel(tableHeaderBackgroundColorConfig)",     "rowColors": "> rowBackgroundColors | maxContrast(tableRowColorMaxContrast)",     "headerColor": "> headerBackgroundColor | maxContrast(tableRowColorMaxContrast)" }  And the whole row is now that color.     
Unfortunately, it is not possible to get data collector variable information in Email Digest  . The product doesn't support.  
In the Splunk Synthetic doc, it mentioned, you need to use the following syntax to reference the custom variable you defined in earlier on steps.  {{custom.$variable}} However, when I tried, it see... See more...
In the Splunk Synthetic doc, it mentioned, you need to use the following syntax to reference the custom variable you defined in earlier on steps.  {{custom.$variable}} However, when I tried, it seems that syntax only works when you try to reference it in the "Fill in Field" sort of steps.   If I have a downstream javaScript step, and in that javaScript,  I want to reference the custom variable I defined in my earlier on steps,  how do I reference them? I tried the  {{custom.$variable}} syntax, they do NOT work in my javaScript
I still think it's a mistake. There is a JS SDK separately downloadable from https://dev.splunk.com/enterprise/downloads
Hello, I am getting the below error when i attempt to execute the process of creating a secret storage in /opt/splunk/bin [root@ba-log bin]# splunk rdrand-gen -v -n 1M -m reseed_delay -o /opt/spl... See more...
Hello, I am getting the below error when i attempt to execute the process of creating a secret storage in /opt/splunk/bin [root@ba-log bin]# splunk rdrand-gen -v -n 1M -m reseed_delay -o /opt/splunk/rdrand.bin Passphrase that unlocks secret storage: /usr/bin/env: ‘python’: No such file or directory /opt/splunk/etc/system/bin/gnome_keyring.py: process returned non-zero status: status=done, exit=127, stime_sec=0.005357, max_rss_kb=13716, vm_minor=216, sched_vol=1, sched_invol=7 Error accessing secret storage.   permission of file: -rwxr-xr-x. 1 root root 42392 Jan 6 2023 /usr/bin/env I verified with splunk envvers and the path to /usr/bin is in there. I run the splunk command as root user, so why isn't splunk able to read the /usr/bin/env file?
Thank you for the additional detail Jananie.  This is super helpful we will be storing 10 years of AppD case data in an internal database accessible by Support,  so when an issue comes up,  the engin... See more...
Thank you for the additional detail Jananie.  This is super helpful we will be storing 10 years of AppD case data in an internal database accessible by Support,  so when an issue comes up,  the engineers have the ability to pull on 10 years of data for a quick resolution.  I do realize that this doesn't help customers and partners self-serve and do that in-depth research themselves,  but we're not losing that history for support engineers.  Thanks so much for taking the time to provide feedback,  we'll weigh this input balancing cost,  resources and impact to timelines.
I agree that it makes no sense. I opened an SR and this is in the reply I got back from support: The app  Splunk SDK for Python must work for .js as well as .py, please attempt using .js an let us... See more...
I agree that it makes no sense. I opened an SR and this is in the reply I got back from support: The app  Splunk SDK for Python must work for .js as well as .py, please attempt using .js an let us know the outcome. I thought, surely this can't be right. Especially on Python 3.7. There are packages built to allow for the import of Python libraries into Node.js, but the ones I know are designed for 3.8+.
Splunk isn't one product. It is split all over the place. You have to sign up for each product separately. The homepage signs you up to splunk cloud not splunk observability.   To fix this, sign up... See more...
Splunk isn't one product. It is split all over the place. You have to sign up for each product separately. The homepage signs you up to splunk cloud not splunk observability.   To fix this, sign up for observability separately.
@Kim.Frazier, Thanks for asking your question on the community. Since it's been a few days with no reply, did you find a solution or any new discoveries you could share? If you are still looking... See more...
@Kim.Frazier, Thanks for asking your question on the community. Since it's been a few days with no reply, did you find a solution or any new discoveries you could share? If you are still looking for help, you can contact Cisco AppDynamics Support. How do I submit a Support ticket? An FAQ 
Hi @Justin.Matthew, Thanks for asking your question on the community. It's been a few days, have you discovered anything worth sharing? If not, you can contact Cisco AppDynamics Support for more h... See more...
Hi @Justin.Matthew, Thanks for asking your question on the community. It's been a few days, have you discovered anything worth sharing? If not, you can contact Cisco AppDynamics Support for more help. How do I submit a Support ticket? An FAQ 
3 members in the cluster, has not updated since I made the change yesterday, even on the instance I made the change on.
Haven't tried yet, but wanted to confirm if it works for POC. Thanks.
Just using the local Splunk authentication (username and password), nothing external.
Hi @Anthony.Dahanne, Thanks for asking your question on the community. Since it's been a few days, have you discovered a solution or anything worth sharing? 
Hi @iam_ironman , does it run in this way? Ciao. Giuseppe