All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

What's the best way to start the splunk ? Is it with root user or with the splunk user ? 
Hello Team, I have configured splunk forwarder and on which I am getting below error, WARN TcpOutputProc [8204 parsing] - The TCP output processor has paused the data flow. Forwarding to host_des... See more...
Hello Team, I have configured splunk forwarder and on which I am getting below error, WARN TcpOutputProc [8204 parsing] - The TCP output processor has paused the data flow. Forwarding to host_dest=WALVAU-VIDI-1 inside output group default-autolb-group from host_src=WALVAU-MCP-APP- has been blocked for blocked_seconds=400. This can stall the data flow towards indexing and other network outputs. Review the receiving system's health in the Splunk Monitoring Console. It is probably not accepting data.   Task : I want to send data from Splunk forwarder to Splunk enterprise server ( Indexer ) 1.  I opened outbound port on UF 9997 2. Opened inbound port 9997 on indexer outputs.conf on UF [tcpout] defaultGroup = default-autolb-group [tcpout:default-autolb-group] server = WALVAU-VIDI-1:9997 [tcpout-server://WALVAU-VIDI-1:9997] inputs.conf on UF [monitor://D:\BEXT\Walmart_VAU_ACP\Log\BPI*.log] disabled = false index = walmart_vau_acp sourcetype = Walmart_VAU_ACP Please help me to fix the issue. So that forwarder will send data to Indexer server.  
Hi @Somesh , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Point... See more...
Hi @Somesh , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Thanks for the update @gcusello. It worked by running the command "splunk enable boot-start -systemd-managed 1". Also what is the best practice to start the splunk ? Is that needs to be started by "r... See more...
Thanks for the update @gcusello. It worked by running the command "splunk enable boot-start -systemd-managed 1". Also what is the best practice to start the splunk ? Is that needs to be started by "root" user or the "splunk" user 
Hi @orendado , Usually different types of logs are categorized using sourcetype. Related to sourcetype, usually there are all the parsing rules and field extraction. Are you using different source... See more...
Hi @orendado , Usually different types of logs are categorized using sourcetype. Related to sourcetype, usually there are all the parsing rules and field extraction. Are you using different sourcetypes? If you want to add othe data sources, you can create your own sourcetypes eventually starting from an existern one. The Add Data function is very useful to find the correct sourcetype to associate to your data sources. Ciao. Giuseppe
Hi, Let's say I'm ingesting different types of logs files from different type(some are txt,csv,json,xml....) to the same index. How can I add additional data to each datasource/log? I would like to ... See more...
Hi, Let's say I'm ingesting different types of logs files from different type(some are txt,csv,json,xml....) to the same index. How can I add additional data to each datasource/log? I would like to some extra fields in json format, for example : customers name, system same...
Thanks again for the info and clarification   ewholz
Hi @iam_ironman , I usually use this configuration. Ciao. Giuseppe
Hi @Somesh, please try some of these (starting from the first url): https://www.aidanwardman.com/enabling-boot-start-in-splunk-on-rhel-9-rocky-9/ https://docs.splunk.com/Documentation/Splunk/9.2.1... See more...
Hi @Somesh, please try some of these (starting from the first url): https://www.aidanwardman.com/enabling-boot-start-in-splunk-on-rhel-9-rocky-9/ https://docs.splunk.com/Documentation/Splunk/9.2.1/Admin/ConfigureSplunktostartatboottime https://docs.splunk.com/Documentation/Splunk/9.2.1/Admin/RunSplunkassystemdservice Ciao. Giuseppe
As @ITWhisperer shows, the key is mvexpand.  Meanwhile, I believe that you already have serial_number, type, outer result, and so on so you don't need to extract them.  In addition, you have a whole ... See more...
As @ITWhisperer shows, the key is mvexpand.  Meanwhile, I believe that you already have serial_number, type, outer result, and so on so you don't need to extract them.  In addition, you have a whole bunch of logs{}.* fields (including logs{}.result) that you no longer need.  You can simplify search to | fields - logs{}.* | spath logs{} output=logs | mvexpand logs | spath input=logs One more thing: is the outer result field important?  If it is you want to rename outer result first.
From what *I* have seen, the machineTypesFilter seems to be at the root of this bug. This is the absolute *WORST* update that I've seen in the last 6 years that I've been working with Splunk. I did... See more...
From what *I* have seen, the machineTypesFilter seems to be at the root of this bug. This is the absolute *WORST* update that I've seen in the last 6 years that I've been working with Splunk. I did read something that would indicate that the (white|black)list.X can also take OS Strings, but the docs call it "platform dependent", so I am putting it off until we can actually SEE what's being deployed again. I have noticed yet *ANOTHER* bug...  After a Deployment Server has been running for a bit, ANY CALL that would query DS Client information will TIMEOUT. I have multiple scripts that read data from /services/deployment/server/clients, and I've bumped the timeouts to 30 seconds, and it still times-out.  It used to take < 2s to pull data from THOUSANDS of clients.
Well, actually, I think so! The logs about the XmlConfig not finding configuration are a red herring: they don't mean anything is wrong with env. variables. Those are the required env. variables th... See more...
Well, actually, I think so! The logs about the XmlConfig not finding configuration are a red herring: they don't mean anything is wrong with env. variables. Those are the required env. variables that ALL need to be set explicitly to make it work without am xml config file: APPDYNAMICS_CONTROLLER_HOST_NAME: <key-value> APPDYNAMICS_CONTROLLER_SSL_ENABLED: "true" APPDYNAMICS_CONTROLLER_PORT: "443" APPDYNAMICS_AGENT_ACCOUNT_ACCESS_KEY: <key-value> APPDYNAMICS_AGENT_ACCOUNT_NAME: <key-value> APPDYNAMICS_AGENT_TIER_NAME: <key-value> APPDYNAMICS_AGENT_NODE_NAME: <key-value> APPDYNAMICS_AGENT_APPLICATION_NAME: <key-value> I forgot to add APPDYNAMICS_AGENT_ACCOUNT_NAME and there no messages in the logs pointing me into the right direction. I would suggest the doc would insist on all of those values being set, or the agent won't connect. Thanks!
I have downloaded splunk 9.2 rpm file and installed on rhel9.2. While i'm running splunk enable boot-start it's throwing  the error as below.    [root@splunk~]# splunk enable boot-start execve: No ... See more...
I have downloaded splunk 9.2 rpm file and installed on rhel9.2. While i'm running splunk enable boot-start it's throwing  the error as below.    [root@splunk~]# splunk enable boot-start execve: No such file or directory while running command /sbin/chkconfig [root@splunk~]#     Can someone help me on this ?
| spath logs{} output=logs | spath serial_number | mvexpand logs | spath input=logs
| stats list(Date) as Date by File | eval row=mvrange(0,2) | mvexpand row | eval Date=mvindex(Date,row) | eval File=if(isnotnull(Date),File,"missing file") | fields - row
I would like to extract the results of each test within the logs array by distinct count of serial number. That is, for each serial number i want to extract the test_name and result and plot the pas... See more...
I would like to extract the results of each test within the logs array by distinct count of serial number. That is, for each serial number i want to extract the test_name and result and plot the pass/fail rates of eachindividual test. The test names are the same in each event with only a different serial_number For the raw JSON below there would be 12 tests to extract the results from.  How would i go about this in search?       {"serial_number": "VE7515966", "type": "Test", "result": "Pass", "logs": [{"test_name": "UGC Connect", "result": "Pass"}, {"test_name": "Disable UGC USB Comm Watchdog", "result": "Pass"}, {"test_name": "ACR Data Read", "result": "Pass", "received": "ACR model extracted as Test"}, {"test_name": "Thermocouple in Grill", "tc_1": "295", "tc_2": "578", "req_diff": 50, "result": "Pass"}, {"test_name": "Glow Plug in Grill", "m_ugc": "1849", "tol_lower": 0, "tol_upper": 10000, "cond_ugc": "0", "tol_cond": 0, "result": "Pass"}, {"test_name": "Glow Plug Power Toggle", "result": "Pass", "received": "Glow plug power toggled"}, {"test_name": "Glow Plug in Grill", "m_ugc": "2751", "tol_lower": 0, "tol_upper": 10000, "cond_ugc": "0", "tol_cond": 0, "result": "Pass"}, {"test_name": "Motor", "R_rpm_ugc": 3775.0, "R_rpm": 3950, "R_v_ugc": 984.3333333333334, "R_v": 800, "R_rpm_t": 550, "R_v_t": 500, "R_name": "AUGER 640 R", "F_rpm_ugc": 3816.6666666666665, "F_rpm": 3950, "F_v_ugc": 985.3333333333334, "F_v": 800, "F_rpm_t": 550, "F_v_t": 500, "F_name": "AUGER 640 F"}, {"test_name": "Fan", "ugc_rpm": 2117.0, "rpm": 2700, "rpm_t": 800, "ugc_v": 554.3333333333334, "v": 630, "v_t": 200, "result": "Pass"}, {"test_name": "RS 485", "result": "Pass", "received": "All devices detected", "expected": "Devices detected: ['P', 'F']"}, {"test_name": "Close UGC Port", "result": "Pass"}, {"test_name": "Confirm Grill Size", "result": "Pass", "received": "Grill confirmed as Test"}]}
oooh can doo! Format a column first, and make a range map or value match then take note of that in the code, for instance the "Users in group" column:  "Users in Group": { "data": "> table | se... See more...
oooh can doo! Format a column first, and make a range map or value match then take note of that in the code, for instance the "Users in group" column:  "Users in Group": { "data": "> table | seriesByName(\"Users in Group\") | formatByType(Users_in_GroupColumnFormatEditorConfig)", "rowColors": "> table | seriesByName('Users in Group') | pick(Users_in_GroupRowColorsEditorConfig)", "rowBackgroundColors": "> table | seriesByName(\"Users in Group\") | rangeValue(Users_in_GroupRowBackgroundColorsEditorConfig)" }    Then find tableFormat and change rowBackgroundColors to the same as you used to color the column "headerVisibility": "fixed", "backgroundColor": "transparent", "tableFormat": {     "rowBackgroundColors": "> table | seriesByName('Users in Group') | rangeValue(Users_in_GroupRowBackgroundColorsEditorConfig)",     "headerBackgroundColor": "> backgroundColor | setColorChannel(tableHeaderBackgroundColorConfig)",     "rowColors": "> rowBackgroundColors | maxContrast(tableRowColorMaxContrast)",     "headerColor": "> headerBackgroundColor | maxContrast(tableRowColorMaxContrast)" }  And the whole row is now that color.     
Unfortunately, it is not possible to get data collector variable information in Email Digest  . The product doesn't support.  
In the Splunk Synthetic doc, it mentioned, you need to use the following syntax to reference the custom variable you defined in earlier on steps.  {{custom.$variable}} However, when I tried, it see... See more...
In the Splunk Synthetic doc, it mentioned, you need to use the following syntax to reference the custom variable you defined in earlier on steps.  {{custom.$variable}} However, when I tried, it seems that syntax only works when you try to reference it in the "Fill in Field" sort of steps.   If I have a downstream javaScript step, and in that javaScript,  I want to reference the custom variable I defined in my earlier on steps,  how do I reference them? I tried the  {{custom.$variable}} syntax, they do NOT work in my javaScript
I still think it's a mistake. There is a JS SDK separately downloadable from https://dev.splunk.com/enterprise/downloads