All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

How to Create Dataset inside DataModel and add new Fields in Dataset using Splunk SDK for Java
I'm having the same problem on our DC's. Did you find a solution?
We need to monitor Azure API Management self-hosted gateway and get all the traces. The gateway is AKS container with image mcr.microsoft.com/azure-api-management/gateway:v2. Inside is .NET applicati... See more...
We need to monitor Azure API Management self-hosted gateway and get all the traces. The gateway is AKS container with image mcr.microsoft.com/azure-api-management/gateway:v2. Inside is .NET application with  /app $ dotnet --info Host: Version: 6.0.26 Architecture: x64 Commit: dc45e96840 .NET runtimes installed: Microsoft.AspNetCore.App 6.0.26 [/usr/share/dotnet/shared/Microsoft.AspNetCore.App] Microsoft.NETCore.App 6.0.26 [/usr/share/dotnet/shared/Microsoft.NETCore.App] We would expect to monitor it the same way as any other .NET application, but we dont catch any BT or traces. Also we inject following env - name: CORECLR_PROFILER value: "{57e1aa68-2229-41aa-9931-a6e93bbc64d8}" - name: CORECLR_ENABLE_PROFILING value: "1" - name: CORECLR_PROFILER_PATH value: "/opt/appdynamics-dotnetcore/libappdprofiler.so" - name: LD_DEBUG value: all - name: LD_LIBRARY_PATH value: /opt/appdynamics-dotnetcore/dotnet - name: IIS_VIRTUAL_APPLICATION_PATH value: "/" Please help.
yes, still we observed vulnerability  openssl libraries files having 1.0.2zi FIPS with latest SplunkForwarder 9.2.0.1 as below. # cat /opt/splunkforwarder/etc/splunk.version VERSION=9.2.0.1 BUILD=... See more...
yes, still we observed vulnerability  openssl libraries files having 1.0.2zi FIPS with latest SplunkForwarder 9.2.0.1 as below. # cat /opt/splunkforwarder/etc/splunk.version VERSION=9.2.0.1 BUILD=d8ae995bf219 PRODUCT=splunk PLATFORM=Linux-x86_64 Library files r-xr-xr-x. 1 splunk splunk 475784 Feb 7 00:48 libssl.so.1.0.0 r-xr-xr-x. 1 splunk splunk 2996816 Feb 7 00:48 libcrypto.so.1.0.0 How to mitigate this vulnerability ?
worked for me - but - surely this is something that should not happen- there are no warnings in Splunk it just bang - Splunk is down in production  
@uagraw01 Hello, All files with the.xml extension, such as /scada_server/walmart_1.xml, /scada_server/walmart_2.xml, /scada_server/walmart_3.xml, and so forth, are matched by /walmart_*.xml. Could yo... See more...
@uagraw01 Hello, All files with the.xml extension, such as /scada_server/walmart_1.xml, /scada_server/walmart_2.xml, /scada_server/walmart_3.xml, and so forth, are matched by /walmart_*.xml. Could you please verify the permissions for every file inside this directory?And also,  You can try to remove the CrCSalt and try.  Check the below document for more examples:  https://docs.splunk.com/Documentation/Splunk/latest/Data/Specifyinputpathswithwildcards   
Thanks for your reply. But I have a few questions since I am new to this. 1. In which server should I add the custom Add on (Forwarder or DS?) We have hundreds of forwarders pointing to the indexer ... See more...
Thanks for your reply. But I have a few questions since I am new to this. 1. In which server should I add the custom Add on (Forwarder or DS?) We have hundreds of forwarders pointing to the indexer right now. Do we need to change all of them? 2. And since you are saying I shall remove the already existing files in $SPLUNK_HOME/etc/system/local folder, what shall be the contents of the newly added custom add on files? 3. Also, the indexer discovery feature needs to be installed in the DS right? 
Hi @uagraw01, sorry but I don't understand your question, anyway, then, why do are using crcSalt=<SOURCE>? please try this: [monitor://D:\scada_server\walmart_*.xml] disabled = false host = WALVAU... See more...
Hi @uagraw01, sorry but I don't understand your question, anyway, then, why do are using crcSalt=<SOURCE>? please try this: [monitor://D:\scada_server\walmart_*.xml] disabled = false host = WALVAU-VIDI-1 index = 2313917_2797418_scada sourcetype = Scada_walmart_alarm CHECK_METHOD = entire_md5 Then why are you using a so complex index? Ciao. Giuseppe
Im using splunk enterprise v9.0.2.1 and MQTT modular input  app is installed. When receiving json input for MQTT modular input getting  ERROR JsonLineBreaker [1946662 parsing] - JSON StreamId:1164... See more...
Im using splunk enterprise v9.0.2.1 and MQTT modular input  app is installed. When receiving json input for MQTT modular input getting  ERROR JsonLineBreaker [1946662 parsing] - JSON StreamId:11645015375736311559 had parsing error:Unexpected character while looking for value: 'W' - data_source="mqtt", data_host="local", data_sourcetype="jklg_json". jklg_json props.conf: DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = (\{\"message\":.*\}) NO_BINARY_CHECK = true category = Custom description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = 1 SHOULD_LINEMERGE = false sample json: {"message":"hi","name":"jklg"} How to resolve this issue?  
Hi @himaniarora20, the best approach is to create a custom add-on (called e.g. TA_Forwarders) containing at least three files: app.conf: containing the name and description of the add-on; deploym... See more...
Hi @himaniarora20, the best approach is to create a custom add-on (called e.g. TA_Forwarders) containing at least three files: app.conf: containing the name and description of the add-on; deploymentclient.conf: containing the address of the Deployment Server; outputs.conf: containing the address of the Indexers. then you should remove the same conf files from $SPLUNK_HOME/etc/system/local folder in each server. In this way you can dinamically manage the indexers and Deployment Server addressing from the DS. For indexers, you could also use the Indexer Discovery feature (https://docs.splunk.com/Documentation/Splunk/9.2.0/Indexer/indexerdiscovery) pointing to the Cluster Manager instead to the Indexers. Indexers Cluster must be managed by the Cluster Manager not by the DS. Search Head Cluster must be managed by the SHC Deployer not by the DS. Ciao. Giuseppe
Thank you @isoutamo for the response.  Here is more accurate version of payload [ { "Assigned to": "Jones, Francis", "Cost": 3, "Created date": "2024-02-28 12:52:18", ... See more...
Thank you @isoutamo for the response.  Here is more accurate version of payload [ { "Assigned to": "Jones, Francis", "Cost": 3, "Created date": "2024-02-28 12:52:18", "Extraction date": "2024-03-02 13:51:00", "ID": 12345, "Initial Cost": 3, "Location": "Sites", "Path": "Sites\\FY1\\S3", "Priority": 1, "State": "In Progress", "Status Change date": "2024-03-05 16:33:23", "Tags": "Europe; Finance", "Title": "Ensure correct routing of orders", "Updated date": "2024-03-05 16:33:23", "Warranty": false, "Wave Quarter": "Q2 22", "Work Item Type": "Request" }, { "Assigned to": "Jones, Francis", "Cost": 3, "Created date": "2024-02-28 18:59:18", "Extraction date": "2024-03-05 16:31:00", "ID": 12345, "Initial Cost": 3, "Location": "Sites", "Path": "Sites\\FY1\\S3", "Priority": 1, "State": "In Progress", "Status Change date": "2024-03-05 16:33:23", "Tags": "Europe; Finance", "Title": "Ensure correct routing of orders", "Updated date": "2024-03-05 16:33:23", "Warranty": false, "Wave Quarter": "Q2 22", "Work Item Type": "Request" }, { "Assigned to": "Jones, Francis", "Cost": 3, "Created date": "2023-01-28 18:59:18", "Extraction date": "2023-02-05 16:31:00", "ID": 12345, "Initial Cost": 3, "Location": "Sites", "Path": "Sites\\FY1\\S3", "Priority": 1, "State": "In Progress", "Status Change date": "2023-02-05 16:33:23", "Tags": "Europe; Finance", "Title": "Ensure correct routing of orders", "Updated date": "2024-03-05 16:33:23", "Warranty": false, "Wave Quarter": "Q2 22", "Work Item Type": "Request" } ]
Unable to import splunk-sdk  and splunklib python. Here are the error's I'm getting while importing. Any suggestions?    splunklib: error: command 'C:\\Program Files (x86)\\Microsoft Visual S... See more...
Unable to import splunk-sdk  and splunklib python. Here are the error's I'm getting while importing. Any suggestions?    splunklib: error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2022\\BuildTools\\VC\\Tools\\MSVC\\14.39.33519\\bin\\HostX86\\x64\\cl.exe' failed with exit code 2 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pycrypto Running setup.py clean for pycrypto Failed to build pycrypto ERROR: Could not build wheels for pycrypto, which is required to install pyproject.toml-based projects   splunk-sdk : line 18, in <module> from splunklib.six.moves import map ModuleNotFoundError: No module named 'splunklib.six.moves' [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed × Encountered error while generating package metadata. ╰─> See above for output.
walmart_2.xml walmart_3.xml walmart_4.xml Scenerio I   When using below configuration in Inputs.conf we can able to monitor in splunk   [monitor://D:\scada_server\walmart_2.xml] disabled ... See more...
walmart_2.xml walmart_3.xml walmart_4.xml Scenerio I   When using below configuration in Inputs.conf we can able to monitor in splunk   [monitor://D:\scada_server\walmart_2.xml] disabled = false host = WALVAU-VIDI-1 index = 2313917_2797418_scada sourcetype = Scada_walmart_alarm crcSalt = <SOURCE> CHECK_METHOD = entire_md5   Scenerio 2   Hello Splunkers!! I need your help to fix this issue. When using below configuration in Inputs.conf we can't able to monitor in splunk.   [monitor://D:\scada_server\walmart_*.xml] disabled = false host = WALVAU-VIDI-1 index = 2313917_2797418_scada sourcetype = Scada_walmart_alarm crcSalt = <SOURCE> CHECK_METHOD = entire_md5   Please suggest some workaround.
Hi @kannu, I understand: there aren't eventtypes.conf and tags.conf, (I don't understand how it was declared CIM compliant!). The only way is consider them as custom and follow the normalization pr... See more...
Hi @kannu, I understand: there aren't eventtypes.conf and tags.conf, (I don't understand how it was declared CIM compliant!). The only way is consider them as custom and follow the normalization process using the Add-On builder or the SA-CIM Vlaidator. Ciao. Giuseppe
In enterprise security app   Configure > incident management > New Notable Event
Since this looks like JSON, why not use spath? Try something like this: | spath serviceUserRequests{} output=serviceUserRequests | mvexpand serviceUserRequests Obviously you will have to modify the... See more...
Since this looks like JSON, why not use spath? Try something like this: | spath serviceUserRequests{} output=serviceUserRequests | mvexpand serviceUserRequests Obviously you will have to modify the paths to fit your actual events.
Sorry for the long answer I tested your settings and I can say with confidence that there is no difference Events also unexpectedly break
@ITWhisperer  Actually this is my query index=fwdl-meter-batching-agent-logs earliest=-7d@h-5d     | rex field=_raw ""requestId"(?<x>[\w\W]+?)]" max_match=0     | table internalFWDLRequestId x ... See more...
@ITWhisperer  Actually this is my query index=fwdl-meter-batching-agent-logs earliest=-7d@h-5d     | rex field=_raw ""requestId"(?<x>[\w\W]+?)]" max_match=0     | table internalFWDLRequestId x     | mvexpand x     | rex field=x "\"commsHubId\"\s+:\s+(?<CH_ID>\d+)" max_match=0     | rex field=x "^\" : \"(?<suRequestId>.+?)\""     | mvexpand CH_ID     | rename internalFWDLRequestId as requestId     | eval x=requestId."-".CH_ID     | fields x suRequestId
@anooshac  There is 2nd part of this query. In this section based on month selection it will set earliest and latest time in  "start" and "end" token. use these token in respective search. basically ... See more...
@anooshac  There is 2nd part of this query. In this section based on month selection it will set earliest and latest time in  "start" and "end" token. use these token in respective search. basically these token have earliest and latest time in epoch format for selected month.  PS: I have to use separate sub search because $result.token_name$ only works for 1 entry. <search base="basesearch_time"> <query> | where Month="$month_token$" | table start end </query> <done> <set token="start">$result.start$</set> <set token="end">$result.end$</set> </done> </search>  
Here is a runanywhere example using the example you posted showing the extract working. If the sample data does not match your events sufficiently closely enough, please post a more accurate represen... See more...
Here is a runanywhere example using the example you posted showing the extract working. If the sample data does not match your events sufficiently closely enough, please post a more accurate representation of your raw events, preferably in a code block </> similar to how I have done. | makeresults | fields - _time | eval _raw="{\"batchId\" : \"63361\", \"internalFWDLRequestId\" : \"70-B3-D5-1F-30-5F-30-00:70-B3-D5-1F-30-00-A0-03:519633036\", \"initialJobId\" : 3860464, \"batchCreationDate\" : 1709203012824, \"batchSubmissionDate\" : 1709293013333, \"allowMultipleRequests\" : true, \"abortedCountForDuplicateRepId\" : 0, \"abortedDuplicatesJobId\" : null, \"image\" : { \"approvedFirmwareVersionId\" : \"00070400\", \"fileName\" : \"00070400\", \"imageByteCount\" : 663191, \"mfcImageThumbprint\" : \"663125_675428228_vQhOAh27O+KHxkpO/Qrq0g==\" }, \"serviceUserRequests\" : [ { \"requestId\" : \"70-B3-D5-1F-30-5F-30-00:70-B3-D5-1F-30-00-A0-03:519633036\", \"requestDate\" : 1709203013315, \"imageCRC\" : 2291340038, \"numberOfCommsHubs\" : 3, \"deliveryPoints\" : [ { \"commsHubId\" : 101388585, \"endpointId\" : \"00-1D-24-02-01-0B-11-8E\" }, { \"commsHubId\" : 101762268, \"endpointId\" : \"00-1D-24-02-01-0A-D0-81\" }, { \"commsHubId\" : 102016271, \"endpointId\" : \"00-1D-24-02-01-0A-CF-75\" } ] } ], \"endpointType\" : 1}" | rex field=_raw "\"requestId\"\s:\s\"(?<x>[^\"]+)"