All Topics

Top

All Topics

I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want... See more...
I have a json that looks like this: { "Field1" : [ { "id": 1234 "name": "John" }, { "id": 5678 "name": "Mary" "occupation": { "title": "lawyer", "employer": "law firm" } } ] } I want to extract the value of the "name" field from the object that contains an occupation field (could be any). In this case I want to get "Mary" and store it inside a variable. How would I do this using splunk search language?
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'Se... See more...
I configured a Macro name securemsg(1), I use this Marco in the following search: ....| eval log_info=_raw | 'securemsg(log_info)' | .... When I run this search I got following error: Error in 'SearchParser': Missing a search command before '''. Error at position '264' of search query 'search index="linuxos" sourcetype="syslog" host="C...{snipped} {errorcontext = fo=_raw | 'securemsg(}'. Please help. Thanks      
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex fi... See more...
How would I add a permanent search or field to a sourctype?  For example: I have a set of a data that I have been able to snag a field out of using this search sourcetype="collectedevents" | rex field=_raw "<Computer>(?<Computer>[^<]+)</Computer>" Our sourcetype is "collectedevents"  And I found the way to pull the <Computer> field that was in the XML data down to a field "Computer" But what I would like to be able to do is to have that field be permanent, or transpose the "host =" to not be the host of the WEC but the host of the origin server that it came from.   Long story short, we have servers that we don't want the Splunk Forwarder on because we know that it can execute scripts creating a vulnerability with the Splunk Forwarder on these servers.  Any help is appreciated, thank you!
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let m... See more...
Hello, We have been investigating on missing 30% of Splunk logs in our production environment. I'm thinking it maybe due to TIME_FORMAT or due to high volume logs on production. Can you please let me know what should be the key-value for TIME_FORMAT on props.conf file?  Lagsec value is 1.5seconds on source logs and the splunk forwarder log source type where we are checking has 1.13s.  Additionally, source logs have format: 05/Mar/2024 SplunkForwarder logs have format: 2024-03-05 2048kbps on both dev and prod config file. Also, have ignoreOlderThan=1d so, looking to remove this parameter and fix TIME_FORMAT and check out. Can you please help or provide additional information to check?
Thanks in Advance. 1.I have a json object as content.payload{} and need to extract the values inside the payload.Already splunk extract field as content.payload{} and the result as  AP Import flow ... See more...
Thanks in Advance. 1.I have a json object as content.payload{} and need to extract the values inside the payload.Already splunk extract field as content.payload{} and the result as  AP Import flow related results : Extract has no AP records to Import into Oracle". But I want to extract all the details inside the content.payload. How can extract from splunk query or from props.conf file.I tried spath but cant able to get it. 2.How to rename wildcard value of content.payload{}* ?     "content" : { "jobName" : "AP2", "region" : "NA", "payload" : [ { "GL Import flow processing results" : [ { "concurBatchId" : "4", "batchId" : "6", "count" : "50", "impConReqId" : "1", "errorMessage" : null, "filename" : "CONCUR_GL.csv" } ] }, "AP Import flow related results : Extract has no AP records to Import into Oracle" ] },      
Hi, we are using ITSI Service map/Service Analyzer to monitor services.  we have an use case where for same service we need to add multiple KPI and those KPI depends on different entities. For Ex... See more...
Hi, we are using ITSI Service map/Service Analyzer to monitor services.  we have an use case where for same service we need to add multiple KPI and those KPI depends on different entities. For Example: We have Infrastructure related KPI which uses host as entity, another KPI is "service Up" which basically check service is up and in this case entity is "process name".  Also have KPI for Garbage collection which also has different entity. Question: I am trying to understand which is the best way to handle such scenario. where we can add all these KPI without making service map too complex.  
I want to pass dynamic value from my search result into email alert subject. I try $result.fieldname$ but it not coming up in the email alert  can someone help me? Thanks
Hi all, I set a corn job on alert my alert should not trigger between 9pm to 7am I used below corn job but I am receiving alerts after 9pm 0 0-21, 7-23 5-9 3 1-7 is this corn job correct?? Do I ne... See more...
Hi all, I set a corn job on alert my alert should not trigger between 9pm to 7am I used below corn job but I am receiving alerts after 9pm 0 0-21, 7-23 5-9 3 1-7 is this corn job correct?? Do I need to do any changes????
Hello everyone,  I followed steps to install DSDL : https://docs.splunk.com/Documentation/DSDL/5.1.1/User/InstallDSDL and do this scenario here https://www.sidechannel.blog/en/detecting-anomalies-us... See more...
Hello everyone,  I followed steps to install DSDL : https://docs.splunk.com/Documentation/DSDL/5.1.1/User/InstallDSDL and do this scenario here https://www.sidechannel.blog/en/detecting-anomalies-using-machine-learning-on-splunk/ But when I'm trying to start the container :    I get 403 error :  I checked roles, capabilities  I checked all kind of posts from the community I checked global permissions on DSDL  Is there a known issue about that ?  Have a good day all, Betty
Hi, I'd lilke to create a detailed report with info including the type of forwarder, the average KB/s, the OS, the IP, the splunk version but also with information to which exact index the forwarder... See more...
Hi, I'd lilke to create a detailed report with info including the type of forwarder, the average KB/s, the OS, the IP, the splunk version but also with information to which exact index the forwarder forwards to.  Is it possible to recreate the search from the monitoring console for forwarder instance and use it somehow to connect it to each index?  `dmc_get_forwarder_tcpin` hostname=* | eval source_uri = hostname.":".sourcePort | eval dest_uri = host.":".destPort | eval connection = source_uri."->".dest_uri | stats values(fwdType) as fwdType, values(sourceIp) as sourceIp, latest(version) as version, values(os) as os, values(arch) as arch, dc(dest_uri) as dest_count, dc(connection) as connection_count, avg(tcp_KBps) as avg_tcp_kbps, avg(tcp_eps) as avg_tcp_eps by hostname, guid | eval avg_tcp_kbps = round(avg_tcp_kbps, 2) | eval avg_tcp_eps = round(avg_tcp_eps, 2) | `dmc_rename_forwarder_type(fwdType)` | rename hostname as Instance, fwdType as "Forwarder Type", sourceIp as IP, version as "Splunk Version", os as OS, arch as Architecture, guid as GUID, dest_count as "Receiver Count", connection_count as "Connection Count", avg_tcp_kbps as "Average KB/s", avg_tcp_eps as "Average Events/s"   And probably somehow join it with  | tstats count values(host) AS host WHERE index=* BY index   The issue I see is that it searches dmc_get_forwarder_tcpin which is equal to index=_internal sourcetype=splunkd group=tcpin_connections (connectionType=cooked OR connectionType=cookedSSL) fwdType=* guid=* and I cannot find the indexes there. How can i connect it to each index?
So, I have a chart function that works perfectly! | chart sum(transactionMade) over USERNUMBER by POSTDATE But, I want my chart to have USERNUMBER and USERNAME. They are both correlated, so it shou... See more...
So, I have a chart function that works perfectly! | chart sum(transactionMade) over USERNUMBER by POSTDATE But, I want my chart to have USERNUMBER and USERNAME. They are both correlated, so it should not be an issue. I also want to add Team Number, which there is no correlation to USERNUMBER and USERNAME. Is it possible to have multiple fields after over? I can concatenate all the fields into one string, but it would be easier if they were separate columns. Thank you! 
We are currently changing our splunk server to a new one and during the change there was a mix up and we got data sent to the old instance (about 12h worth) which we would like to transfer to our new... See more...
We are currently changing our splunk server to a new one and during the change there was a mix up and we got data sent to the old instance (about 12h worth) which we would like to transfer to our new splunk instance. My thought was to do a search on the old one and then export the results, when I do this as a RAW format and then import it to the new one the data looks good but the field extracts for WinEventLog is not applied as it should (even tho I use the same Event type) how can I solve this? I've also tried to export it as xml, json, csv but the data looks worse than using RAW
How to Create Dataset inside DataModel and add new Fields in Dataset using Splunk SDK for Java
We need to monitor Azure API Management self-hosted gateway and get all the traces. The gateway is AKS container with image mcr.microsoft.com/azure-api-management/gateway:v2. Inside is .NET applicati... See more...
We need to monitor Azure API Management self-hosted gateway and get all the traces. The gateway is AKS container with image mcr.microsoft.com/azure-api-management/gateway:v2. Inside is .NET application with  /app $ dotnet --info Host: Version: 6.0.26 Architecture: x64 Commit: dc45e96840 .NET runtimes installed: Microsoft.AspNetCore.App 6.0.26 [/usr/share/dotnet/shared/Microsoft.AspNetCore.App] Microsoft.NETCore.App 6.0.26 [/usr/share/dotnet/shared/Microsoft.NETCore.App] We would expect to monitor it the same way as any other .NET application, but we dont catch any BT or traces. Also we inject following env - name: CORECLR_PROFILER value: "{57e1aa68-2229-41aa-9931-a6e93bbc64d8}" - name: CORECLR_ENABLE_PROFILING value: "1" - name: CORECLR_PROFILER_PATH value: "/opt/appdynamics-dotnetcore/libappdprofiler.so" - name: LD_DEBUG value: all - name: LD_LIBRARY_PATH value: /opt/appdynamics-dotnetcore/dotnet - name: IIS_VIRTUAL_APPLICATION_PATH value: "/" Please help.
Im using splunk enterprise v9.0.2.1 and MQTT modular input  app is installed. When receiving json input for MQTT modular input getting  ERROR JsonLineBreaker [1946662 parsing] - JSON StreamId:1164... See more...
Im using splunk enterprise v9.0.2.1 and MQTT modular input  app is installed. When receiving json input for MQTT modular input getting  ERROR JsonLineBreaker [1946662 parsing] - JSON StreamId:11645015375736311559 had parsing error:Unexpected character while looking for value: 'W' - data_source="mqtt", data_host="local", data_sourcetype="jklg_json". jklg_json props.conf: DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = (\{\"message\":.*\}) NO_BINARY_CHECK = true category = Custom description = JavaScript Object Notation format. For more information, visit http://json.org/ disabled = false pulldown_type = 1 SHOULD_LINEMERGE = false sample json: {"message":"hi","name":"jklg"} How to resolve this issue?  
Unable to import splunk-sdk  and splunklib python. Here are the error's I'm getting while importing. Any suggestions?    splunklib: error: command 'C:\\Program Files (x86)\\Microsoft Visual S... See more...
Unable to import splunk-sdk  and splunklib python. Here are the error's I'm getting while importing. Any suggestions?    splunklib: error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2022\\BuildTools\\VC\\Tools\\MSVC\\14.39.33519\\bin\\HostX86\\x64\\cl.exe' failed with exit code 2 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pycrypto Running setup.py clean for pycrypto Failed to build pycrypto ERROR: Could not build wheels for pycrypto, which is required to install pyproject.toml-based projects   splunk-sdk : line 18, in <module> from splunklib.six.moves import map ModuleNotFoundError: No module named 'splunklib.six.moves' [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed × Encountered error while generating package metadata. ╰─> See above for output.
walmart_2.xml walmart_3.xml walmart_4.xml Scenerio I   When using below configuration in Inputs.conf we can able to monitor in splunk   [monitor://D:\scada_server\walmart_2.xml] disabled ... See more...
walmart_2.xml walmart_3.xml walmart_4.xml Scenerio I   When using below configuration in Inputs.conf we can able to monitor in splunk   [monitor://D:\scada_server\walmart_2.xml] disabled = false host = WALVAU-VIDI-1 index = 2313917_2797418_scada sourcetype = Scada_walmart_alarm crcSalt = <SOURCE> CHECK_METHOD = entire_md5   Scenerio 2   Hello Splunkers!! I need your help to fix this issue. When using below configuration in Inputs.conf we can't able to monitor in splunk.   [monitor://D:\scada_server\walmart_*.xml] disabled = false host = WALVAU-VIDI-1 index = 2313917_2797418_scada sourcetype = Scada_walmart_alarm crcSalt = <SOURCE> CHECK_METHOD = entire_md5   Please suggest some workaround.
Hi Team, We have a search head cluster and indexer cluster in our current Splunk environment.  The data to the indexer earlier was provided by multiple forwarders which had the endpoint for the Inde... See more...
Hi Team, We have a search head cluster and indexer cluster in our current Splunk environment.  The data to the indexer earlier was provided by multiple forwarders which had the endpoint for the Indexer. Now, since it is a multi-indexer architecture, we need a common point for the forwarder to point the data Please provide suggestions on how to set up the forwarders -> Deployment Server ->Cluster master architecture. I came across this one. But confused with the meaning of deployment client  https://community.splunk.com/t5/Deployment-Architecture/How-to-set-up-new-deployment-server-in-a-clustered-environment/m-p/514847   Thanks in advance!
When i try to create a manual notable, i get the following error.    
Hi All, I have prepared a dropdown using this solution- https://community.splunk.com/t5/Dashboards-Visualizations/Custom-Time-dropdown/m-p/677806#M55517 From this solution i used this query to crea... See more...
Hi All, I have prepared a dropdown using this solution- https://community.splunk.com/t5/Dashboards-Visualizations/Custom-Time-dropdown/m-p/677806#M55517 From this solution i used this query to create the dropdown. | makeresults | addinfo | eval date=mvrange(info_min_time,info_max_time,"1mon") | mvexpand date | sort - date | eval Month=strftime(date,"%b-%y") | table Month date How to use this in the query so that the search will take only selected month. Also for some other charts the query should take the date till the selected month. For example if Jan 24 is selected, the chart should show data till dec 23. How can i achieve these 2 requirements? can anyone help me!