All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

There is probably a simple solution to this, but unfortunately I was not able to find the answer in the documentation, nor by searching the community. I am injecting events into Splunk, with a cert... See more...
There is probably a simple solution to this, but unfortunately I was not able to find the answer in the documentation, nor by searching the community. I am injecting events into Splunk, with a certain JSON structure, e.g.   [ { "foo": { "k1": 1, "k2": 2 }, "bar": { "m1": 5, "m2": 6 }, "string1": "hi", "string2": "bye" }, { "foo": { "k1": 11, "k2": 22 }, "bar": { "m1": 55, "m2": 66 }, "string1": "hi2", "string2": "bye2" }, ... and so on ... ]       I can nicely search these events in Splunk, e.g. by | where foo.k1 > 10 Now when searching through the REST API, I can specify which fields I would like to get, e.g. with | fields string1, foo | fields - _* The problem I am having is as follows: When specifying the field "foo" - which has a map (or some other complex structure) in the above naive way, I am not getting any contents from it in my search result (the results are nicely visible in the event view of the Splunk web UI - but in the REST API) When using fields foo*, I am getting an expanded result: { "foo.k1": 1, "foo.k2": 2 } I tried spath, like in: | spath output=myfoo path=foo | fields myfoo | fields - _* which however gives me a string that contains JSON: {"myfoo": "{\"k1\": 1,\"k2\": 2}"} The above are all sub-optimal; I would like to get a search result which is pure JSON, and preserves the structure of the "foo" field, so that I would get: { ..., "foo": { "k1": 1, "k2": 2 }, ... } Or in other words: I would like to pass through some of the event content as is to the result, such that I would get a nice hierarchical data structure when parsing the JSON search result. Thanks a lot for your valuable advice!
Hi all, We are running the latest version of URL Toolbox (at the time of writing, 1.9.1 released on Dec 2021) on Splunk 8.2.3 with Splunk ES 6.6.2. After the upgrade, we have noticed that the mozi... See more...
Hi all, We are running the latest version of URL Toolbox (at the time of writing, 1.9.1 released on Dec 2021) on Splunk 8.2.3 with Splunk ES 6.6.2. After the upgrade, we have noticed that the mozilla list is not working properly anymore. To test it: | makeresults | eval domain="http://www.example.com/123/123.php",list="mozilla" | `ut_parse_extended(domain,list)` Gives: domain list ut_domain ut_domain_without_tld ut_fragment ut_netloc ut_params ut_path ut_port ut_query ut_scheme ut_subdomain ut_subdomain_count ut_tld http://www.example.com/123/123.php mozilla None None None www.example.com None /123/123.php 80 None http None 0 None With iana no problems at all (even if the parsing is a bit different and mozilla would be the ideal ones for our user cases): | makeresults | eval domain="http://www.example.com/123/123.php",list="iana" | `ut_parse_extended(domain,list)`   domain list ut_domain ut_domain_without_tld ut_fragment ut_netloc ut_params ut_path ut_port ut_query ut_scheme ut_subdomain ut_subdomain_count ut_subdomain_level_1 ut_tld http://www.example.com/123/123.php iana example.com example None www.example.com None /123/123.php 80 None http www 1 www com   Is anyone having the same issue and/or a fix that we might apply? Thank you and cheers!
I am working on a Splunk Deployment with a cluster of search heads spanning two physical sites.  At Site1 there is actually only one search head.  At Site2 there are two search heads. The load bala... See more...
I am working on a Splunk Deployment with a cluster of search heads spanning two physical sites.  At Site1 there is actually only one search head.  At Site2 there are two search heads. The load balancer managing Splunk Web access tends to favor Site1 and so almost all users end up landing on the sole search head at Site1 when they access our Splunk Web url. What I have noticed, however, is that the search head at Site1 also seems to run almost all of the scheduled searches.  Also, because it is favored by the load balancer for user access, users log into that search head and it takes on most of the ad-hoc searches. I know that this setup is very non-optimal, and as the story goes, this is a mess I inherited recently in taking over Splunk responsibilities.  More search heads are needed actually at both physical Sites, but in the meantime, I am trying to understand why the cluster captain is not more evenly distributing the saved searches, alerts, reports, etc.  Why do they all seem to execute from the sole cluster member at Site1?
Hoping that I may be able to get some assistance with this dashboard. Full disclosure, I am not a Splunk aficionado by any stretch, but I am trying to put together a dashboard that.. 1. takes an acc... See more...
Hoping that I may be able to get some assistance with this dashboard. Full disclosure, I am not a Splunk aficionado by any stretch, but I am trying to put together a dashboard that.. 1. takes an account as input and queries the ports that it hits the AD domain controller on. 2. drills down with a query for F5 logs using the time frame passed from drill down selection as well as the ADDC IP and ports. In a nutshell, we want to follow the service account authentication to the loadbalancer and identify the actual client since logs against AD only show the F5 IP. Have tried a number of different methods for this but can't get the drill down to process as intended with the necessary parameters passing. Also open to rethinking the approach if necessary. Originally tried using a transaction to capture events and associate them but there was a lot of noise to filter out. XML shown below:   <form theme="dark"> <label>Account Drilldown</label> <fieldset submitButton="false" autoRun="false"> <input type="time" token="dt" searchWhenChanged="true"> <label>Timeframe</label> <default> <earliest>@d</earliest> <latest>now</latest> </default> </input> <input type="dropdown" token="directoryValue" searchWhenChanged="true"> <label>Directory</label> <choice value="index=index host=*ad* &quot;Source_Port&quot; OR &quot;Port&quot; Account_Name=&quot;*">Active Directory</choice> </input> <input type="text" token="accountSearch" searchWhenChanged="true"> <label>Account Name</label> <default>Type Account Here</default> </input> <input type="multiselect" token="srcport" depends="$dt.earliest$,$dt.latest$" searchWhenChanged="true"> <label>Domain Controller Ports</label> <choice value="*">All</choice> <search id="activityList"> <query>$directoryValue$$accountSearch$*" | fields Source_Network_Address, Port, Source_Port | eval srcip = Source_Network_Address, Port = Source_Port, srcport = Port | table _time, srcip, srcport</query> <earliest>@d</earliest> <latest>now</latest> </search> <fieldForLabel>srcport</fieldForLabel> <fieldForValue>srcport</fieldForValue> <prefix>(</prefix> <suffix>)</suffix> <delimiter> OR </delimiter> </input> </fieldset> <row> <panel> <table> <search base="activityList"> <![CDATA[index=index host=*ad* &quot;Source_Port&quot; OR &quot;Port&quot; Account_Name=&quot;*" | eval _querystring=replace(replace(ltrim(rtrim("$srcport$",")"),"("),"srcport=","form.srcport=")," OR ","&")]]> </search> <option name="count">5</option> <option name="drilldown">row</option> <option name="refresh.display">preview</option> <drilldown> <link> <![CDATA[/app/team/account_activity_drilldown?form.dt.earliest=$earliest$&form.dt.latest=$latest$&form.srcip=$row.srcip$&form.srcport=$row.srcport$]]> </link> </drilldown> </table> </panel> </row> <row> <panel id="drilldown" depends="$row.srcip$"> <table> <search> <query>| search index=index type=traffic dstport=* action=* policyid=* srcip="$srcip$" OR srcport="$srcport$"</query> <earliest>$dt.earliest$</earliest> <latest>$dt.latest$</latest> </search> <option name="drilldown">row</option> <option name="refresh.display">preview</option> <option name="rowNumbers">true</option> </table> </panel> </row> </form>  
Hello, I wanted to create a detection rule on the LLMNR protocol knowing that I don't have Sysmon just with the logs. Can you help me please? thank you and have a great day
Hi, I have created a dashboard to filter firewall statuses. One of the inputs I need is a checkbox to eliminate duplicates based on host, source IP, destination IP and destination port.  However... See more...
Hi, I have created a dashboard to filter firewall statuses. One of the inputs I need is a checkbox to eliminate duplicates based on host, source IP, destination IP and destination port.  However, the checkbox input is not working and every time the use checks and unchecks the box, it has no effect on the dashboard. The following is my dashboard and the XML code, respectively: Can you please help? Thank you!
Hello, I'm experiencing some issues on kvstore: [conn4556] SCRAM-SHA-1 authentication failed for __system on local from client xxx.xxx.x.xx:xxxxx ; AuthenticationFailed: SCRAM-SHA-1 authenticatio... See more...
Hello, I'm experiencing some issues on kvstore: [conn4556] SCRAM-SHA-1 authentication failed for __system on local from client xxx.xxx.x.xx:xxxxx ; AuthenticationFailed: SCRAM-SHA-1 authentication failed, storedKey mismatch I followed this https://community.splunk.com/t5/Deployment-Architecture/Why-is-the-KV-Store-status-is-showing-as-quot-starting-quot-in/m-p/284690 as for 1SH (total of 3)  I'm reciving: This member: backupRestoreStatus : Ready disabled : 0 guid : xxxxxxxxxxxxxxxxxxxxxx port : 8191 standalone : 0 status : starting storageEngine : mmapv1 I appreciate any help
Hi Team, We have SaaS-based Appdynamics, where login is through the Access key.  Now I want to deploy the AppDynamics cluster agent in Kubernetes using the helm chart.  In the helm chart, the ... See more...
Hi Team, We have SaaS-based Appdynamics, where login is through the Access key.  Now I want to deploy the AppDynamics cluster agent in Kubernetes using the helm chart.  In the helm chart, the following code is there  api-user: {{ cat (.username | trim | required "AppDynamics controller username is required!") "@" (.account | trim | required "AppDynamics controller account is required!") ":" (.password | trim | required "Appdynamics controller password is required!") | nospace | b64enc -}} Values.yaml file looks like  controllerInfo:   url: "https://myinstance:443"   account: "xysystems"   username:    password:    accessKey: "xxxxxxxxxxxxxx"   globalAccount: null # To be provided when using machineAgent Window Image # SSL properties customSSLCert: null Could someone please explain, instead of using username\@account:password  I want to use the account Name with the Access key for login. Please guide me. thanks
Can i implemenet something like this ?   Process: Service Parameters – Average of the percentage reported by the IT Application health parameters ie Transaction Timeouts . We count the Nu... See more...
Can i implemenet something like this ?   Process: Service Parameters – Average of the percentage reported by the IT Application health parameters ie Transaction Timeouts . We count the Number of transactions is a time period and compare with the transaction timeouts that were reported. Example:. If number of transactions for 1 day is 5000 ,and the transaction timeouts were 30. Then the transaction success rate is 99.4 % The colour coding scheme would be -: Green- If after subtracting the transaction timeouts from the number of transactions of the entire time period ,the transaction success rate is more than 98% then the application is considered Green. Yellow-- If after subtracting the transaction timeouts from the number of transactions of the entire time period ,the transaction success rate is more than 90% but less than 98 % then the application is considered Yellow. Red - If after subtracting the transaction timeouts from the number of transactions of the entire time period ,the transaction success rate is less than 90% then the application is considered Red.  
Hi Team, Our team is planning to install Defender for Endpoint on Splunk server. Can anyone please confirm if there are any restrictions for having Microsoft Defender AV solution on the Splunk serv... See more...
Hi Team, Our team is planning to install Defender for Endpoint on Splunk server. Can anyone please confirm if there are any restrictions for having Microsoft Defender AV solution on the Splunk servers. Means would there be any impact if we install the same on the Splunk server. Thanks & Regards,  
When using the DLTK, I get a lot of different error messages rather frequently, the most often occurring being:"Could not parse xml reply (no reply from script). See splunkd.log for more info.." with... See more...
When using the DLTK, I get a lot of different error messages rather frequently, the most often occurring being:"Could not parse xml reply (no reply from script). See splunkd.log for more info.." with "02-25-2022 11:21:05.426 +0100 WARN HttpListener [16744 HttpDedicatedIoThread-7] - Socket error from 127.0.0.1:60124 while accessing /services/mltk-container/sync: Winsock error 10053" as the corresponding entry in the log file. The state of the DLTK on my machine is, that a connection with Docker has been successfully established, but no containers are found in the containers context menu except for the __dev__ container, which cant be started. I have pulled some of the images from dockerhub as described in the setup guide for air gapped environments, even though mine is not. These are not found by Splunk. Has anyone got any idea of what I might be doing wrong, I have tried to follow the setup guide as closely as possible without success. Thanks in advance
I've this | eval status=if(CODE=200,"success","failure")  i need to have 3 fields like success, failure , total I'm trying this  | stats eval(if(c(CODE=200))) as success but its not working  Cou... See more...
I've this | eval status=if(CODE=200,"success","failure")  i need to have 3 fields like success, failure , total I'm trying this  | stats eval(if(c(CODE=200))) as success but its not working  Could you please help ?
Hi Dears I want to know which training course is about how to write a use case in Splunk. I am a beginner and want to get the idea, of which use case is good and beneficial for my organization. ... See more...
Hi Dears I want to know which training course is about how to write a use case in Splunk. I am a beginner and want to get the idea, of which use case is good and beneficial for my organization.   Br
How do I make a search that includes to events. The first event is a 'CALL' with parameters and the second event is the response.
Hello Team,   I create an Add-on where I configured REST API for data collection input. It executed successfully upon testing and I saved it. But while creating an input in the particular Add-on ap... See more...
Hello Team,   I create an Add-on where I configured REST API for data collection input. It executed successfully upon testing and I saved it. But while creating an input in the particular Add-on app, I am getting below error even after selecting field option in Global account. The following required arguments are missing: count.   Kindly assist.   Regards Gargi Gharat
An older splunk instance (6.5.0) was found within my environment running on a windows server 2008r2 host. The instance was experiencing license breaches which were resolved by pointing the host to ... See more...
An older splunk instance (6.5.0) was found within my environment running on a windows server 2008r2 host. The instance was experiencing license breaches which were resolved by pointing the host to our primary license master. Currently when searching index=* no results are found. The main index has over 500 million events with data currently flowing into the index. There are no errors when searching _* indexes    
I am using the REST API knowledge in - https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-DB-Connect-V3-Automated-Programmatic-cre... But I am not able to update a DB connection. Is there... See more...
I am using the REST API knowledge in - https://community.splunk.com/t5/All-Apps-and-Add-ons/Splunk-DB-Connect-V3-Automated-Programmatic-cre... But I am not able to update a DB connection. Is there a way to update it without having to delete it? My end goal is to update an identity, but for that, I am thinking if updating the db connection to point to another identity, then deleting the original identity, then creating a new one, and pointing my db connection to it via REST API.
I am trying to edit DB connections using the REST API, but I am not able to find ways to edit a DB connection. Is the only way via deleting and creating it again?
Hello Splunk Community,  I am trying to replicate a heat map using the table formats app available through Splunk.  I see the coloring of the cells when I use the stats command as below, but I nee... See more...
Hello Splunk Community,  I am trying to replicate a heat map using the table formats app available through Splunk.  I see the coloring of the cells when I use the stats command as below, but I need to have the data show as a chart. The issue is when I use chart all the color goes away from the table. Is there a work around for this problem?    <dashboard> <label>Table Formats</label> <description>Format columns using built-in table formats (coloring, number formatting).</description> <row> <panel> <table> <search> <query> index="Dept_data_idx" eventType="Created" status="success" host=* | bucket _time span=1h | stats count by _time host </query> <earliest>-7d</earliest> <latest>now</latest> </search> <format type="color" field="count"> <colorPalette type="minMidMax" maxColor="#31A35F" minColor="#FFFFFF"></colorPalette> <scale type="minMidMax"></scale> </format> </table> <html> </html> </panel> </row> </dashboard>    
Hi, I'm writing a splunk query to find emails with specific file types attached I have the regex working which pulls the files and also extracts the file extensions which I'll be using for data coll... See more...
Hi, I'm writing a splunk query to find emails with specific file types attached I have the regex working which pulls the files and also extracts the file extensions which I'll be using for data collection purposes later. I will then use this extracted file extension to search and return specific emails containing files with said extension (hope that makes sense) The problem is that when I use |where FileExtension=".doc" I get events returned where it contains a .doc file which is fine. But it also shows all the other files attached which I do not want. For example I want my output to be sender recipient file.doc   But what I am getting is  sender recipient file.doc file.a file.b file.c file.d   Is there any way to do some kind of exclusive search that will ignore the extra data in the file field that are not .doc's as they are of no interest to me at the moment?