All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi Everyone, I am trying to figure out how can I do dual forwarder configuration for universal forwarders. Can someone please guide me in getting some idea for it or point out to splunk-docs/article... See more...
Hi Everyone, I am trying to figure out how can I do dual forwarder configuration for universal forwarders. Can someone please guide me in getting some idea for it or point out to splunk-docs/articles that can be helpful.
Hi all, I wanted to ask a question: "is it possible to execute code based on a condition"? example: if A = B then "rename C as D" else "add a column" My problem: the where returns columns usually... See more...
Hi all, I wanted to ask a question: "is it possible to execute code based on a condition"? example: if A = B then "rename C as D" else "add a column" My problem: the where returns columns usually 2, (DIRECT, INDIRECT) but there are some cases that returns 3 (DIRECT, INDIRECT, SPC) a case that returns only 1 (INDIRECT) another that returns 1 (DIRECT). When I do the chart, the third field is called "row 3" (I hope to solve with rename). When I have only one field (INDIRECT) it is called "row 1" but if I call it "DIRECT" it is not good as the values ​​are from the INDIRECT. Same thing for the field only (LIVE). The problem I would like is that the chart always has 2 bars for both DIRECT and INDIRECT, even when there is not one of the two. with this code I have: | stats sum (*) by OFFERTA | transpose | addtotals fieldname = "TOTAL" | rename "row 1" as "DIRECT" | rename "row 2" as "INDIRECT" | rename "row 3" as "SPC" Solar year                                            DIRECT     INDIRECT     TOTAL sum (00_PREVIOUS_MONTH)        8                    4                 12 sum (01_PREVIOUS_MONTH)      32                  16               48 sum (02_PREVIOUS_MONTH)      42                 10               52 sum (03_PREVIOUS_MONTH)      30                  8                38 but if I only have 1 field (INDIRECT) I have as a result: Solar Year                                                  DIRECT       TOTAL sum (00_PREVIOUS_MONTH)              0                   0 sum (01_PREVIOUS_MONTH)              3                  3 sum (02_PREVIOUS_MONTH)              1                  1 sum (03_PREVIOUS_MONTH)               3                 3 sum (04_PREVIOUS_MONTH)               2                  2   I would like the chart to have also in this case the two fields with DIRECT and INDIRECT with the DIRECT field all zero
We have just upgraded to v8.1 and because we have a small license, we are subject to the license enforcement. The document states that enforcement will occur if you receive 45 warnings over a rolling... See more...
We have just upgraded to v8.1 and because we have a small license, we are subject to the license enforcement. The document states that enforcement will occur if you receive 45 warnings over a rolling 60-day window. What is unclear is what counts as a "warning". For example I have 9 indexers all sharing a single license pool, and when we went over the daily limit, we appears to receive 9 warnings - one per indexers. Is this expected? - for example This pool has exceeded its configured poolsize=xxx bytes. A CLE warning has been recorded for all members So does the 45 warning limit apply to these pool warnings?, hard warnings or license master warnings? I.e. going over the daily limit = 1 warning?
Hello splunk community. I have a search query which i am using to report the daily api stats. I have a requirement where i want to send the result of below query (which is a chart table) into slack. ... See more...
Hello splunk community. I have a search query which i am using to report the daily api stats. I have a requirement where i want to send the result of below query (which is a chart table) into slack. Query:   index=api* metaData.pid="apiDdata" | chart count BY apiName status   And the result looks like What i learnt from the the splunk webhooks is that it can send only one row of data at a time. So if i have to send the whole data, i need to send it result by result. So my question is, is there any way to combine the table into a single value something like below so that i can send it to slack at one shot ? Something like below ============================= || ApiName    |      Success    |      NULL  || --------------------------------------------------- || Api 1            ||    123               ||       222     || || Api 1            ||    123               ||       222     || || Api 1            ||    123               ||       222     || || APi 2            ||       123            ||       222.    || ---------------------------------------------------- The above table is a single string value which i am expecting it to be sent to slack. Is it possible ? Please help
We have two sites with two indexers per site. A total of four indexers. I have to set up certificate-based encryption from all forwarders to Indexers. What is the easiest way to go about setting up... See more...
We have two sites with two indexers per site. A total of four indexers. I have to set up certificate-based encryption from all forwarders to Indexers. What is the easiest way to go about setting up certificates? Can I generate one certificate for ALL forwarders and another certificate for ALL indexers ?   Any assistance is appreciated!
Hello I am a Splunk user, not admin, and I seem to be able to do a search like: | rest splunk_server=local servicesNS/-/-/data/ui/views/ Does that mean I have API access and how can I access this ... See more...
Hello I am a Splunk user, not admin, and I seem to be able to do a search like: | rest splunk_server=local servicesNS/-/-/data/ui/views/ Does that mean I have API access and how can I access this via Excel, PowerBI etc? What would be the URI and do I need a separate key? Thanks!
Hi sorry if this has been asked before, spent lot of time researching but can't find quite the answer. I have this json logged below, I want to do analysis on the order lines, so need a search to re... See more...
Hi sorry if this has been asked before, spent lot of time researching but can't find quite the answer. I have this json logged below, I want to do analysis on the order lines, so need a search to return two lines from my json example. Order Reference Description Value XXX PAUL 35,700 XXX IS GREAT 42,000 I've tried a million things, can't get it to work Thanks in advance Paul (who isn't actually that great!) THE JSON {   "orderReference": "xxx",   "orderLine": [   {       "orderLineUserItemDescription": "PAUL",        "orderLineUnitPrice": "35700.0",    },     {       "orderLineUserItemDescription": "IS GREAT",      "orderLineUnitPrice": "42000.0",    } ] }
need help on using command strptime/strftime   EX: input: December 7, 2021 1:00:01 PM          output: 12/1/2021   13:00:01 
How do I export of list of triggered alerts in a CSV for a certain period of time from Splunk Cloud? This should be something like the view on the Activity>Triggered Alerts screen? The important fiel... See more...
How do I export of list of triggered alerts in a CSV for a certain period of time from Splunk Cloud? This should be something like the view on the Activity>Triggered Alerts screen? The important fields are triggered time and title of alert. Thank you.
need help on removing only endpoint from the data set  input :                                                                              output:                Endpoint CD/DVD                 ... See more...
need help on removing only endpoint from the data set  input :                                                                              output:                Endpoint CD/DVD                                                  CD/DVD               Endpoint Cloud Storage                                       Cloud Storage
I searched the documentation and splunk docker related code, but did not find the relevant configuration. how to enable indexer acknowledgment via environment variables? many thx~
Hello, { [-] guessedService: ejj logGroup: /aws/ejj/cluster logStream: kube-apt-15444d2f8c4b216a9cb69ac message:{"kind":"Event","stage":"ResponseComplete","requestURI":"/api/v1/namespace... See more...
Hello, { [-] guessedService: ejj logGroup: /aws/ejj/cluster logStream: kube-apt-15444d2f8c4b216a9cb69ac message:{"kind":"Event","stage":"ResponseComplete","requestURI":"/api/v1/namespaces/jej/endpoints/eji.com-aws-eji","verb":"update","user":{"username":"system:serviceaccount:efs:efs-provisioner","uid":"ab5d27b4c-71a4f77323b0","groups":["system:serviceaccounts","system:serviceaccounts:eji","system:authenticated"]},"sourceIPs":["10.0.0.0"],"userAgent":"eji-provisioner/v0.0.0 (linux/amd64) kubernetes/$Format","objectRef":{"resource":"endpoints","namespace":"edd","name":"dds.com-aws-edds","uid":"44ad8-899f-fbc1f4befb2f","apiVersion":"v1","resourceVersion":"8852157"},"responseStatus":{"metadata":{},"code":200}}   i already a below props and transforms to extract all the fields from message.  Props.conf [json_no_new] REPORT-json = report-json,report-json-new KV_MODE = none INDEXED_EXTRACTIONS = json LINE_BREAKER = ^{ NO_BINARY_CHECK = true disabled = false pulldown_type = true Transforms.conf [report-json] SOURCE_KEY = message REGEX = (?P<json2>{.+) DEST_KEY = _raw [report-json-new] REGEX = \\*"([^"]+)\":[\s]*"*(\[.*?\]|\{.*?\}"*\}*|[^"]+|\d+),* FORMAT = $1::$2 SOURCE_KEY = json2 Now from the result i have below field with json value user = {"username":"system:serviceaccount:efs:efs-provisioner","uid":"ab5d27b4c-71a4f77323b0","groups":["system:serviceaccounts","system:serviceaccounts:eji","system:authenticated"]} again with props and transform i want to extract values from user field. Please some one let me know if thats possible  Thanks
Hi Team,   I am checking for the update that if the Splunk application is also exposed to threat due to Vulnerability -  Apache Log4j.  Please let us know the work around if there is any impact. ... See more...
Hi Team,   I am checking for the update that if the Splunk application is also exposed to threat due to Vulnerability -  Apache Log4j.  Please let us know the work around if there is any impact. Thanks User
Hello all, I need a hand with a basic Splunk search. I appreciate this is Splunk 101 basics, but with other commitments, I am struggling to remember all commands.  What I need is to edit the follow... See more...
Hello all, I need a hand with a basic Splunk search. I appreciate this is Splunk 101 basics, but with other commitments, I am struggling to remember all commands.  What I need is to edit the following search, (or someone kind enough to completely improve the search), that rather the showing the same day next to each URL in the field "day", it just shows all URLs next to one single entry of the day. hopefully, that makes sense? index=netproxy user=(user) (url=https://www* OR url=http://www.) | stats count by day, url day                                    url                                                                   count Mon, 13 Dec 2021    https://www.bizographics.com/         1 Mon, 13 Dec 2021 https://www.bleacherbreaker.com/     2788 Mon, 13 Dec 2021 https://www.google-analytics.com/     3 Mon, 13 Dec 2021 https://www.google.co.uk/                       5                     Mon, 13 Dec 2021 https://www.googletagmanager.com/ 10 Mon, 13 Dec 2021 https://www.googletagservices.com/  6 Sat, 11 Dec 2021 https://www.capitalfm.com/                     23 Sat, 11 Dec 2021 https://www.capitalxtra.com/                   26 Sat, 11 Dec 2021 https://www.globalplayer.com/                 8   Basically what I am after is a search that shows,  -    day                                             url                                            times visited(count) Mon, 13 Dec 2021    https://www.bizographics.com/            1                                           https://www.bleacherbreaker.com/      2788                                           https://www.google-analytics.com/      3                                           https://www.google.co.uk/                         5                                           https://www.googletagmanager.com/   10                                           https://www.googletagservices.com/   5 If anyone can improve on my basic search it would be greatly appreciated
Hello I have this below data, { [-] guessedService: ejj logGroup: /aws/ejj/cluster logStream: kube-apt-15444d2f8c4b216a9cb69ac message:{"kind":"Event","stage":"ResponseComplete","... See more...
Hello I have this below data, { [-] guessedService: ejj logGroup: /aws/ejj/cluster logStream: kube-apt-15444d2f8c4b216a9cb69ac message:{"kind":"Event","stage":"ResponseComplete","requestURI":"/api/v1/namespaces/jej/endpoints/eji.com-aws-eji","verb":"update","user":{"username":"system:serviceaccount:efs:efs-provisioner","uid":"ab5d27b4c-71a4f77323b0","groups":["system:serviceaccounts","system:serviceaccounts:eji","system:authenticated"]},"sourceIPs":["10.0.0.0"],"userAgent":"eji-provisioner/v0.0.0 (linux/amd64) kubernetes/$Format","objectRef":{"resource":"endpoints","namespace":"edd","name":"dds.com-aws-edds","uid":"44ad8-899f-fbc1f4befb2f","apiVersion":"v1","resourceVersion":"8852157"},"responseStatus":{"metadata":{},"code":200}}   Need to extract user:{username:.....} user part alone  need to write once regex to extarct all the values inside user object..like user.username, user.uid, user.groups    
I am attempting to upgrade the Database Agents because of the issue with Logs4J. In AppD I can see four servers that have the database agent installed and are listed as 'Active'. I've connected to t... See more...
I am attempting to upgrade the Database Agents because of the issue with Logs4J. In AppD I can see four servers that have the database agent installed and are listed as 'Active'. I've connected to the first Windows server to stop the process, allowing me to upgrade and I can't see 'db-agent' listed as a running process. How do I find the process id on Windows to shut it down?
Hi Folks, I have been trying to pull some data associated with latest Run ID (associated with execution), I am having hard time writing query for it. Any help would be appreciated. Base Data:   in... See more...
Hi Folks, I have been trying to pull some data associated with latest Run ID (associated with execution), I am having hard time writing query for it. Any help would be appreciated. Base Data:   index="unified-tests" dataType="TestRail"   To this I would need to apply a filter for runId = result of below query   index="unified-tests" dataType="TestRail" | stats last(runId) as latestRunID by brand, platform | stats last(latestRunID) by latestRunID | table latestRunID   These are some failed attempts:   index="unified-tests" dataType="TestRail" runId=*[search index="unified-tests" dataType="TestRail" | stats last(runId) as latestRunID by brand, platform | stats last(latestRunID) by latestRunID | table latestRunID] index="unified-tests" dataType="TestRail" | join left=L right=R where L.runID = R.latestRunID [search index="unified-tests" dataType="TestRail" | stats last(runId) as latestRunID by brand, platform | stats last(latestRunID) by latestRunID | table latestRunID]  
Hello every body I have been struggling with a serious problem recently  my splunk version is 7.2  when I use  span Command (with tstats or bin ) , it starts from half hour ! instead of hour for ... See more...
Hello every body I have been struggling with a serious problem recently  my splunk version is 7.2  when I use  span Command (with tstats or bin ) , it starts from half hour ! instead of hour for example : | tstats count as count from datamodel=Log where Log.FinalStatus!=61  by  _time span=1h I have picture below as result , while I want time sections like 10 , 11, 12 ...   what should I do ?? Thank you
Hi Splunkers ,   The vulnerability was disclosed by the Apache Log4j project on Thursday, December 9, 2021. If exploited, it could potentially allow a remote attacker to execute code on the serv... See more...
Hi Splunkers ,   The vulnerability was disclosed by the Apache Log4j project on Thursday, December 9, 2021. If exploited, it could potentially allow a remote attacker to execute code on the server if the system logs an attacker-controlled string value on an affected endpoint. Can you please help me to these below Addon's & Apps are impacted or Not -- Splunk Add-on for Microsoft SCOM Okta Identity Cloud Add-on for Splunk Lookup Editor Number Display Viz Splunk Dashboard Examples Tanium App Splunk Enterprise Dashboards Beta Python for Scientific Computing Solarwinds Add on for splunk Tanium Technology Add on 100_genpact_splunkcloud Splunk DB Connect Tanium App Microsoft windows DHCP Add on for splunk Website Monitoring rest_ta   These are not listed in below links :-  Splunk Security Advisory for Apache Log4j (CVE-2021-44228) | Splunk https://www.splunk.com/en_us/blog/security/log-jammin-log4j-2-rce.html