All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi All, Im struggeling  to remove everything before the date using SED  Example  |makeresults|eval_raw="Feb 2 14:27:50 test.dh.test.com named[123456]: [ID xxxxx local6.info] 02-Feb-2021 14:27:5... See more...
Hi All, Im struggeling  to remove everything before the date using SED  Example  |makeresults|eval_raw="Feb 2 14:27:50 test.dh.test.com named[123456]: [ID xxxxx local6.info] 02-Feb-2021 14:27:50.448 queries: info: client @172ac618 x.x.x.x#61721 (cd1ae518178748adbca4cff52b24b791.fp.measure.office.com): query: cd1ae51817ffdfdfhhh8748adbca4cff52b24b791.fp.measure.office.com IN A + (x.x.x.x)"    1- I want to remove everything up to 02-Feb-2021 ( the date should be included )        I tried this  rex mode=sed field=_raw "s/[^[0-9]{2}-[\w]{3}-[0-9]{4}]*//g"  NO LUCK Desired resullt would be something like this  02-Feb-2021 14:27:50.448 queries: info: client @172ac618 x.x.x.x#61721 (cd1ae518178748adbca4cff52b24b791.fp.measure.office.com): query: cd1ae51817ffdfdfhhh8748adbca4cff52b24b791.fp.measure.office.com IN A + (x.x.x.x)"    2- I want to keep only the last 3 parts of the domain  Example  cd1ae51817ffdfdfhhh8748adbca4cff52b24b791.fp.measure.office.com Desire result:        fp.measure.office.com I have a working rex that does that but how can i use SED to avoid indexing anything before the last 3 parts of the domain  | rex field=_raw "query:\s(?:[^\s.]+\.)*(?P<query_dns1>[^\s.]+\.[^\s.]+\.[^\s.]+)"   THANK YOU!!
Hi, I want to make a search out of events
I am trying to get network outage totals by domain. I have four domains: A, B, C, D. The problem is that sometimes there are outages that effect 2-3 domains that are reported as a different event and... See more...
I am trying to get network outage totals by domain. I have four domains: A, B, C, D. The problem is that sometimes there are outages that effect 2-3 domains that are reported as a different event and not by domain.  | chart count(event_id) as Count by Domain A                                           4 B                                           7 C                                           2 D                                           5 A, B                                      2 A, D                                      4 C, B                                      3 A, D, B                                 6 I want to display outages that effect each domain. So anything that includes A (A; A,B; A,D; A,D,B) will be added and the count for A will be 16. Same for the other domains. The end result should be: A                            16 B                            18 C                            5 D                            15 I've tried eval domain=if(domain=A OR domain="A, B" OR domain="A, D" OR domain="A, D, B", "A", domain) ....... but that only works for the first one. The combined domains aren't included in the totals for the subsequent if statements.
Hi, I have this table of data: Name Age Address Mark 21 1 st xxxxx Elisabeth 21 2 st xxxxx Jane 22 3 st xxxxx Bryan 24 4 st xxxxx   I ... See more...
Hi, I have this table of data: Name Age Address Mark 21 1 st xxxxx Elisabeth 21 2 st xxxxx Jane 22 3 st xxxxx Bryan 24 4 st xxxxx   I want to list only the elements having a specific age. Exp: list of person with Age=21 Name Age Address Mark 21 1 st xxxxx Elisabeth 21 2 st xxxxx   Thanks for your help.
Hi, I got these datas URI TXT Param A My text and others param 1 A My text and others param 2 A My text  param 3 A My text param 4 B My t... See more...
Hi, I got these datas URI TXT Param A My text and others param 1 A My text and others param 2 A My text  param 3 A My text param 4 B My text and others param 2 B My text param 3 C My text and others param 1 C My text param4 I'd like to extract the datas according these rules : - If URI = A, extract TXT that not contains "others" - Else extract all to obtain these results URI TXT Param A My text  param 3 A My text param 4 B My text and others param 2 B My text param 3 C My text and others param 1 C My text param4   I don't know how to build my request. I try to use "case" command like this but it return a logical error (Error in 'EvalCommand': The arguments to the 'like' function are invalid. ) : ... |eval TXT=case(URI="A", NOT LIKE("%others%"), 1=1, "others") ... How to use negative search in case command ?   Thanks
Hi Splunk Community, I am trying to work with over writing fields using an if clause. The data I have is like what is in the table below: Sourcetype Index LastSeenDate clarity... See more...
Hi Splunk Community, I am trying to work with over writing fields using an if clause. The data I have is like what is in the table below: Sourcetype Index LastSeenDate clarity-A abc123 2-6-2023 clarity-B abc123 1-15-2023 clarity-C abc123 12-1-2022 DR:101:405 abc123 2-4-2023 BillingTool abc123 2-2-2023   I want to overwrite the current LastSeenDates only for clarity-B and clarity-C so that their last seen date is equal to the LastSeenDate for clarity-A. The table below is an example of what I am trying to achieve: Sourcetype Index LastSeenDate clarity-A abc123 2-6-2023 clarity-B abc123 2-6-2023 clarity-C abc123 2-6-2023 DR:101:405 abc123 2-4-2023 BillingTool abc123 2-2-2023
I have logs with the following three fields: -category  -price  -requestID (unique per entry) I want to find all requestID's for entries that have BOTH the same category and price within a 1 ... See more...
I have logs with the following three fields: -category  -price  -requestID (unique per entry) I want to find all requestID's for entries that have BOTH the same category and price within a 1 hour time span. I started off with this query:  index=foo component="shop-service" | streamstats count as dupes by category, price | search dupes> 1 But I cannot seem to calculate the duplicate entries nor tie it to the requestID  
hi  team,   i am using below splunk search in dashboards query   index=BigIt log_severity=INFO or WARN app_name= test-cap-generator country_code=USA error_code= COA-1004 earliest=-5d rex " to... See more...
hi  team,   i am using below splunk search in dashboards query   index=BigIt log_severity=INFO or WARN app_name= test-cap-generator country_code=USA error_code= COA-1004 earliest=-5d rex " total number where indicator I is Z(?<Counts>\d)" | stats count by _time, Counts | table _time,Counts Requirement : i have to filter runs which are occuring once..  daily and ignore the duplicate runs which are showing up in splunk query any suggestion please how can i ignore those duplicate runs anything which ran after 07:00 am should be ignored for that particular day   for example in below image for date 2023-02-02 i have to filter only  06:28 run   
Issue happens after windows server is restarted. Restarting splunk universal forwarder fixes the issue. Either the component that raises this event is not installed
Hello team ! After unsuccessful research on the Internet / Splunk doc, I am turning to you for my question: - Let's say I have 50 alerts in a single app, that are all stored in my file $SPLUNK_HO... See more...
Hello team ! After unsuccessful research on the Internet / Splunk doc, I am turning to you for my question: - Let's say I have 50 alerts in a single app, that are all stored in my file $SPLUNK_HOME$/etc/apps/<appname>/default/savedsearches.conf. - For version control / code management, I want to split this single savedsearches.conf into multiples savedsearches.conf files so that developers can work with a folder directory looking like this: | default | | - | alerts | | - | - | category_1_alerts | | - | - | category_1_alerts | savedsearches.conf | - | - | category_2_alerts | | - | - | category_2_alerts | savedsearches.conf ... - I tried without success on my Splunk instance. I don't know if it is possible, and if it this, I don't know if there are some statements to make in code (e.g. #include <filename>) Have a nice day PS :  In my version control / code management tool, I can always resort to concatenating all my files together when packaging Splunk code if I don't manage to find a better answer.
Currently I am using below query to extract the list of employee_ID column has less then 9 digit employee ID.  However, I have another requirement in same table to extract the employee ID with alphan... See more...
Currently I am using below query to extract the list of employee_ID column has less then 9 digit employee ID.  However, I have another requirement in same table to extract the employee ID with alphanumeric- (like N0001234, etc) and any special characters.  So overall we need data which is less than 9 digits, more than 9 digits, any alphanumeric characters, special characters. index=QQQQQ sourcetype="XXXXX*" source=TTTTTT Extension="*" MSID="*" Employee_Active="*" Employee_Id=* last_name="*" first_name="*"| rename Extension as DN| dedup Employee_Id | eval emplength=len(Employee_Id)| stats count by DN, MSID, Employee_Active, emplength,Employee_Id, last_name, first_name| where emplength>9 ] | table DN, MSID, Employee_Active, emplength,Employee_Id, last_name, first_name
I have a query that returns 2 values("A" and "B"), and a want to make a dinamic field exibition.  With "A" is bigger then "B" show "A" in green, with "A" is lower then "B" show in red. i manage to cr... See more...
I have a query that returns 2 values("A" and "B"), and a want to make a dinamic field exibition.  With "A" is bigger then "B" show "A" in green, with "A" is lower then "B" show in red. i manage to create dash's like that but with fixables values , in my case "B" is dynamic and came in the same query then "A"(using 2 query woud be fine with me too). how i can do that?     { "type": "splunk.singlevalue", "options": { "majorColor": "> majorValue | rangeValue(majorColorEditorConfig)" }, "dataSources": { "primary": "ds_N2TXpjLO" }, "context": { "majorColorEditorConfig": [ { "value": "#D41F1F", "to": "B" }, { "value": "#118832", "from": "B" } ] }, "showProgressBar": false, "showLastUpdated": false }  
When upgrading the Universal Forwarder using the .tgz on Mac OS , a pop up appears and states the following: The "DeRez" command requires the command line developer tools. Would you like to install... See more...
When upgrading the Universal Forwarder using the .tgz on Mac OS , a pop up appears and states the following: The "DeRez" command requires the command line developer tools. Would you like to install the tools now? If 'cancel' is selected, it appears not to affect anything, but I am unsure why this is happening. This appears to be happening when migrating the configuration when upgrading a Splunk UF version on Mac OS. What is the "DeRez" command and what is not being migrated when this is happening?   Thanks!     -- Migration information is being logged to '/Applications/splunkforwarder/var/log/splunk/migration.log.2023-02-01.10-15-52' -- This appears to be an upgrade of Splunk. --------------------------------------------------------------------------------) Splunk has detected an older version of Splunk installed on this machine. To finish upgrading to the new version, Splunk's installer will automatically update and alter your current configuration files. Deprecated configuration files will be renamed with a .deprecated extension. You can choose to preview the changes that will be made to your configuration files before proceeding with the migration and upgrade: If you want to migrate and upgrade without previewing the changes that will be made to your existing configuration files, choose 'y'. If you want to see what changes will be made before you proceed with the upgrade, choose 'n'. Perform migration and upgrade without previewing configuration changes? [y/n] y Migrating to: VERSION=9.0.2 BUILD=17e00c557dc1 PRODUCT=splunk PLATFORM=Darwin-universal It seems that the Splunk default certificates are being used. If certificate validation is turned on using the default certificates (not-recommended), this may result in loss of communication in mixed-version Splunk environments after upgrade. "/Applications/splunkforwarder/etc/auth/ca.pem": already a renewed Splunk certificate: skipping renewal "/Applications/splunkforwarder/etc/auth/cacert.pem": already a renewed Splunk certificate: skipping renewal [DFS] Performing migration. [DFS] Finished migration. [Peer-apps] Performing migration. [Peer-apps] Finished migration. Init script installed at /Library/LaunchDaemons//com.splunk.plist. Init script is configured to run at boot. Splunk> Another one. Checking prerequisites... Management port has been set disabled; cli support for this configuration is currently incomplete. Invalid key in stanza [webhook] in /Applications/splunkforwarder/etc/system/default/alert_actions.conf, line 229: enable_allowlist (value: false). Your indexes and inputs configurations are not internally consistent. For more information, run 'splunk btool check --debug' Checking conf files for problems... Done Checking default conf files for edits... Validating installed files against hashes from '/Applications/splunkforwarder/splunkforwarder-9.0.2-17e00c557dc1-darwin-universal2-manifest' PYTHONHTTPSVERIFY is set to 0 in splunk-launch.conf disabling certificate validation for the httplib and urllib libraries shipped with the embedded Python interpreter; must be set to "1" for increased security All installed files intact. Done All preliminary checks passed. Starting splunk server daemon (splunkd)... Done    
Hello All, I'm new to Splunk. I have the below table. I want to show the Previous Month Actual Cost in a single value panel, and the difference as a subscript and trend showing it is increased or d... See more...
Hello All, I'm new to Splunk. I have the below table. I want to show the Previous Month Actual Cost in a single value panel, and the difference as a subscript and trend showing it is increased or decreased comparing to Current Month Forecast Cost How can I structure the Simple XML to get the desired output?
Hello, We have installed Splunk Heavy Forwarder in IBM cloud and communicating with Indexers, but we are experiencing network flickering. In other words: strange things happening on the network. Wa... See more...
Hello, We have installed Splunk Heavy Forwarder in IBM cloud and communicating with Indexers, but we are experiencing network flickering. In other words: strange things happening on the network. Warning message that this Splunk internal data sent from forwarder to indexers, is too much for the indexers to handle. We have a Firewall installed in front of Indexer and denying the traffic stating 'TCP anomaly' 'Non-compliant TCP packets coming from multiple external sources were detected' Could you someone help me in this topic, if Forwarder is sending too much data or there is some Network issue. 
I am facing issue for certain sourcetype the indexed events are with the future time stamp. The data of these source type is getting indexed in splunk via HF and forwarded to IDX. The props is define... See more...
I am facing issue for certain sourcetype the indexed events are with the future time stamp. The data of these source type is getting indexed in splunk via HF and forwarded to IDX. The props is defined from the SH GUI. please help me understand and eradicate this issue. Example event data 12/12/2024 10:08:24 PM LogName=Application SourceName=Galaxy EventCode=1 EventType=4 Type=Information ComputerName=testserver.gtest.com TaskCategory=None OpCode=None RecordNumber=8425512 Keywords=Classic
We are running with Splunk Cloud version 9.0.2208.4 and all the other components such as HF and other client machines are running with a minimum of version 9.0 and above but we have few critical Wind... See more...
We are running with Splunk Cloud version 9.0.2208.4 and all the other components such as HF and other client machines are running with a minimum of version 9.0 and above but we have few critical Windows client machine running with Windows 2008 R2 OS. And there are very important critical logs needs to be ingested into Splunk from those machines. So can i install Splunk UF version 9.0.3 version in those Windows 2008 R2 machines will it be able to collect logs and is it supported? Or do I need to install some lower version and get them ingested? What is the recommended solution to get the logs ingested into Splunk. Kindly help on the same.    
Hi Team, I have downloaded the 9.0.3 UF version Windows 64 bit and installed the Splunk UF in my Microsoft Windows server 2016 Data Center and started the services. Post which when i have try to che... See more...
Hi Team, I have downloaded the 9.0.3 UF version Windows 64 bit and installed the Splunk UF in my Microsoft Windows server 2016 Data Center and started the services. Post which when i have try to check the Splunk status or do a restart of Splunk UF i am getting an warning message as below in the cmd prompt. So how to overcome this one. Warning: overriding %SPLUNK_HOME% setting in environment ("C:\Program Files\SplunkUniversalForwarder\bin\") with "C:\Program Files\SplunkUniversalForwarder". If this is not correct, edit C:\Program Files\SplunkUniversalForwarder\etc\splunk-launch.conf   When i checked the Splunk-launch.conf file i can see as below: # SPLUNK_HOME=C:\Program Files\SplunkUniversalForwarder   And when i hit the %SPLUNK_HOME% in run it directly navigating to below directory  C:\Program Files\SplunkUniversalForwarder\bin And when i checked the environment variables i can see SPLUNK_HOME as been set as C:\Program Files\SplunkUniversalForwarder\bin sohow to get rid of warning.
I am trying to extract Ips from the field called Text, where this field contains Ips & some string values ,  this field not contains only one IP all time, it may contain 2 Ips , 3 or 5 or more than t... See more...
I am trying to extract Ips from the field called Text, where this field contains Ips & some string values ,  this field not contains only one IP all time, it may contain 2 Ips , 3 or 5 or more than that.  Ips will not be same for all the events and the string "value" is same for all the events eg., Text= value 127.0.0.1,10.x.x.x, 10.x.x.1,10.x.x.3 Text= value 145.X.X.2, 19.x.x.3 Text= value 123.X.X.X So, i need to extract only ip separetely(irrespective of count of Ips) and "value" in one field.
How to generate threshold breaches health rule report by using API for certain month