All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello all! This will be a doozy, so get ready. We are running a search with tstats generated results,  from various troubleshooting we simplified it to the following     | tstats count by host | ... See more...
Hello all! This will be a doozy, so get ready. We are running a search with tstats generated results,  from various troubleshooting we simplified it to the following     | tstats count by host | rename host as hostname | outputlookup some_kvstore     The config of the kvstore is as follows:     # collections.conf [some_kvstore] field.hostname = string         # transforms.conf [some_kvstore] collection = some_kvstore external_type = kvstore fields_list = hostname     When you run the first 2 lines of the SPL, you will get quite a few results, as it queries the internal db for hosts and retrieves a count of their logs. After you add the outputlookup command, it removes all your results and will not add them to the kvstore.  As my coworker found, there is a way to write the results to the kvstore after all, however the SPL for that is quite cursed, as it involves joining the original search back in, but the new results will be written to the kvstore.     | tstats count by host | rename host as hostname | table hostname | join hostname [ tstats count by host | rename host as hostname] | outputlookup some_kvstore       As far as I aware, 9.1.2, 9.0.6, and latest verisions of cloud have this issue even as fresh installs of Splunk, however it does work on an 8.2.1 and 7.3.3 systems (dont ask). The Splunk user owns everything in the Splunkdir so there is no problem with writing to any files, the kvstore permissions are global, and any user can read or write to it. So after several hours of troubleshooting, we are stumped here and not sure where we should look next. Changing to a csv is unfortunately not an option.   Things we have tried so far, that i can remember: Completely fresh installs of Splunk Cleaning the kvstore via `splunk clean kvstore -local` Outputting to a csv (works) Using makeresults to create the fields manually and add to the kvstore (works) Using the noop command to disable all search optimization  Writing to the kvstore via API (works) Reading data from the kvstore via inputlookup (works) Modifying an entry in the kvstore via the lookup editor app (works) Testing with all search modes (fast, smart, verbose)
I am trying to remove window EventCodes 4688 and 4627. Nothing I have tried has worked. Her are the things that I have tried. This is on the inputs.conf. blacklist = EventCode="4688" Message="(... See more...
I am trying to remove window EventCodes 4688 and 4627. Nothing I have tried has worked. Her are the things that I have tried. This is on the inputs.conf. blacklist = EventCode="4688" Message="(?:New Process Name:).+(?:SplunkUniversalForwarder\bin\splunk.exe)|.+(?:SplunkUniversalForwarder\bin\splunkd.exe)|.+(?:SplunkUniversalForwarder\bin\btool.exe)|.+(?:Splunk\bin\splunk.exe)|.+(?:Splunk\bin\splunkd.exe)|.+(?:Splunk\bin\btool.exe)|.+(?:Agent\MonitoringHost.exe)" blacklist1= EventCode="4688" blacklist2= EventCode="4627" blacklist= EventCode=4627,4688 blacklist = EventCode=4627|4688 blacklist= EventCode=%^(4627|4688)$% blacklist= EventCode=%^4627$% blacklist= EventCode=%^4688$%
I have some search before, and after I extract fields (name, status) from json and mvzip it together, I got this table   _time name status nameStatus 2023-12-06 16:06:20 A B C UP D... See more...
I have some search before, and after I extract fields (name, status) from json and mvzip it together, I got this table   _time name status nameStatus 2023-12-06 16:06:20 A B C UP DOWN UP A,UP B,DOWN C,UP 2023-12-06 16:03:20 A B C UP UP UP A,UP B,UP C,UP 2023-12-06 16:00:20 A B C DOWN  UP UP A,DOWN B,UP C,UP   I want to get only the latest time of the records, so I pipe in the command  ...|stats latest(nameStatus). However, the result comes out only as A,UP   How can I fix this? Thank you!
I'm going crazy trying to troubleshoot this error with eventlog. I'm only using one mvfile replacement type and it is not working. The SA-Eventgen logs tell me this:       time="2023-12-06T19:42:... See more...
I'm going crazy trying to troubleshoot this error with eventlog. I'm only using one mvfile replacement type and it is not working. The SA-Eventgen logs tell me this:       time="2023-12-06T19:42:32Z" level=warning msg="No srcField provided for mvfile replacement: "         In my $SPLUNK_HOME/etc/apps/<app>/default/eventgen.conf file, I have:       ... token.2.token = "(\$customer_name\$)" token.2.replacementType = mvfile token.2.replacement = $SPLUNK_HOME/etc/apps/eventgen_yogaStudio/samples/customer_info.txt:1 ...         My customer_info.txt:1 file contains:       JoeSmith,43,Wisconsin,Pisces JaneDoe,25,Kentucky,Gemini ...         I'm getting JSON-formatted events but for customer_name, it's just blank:       { membership: gold customer_name: item: 30-day-pass quantity: 4 ts: 1701892130 }         I've tried the following sample file names: customer_info.txt customer_info.sample customer_info.csv Nothing seems to work. I'm going crazy!
Hi all, I published my new version of app : https://splunkbase.splunk.com/app/7087, version 1.2.0 (invisible for now because below issue) When I tried to install it on my cloud instance through spl... See more...
Hi all, I published my new version of app : https://splunkbase.splunk.com/app/7087, version 1.2.0 (invisible for now because below issue) When I tried to install it on my cloud instance through splunkbase, I face below errors X509 certificate (CN=splunkbase.splunk.com,O=Splunk Inc.,L=San Francisco,ST=California,C=US) common name (splunkbase.splunk.com) did not match any allowed names (apps.splunk.com,cdn.apps.splunk.com) That's werid because I did not change anything about certification or the package process... Just fixed one more bug in the app about data missing and bump the app version. Tried other apps on Splunkbase and the old version of my app, they are all works fine... Anyone has idea what happened to my 1.2.0 app? Your help will be appreciated very much!
Hi, I have a problem excluding or including only entries that contain specific String values in the msg field. For example, there are two (maybe more) definite String values contained in the msg fie... See more...
Hi, I have a problem excluding or including only entries that contain specific String values in the msg field. For example, there are two (maybe more) definite String values contained in the msg field: 1. "GET /ecc/v1/content/preLoginBanners HTTP/1.0" 2. "GET /ecc/v1/content/category/LegalTerms HTTP/1.0" I need 3 statements like the following: 1. Include ONLY 1 above in the msg field. 2. Include ONLY 2 above in the msg field. 3. Exclude 1 and 2 above to determine if there are more unknown values in the msg field.  I imagine I will be using thistype of  logic more on other output fields as time goes on. I am new to this and I am using the XML-based AdHoc Search input/output form. Any help is greatly appreciated!  
Hello, I am trying to find a command that will allow me to create a table and only display values. when using the user agent field in my table, there are some values that are null. I only want value... See more...
Hello, I am trying to find a command that will allow me to create a table and only display values. when using the user agent field in my table, there are some values that are null. I only want values to display. 
Hi, I have seen a aggregration issue for one of my source type cisco, how can I fix this issue  in my splunk cloud ? 12-06-2023 17:42:27.004 +0000 ERROR AggregatorMiningProcessor [82698 merging_0... See more...
Hi, I have seen a aggregration issue for one of my source type cisco, how can I fix this issue  in my splunk cloud ? 12-06-2023 17:42:27.004 +0000 ERROR AggregatorMiningProcessor [82698 merging_0] - Uncaught exception in Aggregator, skipping an event: Can't open DateParser XML configuration file "/opt/splunk/etc/peer-apps/Splunk_TA_cisco-ise/default/datetime_udp.xml": No such file or directory - data_source="/syslog/nac/ise.log", data_host="ise-xx", data_sourcetype="cisco:ise:syslog" Thanks...  
Is there any mechanism to monitor a salesforce URL beyond single sign on. We try to setup using Splunk website monitoring. this app directly monitoring single sign on page and not actual page. Please... See more...
Is there any mechanism to monitor a salesforce URL beyond single sign on. We try to setup using Splunk website monitoring. this app directly monitoring single sign on page and not actual page. Please suggest a method to monitor an URL beyond single sign on.  Thanks.           
For example: If "fieldX" has many possible values(ex. 1 2 3 4 a b c d ...) we want to have Splunk send an alert email whenever any of these values are seen more than 10 times in 60mins.   Does anyo... See more...
For example: If "fieldX" has many possible values(ex. 1 2 3 4 a b c d ...) we want to have Splunk send an alert email whenever any of these values are seen more than 10 times in 60mins.   Does anyone know a search that will work for this? Thanks in advance!
Do you need to return output from one section of a chain search to another, like when writing a function in a programming language I've assumed that a chained search would, as a user, act in a simil... See more...
Do you need to return output from one section of a chain search to another, like when writing a function in a programming language I've assumed that a chained search would, as a user, act in a similar fashion to concatenating both searches, but with a really DRY efficiency - so superb use for dashboarding as often the material being presented shared a common subject. There are certain queries I am running that break when used in a chained order - am I missing some kind of return function needed?
Hello all, Can someone help me with where I can download the Splunk Tools 6.3 package for linux?
So when an upstream error is logged in our splunk it has two fields that contain all the information about the error. So I created a nice little query to show a simple table of the two fields: stats... See more...
So when an upstream error is logged in our splunk it has two fields that contain all the information about the error. So I created a nice little query to show a simple table of the two fields: stats values(errorMessage) by errorCode However for one of the error messages in the errorMessage field it can contain an id for the current transaction with the server. So when we scale up and release this table will contain hundreds of values for a single error type. Examples of the types of errors (obviously sanitized without actual data): errorCode: Not Required, errorMessage: [Error: Not Required] 400: Downgrade for transactionId=00000000000: type=01 country=GB errorCode: Not Required, errorMessage: [Error: Not Required] 400: Downgrade for transactionId=00000000001: type=01 country=GB errorCode: Invalid Request Parameters, errorMessage: [Error: Invalid Request Parameters] 400: Value of 30 for field not valid errorCode: undefined, errorMessage: [Error: undefined] 400: undefined errorCode: undefined, errorMessage: [Error: undefined] 500: undefined So I would like the values(errorMessage) to group the first two items as a single entry so if I could create a new variable without the transactionId or replacing it with the same value, the information would be much easier to read and present for error triage in our dashboard because the transaction id is not important for seeing an error trend. Not super great with Regex but I feel there is something that would work to just find a field of numbers with a specific length and remove them or replace them. Is that possible? Thanks
Hello, The rex command to catch and group the Accesses multi values are not working even though the results in regex101 are fine. Could you guys tell me what I am missing? Test Log:   12/12/2012 ... See more...
Hello, The rex command to catch and group the Accesses multi values are not working even though the results in regex101 are fine. Could you guys tell me what I am missing? Test Log:   12/12/2012 04:25:13 PM LogName=Security EventCode=5145 EventType=0 ComputerName=test.corp SourceName=Microsoft Windows security auditing. Type=Information RecordNumber=2049592111 Keywords=Audit Success TaskCategory=Detailed File Share OpCode=Info Message=A network share object was checked to see whether client can be granted desired access. Subject: Security ID: User\Test Account Name: Test Account Domain: Test Logon ID: 0x117974CE Network Information: Object Type: File Source Address: ::1 Source Port: 51234 Share Information: Share Name: \\*\C$ Share Path: \??\C:\ Relative Target Name: Users\Test\Desktop Access Request Information: Access Mask: 0x100081 Accesses: SYNCHRONIZE ReadData (or ListDirectory) ReadAttributes Access Check Results: -     Splunk Rex Query:   ... | rex field=Body ".*Access Mask.*\sAccesses:\s(?<Accesses2>.+?)Access\sCheck Results\:.*"     Thanks, Regards,
We have recently switched over from one proxy to another in our organisation, when trying to put the new proxy details in the relevant add-ons like serviceNOW, cisco umbrella etc the data feeds stop,... See more...
We have recently switched over from one proxy to another in our organisation, when trying to put the new proxy details in the relevant add-ons like serviceNOW, cisco umbrella etc the data feeds stop, the Network team inform me that we need to use the CA file that they supply. Does anyone know where this needs to be installed in Splunk? I thought in /etc/auth/ but not sure how we point the config to it.
Hello, I would like to ask if there is a way to restore splunk user password. During the deployment of UF on client splunk user has been created to deploy UF. Unfortunately this pass is not working a... See more...
Hello, I would like to ask if there is a way to restore splunk user password. During the deployment of UF on client splunk user has been created to deploy UF. Unfortunately this pass is not working anymore. How to restore password for this user ? What would happen if new version of UF 9.1.2 is deployed ? Does is it help create a new user ?  Thanks in advance
Hi There!    I would like to find the values of host that were in macro 1 but not in macro 2 search 1   `macro 1` | fields host   search 2   `macro 2` | fields host   macro ... See more...
Hi There!    I would like to find the values of host that were in macro 1 but not in macro 2 search 1   `macro 1` | fields host   search 2   `macro 2` | fields host   macro 1 host a b c d macro 2 host a b e f Result Count - 2 because host c and d were not in macro 2 Thanks in Advance!
Hello Splunkers!! index=messagebus "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Entry*" OR "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*... See more...
Hello Splunkers!! index=messagebus "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Entry*" OR "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName"="ASR/Hb/*/Exit*" | stats count by "AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName" |fields - _raw | fields AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName | rex field=AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName "(?<location>Aisle\d+)" | fields - AsrLocationStatusUpdate.AsrLocationStatus.LocationQualifiedName |strcat "raw" "," location group_name | stats count BY location group_name   Current visualisation I am getting by above search in column chart:      I want to obtain below visualization. Please guide me what changes I need to used in my current SPL to obtain below visualization.    
I am using Splunk 9.0.4 and I need to make a query where I extract data from a main search. So I am interested in results from the main search:   stage=it sourcetype=some_type NOT trid="<null>... See more...
I am using Splunk 9.0.4 and I need to make a query where I extract data from a main search. So I am interested in results from the main search:   stage=it sourcetype=some_type NOT trid="<null>" reqest="POST /as/*/auth *"   But then I need filter out results from the main search, using a subsearch that operates on a different data set, using a value from a field from the main search, let's call it trid, and trid is a string that might be part of a  value called message in a subsearch. There might be more results in the subsearch, but if there is at least one result in a subsearch then the result from the main search stays in the main search, if not it should not be included in the main search. So I am interested only in the results from the main search, and the subsearch is only used to filter out some of them that does not match.   stage=it sourcetype=some_type NOT trid="<null>" reqest="POST /as/*/auth *" | fields trid [ search stage=it sourcetype=another_type | eval matches_found=if(match(message, "ID=PASSLOG_" + trid), 1, 0) | stats max(matches_found) as matches_found ] | where matches_found>0   After a few hours I cannot figure out how to make it. What is wrong with it? Please advise.
Hi All, I have been trying to extract userids which has special characters in it but with no luck. For ex let's say a field name uid contains two userids one is "roboticts@gmail.com" and the other ... See more...
Hi All, I have been trying to extract userids which has special characters in it but with no luck. For ex let's say a field name uid contains two userids one is "roboticts@gmail.com" and the other one is "difficult+1@gmail.com". Now I want to write a query which could extract only the uid with + sign in it.  Please help on this