All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

anything new you've found? still couldn't solved the issue.
Thank you for this answer. It is perfect. I had a lot of the right code, I just did not know how to use the <change> tag correctly. It works exactly as I envisioned.
Hopefully this will set the issue out clearly.  I have two sources, Transaction and Request. The Transaction holds the transaction id, date and time and user details of a user transaction. The Req... See more...
Hopefully this will set the issue out clearly.  I have two sources, Transaction and Request. The Transaction holds the transaction id, date and time and user details of a user transaction. The Request holds the request id, transaction id and an XML string with details of a users search.    I have a query that searches the Request and returns those searches which contain specific strings. However i need to show the user details on the results  table.  index="PreProdIndex" source="Request" "<stringCriterion fieldName=\"Product\" operator=\"equals\" value=\"Soup\"/>" OR "<stringCriterion fieldName=\"Product\" operator=\"equals\" value=\"Biscuits\"/>" | table REQUEST_DATE_TIME REQUEST So I need to add onto the table USER_DETAILS from the Source "Transaction" to the above query based on the common key of the Transaction ID.  In SQL I would simply put in a join on Transaction.ID=Request.Transaction_ID and all would be good but I have failed to find anything that gives a SPLUNK solution yet.   
Great, thanks Rich. It would be good if Splunk could enable the new geo-location DB that ships with SE 9.0.0 or later, dbip-city-lite.mmdb, to be updated on a regular basis instead of having to repl... See more...
Great, thanks Rich. It would be good if Splunk could enable the new geo-location DB that ships with SE 9.0.0 or later, dbip-city-lite.mmdb, to be updated on a regular basis instead of having to replace the new DB with either MaxMind's, or some other vendor's DB. Splunk could build that update functionality in behind the scenes if divulging the new vendor is top secret for some reason.   Otherwise, the update procedure for the new DB could be added to the iplocation page like for MaxMind's update procedure.
Hi All, How can we implement the wait logic in a Splunk query. We monitor the Service down traps primarily and create Splunk alerts. We have requirement now, to wait for a time interval and check ... See more...
Hi All, How can we implement the wait logic in a Splunk query. We monitor the Service down traps primarily and create Splunk alerts. We have requirement now, to wait for a time interval and check if the service UP trap received if yes then don't create alert else create an alert. How can we implement this in a single query? Any suggestion please. Example: If ServiceDown trap received:                 Wait for 5 minutes.                 If Good trap received:                                 Return                 Else:                                 Create alarm.   Thanks!
You can replace the geo-ip file with an MMDB file from any vendor, including MaxMind.  It does not have to be from the same vendor as the one that shipped with Splunk.
Make sure the Cisco IOS TA is installed and enabled.  If it is, go to Settings->Event types and make sure the eventtype itself is enabled.
I re-downloaded the TA_cisco-ios add on and it finally started working! I think the one i downloaded to install offline may have not gotten all the files needed. 
Thanks Rich! That answered all my questions, but brought up 2 new questions. We are running SE 9.0.5 so we have the new $SPLUNK_HOME/share/dbip-city-lite.mmdb  geo-location DB as you mentioned. Th... See more...
Thanks Rich! That answered all my questions, but brought up 2 new questions. We are running SE 9.0.5 so we have the new $SPLUNK_HOME/share/dbip-city-lite.mmdb  geo-location DB as you mentioned. The reason for this new question is I noticed an IP address yesterday whose City seems to be outdated against the results from iplocation.net. Guessing there is no way to update the new dbip-city-lite.mmdb DB after the initial SE install since Splunk has not divulged the vendor ? Went to the link you provided, and to the 9.0.5 page for iplocation which does state the new vendor's mmdb file name, but the data after that shows how to update MaxMind DB's, GeoLite2-City.mmdb & GeoIP2-City.mmdb , which as you said were replaced in 9.0.0, and are not shipped with version 9.0.5.  Is this an oversight in the documentation ?   iplocation - Splunk Documentation "Usage The iplocation command is a distributable streaming command. See Command types. The Splunk software ships with a copy of the dbip-city-lite.mmdb IP geolocation database file. This file is located in the $SPLUNK_HOME/share/ directory. Updating the IP geolocation database file Through Splunk Web, you can update the .mmdb file that ships with the Splunk software. The file you update it with can be a copy of one of the following two files. Only those two files are supported. To use these two files, you must have a license for the GeoIP2 City database. File name Description GeoLite2-City.mmdb This is a free IP geolocation database that is updated on its download page on a weekly basis. GeoIP2-City.mmdb This is a paid version of the GeoLite2-City IP geolocation database that is more accurate than the free version.   Replacing your mmdb file with one of these two files reintroduces the Timezone field that is absent in the default .mmdb file, but does not reintroduce the MetroCode field. Prerequisites You must have a role with the upload_mmdb_files capability. Steps Go online and find a download page for the binary .tar.gz versions of the GeoLite2-City or the GeoIP2-City database files. Download the binary .tar.gz version of the file (GeoLite2-City or GeoIP2-City) that is most appropriate for your needs. Expand the binary .tar.gz version of the file. The .tar.gz file expands into a folder which contains the GeoLite2-City.mmdb file, or the GeoIP2-City.mmdb file, depending on the download you selected. In Splunk Web, go to Settings > Lookups > GeoIP lookups file. On the GeoIP lookups file page, click Choose file. Select the .mmdb file. Click Save. The page displays a success message when the upload completes."
See the https://docs.splunk.com/Documentation/Splunk/latest/Data/Applytimezoneoffsetstotimestamps article to understand how Splunk applies timezone information. It could be done in several different... See more...
See the https://docs.splunk.com/Documentation/Splunk/latest/Data/Applytimezoneoffsetstotimestamps article to understand how Splunk applies timezone information. It could be done in several different places, most probably you'd want to set the TZ on the forwarder so that it doesn't interfere with other components' settings.  
Never figured out the "why" part, but there is a "how" part. Finding no explanation or solution and with no "force push" option, having exhausted all other options I ended up manually transfered the... See more...
Never figured out the "why" part, but there is a "how" part. Finding no explanation or solution and with no "force push" option, having exhausted all other options I ended up manually transfered the lookup files to the appropriate locations in the cluster with the correct ownership etc and it "just worked". So "problem solved"
Pretty sure the forwarder can pass eventlogg as either XML or JSON from a host. If this is not incorrect, then could anyone consider sharing a bit of eventlog in "Splunk native" JSON format as "raw"?... See more...
Pretty sure the forwarder can pass eventlogg as either XML or JSON from a host. If this is not incorrect, then could anyone consider sharing a bit of eventlog in "Splunk native" JSON format as "raw"? I have some log samples in JSON format though in non standard layout with some added metadata. What I'm looking for is a sample of eventlog in JSON format which might be accepted by TA_windows and other apps to compare against. Hopefully someone has some sample log they could share and spare me the need to to generate samples Best regards
Looking to create a search / report showing the ingest by source ingestion method in the last 24hours. I am looking for the source to be the amount of data in GB being ingested by total source.  So f... See more...
Looking to create a search / report showing the ingest by source ingestion method in the last 24hours. I am looking for the source to be the amount of data in GB being ingested by total source.  So for example, how much data in GB's is being ingested for the following source ingest methods:  UF's Syslog API HEC DBX
Does it have to do with the highlighted parameter INDEXED_EXTRACTIONS,  Example:Isolation:Web doesn’t have any SEDCMDs [Example:Isolation:Web] EVAL-vendor_region = lower('region'."-".'zone') FIEL... See more...
Does it have to do with the highlighted parameter INDEXED_EXTRACTIONS,  Example:Isolation:Web doesn’t have any SEDCMDs [Example:Isolation:Web] EVAL-vendor_region = lower('region'."-".'zone') FIELDALIAS-aob_gen_Example_Isolation_Web_alias_1 = userName AS user FIELDALIAS-aob_gen_Example_Isolation_Web_alias_2 = disposition AS action FIELDALIAS-aob_gen_Example_Isolation_Web_alias_4 = categories{} AS category FIELDALIAS-aob_gen_Example_Isolation_Web_alias_5 = fileName AS file_name FIELDALIAS-aob_gen_Example_Isolation_Web_alias_6 = fileSize AS file_size FIELDALIAS-aob_gen_Example_Isolation_Web_alias_7 = fileMimeType AS http_content_type FIELDALIAS-aob_gen_Example_Isolation_Web_alias_8 = parentPageURL AS http_referrer FIELDALIAS-aob_gen_Example_Isolation_Web_alias_9 = classification AS type INDEXED_EXTRACTIONS = json AUTO_KV_JSON = 0 KV_MODE = none SHOULD_LINEMERGE = 0 TIMESTAMP_FIELDS = date category = Example Web Isolation pulldown_type = 1 local/props.onf [Example:Isolation:Url] SEDCMD-sanitize_jsessionid = s/jsessionid=[0-9A-Za z]+/jsessionid=masked_by_splunk/g SEDCMD-sanitize_url_parameter = s/([#&])(access_token|id_token)=[^\s&",]+/\1\2=masked_by_splunk/g SEDCMD-sanitize_url_parameters_password = s/([Pp][Aa][Ss][Ss][Ww][Oo][Rr][Dd])=[^\s"&']+/\1=masked_by_splunk/g  
So i just tried to search for the eventtype cisco_ios and its telling me it does not exist or is disabled? Any suggestions on how i get that eventype enabled?
The value of the diff field is in seconds.  The strftime function adds that value to 1 Jan 1970 to come up with a timestamp.  Obviously, that is not the goal.  Expressing diff in days can be done in ... See more...
The value of the diff field is in seconds.  The strftime function adds that value to 1 Jan 1970 to come up with a timestamp.  Obviously, that is not the goal.  Expressing diff in days can be done in a couple of ways: divide seconds by 86400 to get a number of days | eval days=round(diff/86400,0) Use the tostring function to convert seconds into d:H:M:S format. | eval days=tostring(diff, "duration")  
Apparently, the Cisco app is not performing the same search you are performing manually.  Examine the searches the app uses (click the magnifying glass icon on a panel or use the CLI to view the dash... See more...
Apparently, the Cisco app is not performing the same search you are performing manually.  Examine the searches the app uses (click the magnifying glass icon on a panel or use the CLI to view the dashboard code) and compare it to your manual search.  Based on your findings, adjust how the data is onboarded or modify the queries.
The cisco app shows no data from the syslog but if i run a search my network devices are sending syslogs to the correct indexer.  UDP:514 - cisco:ios My splunk infrastructure is just a single serve... See more...
The cisco app shows no data from the syslog but if i run a search my network devices are sending syslogs to the correct indexer.  UDP:514 - cisco:ios My splunk infrastructure is just a single server preforming all functions.   Please give me some suggestions to troubleshoot! I have tried deleting the data inputs are readding but with no luck.
I am trying to extract the difference of time(duration) of 2 events in days. I have 2 saperate event for the same ID. One is the starting event and the second is the ending event. Looking as follow... See more...
I am trying to extract the difference of time(duration) of 2 events in days. I have 2 saperate event for the same ID. One is the starting event and the second is the ending event. Looking as follows. event1 start: [2023-05-24 12:02:24.674 CEST_] ID:1234 Event 2 end: [2023-05-30 6:13:04:954 CEST_] ID:1234 De following query i tried: Gebeurtenis(=id) =000057927_018448922 |stats min(_time) as start, max(_time) as end, range(_time) as diff by Gebeurtenis |eval start=strftime(Aanmelden, "%d/%m/%Y") |eval end=strftime(Afmelden, "%d/%m/%Y") |eval diff=strftime(diff, "%d/%m/%Y") the result i get is: Diff is calculating the beginning time of splunk and not the 6 days of difference. Any help is welcom.  
Splunk hasn't disclosed the new vendor of geo-ip data, which changed with version 9.0. The file is $SPLUNK_HOME/share/dbip-city-lite.mmdb. You can read more about it in the iplocation documentation... See more...
Splunk hasn't disclosed the new vendor of geo-ip data, which changed with version 9.0. The file is $SPLUNK_HOME/share/dbip-city-lite.mmdb. You can read more about it in the iplocation documentation at https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Iplocation