All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I've been wanting to build some integrity checking and other functionality based on knowing the fields in a sourcetype for a while now. At my company we've built a data dictionary of indexes and so... See more...
I've been wanting to build some integrity checking and other functionality based on knowing the fields in a sourcetype for a while now. At my company we've built a data dictionary of indexes and sourcetype of interest to the SOC. They can search the dictionary to help them remember the important data sources. I'd like to augment/use this info in a couple of new ways: 1) give them a field list for all of these sourcetypes so they could search for which sourcetypes have a relevant field (like src_ip) 2) I'd like to note the fields that appear in 100% of records for a sourcetype and then every day find out if is missing any of those fields. This would quickly clue me into data issues related to the event sent, parsing, or knowledge objects. I know how to get a list of fields for 1 sourcetype and store that info. And I know how to compare a sourcetype to a past set of fields to a current set. My challenge now is how do I get the list of fields for the 100 sourcetypes of interest so far my best idea is to create 100 jobs to handle each sourcetype. Something like ```1-get the sourcetypes of interest and pull back data for them``` [| inputlookup dataDictionary.csv where imf_critical=true | eval yesterday=relative_time(now(),"-1d@d") | where evalTS>yesterday | dedup sourcetype | sort sourcetype | head 5 | tail 1 | table sourcetype] earliest=-2d@d latest=-1@d ```2-get samples for all indexes in which the sourcetype appears``` | dedup 10 index sourcetype | fieldsummary ```3-determine field coverage so we can pick the hallmark fields``` | eventstats max(count) as maxCount | eval pctCov=round(count/maxCount,2)*100 | table field pctCov ```4-add back in the sourcetype name``` | append [| inputlookup dataDictionary.csv where imf_critical=true | eval yesterday=relative_time(now(),"-1d@d") | where evalTS>yesterday | dedup sourcetype | sort sourcetype | head 5 | tail 1 | table sourcetype] | eventstats first(sourcetype) as sourcetype | eval evalTS=now() | table sourcetype evalTS field pctCov ```5-collect the fields to a summary index daily``` | collect index=soc_summary marker="sumType=dataInfo, sumSubtype=stFields" If I ran 100 jobs like this, the number after head would increment to give me the next sourcetype. But I feel like there has to be a better way to do fieldsummary on a lot of sourcetypes. Any ideas?
Hi to all, it's possible to invert y1 and y2 axis? Second question, if y1 axis show a percentage value and y2 show count value, it's possible add symbol "%" to y1? Thanks to all!
Hi Am trying to create an alert and a weekly scheduled report for user"us.admin" in Splunk. I want to get an alert if this user login and activities if possible. Am already monitoring the path and ... See more...
Hi Am trying to create an alert and a weekly scheduled report for user"us.admin" in Splunk. I want to get an alert if this user login and activities if possible. Am already monitoring the path and pushing into Splunk. What are the appropriate search strings to do this? Thanks
Hi Splunkers ! Is it a way to automatically retrieve Entity information like OS, IP address, OS version, ... and add it as Dimensions ? All my Entities are retrieved via Splunk Addon For *nix and v... See more...
Hi Splunkers ! Is it a way to automatically retrieve Entity information like OS, IP address, OS version, ... and add it as Dimensions ? All my Entities are retrieved via Splunk Addon For *nix and via Splunk Addon For Windows. All my Entities are correctly imported to ITEW but no one has other information like his IP or his OS in Entities Info Fields. I have about 1200 entities so, I'm looking for a way to add those information to all my entities automatically. All needed data are correctly indexed. My save search ITSI Import Objects - TA *Nix and ITSI Import Objects - Perfmon get those info correctly. Can somebody help me with these issue ? Happy Splunking !
I am trying to use Splunk Dashboard Studio, I have a search for a single value viz: | makeresults | eval Date=strftime(now(),"%Y-%m-%d %H:%M:%S") | table Date | rename Date AS UTC-DateTime The ... See more...
I am trying to use Splunk Dashboard Studio, I have a search for a single value viz: | makeresults | eval Date=strftime(now(),"%Y-%m-%d %H:%M:%S") | table Date | rename Date AS UTC-DateTime The single value viz always returns the "time" in this format "2022-12-02T20:39:21", ignoring the format strftime in my search. I can apply a format to a table column no problem. How can I format the value in the single value viz as "2022-12-02 20:39:21" and... how can I modify or refresh the query that gets the time every second? I saw a youtube tutorial on this, but the author did not explain the query or the process to refresh or how to apply a different format to the value. Please advise, thanks, eholz1
We are looking to see the size of all the fields in a particular index. We have come up with this search to see the size of a particular field but we would like to see the size of all the fields in t... See more...
We are looking to see the size of all the fields in a particular index. We have come up with this search to see the size of a particular field but we would like to see the size of all the fields in the index in order to understand where the bulk of the data is sitting. index=index_name | eval raw_len=(len(_raw)/1024/1024/1024) | stats sum(raw_len) as GB by field_name | sort -GB
I want to change the column cell background based on the value, but I also want to use a wild card. Example Field values Passed (12:20) Failure (2:30) Passed (4:40) I want to change the cell col... See more...
I want to change the column cell background based on the value, but I also want to use a wild card. Example Field values Passed (12:20) Failure (2:30) Passed (4:40) I want to change the cell color based on only Passed and Failure and ignore rest of the string.
My query: index=primary eventType=ConnectionTest msg="network check results" | spath output=connectError details.error.connectionError | fillnull value=false connectError | dedup visitId | stats co... See more...
My query: index=primary eventType=ConnectionTest msg="network check results" | spath output=connectError details.error.connectionError | fillnull value=false connectError | dedup visitId | stats count as total, count(eval(connectError==true)) as errors If I run this, "errors" always returns 0. However, if I run index=primary eventType=ConnectionTest msg="network check results" | spath output=connectError details.error.connectionError | fillnull value=false connectError | dedup visitId | stats count by connectError connectError properly returns the set of values in each bucket of connectError. My dataset will sometimes contain the object "details.error". I tried fillnull to resolve this but that didn't work. If I look at the Events data for the first or second query, I do see "connectError" in the "Interesting Fields" list on the left hand side. How do I get the first query to work whereby I can get errors and total errors? I want to follow it up with |eval percentErrors=errors/total but I first need to get the stats to work properly.
Hi, I want to index simple xml file. <?xml version="1.0" encoding="utf-8"?> <unitData xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xsi:noNames... See more...
Hi, I want to index simple xml file. <?xml version="1.0" encoding="utf-8"?> <unitData xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xsi:noNamespaceSchemaLocation="unitData-1.0.xsd" unit="0000006000" equipment="W052A-22G0014" operator="admin" starttime="2022-11-22T06:10:53+01:00" endtime="2022-11-22T06:15:07+01:00" state="ok"> </unitData> Before indexing I would like to create new additional attribute machine which should have value depended of these conditions: case equipment="W052A-22G0014" machine =machine1 case equipment="W052A-22G0013" machine =machine2 Can anybody help, please?
What is the query to setup a report to log all activity from a user? Basically anytime they access the VPN and log into the Network, and all activity they are doing.
We've got Splunk_TA_Windows installed on a number of our servers sending data to our Splunk Cloud instance. However, there is far too much WinEventLog data being sent, pushing us to the limits of our... See more...
We've got Splunk_TA_Windows installed on a number of our servers sending data to our Splunk Cloud instance. However, there is far too much WinEventLog data being sent, pushing us to the limits of our ingest volume. What are best practices to lower this volume. I've already updated the props.conf file with the recommendations from the app installation and we've made adjustments to winnetmon to lower that volume. Are there any other best practices out there? We don't want to just disable it entirely.
Dear all, I have the use case that my splunk universal forwarder does not continuously monitor my logs. Because of this nature, I am using batch mode to have the files deleted after ingestion. Now... See more...
Dear all, I have the use case that my splunk universal forwarder does not continuously monitor my logs. Because of this nature, I am using batch mode to have the files deleted after ingestion. Now, I occasionally receive log files which I have already received at an earlier point in time. Problem is: The features crcSalt, initCrcLength etc. are only available in monitor mode. This means that I am not able to benefit from splunks features to prevent duplicate ingestion of the same data. Any help on a solution for this is greatly appreciated.
I have two Splunk Enterprise environments, both at 9.0.2. For users in one environment, search history goes back only two days. For users in the other environment, search history goes back more than ... See more...
I have two Splunk Enterprise environments, both at 9.0.2. For users in one environment, search history goes back only two days. For users in the other environment, search history goes back more than 8 months. Any clue about what could cause that? Both environments are using a single search head. Users are set up the same in each environment. The limits.conf on both search heads is identical. I verified that the user's search history .csv file goes back two days on one and 8 months on the other.
Hello Splunkers!! We have a dashboard which works on the loadjob. When users try accessing the dashboard, they are getting "No results found" message. First I thought problem with permissions, but... See more...
Hello Splunkers!! We have a dashboard which works on the loadjob. When users try accessing the dashboard, they are getting "No results found" message. First I thought problem with permissions, but out of 4 colleagues with same admin access as mine, 3 members are able to see the dashboard results. So it seems it is not problem with permissions. To figure out the problem in query, we back traced the logic line by line and found the line from where user is not getting 0 results. Search Query: |loadjob reportname .....some evals & lookups.... |eval valid=if(match(backlog_dates,e_time),"yes","no") | search valid=yes --->no results from this line replaced 'match' with 'like' but still no results tried the below line but same issue. | where backlog_dates like e_time Checked the logs for both users who are able to get results and who are not able to get results. But nothing to suspect and no errors in log. It is very strange that it is working for some users. Please help me on figuring out the issue. Below is the sample data
Hi I am sending windows system and security data to splunk cloud. Data is collected using UF and forwarded to cloud through HF. I want to get rid of extra text in windows data (example:4624). I s... See more...
Hi I am sending windows system and security data to splunk cloud. Data is collected using UF and forwarded to cloud through HF. I want to get rid of extra text in windows data (example:4624). I saw SED command stanzas are there in the documentation. I tried to place them in the sourcetype on splunk cloud, but it is not working. But same is working for my onprem indexer. Not sure what is wrong. Any suggestions would be appreciated.
Hi, I have a string in splunk logs something like below. msg.message="Matches Logs :: Logger{clientId='hFKfFkF-K7jlp5epzCnZASazoYmXxgUzBLQ8cixb7f23afb8', apiName='Matches', apiStatus='Success', e... See more...
Hi, I have a string in splunk logs something like below. msg.message="Matches Logs :: Logger{clientId='hFKfFkF-K7jlp5epzCnZASazoYmXxgUzBLQ8cixb7f23afb8', apiName='Matches', apiStatus='Success', error='NA', locationIdMerchDetail=[6d65fcb6-8885-4f56-93c1-7050c8bef906 :: QUALITY COLLISION 1 LLC :: 1, e5ff5b47-839c-4ed0-86a3-87fc18f4bfda :: P A JOLLY'S LLC :: 2, 2053428f-f6ba-4038-a03e-4dbc8737c37d :: CREATIVE EXCELLENCE SALON LLC :: 3, c3e9e6fc-8388-49fd-ba7b-3b9d76f5f9ea :: QUALITY SERVICES AND APP :: 4, 75ca5712-f7a1-4a63-a69f-d73c8e7d187b :: FREEDOM COMICS LLC :: 5, e87a96e8-de73-47f8-bfbd-6099c83376f7 :: S AND G STORES LLC :: 6, 732f9d61-3916-4664-9601-dd0745b68837 :: QUALITY RESALE :: 7, d666bef7-e2fa-498f-a74f-e80f6d2701e7 :: CAKE ART SUPPLIES LLC :: 8, 23ca4856-5908-4bd6-b90d-cace07036b05 :: INTUIT PAYMENT SOLUTIONS, LLC :: 9, b583405f-bb3d-4dba-9bb3-ee9b3713b8f7 :: LA FIESTA TOLEDO LLC :: 10], numReturnedMatches='10'}" My string contains locationIdMerchDetail as highlighted above. I need to extract locationId, rank into table first item being locationid and last item being rank in every comma separated item. Ex: In 6d65fcb6-8885-4f56-93c1-7050c8bef906 :: QUALITY COLLISION 1 LLC :: 1 locationId : 6d65fcb6-8885-4f56-93c1-7050c8bef906 rank: 1 I am able to extract locationIds into table, using below query, but not sure how to include corresponding rank ###################################### index=app_pcf AND cf_app_name="credit-analytics-api" AND message_type=OUT AND msg.logger=c.m.c.d.MatchesApiDelegateImpl | rex field=msg.message "(?<LocationId>[0-9a-f]{8}-([0-9a-f]{4}\-){3}[0-9a-f]{12})" | table LocationId ###################################### I want a table something like below. LocationId rank 6d65fcb6-8885-4f56-93c1-7050c8bef906 1 e5ff5b47-839c-4ed0-86a3-87fc18f4bfda 2 2053428f-f6ba-4038-a03e-4dbc8737c37d 3 .............................................................................................................. and so on Any regex to filter these into table. Please help.
we are using Splunk React. may I have a sample Splunk React code that queries Splunk data, please?
index="*dockerlogs*" source="*gps-request-processor-test*" OR source="*gps-external-processor-test*" OR source="*gps-artifact-processor-test*" event="*Request" | eval LabelType=coalesce(labelType, do... See more...
index="*dockerlogs*" source="*gps-request-processor-test*" OR source="*gps-external-processor-test*" OR source="*gps-artifact-processor-test*" event="*Request" | eval LabelType=coalesce(labelType, documentType) | eval event = case (like(event,"%Sync%"),"Sync",like(event,"%Async%"),"Async") | stats count(eval(status="Received")) as received count(eval(status="Failed")) as failed by sourceNodeCode geoCode LabelType event where as the source : - is my application name event :- Type of request whether synchronous request or Asynchronous request labeltype : - Different type of label sourcenodecode and geocode :- is the shopcode and shopregion from where the label is requested received - no of label request received failed - no of label request failed Now i want to find the received and failed request count based on sourceNodeCode, geoCode, LabelType, event But for failed request count i want to add condition - in case of synchronous request or event the failed count should fetch from '*gps-request-processor-test*' application in case of asynchronous request or event the failed count should fetch from "*gps-external-processor-test*" OR "*gps-artifact-processor-test*" application The output should look something similar to this attached o/p.
I want to match one field value with other field values. If Value in btc field is present in NEB_Sales_Oppy_Business_Type I should get True otherwise False. I tried with the following query: | eval ... See more...
I want to match one field value with other field values. If Value in btc field is present in NEB_Sales_Oppy_Business_Type I should get True otherwise False. I tried with the following query: | eval Is_businees_type_matching=if(match(NEB_Sales_Oppy_Business_Type, btc), "TRUE", "FALSE") Why I am getting False for 3 rows even the value is available in both fields.
HI Splunkers, We are getting below value inside one of field "data" in tabular format: Source success Total_Count 0 abc.csv True 200 1 some_string_1 False 34 2 some_string_2 True 12 3 some_st... See more...
HI Splunkers, We are getting below value inside one of field "data" in tabular format: Source success Total_Count 0 abc.csv True 200 1 some_string_1 False 34 2 some_string_2 True 12 3 some_string_3 False 4 4 some_string_4 True 63 5 some_string_5 False 2 6 some_string_6 True 108 Can we extract these values in different fields. Thank you in advance for your reply