Activity Feed
- Got Karma for Re: Splunk Add-on for Salesforce - how to add a filter to the query. 12-06-2024 01:17 PM
- Got Karma for Re: Setting up HEC (HTTP Event Collector) in a indexer cluster. 05-02-2024 03:07 AM
- Posted Re: Phantom health alert notification on Splunk SOAR. 04-21-2024 02:07 PM
- Got Karma for Re: Convert multiple value to single value.. 12-06-2023 12:09 PM
- Posted Re: Incident Review -> History page is empty on Splunk Enterprise. 12-04-2023 05:59 PM
- Got Karma for Re: Cron Scheduling. 07-04-2023 02:48 AM
- Posted Re: Cron Scheduling on All Apps and Add-ons. 07-03-2023 07:19 PM
- Posted Re: How to configure heavy forwarder to extract multivalue nested json ? on Getting Data In. 07-03-2023 03:59 PM
- Posted Re: [WinEventLog://Security] logs to Splunk on Getting Data In. 07-02-2023 09:42 PM
- Posted Re: Crowdstrike to deprecate some v1 API Endpoints in February 2023- Is there a plan to update? on All Apps and Add-ons. 05-21-2023 11:34 PM
- Got Karma for Re: Announcing Our Splunk MVPs. 04-03-2023 10:38 AM
- Got Karma for Re: Announcing Our Splunk MVPs. 03-09-2023 10:15 AM
- Got Karma for Re: Announcing Our Splunk MVPs. 03-07-2023 04:08 AM
- Got Karma for Re: Announcing Our Splunk MVPs. 03-06-2023 06:33 AM
- Got Karma for Re: UF not sending Internal logs after DS app update. 01-30-2023 04:01 AM
- Got Karma for Re: can we get rid of [lmpool:auto_generated_pool_download-trial] from our server.conf. 12-29-2022 05:50 AM
- Posted Re: What date format does splunk HTTP Event Collector use? on Getting Data In. 09-19-2022 10:38 PM
- Posted Re: Need help with regex to read all the value in all the lines on Splunk Search. 09-19-2022 09:55 PM
- Posted Re: Is it possible to extract a field across multiple indexes and multiple sourcetypes? on Splunk Enterprise. 09-19-2022 07:47 PM
- Posted Need help with regex to read all the value in all the lines on Splunk Search. 09-19-2022 07:39 PM
Topics I've Started
No posts to display.
05-19-2021
10:14 PM
1 Karma
Hi @yuanliu As per Splunk docs lookups get executed after calculated fields. * Splunk software processes calculated fields after field extraction and
field aliasing but before lookups. This means that:
* You can use a field alias in the eval statement for a calculated
field.
* You cannot use a field added through a lookup in an eval statement for a
calculated field. Ref. link - props.conf - Splunk Documentation You can use it in search query instead writing props.conf same way as you written second query. unfortunately there seems no way you can invoke lookup first inside eval. ----------------------------------------------------------- An upvote would be appreciated if it helps!
... View more
05-17-2021
04:38 PM
1 Karma
Hi @fnetechnology default.meta is the right file to control Access to Knowledge Objects, Follow below link and set read-only permissions to Object_types (aka Knowledge Object) that your app contains. Set permissions using RBAC | Documentation | Splunk Developer Program ----------------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-25-2021
08:43 PM
One more link recently updated - Using Syslog-ng with Splunk | Splunk ------------------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-25-2021
08:41 PM
Hi @pmarceau Hope these links would help - Community:Best Practice For Configuring Syslog Input - Splunk Wiki see Section. 3. ------------------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-25-2021
08:07 PM
1 Karma
Hi @the_rains Lookups are only used during search-time on search head component, what you are trying to achieve before indexing at HF layer. Try using regex and other method which you can think of. ----------------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-25-2021
07:54 PM
Hi @cleelakrishna Can you share the monitor stanza configured in inputs.conf and outputs conf on UF?
... View more
04-22-2021
07:45 PM
Hi @k31453 Following query would help! index="<your_index>" sourcetype="<sourcetype>"
| stats count(user) as "Total_Customer", count(eval(all_flag_updated=="yes")) as "updated_flag" , count(eval(all_flag_updated=="no")) as "not_updated_flag" by region
| eval updated_flag_perc = (updated_flag/Total_Customer) * 100, no_updated_flag_perc = (not_updated_flag/Total_Customer) * 100
| table Total_Customer updated_flag not_updated_flag updated_flag_perc no_updated_flag_perc region ------------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-21-2021
05:00 PM
Hi, Then Splunk avoids re-indexing duplicate data which is built-in, have you configured the monitors then share inputs.conf and sample data files.
... View more
04-21-2021
12:43 AM
Hi @don12 Following config would work for above logs format, copy this to props.conf file to either HF/indexer layer under /opt/splunk/etc/apps/<app_name>/local (or) /opt/splunk/etc/system/local followed by a restart. [hadoop_logs]
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)
TIME_PREFIX=^
TIME_FORMAT=%Y-%m-%d %H:%M:%S,%3Q ---------------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-21-2021
12:26 AM
Hi @ebaileytu There could be several possibilities, check _internal index, splunkd sourcetype for any errors ./splunk list monitor to find what files being monitored, make sure files having enough read permissions for splunk uf to read Enable these debug flags if required - Community:Troubleshooting Monitor Inputs - Splunk Wiki See file tail processor state command - curl https://serverhost:8089/services/admin/inputstatus/TailingProcessor:FileStatus Docs - Troubleshoot the input process - Splunk Documentation
... View more
04-21-2021
12:12 AM
Hi @vikram1583 How the data looks like in both files they change every time script runs? Instead index both files and remove duplicates using Splunk commands like - dedup, dc etc... depends on your use case. ---------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-20-2021
12:49 AM
Hi, There is no straight approach as Splunk prefers text and you can force it to ingest binary that doesn't really help with search. Instead a pre processed binary to text via a separate process or scripted input by Splunk which again back by a custom script which user has to write for binary conversion. Following link would direct to such solutions. https://community.splunk.com/t5/Archive/How-to-use-splunk-for-binary-log-file/m-p/41784 ____________________________ An upvote would be appreciated if it helps!
... View more
04-20-2021
12:35 AM
Hi @gilsegev468 There are two ways of fields extraction search-time and index-time extraction ( which means while parsing extract and write to indexer). In your case a search-time extraction is fine with a combination of inline rex (same can be configured as props.conf inside Search-head) and spath as inner-json is the one you want to extract fields from. | makeresults
| eval log_data="19-04-2021 gil-server-1 {\"systemId\":\"1254\", \"systemName\":\"coffe\", \"message\":\"hello dor\"}"
| rex field=log_data "(?<inner_json>\{\".*)"
| spath input=inner_json
| table systemId systemName message If you want to use in-line rex then field=_raw (default), same regex can be configured to source/sourcetype in props.conf for search-time extraction and deploy it to Search Head. --------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-18-2021
05:56 PM
Hi, if .ldf is a text file format then you can install a Splunk Universal forwarder on the host where file exist and configure it to ingest to Splunk Enterprise. ----------------------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-18-2021
05:40 PM
Hi @shinobu Subquery might work in your case, index=index2 [search index=index1 | fields your_attribute_field | rename your_attribute_field as search] With above query the events from index2 getting filtered only having your required attribute. ---------------------------------------------- An upvote would be appreciated if it helps!
... View more
04-15-2021
11:30 PM
Hi, Since you are using a default sourcetype syslog which comes with default configuration to extract a host from _raw data. [syslog]
pulldown_type = true
maxDist = 3
TIME_FORMAT = %b %d %H:%M:%S
MAX_TIMESTAMP_LOOKAHEAD = 32
TRANSFORMS = syslog-host
REPORT-syslog = syslog-extractions
SHOULD_LINEMERGE = False
category = Operating System
description = Output produced by many syslog daemons, as described in RFC3164 by the IETF Path of default config - /opt/splunk/etc/system/default/props.conf ------------------------------------- An upvote would be appreciated if it helps!
... View more
03-24-2021
05:15 PM
Hi @vidhya Following panel code $customer_id_tok$ will get updated with selected dropdown value and output would be something like this - index=main customerid="123", index=main customerid="1234" etc.. i have considered static customerid values in drop-down for testing you can populate same using dynamic search. In xml <prefix>, <suffix> elements are the key for this to work.. <row>
<panel>
<input type="dropdown" token="customer_id_tok">
<label>dropdown menu</label>
<choice value="123">123</choice>
<choice value="1234">1234</choice>
<choice value="12345">12345</choice>
<prefix>index=main customerid="</prefix>
<suffix>"</suffix>
</input>
<table>
<title>Panel title</title>
<search>
<query> [your query goes here] $customer_id_tok$ </query>
<earliest>-24h@h</earliest>
<latest>now</latest>
</search>
<option name="drilldown">none</option>
</table>
</panel>
</row> ------------------------------------------------ An upvote would be appreciated if it helps!
... View more
03-24-2021
04:26 PM
Hi @aferns0804 Please try this instead of fields command use table. <your query> | table field1, field2, field3, field4... | outputlookup <your_saturday_report>.csv ----------------------------------------------------- An upvote would be appreciated if it helps!
... View more
03-22-2021
08:21 PM
Hi @ethanthomas props.conf can be deployed on Splunk UF, HF , SH and indexer. What you are referring to seems to be closed to parsing layer which can be either deployed on HF or indexer as timestamp extraction only happens there and get assigned to _time when you search events in SH. Assuming you are going to use the props conf during parsing, [your_sourcetype_name] TIME_PREFIX=\"timestamp\":\s+\" TIME_FORMAT=%Y-%m-%d %H:%M:%S NOTE: There are additional fields can be used together depends on data, KV_MODE works on SH, INDEXED_EXTRACTIONS works on UF. You can refer - Configure timestamp recognition - Splunk Documentation props.conf - props.conf - Splunk Documentation TIME_FORMAT - Date and time format variables - Splunk Documentation --------------------------------------------------- An upvote would be appreciated if it helps!
... View more
03-18-2021
10:50 PM
Hi @splunkdivya You can use Splunk DB connect batch input to import entire table, it takes a while depends on schedule of input and table size You will need one input for each table in schema unless you write a SQL query that combines all the information from multiple tables which might be complex I wouldn't do that Note: "Batch input mode is ideal for unchanging historical data that will be indexed into Splunk once. It can also be useful for re-indexing data that updates through time" Splunk docs link would help to start with, How Splunk DB Connect works - Splunk Documentation ---------------------------------------------- An upvote would be appreciated if it helps!
... View more
03-18-2021
10:04 PM
Hi @Muwafi CSV lookups are defined for small sets of data, 1 Million is definitely huge KV store is right choice for your case As you mentioned CSV is the only way you get the details out of your systems on schedule basis, you should research about how to import CSV into KV store KV stores are relatively faster and accept large datasets which helps for your faster dashboard/query loading KV Stores requires at least one KV Store Collection which is a database stores data in key/value pairs. Docs cover how to set-up the same if not already exist in your environment. Following links would be the starting point to read about them, About lookups - Splunk Documentation, Define a KV Store lookup in Splunk Web - Splunk Documentation -------------------------------------------------------- An upvote would be appreciated if it helps!
... View more
03-18-2021
09:41 PM
Hi @aferns0804 If your report is relatively small then go with CSV lookups and following example query would help to create assuming the search job user having enough rights to run outputlook. <your query> | fields field1, field2, field3, field4... | outputlookup <your_saturday_report>.csv You can read more about it here - About lookups - Splunk Documentation ----------------------------------------------------- An upvote would be appreciated if it helps!
... View more
- Tags:
- lookups
- outputlookup
03-18-2021
08:35 PM
Hi @timgren It looks like the solution in other post might satisfy your requirement. Solved: Re: How to extract fields from child node - Splunk Community ------------------------------------------------------------- An upvote would be appreciated if it helps!
... View more
03-18-2021
08:22 PM
1 Karma
Hi @luna There are multiple combinations of time modifiers in Splunk. Specify time modifiers in your search - Splunk Documentation helps to understand more, docs says A time range that you specify in the Search bar earliest= latest= ( means the time between inclusive) , or in a saved search, overrides the time range that is selected in the Time Range Picker in UI. To satisfy your requirement refer absolute time window notes in above link, what you have tried was @w0 which is a relative timerange there is no harm using it its little tricky the time/date should be well calculated with snap. -------------------------------------------------- An upvote would be appreciated if it helps!
... View more
03-18-2021
05:05 PM
Hi @mdmosaraf There is no official Splunk docs supporting WSL, however this link having some discussion around it which is about installing Splunk Enterprise. IS it possible to install Splunk on Ubuntu on Wind... - Splunk Community If your requirement is to monitor WSL2 and Splunk Enterprise set-up is already running in your network in different host then i would give a try installation of Splunk Universal Forwarder (UF), Linux version depends on 64/32 bit of your WLS2 OS. If that is successful then add-on Splunk Add-on for Unix and Linux | Splunkbase helps to extract some useful logs from Linux which will be installed on top of UF. Note: This is not official as per docs just a trail and test, Splunk might not support if you find issues with it. This may result into your WSL2 performance degradation as well if you are running critical apps just keep it in mind. ------------------------------------------------------------- An upvote would be appreciated if it helps!
... View more
- « Previous
- Next »