How to print a splunk default variable in search query? Actually I have two variables like $job.earliestTime$ and $job.latestTime$. And I want this two to use in alert so that it will give me the dat...
See more...
How to print a splunk default variable in search query? Actually I have two variables like $job.earliestTime$ and $job.latestTime$. And I want this two to use in alert so that it will give me the date range in the pdf. Can somebody please suggest.
Hi All, Currently we are using Opsec add on to get data into SPlunk and forwarding the duplicate copy of same data to third party (logstash).Now we want to use existing opsec add on to pull the dat...
See more...
Hi All, Currently we are using Opsec add on to get data into SPlunk and forwarding the duplicate copy of same data to third party (logstash).Now we want to use existing opsec add on to pull the data and forward it to third party only and stop forwarding to SPlunk indexer. Below is log flow Data Flow: Opsec Manger >> Splunk HWF(opsec add on) >>Forward to Indexer >> forward to third party (logstash) Questions: Is it possible to pull data and forward only to third party without ingesting to Splunk? if yes how? Below sis our existing config: Config for data forwarding to third party: props on hwf [opsec] TRANSFORMS-routing=routelogstash Transforms.conf on HWF [routelogstash] REGEX= . DEST_KEY = SYSLOG_ROUTING FORMAT=syslog_tcp_test outputs.conf on HWF [syslog:syslog_tcp_test] server=thirdparty serevrname:port We have seprate outputs.conf for data forking to Indexers.We tried setnull and setparsing to stop data ingestion to splunk but no luck.Please suggest
I have 2 queries 1st is | rest /services/data/indexes | fields title | dedup title | table title this query is giving me all the indexes in my environment 2nd query is | rest /servicesNS/...
See more...
I have 2 queries 1st is | rest /services/data/indexes | fields title | dedup title | table title this query is giving me all the indexes in my environment 2nd query is | rest /servicesNS/-/-/saved/searches | rex field=search "index=(?P<title>[^ ]+)" | stats count by title | sort -count | table title this is giving me all the indexes on which any savedsearch is created. Now i want to see the remove the 2nd query set from 1st and just wanted to see the indexes on which there are no savedsearches in the environment. I have tried placing "NOT" between the queries but not able to get the desired result. Please help Thanks in advance.
I have two queries Here - 1. I am planning to upgrade my Environment to 8.0, I am not 100% sure about my upgrade plan, can anyone confirm if this would be the correct strategy? 2. Also, I am not s...
See more...
I have two queries Here - 1. I am planning to upgrade my Environment to 8.0, I am not 100% sure about my upgrade plan, can anyone confirm if this would be the correct strategy? 2. Also, I am not sure on How upgrading Heavy Forwarder. Any steps will help here. Role Current Version Intermediate update Target Version Search Head, Deployment, License Mgr. 7.2.6 NA 8.0 Indexer 7.2.6 NA 8.0 Heavy Forwarder 6.5.2 7.2 8.0 Heavy Forwarder 6.5.2 7.2 8.0 Universal Forwarders 6.5.2 7.2 8.0 Upgrade Heavy Forwarder to 7.2 Upgrade UF to 7.2 Upgrade all Splunk Apps. Upgrade Search Head to 8.0 Upgrade Indexer to 8.0 Upgrade Heavy Forwarders to 8.0 Upgrade Heavy Forwarders to 8.0 Upgrade Universal Forwarders to 8.0
According to this ultimate Jenkin certification course following steps will assist you with building jobs in Jenkins: Stage 1: First, go to the Jenkins dashboard and click on the New Item. Stage 2:...
See more...
According to this ultimate Jenkin certification course following steps will assist you with building jobs in Jenkins: Stage 1: First, go to the Jenkins dashboard and click on the New Item. Stage 2: Enter the Item name and pick the 'Free-form project alternative'. Stage 3: Specify the subtleties of the work. Stage 4: Next, determine the area of records that should be assembled. Stage 5: If your archive is facilitated on Github, you can likewise enter the URL of that store here. Stage 6: Build area and click on the Add fabricate step. Stage 7: In the order window, enter the accompanying orders and afterward click on the Save button. ( Javac HelloWorld.java Java HelloWorld) Stage 8: You can click on the Build Now alternative to check whether you have effectively characterized the work. Stage 9: Once the form is planned, it will run. Stage 10: Click on the Console Output connect to see the subtleties of the form
HI team , Default analyzer showing grey in tree view and NO results in GRID view .Although KPI and service is working fine. can you help us why it is happening and we are using ITSI on-prem .
Hello, I have a scripted input with a CRON set to 50 5-23 * * * so that it "sleeps" between the hours of midnight and 6AM. I noticed, however, that the script is "sleeping" between the ours of 2AM ...
See more...
Hello, I have a scripted input with a CRON set to 50 5-23 * * * so that it "sleeps" between the hours of midnight and 6AM. I noticed, however, that the script is "sleeping" between the ours of 2AM and 8AM. After investigating I discovered that the CRON was running with UTC time standard and not my time zone, which is currently CEST. How do I set it to run in line with my timezone instead of the UTC time standard? With other objects like saved searches the timezone is retrieved from the account that owns the object, but with scripted inputs I don't see this option. Thanks! Andrew
Hi, We are working on setting up splunk 0365 addon. It looks like our tenant is used by multiple groups/domains, how do we filter to extract only specific group/domain of events to be indexed into s...
See more...
Hi, We are working on setting up splunk 0365 addon. It looks like our tenant is used by multiple groups/domains, how do we filter to extract only specific group/domain of events to be indexed into splunk. I assume we have to filter out the data in step 2 or 3 from below steps but no idea around o365 side of things Add the Splunk Add-on for Microsoft Office 365 Turn on Office 365 Audit Logging Create the Application in Azure AD Configure the Splunk Add-on for Microsoft Office 365 Verify Logging Add the Microsoft 365 App for Splunk Add-on https://docs.splunk.com/Documentation/AddOns/released/MSO365/About
Hi Guys, How can I query an automatic lookup? Now, this is not the fields created through an automatic lookup, but the name given to the automatic lookup. my automatic lookup is src_top_sale...
See more...
Hi Guys, How can I query an automatic lookup? Now, this is not the fields created through an automatic lookup, but the name given to the automatic lookup. my automatic lookup is src_top_sales_by_model I have tried: index=blah model=* | stats values(src_top_sales_by_model) the dilemma, I have two different automatic lookups and they both have the same field name but different values so I was thinking of just querying the automatic lookups names given to then find the values. Is this even possible?
How do I generate Licensing reports using Meta Woot app? Do you know where I 'd learn to use the main features of this super app? I can't find much info on internet about how to use it to help me.
Hi Community, I encountered the following error message when using the ML Toolkit: 'Error in 'fit' command: Invalid message received from external search command during setup, see search.log.' I'v...
See more...
Hi Community, I encountered the following error message when using the ML Toolkit: 'Error in 'fit' command: Invalid message received from external search command during setup, see search.log.' I've tried uninstalling/reinstalling the app and updating the "Python for Scientific Computing" but still couldn't get the app to work. Would appreciate if I can get some advice on this. Thanks! R
Issues with the SSL Checker app modes: SSL checker auto mode: SSL checker is not capturing all the pem files located in /opt/splunk/etc/auth directory SSL checker manual mode: SSL checker is work...
See more...
Issues with the SSL Checker app modes: SSL checker auto mode: SSL checker is not capturing all the pem files located in /opt/splunk/etc/auth directory SSL checker manual mode: SSL checker is working fine in the manual mode and getting the end date of certs when given the location of the cert paths with comma separated values but throwing the following error. Error: Invalid key in stanza [SSLConfiguration] in /opt/splunk/etc/apps/ssl_checker/local/ssl.conf @jkat54 can you please address the issue We are using the ssl_checker app version 3.2 and we are on splunk enterprise 7.3.8 Thanks
When I used to manually created indexes on prem, I would create a record in index.conf for Indexers and a separate one in indexes.conf for Search heads. The documentation calls it a "Search Head Volu...
See more...
When I used to manually created indexes on prem, I would create a record in index.conf for Indexers and a separate one in indexes.conf for Search heads. The documentation calls it a "Search Head Volume Settings". https://docs.splunk.com/Documentation/Splunk/8.1.3/Indexer/Configurethesearchhead The SH uses this index list to validate the target of summary indexed data, provide typehead for users using index=*. It's my current understanding that this is also used to calculate | rest /services/data/indexes based on testing on-prem. I am concerned that Splunk Cloud doesn't seem to be being creating these in my cloud environment on the search heads that I did not create the index from. The issue is that for things like multi-select dashboard inputs that use this API to select index and IDM input set up, Splunk doesn't know about Indexes that I created on my Search Head/IDM/ES server. Originally Support told me to delete the index and recreate it on the IDM to set up the Modular input to use that Input. Users are complaining about apps that we use wanting to use the rest API query for indexes. Have others dealt with this and found solutions with Splunk Support?
I created a correlation search with only two pipes, table and rename. I added inline table to the email notification but it is not showing. I have another alert for which this works with a similar se...
See more...
I created a correlation search with only two pipes, table and rename. I added inline table to the email notification but it is not showing. I have another alert for which this works with a similar search. The only difference is another user created it. Is there any role dependency to do this? Not sure what else could cause this.
In my add-on, I have a RegEx that contains 3 capture groups. However, one of the capture groups is only used within the RegEx as a numeric reference and is not required in the FORMAT field. Unfortun...
See more...
In my add-on, I have a RegEx that contains 3 capture groups. However, one of the capture groups is only used within the RegEx as a numeric reference and is not required in the FORMAT field. Unfortunately, I can't change this to a non-capturing group because it breaks the RegEx since it's dependent on being able to use the numeric reference. As a result, my add-on always fails AppInspect and I can't seem to find a workaround for this. Here's my transforms.conf entry for reference. [replace-customfields] REGEX = (\S+)=([^=]*)\s+(?:\1Label)=([^=]+)(?:(?:\s\w+=)|$) FORMAT=$3::$2 KEEP_EMPTY_VALS=True Any ideas would be greatly appreciated.
Hi all, I'm trying to use a transaction to get multiple pairs of events (the selection and release of a node). So I have multiple events like the ones below: 2021-04-05 14:34:23 Node 123 selected...
See more...
Hi all, I'm trying to use a transaction to get multiple pairs of events (the selection and release of a node). So I have multiple events like the ones below: 2021-04-05 14:34:23 Node 123 selected for User 1 2021-04-05 14:34:17 User 1 released Node 118 2021-04-05 14:34:11 Node 123 selected for User 1 2021-04-05 14:34:05 Node 118 selected for User 1 2021-04-05 14:33:46 Node 118 selected for User 1 2021-04-05 14:33:29 User 1 released Node 103 2021-04-05 14:33:23 Node 118 selected for User 1 2021-04-05 14:33:21 Node 103 selected for User 1 2021-04-05 14:33:08 User 1 released Node 118 2021-04-05 14:33:02 Node 103 selected for User 1 2021-04-05 14:32:53 Node 118 selected for User 1 I am trying to get groups of events that would have pairs with the start and end of each node selection/release. In the above example I would want to retrieve 2 complete pairs: 1) The events at 14:33:23 and 14:34:17 2) The events at 14:33:02 and 14:33:29 My current query looks like this: index=INDEX host=HOSTNAME sourcetype=SOURCETYPE | rex field=_raw "Node\s(?<node>\d+)\sselected\sfor\sUser\s(?<user_id>\d+)" | rex field=_raw "User\s(?<user_id>\d+)\sreleased\sNode\s(?<node>\d+)" | where isnotnull(node) | transaction node user_id startswith=selected endswith=released | table user_id, node, duration I am currently getting pairs of events with the correct user ids and nodes, but it is taking the earliest instance where the user id and node values match with the latest user id and node values. I don't necessarily want the earliest event however. So instead of the events at 14:33:23 and 14:34:17, my current query would be giving me the events at 14:32:53 and 14:34:17. How can I adjust this query to get the results I'm after?
Hi all, i have a simple splunk app that monitors a folder and indexes a text file that is overwritten every hour. It works fine. Then all of a sudden, it just stops indexing, even as new files are cr...
See more...
Hi all, i have a simple splunk app that monitors a folder and indexes a text file that is overwritten every hour. It works fine. Then all of a sudden, it just stops indexing, even as new files are created. Below are my configs. Any suggestion is appreciated inputs.conf [monitor://E:\Splunk\ccure\AllSites\*.txt] sourcetype = ccure:allsite:csv index = security disabled = false [monitor://E:\Splunk\ccure\Forced_Held\*.txt] sourcetype = ccure:door_csv index = security disabled = false props.conf [ccure:allsite:csv] SHOULD_LINEMERGE=false NO_BINARY_CHECK=true CHARSET=UTF-8 INDEXED_EXTRACTIONS=csv KV_MODE=none category=Structured disabled=false pulldown_type=true CHECK_METHOD=modtime [ccure:door_csv] SHOULD_LINEMERGE=true NO_BINARY_CHECK=true CHARSET=UTF-8 LINE_BREAKER=([\r\n]+) MAX_TIMESTAMP_LOOKAHEAD=180 disabled=false CHECK_METHOD=modtime SEDCMD-crop_extra_line=s/(?!match)Door Forced Report - InfoSec(?!match)($|([\r\n]+))//g TRANSFORMS-set=setnull