All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello! I have a lookup in the csv format of about 1900 users. I have a panel in which we show application usage for every user, but it's formatted as a pivot. I'd like to make a new panel that onl... See more...
Hello! I have a lookup in the csv format of about 1900 users. I have a panel in which we show application usage for every user, but it's formatted as a pivot. I'd like to make a new panel that only shows application usage for the users on this specific lookup table. If it wasn't the case of a pivot being used, I'd have to use an inputlookup followed by an index subsearch, but i'm having a hard time figuring out how to do this with the pivot. This is the code for the current panel     | pivot Process_Detail dc(AppVersion) as "#Versions" dc(ProcUser) as "#Users" dc(host) as "#Hosts" splitrow AppName as Name filter SessionID > 0 filter AppName is "*" | eval sortfield = lower ('Name') | sort limit=0 sortfield | table Name "#Versions" "#Users" "#Hosts"      
Hi, I have requirement to open an additional non ssl http rest port in splunk and bind it to localhost for my splunk security issue. In splunk documentation server. conf file they have mentioned as... See more...
Hi, I have requirement to open an additional non ssl http rest port in splunk and bind it to localhost for my splunk security issue. In splunk documentation server. conf file they have mentioned as below. How to open non ssl port and bound to localhost?  ############################################################################ # Open an additional non-SSL HTTP REST port, bound to the localhost # interface (and therefore not accessible from outside the machine) Local # REST clients like the CLI can use this to avoid SSL overhead when not # sending data across the network. ############################################################################ [httpServerListener:127.0.0.1:8090] ssl = false   
Hi,  We have 2 Splunk authentication systems - SAML,Splunk (default). We wanted to have an alert, if the user login to Splunk via "Splunk" authentication system. Is there way to do that? Can we d... See more...
Hi,  We have 2 Splunk authentication systems - SAML,Splunk (default). We wanted to have an alert, if the user login to Splunk via "Splunk" authentication system. Is there way to do that? Can we do this via Splunk query? need help on this. Thanks, Mala S
Hi Splunkers, I have a doubt about Slunk data forwarding to third part systems. I know that this task can be performed with forwarders; what I'm not able to understand is if it can be performed af... See more...
Hi Splunkers, I have a doubt about Slunk data forwarding to third part systems. I know that this task can be performed with forwarders; what I'm not able to understand is if it can be performed after data are arrived to Splunk and has been ingested, parsed and aggregated. let me explain better: in a usual scenario, we know that Forwarder are something that stay before a Splunk environment, so it is something similar to: Data sources -> Forwarders (on DS or on intermediate) -> Splunk environment (SH +Indexer) With Forwarder I can achieve this scenario: Data sources -> Forwarders (on DS or on intermediate) -> Splunk environment (SH +Indexer)                                                                                                                  -> Other systems So, I can forward data before they arrive to SH + Indexer. But what about if I need to perform this task: Data sources -> Forwarders (on DS or on intermediate) -> Splunk environment (SH +Indexer) -> parsing, aggregation and filtering -> Forwarding to third part system Is it possible? The requirement is that data must have completed all Splunk lifecycle: they arrive raw/only partially manipulated from Mulesoft, then must be parsed, filtered and aggregated and after this sent back to Mulesoft that forward them to final systems. Is this something that I can achieve with Splunk?  
Hi , I want to change the date format and find difference in days from current day . Date format i  have now is , Timestamp 10/31/2022 1:28:20 PM index=s sourcetype=Resources|fillnull | eva... See more...
Hi , I want to change the date format and find difference in days from current day . Date format i  have now is , Timestamp 10/31/2022 1:28:20 PM index=s sourcetype=Resources|fillnull | eval Timestamp=strftime(strptime('Timestamp',"%m/%d/%Y"),"%Y-%m-%d") | eval diff=now()-Timestamp|table Name _time Timestamp diff Here i am not getting the diff field populated with difference in number of days.It shows as blank .The date format i expect is %Y-%m-%d for timestamp  column. Please help me to achieve this .  
Hello Community, I'm currently trying to configure the Splunk Add-on for Microsoft Azure. The Addon is installed on the Heavy Forwarder in my Environment and my goal is to forward the Data from th... See more...
Hello Community, I'm currently trying to configure the Splunk Add-on for Microsoft Azure. The Addon is installed on the Heavy Forwarder in my Environment and my goal is to forward the Data from the API to an Index in the Indexer Cluster.  The Problem I'm having is that the configuration of the App only accepts Indexes that are visible locally on the machine. Since Indexes, that are configured through the _cluster, don't show up in the local index list, I am not able to choose it. If I write the index into the config anyway the App outputs the Status=false and doesn't pull from the API. My Question now is:  How can I make the Index from the Cluster visible on the Heavy Forwarder so I can select it in the Script? Or if there is any other way to forward the Data from the App that I have missed? Any advice or hints to documentation I might have missed are appreciated! 
Hello,   I'm following the steps here: https://docs.splunk.com/Documentation/Splunk/9.0.1/Installation/InstallonLinux#Next_steps After installing and starting the service, I'm of course unabl... See more...
Hello,   I'm following the steps here: https://docs.splunk.com/Documentation/Splunk/9.0.1/Installation/InstallonLinux#Next_steps After installing and starting the service, I'm of course unable to access port 8000 to access the web interface because the system firewall is blocking connections. Besides port 8000, what other ports should I open through the firewall and why isn't this documented on the above page? If anyone has a link to splunk documentation about the ports used, please let me know. I've seen lots of splunk community answers showing different ports, but others say they are user-defined. Like port 9997 for the forwarder to send data to the splunk server... I haven't configured that yet (it wasn't in the above documentation). I see that my splunk server is currently listening on ports 8000, 8089, and 8191, according to the output of "sudo ss -tunlp" tcp LISTEN 0 128 0.0.0.0:8089 0.0.0.0:* users:(("splunkd",pid=1806,fd=4)) tcp LISTEN 0 128 0.0.0.0:8191 0.0.0.0:* users:(("mongod",pid=2285,fd=9)) tcp LISTEN 0 128 0.0.0.0:8000 0.0.0.0:* users:(("splunkd",pid=1806,fd=100)) I tried opening a support case, but apparently I can't do that either. I'm really not sure where to ask this question, or who to ask in order to get the installation documentation updated. If I should post this somewhere else, please let me know. Thank you, Jonathan
We have MFA logs being sent to one of our indexes and the field I'm looking at is as follows:   message: MFA challenge issued for account 1234556. Email is example@example.com   I want to be ... See more...
We have MFA logs being sent to one of our indexes and the field I'm looking at is as follows:   message: MFA challenge issued for account 1234556. Email is example@example.com   I want to be able to count the number of times "MFA challenge issued" occurs in the logs. I'm trying to use the rex command but it's just so confusing at the moment Any help would be great!
Hi I dont understand the goal of the summary range in accelerated search what is the difference with the report range For example if i run my report on the last 30 days and I put 7 days in the ... See more...
Hi I dont understand the goal of the summary range in accelerated search what is the difference with the report range For example if i run my report on the last 30 days and I put 7 days in the summary range what it means exactly? Rgds      
We have dropdown field with four static value. Drop down field value: 1.cloud 2. hana 3. copy2 4. principal for each dropdown static value have separate query,  looking to create dashbo... See more...
We have dropdown field with four static value. Drop down field value: 1.cloud 2. hana 3. copy2 4. principal for each dropdown static value have separate query,  looking to create dashboard with single panel changing dropdown value should change respective search query  output in same panel. 1.cloud -> Query1 2. hana -> Query2 3. copy2 -> Query3 4. principal -> Query4
Hi everyone,   I'm struggling with SplunkDB connect and HEC. I have a monoinstance splunk that has all roles. I have multiple UF and use deployment server with HEC. I'm currently trying to us... See more...
Hi everyone,   I'm struggling with SplunkDB connect and HEC. I have a monoinstance splunk that has all roles. I have multiple UF and use deployment server with HEC. I'm currently trying to use SplunkDB connect on that instance to read some DB data and write it in an index and I keep having the error message : "Unsupported or unrecognized SSL message". I checked the port (8088), seems fine. SSL is enabled on HEC and splunkd. (default parameters)   I wonder if it is possible that splunk db connect uses HEC entry on localhost:8088 if I have deported the HEC entry on the UFs with "useDeploymentServer". Could it explain the ssl error ? I tried to use dbx_settings.conf but it does not seems to use my second entry :     [hec] maxRetryWhenHecUnavailable = 3 hecUris = localhost:8088,splunk-hec.qualgend:8088     Logs sample :     127.0.0.1 - - [03/nov./2022:15:30:08 +0000] "GET /api/taskserver HTTP/1.1" 200 414 "-" "python-requests/2.25.0" 2 2022-11-03 16:30:00.286 +0100 [Scheduled-Job-Executor-4] DEBUG c.s.d.s.dbinput.recordreader.DbInputRecordReader - action=closing_db_reader task=rising_mcipe_qualif 2022-11-03 16:30:00.286 +0100 INFO c.s.dbx.server.task.listeners.JobMetricsListener - action=collect_job_metrics connection=mcipe_qualif jdbc_url=null db_read_time=0 hec_record_process_time=3 format_hec_success_count=69 status=FAILED input_name=rising_mcipe_qualif batch_size=1000 error_threshold=N/A is_jmx_monitoring=false start_time=2022-11-03_04:30:00 end_time=2022-11-03_04:30:00 duration=21 read_count=69 write_count=0 error_count=0 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] INFO org.easybatch.core.job.BatchJob - Job 'rising_mcipe_qualif' finished with status: FAILED 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] ERROR org.easybatch.core.job.BatchJob - Unable to write records javax.net.ssl.SSLException: Unsupported or unrecognized SSL message at java.base/sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:451) at java.base/sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:175) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111) Afficher toutes les 35 lignes 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] ERROR c.s.d.s.dbinput.recordwriter.CheckpointUpdater - action=skip_checkpoint_update_batch_writing_failed javax.net.ssl.SSLException: Unsupported or unrecognized SSL message at java.base/sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:451) at java.base/sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:175) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111) Afficher toutes les 35 lignes 2022-11-03 16:30:00.285 +0100 [Scheduled-Job-Executor-4] ERROR c.s.d.s.task.listeners.RecordWriterMetricsListener - action=unable_to_write_batch javax.net.ssl.SSLException: Unsupported or unrecognized SSL message at java.base/sun.security.ssl.SSLSocketInputRecord.handleUnknownRecord(SSLSocketInputRecord.java:451) at java.base/sun.security.ssl.SSLSocketInputRecord.decode(SSLSocketInputRecord.java:175) at java.base/sun.security.ssl.SSLTransport.decode(SSLTransport.java:111) Afficher toutes les 35 lignes 2022-11-03 16:30:00.279 +0100 [Scheduled-Job-Executor-4] INFO c.s.d.s.dbinput.recordwriter.HttpEventCollector - action=writing_events_via_http_event_collector record_count=69 2022-11-03 16:30:00.279 +0100 [Scheduled-Job-Executor-4] INFO c.s.d.s.dbinput.recordwriter.HttpEventCollector - action=writing_events_via_http_event_collector 2022-11-03 16:30:00.279 +0100 [Scheduled-Job-Executor-4] INFO c.s.dbx.server.dbinput.recordwriter.HecEventWriter - action=write_records batch_size=69       Do you have any idea what I did wrong ? Any clue would be greatly appreciated ! Thanks in advance, Ema
Hello! I need your help please, I need to be able to view the logs for complete processes and not for fractions of these, since when I see them from Splunk, that is, it shows me each event, belongi... See more...
Hello! I need your help please, I need to be able to view the logs for complete processes and not for fractions of these, since when I see them from Splunk, that is, it shows me each event, belonging to the same process, separately in time intervals ( it only happens with a specific sourcetype, the others are fine)
Hi all, we are planing to update to version 9.0.1. and I was wondering if the kv store migration from mmap v1 to WiredTiger  obligatory is? The first sentence from the Splunk documentatiosn says "Sp... See more...
Hi all, we are planing to update to version 9.0.1. and I was wondering if the kv store migration from mmap v1 to WiredTiger  obligatory is? The first sentence from the Splunk documentatiosn says "Splunk Enterprise versions 9.0 and higher require the WiredTiger storage engine and server version 4.2" but I was wondering if something will go wrong if we don't perform the kv store migration before the Splunk upgrade or if we don't perform the migration at all.   Thanks!
Hi All,  The application flow map lists the calls/min, errors/min. When we click on, it lists the Errors but I would like to know where and how this is being captured.  For example:  Name Coun... See more...
Hi All,  The application flow map lists the calls/min, errors/min. When we click on, it lists the Errors but I would like to know where and how this is being captured.  For example:  Name Count Errors/min Internal server error: 500 22 0 xxxxx xx xx The Business Transactions tab lists the BT name and all the other details. It has the drill down option by clicking on view business transaction dashboard, but sometimes, it is not showing the same error counts inside.  Appreciate your inputs to understand this better. 
Hi, i got the below query, and alert should get triggered only when data is not avaiable from any one of the host_ips i gave the time range as 24 hrs to now and alert condition = o and corn expres... See more...
Hi, i got the below query, and alert should get triggered only when data is not avaiable from any one of the host_ips i gave the time range as 24 hrs to now and alert condition = o and corn expression */30 * * * * i am getting mail for every 30 mins, even if data is available. index=advcf request=* host IN(abgc, efgh, jhty, hjyu,kjnb) | eval event_ct=1 | append [| makeresults | eval host="abgc, efgh, jhty, hjyu, kjnb" | rex field=host mode=sed "s/\s+//g" | eval host=split(host,",") | mvexpand host | eval event_ct=0 ] | stats sum(event_ct) AS event_ct BY host | where event_ct=0  
We are looking in setting up a way to test Splunk Dashboard screens, is there any tool that can do this?. We are looking for functional testing and performance testing. The reason we are doing this... See more...
We are looking in setting up a way to test Splunk Dashboard screens, is there any tool that can do this?. We are looking for functional testing and performance testing. The reason we are doing this, is the data is changing and we want to version different Splunk dashboards. So we are looking to set up testing on the dashboards, to see what breaks or not when the data changes.
I have a request from one of our service managers about getting a inventory of all hosts logging into Splunk. Using tstats does get the results we need via | tstats values(host) by host drillin... See more...
I have a request from one of our service managers about getting a inventory of all hosts logging into Splunk. Using tstats does get the results we need via | tstats values(host) by host drilling down per index | tstats values(host) as hosts where index=idxname by index and exporting to a CSV file or emailing the results wont work for our current needs and he would like the exported CSV results to be stored on network drive on a weekly basis, or possibly some other format if that's an option. Not sure if this is possible with the report actions currently available, as I only see webhook, emailing results etc. wondering if there is a way to do this with a addon alert action, or possibly another way?
I have been reviewing the countless other postings on subsearches but I can't pull them all together to figure out our issue.  This first search builds a list of carts that we need to find the cont... See more...
I have been reviewing the countless other postings on subsearches but I can't pull them all together to figure out our issue.  This first search builds a list of carts that we need to find the contents of: index="name" "Authorization was not successful!" AND /placeorder | rex field=_raw "/carts/(?<cart>.+)/placeorder" | dedup cart | table cart This is where I run into issues. I need to take the table created in that search and find all of the items contained in them.  Here is the search for a single cart from that list: index="name" "3322830131/processCheckout" AND "\"paymentProvider\":\"PayPal\"" My thought is that I need to cycle through the table from the subsearch, replacing the number in this search, then finally building a visualization that shows the contents of each cart using the most recent event in the second search.  Am I way off? This seems pretty easy but I can't figure it out. TYIA
I received the following error from splunk team which failed the cloud compatibility check Any suggestions on how to resolve the errors? I have updated the splunk sdk for python packages to la... See more...
I received the following error from splunk team which failed the cloud compatibility check Any suggestions on how to resolve the errors? I have updated the splunk sdk for python packages to latest version Thanks