All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Does each search head cluster need it's own dedicated Deployer server. For example if we have a three server search head cluster called Search Head Cluster A with a Deployer server can we use that sa... See more...
Does each search head cluster need it's own dedicated Deployer server. For example if we have a three server search head cluster called Search Head Cluster A with a Deployer server can we use that same Deployer server for another three server cluster, say Search Head Cluster B
My predecessor installed a Search Head cluster and used the Index Cluster Master as the Deployer server for the Search Head cluster, I would like to move that role to a separate machine, can that be ... See more...
My predecessor installed a Search Head cluster and used the Index Cluster Master as the Deployer server for the Search Head cluster, I would like to move that role to a separate machine, can that be done without having to rebuild the Search Head cluster?
if I have a retention at an index of 90days, but events come in with a broken timestamp that says 2018 or something; what counts for retention now? Indexed time or _time from the event?
Hello,   Is there a way to use transaction command to let us know if an activity/attack is ongoing ? Scenario :  Create a search that detects ongoing DDOS activity I have the following search that ... See more...
Hello,   Is there a way to use transaction command to let us know if an activity/attack is ongoing ? Scenario :  Create a search that detects ongoing DDOS activity I have the following search that will detect DOS activity events and track them using transaction.  I see there is a maxspan option available but there is no minspan .   Even if i schedule this to run every 1h, the maxspan will show those results that are less than 1h too.  Since there is no minspan option, how to make it detect an ongoing activity ?  Hope i am clear My search:   index=arbor ... | transaction eventID startswith=starting endswith=end maxspan=1h | eval starttime = _time | eval duration = "Ongoing" | convert ctime(starttime) | table starttime, duration, condition   Maybe my above approach is wrong. How else can we accomplish this?  
I have a real time Splunk index pushing records into two source types. Source type 1 holds fields including assignmentgroup, manager name , entity etc. Source type 2 hold fields including ticketnumbe... See more...
I have a real time Splunk index pushing records into two source types. Source type 1 holds fields including assignmentgroup, manager name , entity etc. Source type 2 hold fields including ticketnumber , assignmentgroup,priority etc. Sourcetype 2 has tickets updates coming in and each ticket can move from one assignmentgroup to another assignmentgroup which may or may not be present in Source type 1 I would like to find out how many tickets are there in Sourcetype 2 which moved out from assignmentgroups of Sourcetype 1? In other words, how many tickets are present in Sourcetype 2 whose assignmentgroup doesnt belong to the assignmentgroup present in Source type 1.  Any leads would be helpful. TIA! Just an update, this sourcetype 1 is actually pushed to a lookup file (that has same collumns as in Source type 1, Hence , I intend to use this lookup in the search query)
Hello Experts, I am using Splunk Dashboard Studio with Splunk Enterprise version 8.6.2. I have a simple table in my dashboard showing some search results,  no scrollbars seen in the dashboard its... See more...
Hello Experts, I am using Splunk Dashboard Studio with Splunk Enterprise version 8.6.2. I have a simple table in my dashboard showing some search results,  no scrollbars seen in the dashboard itself, works fine. When I Download the dashboard as PDF ( or PNG ), the table has a horizontal scroll bar even though the data displayed is small enough to fit in to the table. I have tried using "Overflow" : "None" , overflow-x :None, fontsize properties in the code editor, the PDF export still has the scrollbar. Attaching some screenshot dashboard screenshot with no horizontal scroll bar Same dashboard after using download as PDF Any ideas on how to remove this ? Appreciate your help in advance. Regards
Hi All, I have a fresh installed splunk with splunk ITSI running app that I will use for Proof of Concept in the customer, so in this case i used Sales Trial license for Splunk Enterprise and Splun... See more...
Hi All, I have a fresh installed splunk with splunk ITSI running app that I will use for Proof of Concept in the customer, so in this case i used Sales Trial license for Splunk Enterprise and Splunk IT Service Intelligence Trial license for splunk ITSI. For the splunk enterprise side there is no issue I can ingest the data needed. Now, the case is when I choose the ITSI app, the first dashboard show me like this picture below with notification Unable to retrieve subscription data I don't know what subscription data means. And when i tried to choose another menu on splunk ITSI app, I got internal Server Error - 500 response code   I want to know if anybody here have experienced the same thing as me, please tell me how you can fix it. Thanks      
What are the best HEC perf tuning configs?
Trying to get the SSL monitoring Extension working and I am seeing this error: unable to load certificate 140343873517384:error:0906D06C:PEM routines:PEM_read_bio:no start line:pem_lib.c:703:Expect... See more...
Trying to get the SSL monitoring Extension working and I am seeing this error: unable to load certificate 140343873517384:error:0906D06C:PEM routines:PEM_read_bio:no start line:pem_lib.c:703:Expecting: TRUSTED CERTIFICATE I am installing on Linux  Thanks
Hi everyone, I would like to know if it is possible to export the alerts created in the splunk cloud instance. I want to export the queries for each alert at once. It's possible? Regards
Hi, I have a timechart with the revenue of several shops (each shop is a field) over the month. I want to know the accumulate revenue of each shop over time so that if a shop earned 5$ on monday an... See more...
Hi, I have a timechart with the revenue of several shops (each shop is a field) over the month. I want to know the accumulate revenue of each shop over time so that if a shop earned 5$ on monday and 7$ on tuesday then on tuesday the graph will show 12$.  I know that the command accum does that for a given field but I don't know ahead how many fields there will be. Example: A B C A B C 8 3 5 -> 8 3 5 6 7 4 14 10 9 2   5    9                16   15   18   This is my code until now: <something> | timechart span=1d sum(revenue) by shop | accum A | accum B | accum C   The goal is for the fields to be dynamic and not hardcoded! Thank you
is there a best practice search to find the last event sent at the start of an outage and the first event the come in after the outage for a specific data source was rectified? Basically what is the ... See more...
is there a best practice search to find the last event sent at the start of an outage and the first event the come in after the outage for a specific data source was rectified? Basically what is the best was to identify the outage window in one search?
Hi All, I am trying to use a drop-down to run a search based on whats selected currently (in the drop-down) in Dashboard Studio Here is what I have so far : A drop-down that's populated by an... See more...
Hi All, I am trying to use a drop-down to run a search based on whats selected currently (in the drop-down) in Dashboard Studio Here is what I have so far : A drop-down that's populated by an input lookup table. There are 4 distinct searches. Every time an item is selected in the drop-down the associated search needs to run and the data displayed needs to change. The searches are very different from one another so passing a value in a $token$ will not be enough. What can I do to show data in the form of a table based on the item selected in the drop-down ?  
hello there i want to search the list of users whose account was disabled with their Account name  and make it as report
index=abc | stats latest(_time) AS Last_time by day | convert ctime(Last_time) | sort by Last_time desc   for example,  Monday 06/13/2022 13:03:11 Tuesday 06/13/2022 13:03:11 ... See more...
index=abc | stats latest(_time) AS Last_time by day | convert ctime(Last_time) | sort by Last_time desc   for example,  Monday 06/13/2022 13:03:11 Tuesday 06/13/2022 13:03:11 Wednesday 06/13/2022 13:03:11 Thursday 06/13/2022 13:03:11 Friday 06/12/2022 13:03:11 Saturday 06/13/2022 13:03:11 Sunday 06/13/2022 13:03:11   I want the search to return 0 // or something else if there was no event today. Monday 06/13/2022 13:03:11 Tuesday 06/13/2022 13:03:11 Wednesday 06/13/2022 13:03:11 Thursday 06/13/2022 13:03:11 Friday 0 // or something else Saturday 06/13/2022 13:03:11 Sunday 06/13/2022 13:03:11   Is that possible. 
Hello from Splunk Cloud Monitoring Console (CMC) Team, The Cloud Monitoring Console (CMC) app comes preinstalled with all Splunk Cloud stacks and is accessible by customers with Splunk cloud admin r... See more...
Hello from Splunk Cloud Monitoring Console (CMC) Team, The Cloud Monitoring Console (CMC) app comes preinstalled with all Splunk Cloud stacks and is accessible by customers with Splunk cloud admin role. CMC helps Splunk cloud administrators (sc_admins) to identify performance issues, view license usage, and gather general information about their stack (e.g. user details). We bring new capabilities to CMC regularly and would love to answer any questions you might have about the CMC app. Before you search through previous conversations looking for assistance, we want to provide you with some basic information and quick resources. Want to access product documentation? CMC docs offer detailed guidance on each stage of using the Cloud Monitoring Console (CMC).  Want to find out what the latest releases of CMC contain? CMC Release Notes  contain information about new features, known issues, and issues resolved for the Cloud Management Console (CMC) app, grouped by release version and the generally available release date. Want to request more features? Add your ideas and vote on other ideas at CMC ideas Portal Please reply to this thread for any questions or get extra help!
Thanks in Advance,  I have a search setup to see whenever someone access's a certain document. This works just fine, the issue comes with the results. Looking at the Extracted Fields, i get the use... See more...
Thanks in Advance,  I have a search setup to see whenever someone access's a certain document. This works just fine, the issue comes with the results. Looking at the Extracted Fields, i get the users "Sid" instead of their username. I do however have Splunk Supporting Add-On for Active Directory, and have it configured. I have a report that pulls a CSV (users.csv) that gives me everyones sAMAccountName as well as their SIDs' and puts it in the location of my Lookup Table.  Trying to figure out how to get the |inputlookup     to compair the search results Sid with my excel doc and give me the AccountName in that specific Row as well. Any help?   I have this ( minus the output to create my users.csv) |ldapsearch search="(&(objectclass=user)(!(objectClass=computer)))" attrs="userAccountControl,sAMAccountName,objectSid,displayName,givenName,sn,mail,telephoneNumber,mobile,manager,department,whenCreated,accountExpires" |makemv userAccountControl |search userAccountControl="NORMAL_ACCOUNT" |eval suffix="" |eval endDate="" |table sAMAccountName,objectSid,displayName,givenName,sn,whenCreated,   and my main search source="WinEventLog:Microsoft-Windows-AppLocker/EXE and DLL" NOT %SYSTEM32*   Just need a input to get my results Sid to look at the Excel find the SID in the "objectSid" ( column B ) and give me the sAMAccountName(columnA) into my search results...   IF POSSIBLE!
Hi all, I encountered the problem in MLTK that the data from the search is passed in multiple chunks to my custom classifier (using the apply command). Interestingly enough, the fit command passes ... See more...
Hi all, I encountered the problem in MLTK that the data from the search is passed in multiple chunks to my custom classifier (using the apply command). Interestingly enough, the fit command passes the entire dataframe to the apply method of the custom classifier as shown below. Search with apply index=test | apply model PID 10489 2022-06-12 22:35:13,604 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 50 PID 10489 2022-06-12 22:35:13,730 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 205 PID 10489 2022-06-12 22:35:13,821 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 41 Search with fit command index=test | fit ECOD date_hour into model PID 8345 2022-06-12 22:27:50,867 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 296 The second one is the behavior I want since I need the data as a single batch. Setting "chunked=false" in the commands.conf to use the legacy protocol does not work because MLTK is not compatible with v1. Setting "streaming=false"  also has no effect. Does anyone know how I can prevent Splunk from splitting the data in multiple chunks? Any help is appreciated! Thanks. 
Hi, We have setup SNMP Modular Input to begin ingesting traps. Traps are hitting the listener, but upon ingestion, the SNMPv2-SMI::enterprises.6876.4.3.306.0 portion is unreadable: We have... See more...
Hi, We have setup SNMP Modular Input to begin ingesting traps. Traps are hitting the listener, but upon ingestion, the SNMPv2-SMI::enterprises.6876.4.3.306.0 portion is unreadable: We have validated the MIBs are converted into proper python format. Any thoughts? Thanks!
Hello, I have some use cases where we need to delete files right after those are read/push by UF. How I would do it. There are any ways we may let the UF to do this task using batch in inputs.conf ... See more...
Hello, I have some use cases where we need to delete files right after those are read/push by UF. How I would do it. There are any ways we may let the UF to do this task using batch in inputs.conf file. Any recommendation would be highly appreciated, thank you!