All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I would like to update my universal forwarders to send data to 2 separate endpoints for 2 separate splunk environments.  How can I do this using my Deployment Server.  I already have an App that I wi... See more...
I would like to update my universal forwarders to send data to 2 separate endpoints for 2 separate splunk environments.  How can I do this using my Deployment Server.  I already have an App that I will use for UF update.
Hi, Before asking i did try to find but not able to locate the thread that has this kind of datetime values..so i had to come up with this new thread I have the datetime values in string format like... See more...
Hi, Before asking i did try to find but not able to locate the thread that has this kind of datetime values..so i had to come up with this new thread I have the datetime values in string format like Thu 10 Oct 2024 08:48:12:574 EDT   sometimes there may be a null in it - thats how it is  what is that i have to do with this is get/derive into separate columns day name like Thursday day of month like 10 month like Oct year 2024 week - weeknumber like 2 or 3 Time part into separate column like 08:48:12:57  - not worried about EDT separate the time components into again 08 as Hour 48 as Min 12 as Sec not worried about ms still looking for threads with this kind of but...again sorry this is a basic one just needs more searching
Am having trouble getting a .json file into splunk through the backend to help support a customized dashboard. Is there a particular step i need to follow to get it in through the deployer?
I need a query that lists URLs a particular host has reached out in a particular time e.g in the last 24 hours. Please help
I have onboarded data from a system,  that scatters actual events over many logging events. Especially successful or failed logins cause me some headache. Successful login: <timestamp> Connection '... See more...
I have onboarded data from a system,  that scatters actual events over many logging events. Especially successful or failed logins cause me some headache. Successful login: <timestamp> Connection 'id123' from '192.168.1.100' has logged onto the server
 <timestamp> User 'johndoe' logged on (Connection id='id123')
 [ Time passes until John eventually decides to logoff again] 
<timestamp> Connection 'id123' from has logged off the server Failed login: <timestamp> Connection 'id123' from '192.168.1.100' has logged onto the server
 <timestamp> Connection 'id123' from has logged off the server   Of course, I can fiddle around with transaction or even stats or whatever to list successful and failed logins or create an alert for it. However that is absolutely not elegant. What is best practice, to get those data nicely streamlined with eventtypes and tags?
Where can I download the installer for Splunk Enterprise 9.2.1?  
i am trying to verify  the username from editing the code but i do not know where to submit the code. i checked the domumentation but it only advice on how to edit the code but it does not mention wh... See more...
i am trying to verify  the username from editing the code but i do not know where to submit the code. i checked the domumentation but it only advice on how to edit the code but it does not mention where to sbmit the code. 
Hi, I have large number of queries which needs to be created as metrics in Analytics (because we can't retain data more than 8 days in Analytics, so making metrics to retain it). Is there any tool/... See more...
Hi, I have large number of queries which needs to be created as metrics in Analytics (because we can't retain data more than 8 days in Analytics, so making metrics to retain it). Is there any tool/API or CURL command we can use to create these metrics by providing Query and Metrics name as payload/arguments? Creating them manually is error prone and time taking
I am looking to append a value in a lookup csv to an existing search index=* |fields _time,x |chart count(_raw) by X and I want to replace(or append) the X with a value(name) from a csv so I can ... See more...
I am looking to append a value in a lookup csv to an existing search index=* |fields _time,x |chart count(_raw) by X and I want to replace(or append) the X with a value(name) from a csv so I can table the results.
Dear All, Need your help. We have achieved the visualization shown in image 1.  But I'm expecting the results as shown in image 2(semicircle donut or pie chart).   Thanks in adva... See more...
Dear All, Need your help. We have achieved the visualization shown in image 1.  But I'm expecting the results as shown in image 2(semicircle donut or pie chart).   Thanks in advance 
I recently upgraded Splunk Enterprise from version 9.1.0.2 to 9.3.1, and I've encountered an issue where the menu bar is no longer visible in the Search and Reporting UI. Issue Details: Previous V... See more...
I recently upgraded Splunk Enterprise from version 9.1.0.2 to 9.3.1, and I've encountered an issue where the menu bar is no longer visible in the Search and Reporting UI. Issue Details: Previous Version: 9.1.0.2 Current Version: 9.3.1 Issue: The menu bar has disappeared, and to access menus, users must utilize the 'Find box' in the top right corner. For example, if a user wants to view dashboards, they need to type "dashboards" into the search box and select it from the results. Screenshots:  Before Upgrade (9.1.0.2) Before Upgrade (9.1.0.2) with Menubar After Upgrade (9.3.1) After Upgrade (9.3.1)- No menu bart Request: Is there a way to restore the traditional menu bar in the Search and Reporting window?  Thank you
Hello, Hello, How do I send email alert if  one or more subsearch exceed 50000 results? For example below I have 4 subsearch.   if subsearch 1 and 4 exceed 50000, I would like to get an email al... See more...
Hello, Hello, How do I send email alert if  one or more subsearch exceed 50000 results? For example below I have 4 subsearch.   if subsearch 1 and 4 exceed 50000, I would like to get an email alert stating that subsearch 1 and 4 exceed 5000. Please suggest  Thank you so much. | base search [| subsearch 1] [| subsearch 2] [| subsearch 3] [| subsearch 4] 
Hello, How do I change the font size on the column chart in the Splunk Dashboard Studio See below the 170500 is overlapping with 170400.  How do I display 0 on the column chart?   0 is ignored. ... See more...
Hello, How do I change the font size on the column chart in the Splunk Dashboard Studio See below the 170500 is overlapping with 170400.  How do I display 0 on the column chart?   0 is ignored. Please suggest. Thank you for your help.    
Hello, I developed a Splunk add-on that is working well.  I attempted to set up several event types and data model mapping, but the add-on builder page fails to load after creating the event typ... See more...
Hello, I developed a Splunk add-on that is working well.  I attempted to set up several event types and data model mapping, but the add-on builder page fails to load after creating the event types. It never loads the model mapping page, then displays a blank page with no event types even though they are present in the system. I can see the data models and the event types in the system--just not in the add-on builder. I've attached a screenshot for reference. Any ideas? I noticed that developer tools indicates a 500 error for get_eventtype_info and get_model_tree  
Looking to see if Splunk has the ability to highlight a row in an output table based on a value in that row in a dashboard using dashboard studio.    Created a dashboard to show printers using a look... See more...
Looking to see if Splunk has the ability to highlight a row in an output table based on a value in that row in a dashboard using dashboard studio.    Created a dashboard to show printers using a lookup and number of print logs associated to a printer that is pulled from indexed print logs. I know how to highly a single row value based on a condition but wanted to know if the whole row can be highlighted using the output in the row: I used the color and style option to set conditions of the jobs field to highlight if print count = 0 Printer Jobs Prints Pntr_01 149 285 Pntr_02 25 78 Pntr_03 0   Pntr_04 75 528 Pntr_05 85 149 Pntr_06 0     Would like to highlight the printer name in red as well if the value = 0 Printer Jobs Prints Pntr_01 149 285 Pntr_02 25 78 Pntr_03 0   Pntr_04 75 528 Pntr_05 85 149 Pntr_06 0     I searched Splunk community as well as other areas of the Splunk matrix with no luck.   If someone has some insight or reference if this can be done, it would be greatly appreciated.  Thanks  
I would like to apply the custom style to a set of inputs. How do I correctly write this code?  I'm aware of the option to create one style clause for each input ID but this seems ridiculous and t... See more...
I would like to apply the custom style to a set of inputs. How do I correctly write this code?  I'm aware of the option to create one style clause for each input ID but this seems ridiculous and the wrong way to do it for, say, 20 inputs. Cheers.     <form version="1.1" theme="light"> <fieldset submitButton="false"> </fieldset> <row> <panel> <html> <style> #LineByLine { display:flex !important; padding-right: 10px; padding-top: 5px; } </style> </html> </panel> </row> <row> <panel> <input id="input1" type="text" token="1"> <label>1</label> </input> </panel> <panel> <input id="input2" type="text" token="2"> <label>2</label> </input> </panel> </row> </form>      
Hello, we are on Splunk 9.3 on prem. I am unable to remove a server from the Splunk forwarder management list, after it has been decommissioned and the Universal Forwarder is uninstalled.  I get an ... See more...
Hello, we are on Splunk 9.3 on prem. I am unable to remove a server from the Splunk forwarder management list, after it has been decommissioned and the Universal Forwarder is uninstalled.  I get an error stating that the DELETE option is depreciated, but what has it been replaced with? I have a server that has not logged to Splunk in 9 days (and never will again), how do I remove it correctly?  (screenshot attached)  
Looking for help running a stats count and stats count sum referencing a lookup using print logs.  Looking to output all printers from a lookup to give "total job" count counting each record in the q... See more...
Looking for help running a stats count and stats count sum referencing a lookup using print logs.  Looking to output all printers from a lookup to give "total job" count counting each record in the query for a single printer and giving a "total page" count for all pages that was printed for each printer listed in lookup.    Logs from my index  date                      printer_name           user            pages_printed 2024_10_09    prnt_01                        user1            10 2024_10_09    prnt_02                        user4            15 2024_10_09    prnt_01                        user6            50 2024_10_09    prnt_04                        user9            25 2024_10_09    prnt_01                        user2            20 Data from my lookup file name: printers.cvs printer_name        printer_location prnt_01                      main office prnt_02                      front desk prnt_03                      breakroom prnt_04                      hallway Looking for an output to give me results similar to what I provided below Printer Name      Location            Print Jobs                Pages Printed prnt_01                  main office       3                                   80 prnt_02                  front desk         1                                   15 prnt_03                  breakroom       0                                   25 prnt_04                  hallway              1                                   25 I have two separate queries for both respectively and having issues merging them together.  My individual queries are: Working query that gives me job count with sum of total jobs and total pages   index=printer sourcetype=printer:logs | stats count sum(pages_printed) AS pages_printed by printer_name, | lookup printers.csv printer_name AS printer_name OUTPUT printer_location | table printer_name, printer_location, count, pages_printed | rename printer_name AS "Printer Name", printer_location AS "Location", count AS "Print Job", pages_printed AS "Pages Printed", Results Printer Name      Location            Print Jobs                Pages Printed prnt_01                  main office       3                                   80 prnt_02                  front desk         1                                   15 prnt_04                  hallway              1                                    25 Working query that gives me list of all printers and job count index=printer sourcetype=printer:logs | eval printer_name=lower(printer_name) | stats count BY printer_name | append [| inputlookup printers.csv | eval printer_name=lower(printer_name), count=0 | fields printer_name count] | stats sum(count) AS print_jobs by printer_name | table printer_name, total | rename printer_name AS "Printer Name", print_jobs AS "Print Job" Results Printer Name      Print Jobs                 prnt_01                 3                                   prnt_02                 1                                    prnt_04                 1                                Again, trying to merge the two to give me Printer Name, Location, # of print jobs and total pages printed.  Any assistance will be greatly appreciated.
I recently upgraded my deployment from a 9.0.3 to 9.2.2. After the upgrade, the KV stopped working. Based on my research, i found that the kv store version reverted to version 3.6 after the upgrade c... See more...
I recently upgraded my deployment from a 9.0.3 to 9.2.2. After the upgrade, the KV stopped working. Based on my research, i found that the kv store version reverted to version 3.6 after the upgrade causing the kvstore to fail. "__wt_conn_compat_config, 226: Version incompatibility detected: required max of 3.0cannot be larger than saved release 3.2:" I looked through the bin directory and found 2 versions for mongod.  1.mongod-3.6 2.mongod-4.6 3.mongodump-3.6 Will removing the mongod-3.6  and mongodump-3.6 from the bin directory resolve this issue?
Hello, This is the result from one of my rows in Search & Reporting (Web). Job Code 039081934400000 (4) 082441325900000 (199)   However, when my code is used in a classic dashboard the results ... See more...
Hello, This is the result from one of my rows in Search & Reporting (Web). Job Code 039081934400000 (4) 082441325900000 (199)   However, when my code is used in a classic dashboard the results are this.   Job Code 039081934400000 (4) 082441325900000 (199)   How do I control my dashboard output to display like my search output?     | inputlookup job_codes_2024.csv ```all fields in the lookup above begin with the letter j, except for the field cntrl``` | foreach j* ```add line feed at the end of all fields beginning with the letter j``` [| rex field=<<FIELD>> mode=sed "s/$/\n/g"] ```group all fields by the cntrl value``` | stats values(*) as * by cntrl     Thanks and God bless, Genesius