All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I have a timechart with the revenue of several shops (each shop is a field) over the month. I want to know the accumulate revenue of each shop over time so that if a shop earned 5$ on monday an... See more...
Hi, I have a timechart with the revenue of several shops (each shop is a field) over the month. I want to know the accumulate revenue of each shop over time so that if a shop earned 5$ on monday and 7$ on tuesday then on tuesday the graph will show 12$.  I know that the command accum does that for a given field but I don't know ahead how many fields there will be. Example: A B C A B C 8 3 5 -> 8 3 5 6 7 4 14 10 9 2   5    9                16   15   18   This is my code until now: <something> | timechart span=1d sum(revenue) by shop | accum A | accum B | accum C   The goal is for the fields to be dynamic and not hardcoded! Thank you
is there a best practice search to find the last event sent at the start of an outage and the first event the come in after the outage for a specific data source was rectified? Basically what is the ... See more...
is there a best practice search to find the last event sent at the start of an outage and the first event the come in after the outage for a specific data source was rectified? Basically what is the best was to identify the outage window in one search?
Hi All, I am trying to use a drop-down to run a search based on whats selected currently (in the drop-down) in Dashboard Studio Here is what I have so far : A drop-down that's populated by an... See more...
Hi All, I am trying to use a drop-down to run a search based on whats selected currently (in the drop-down) in Dashboard Studio Here is what I have so far : A drop-down that's populated by an input lookup table. There are 4 distinct searches. Every time an item is selected in the drop-down the associated search needs to run and the data displayed needs to change. The searches are very different from one another so passing a value in a $token$ will not be enough. What can I do to show data in the form of a table based on the item selected in the drop-down ?  
hello there i want to search the list of users whose account was disabled with their Account name  and make it as report
index=abc | stats latest(_time) AS Last_time by day | convert ctime(Last_time) | sort by Last_time desc   for example,  Monday 06/13/2022 13:03:11 Tuesday 06/13/2022 13:03:11 ... See more...
index=abc | stats latest(_time) AS Last_time by day | convert ctime(Last_time) | sort by Last_time desc   for example,  Monday 06/13/2022 13:03:11 Tuesday 06/13/2022 13:03:11 Wednesday 06/13/2022 13:03:11 Thursday 06/13/2022 13:03:11 Friday 06/12/2022 13:03:11 Saturday 06/13/2022 13:03:11 Sunday 06/13/2022 13:03:11   I want the search to return 0 // or something else if there was no event today. Monday 06/13/2022 13:03:11 Tuesday 06/13/2022 13:03:11 Wednesday 06/13/2022 13:03:11 Thursday 06/13/2022 13:03:11 Friday 0 // or something else Saturday 06/13/2022 13:03:11 Sunday 06/13/2022 13:03:11   Is that possible. 
Hello from Splunk Cloud Monitoring Console (CMC) Team, The Cloud Monitoring Console (CMC) app comes preinstalled with all Splunk Cloud stacks and is accessible by customers with Splunk cloud admin r... See more...
Hello from Splunk Cloud Monitoring Console (CMC) Team, The Cloud Monitoring Console (CMC) app comes preinstalled with all Splunk Cloud stacks and is accessible by customers with Splunk cloud admin role. CMC helps Splunk cloud administrators (sc_admins) to identify performance issues, view license usage, and gather general information about their stack (e.g. user details). We bring new capabilities to CMC regularly and would love to answer any questions you might have about the CMC app. Before you search through previous conversations looking for assistance, we want to provide you with some basic information and quick resources. Want to access product documentation? CMC docs offer detailed guidance on each stage of using the Cloud Monitoring Console (CMC).  Want to find out what the latest releases of CMC contain? CMC Release Notes  contain information about new features, known issues, and issues resolved for the Cloud Management Console (CMC) app, grouped by release version and the generally available release date. Want to request more features? Add your ideas and vote on other ideas at CMC ideas Portal Please reply to this thread for any questions or get extra help!
Thanks in Advance,  I have a search setup to see whenever someone access's a certain document. This works just fine, the issue comes with the results. Looking at the Extracted Fields, i get the use... See more...
Thanks in Advance,  I have a search setup to see whenever someone access's a certain document. This works just fine, the issue comes with the results. Looking at the Extracted Fields, i get the users "Sid" instead of their username. I do however have Splunk Supporting Add-On for Active Directory, and have it configured. I have a report that pulls a CSV (users.csv) that gives me everyones sAMAccountName as well as their SIDs' and puts it in the location of my Lookup Table.  Trying to figure out how to get the |inputlookup     to compair the search results Sid with my excel doc and give me the AccountName in that specific Row as well. Any help?   I have this ( minus the output to create my users.csv) |ldapsearch search="(&(objectclass=user)(!(objectClass=computer)))" attrs="userAccountControl,sAMAccountName,objectSid,displayName,givenName,sn,mail,telephoneNumber,mobile,manager,department,whenCreated,accountExpires" |makemv userAccountControl |search userAccountControl="NORMAL_ACCOUNT" |eval suffix="" |eval endDate="" |table sAMAccountName,objectSid,displayName,givenName,sn,whenCreated,   and my main search source="WinEventLog:Microsoft-Windows-AppLocker/EXE and DLL" NOT %SYSTEM32*   Just need a input to get my results Sid to look at the Excel find the SID in the "objectSid" ( column B ) and give me the sAMAccountName(columnA) into my search results...   IF POSSIBLE!
Hi all, I encountered the problem in MLTK that the data from the search is passed in multiple chunks to my custom classifier (using the apply command). Interestingly enough, the fit command passes ... See more...
Hi all, I encountered the problem in MLTK that the data from the search is passed in multiple chunks to my custom classifier (using the apply command). Interestingly enough, the fit command passes the entire dataframe to the apply method of the custom classifier as shown below. Search with apply index=test | apply model PID 10489 2022-06-12 22:35:13,604 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 50 PID 10489 2022-06-12 22:35:13,730 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 205 PID 10489 2022-06-12 22:35:13,821 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 41 Search with fit command index=test | fit ECOD date_hour into model PID 8345 2022-06-12 22:27:50,867 INFO [mlspl.ECOD Logger] [apply] Length of dataframe: 296 The second one is the behavior I want since I need the data as a single batch. Setting "chunked=false" in the commands.conf to use the legacy protocol does not work because MLTK is not compatible with v1. Setting "streaming=false"  also has no effect. Does anyone know how I can prevent Splunk from splitting the data in multiple chunks? Any help is appreciated! Thanks. 
Hi, We have setup SNMP Modular Input to begin ingesting traps. Traps are hitting the listener, but upon ingestion, the SNMPv2-SMI::enterprises.6876.4.3.306.0 portion is unreadable: We have... See more...
Hi, We have setup SNMP Modular Input to begin ingesting traps. Traps are hitting the listener, but upon ingestion, the SNMPv2-SMI::enterprises.6876.4.3.306.0 portion is unreadable: We have validated the MIBs are converted into proper python format. Any thoughts? Thanks!
Hello, I have some use cases where we need to delete files right after those are read/push by UF. How I would do it. There are any ways we may let the UF to do this task using batch in inputs.conf ... See more...
Hello, I have some use cases where we need to delete files right after those are read/push by UF. How I would do it. There are any ways we may let the UF to do this task using batch in inputs.conf file. Any recommendation would be highly appreciated, thank you!
Hello, Not super familiar with Splunk yet but I have the following scenario. 1 - Several applications on Public Cloud Provider 2 - Heavy Forwarder deployed on Public Cloud Provider 3 - Splunk... See more...
Hello, Not super familiar with Splunk yet but I have the following scenario. 1 - Several applications on Public Cloud Provider 2 - Heavy Forwarder deployed on Public Cloud Provider 3 - Splunk Cloud   The applications log volume on cloud is huge but not related and not wanted in the SIEM but we cannot filter data on the source itself. Is it possible to receive the full volume on the Heavy Forwarder and the HF selects and DISCARDS data before sending to Splunk Cloud? Or maybe we can configure HF to query the log sources for a specific string and bring only what we want?   Thank you!
In this scenario, each HOST_NAME has many HOME_LOCATIONS. Each HOME_LOCATION has unique info - in this case, the RDBMS_VERSION and the DATABASE_RELEASE. I am trying to produce a simple statistics t... See more...
In this scenario, each HOST_NAME has many HOME_LOCATIONS. Each HOME_LOCATION has unique info - in this case, the RDBMS_VERSION and the DATABASE_RELEASE. I am trying to produce a simple statistics table that shows each unique HOME_LOCATION (and accompanying info) for each HOST_NAME.  -------------------- When I run the below (1st screen shot) the data is aligned as I'd expect it to | stats values(HOME_LOCATION) values(RDBMS_VERSION) by HOST_NAME When I run the below (2nd screen shot) and add the third values field in red, the data becomes misaligned for some rows | stats values(HOME_LOCATION) values(RDBMS_VERSION) values(DATABASE_RELEASE) by HOST_NAME What am I missing or doing incorrectly?
Hi, We have Server Visibility enabled, and I can see the processes for nodes. We want to monitor if a process is running or not. Is there some trick to finding the metric in the metric browser? Wit... See more...
Hi, We have Server Visibility enabled, and I can see the processes for nodes. We want to monitor if a process is running or not. Is there some trick to finding the metric in the metric browser? With 5-8 thousand nodes it's extremely difficult to navigate in the UI. Does someone have a base metric starting point or some trick to copy the metric URL (similar to BTs and other stuff)?  Thanks Chris 
I was trying the mentioned operation but not getting the expected result. 1. need ID from sub search which is  the join parameter 2. apply stats on the outer query Stats are getting applied to ... See more...
I was trying the mentioned operation but not getting the expected result. 1. need ID from sub search which is  the join parameter 2. apply stats on the outer query Stats are getting applied to inner query instead index=* methodName=* success=true | join ID [|SEARCH index=* Success=true ]|stats count(eval(Success="true")) as SuccessRate,count(eval(Success="false")) as FailureRate by Action Appreciate quick help on this
Hello, I am interested in knowing the best storage option among these three (DAS, NAS or SAN) when you want to store the data from indexers for the long term.   Thanks, 
The query that is generated by splunk is quite convoluted and I would like to provide my own query for this "Open In Search" on 1 of the panels in my dashboard. Is it possible to do so?   edit: C... See more...
The query that is generated by splunk is quite convoluted and I would like to provide my own query for this "Open In Search" on 1 of the panels in my dashboard. Is it possible to do so?   edit: Corrected to "Open in Search"
Hi Team, We are constantly getting below errors in forwarders splukd.log ERROR TCPOutputQ - Unexpected event id=4 ERROR TCPOutputQ - Unexpected event id=7 However we have observed data is get... See more...
Hi Team, We are constantly getting below errors in forwarders splukd.log ERROR TCPOutputQ - Unexpected event id=4 ERROR TCPOutputQ - Unexpected event id=7 However we have observed data is getting ingested to splunkindexers with out any issue. can any one please help us to understand what exactly this error is related to   With Regards, Krishna.
I have my Sonicwall logfiles coming into Splunk. By searching this index I want to replace "dst" (Destination IP address) without portnumber and interface with (for example) RegEx. Note that the form... See more...
I have my Sonicwall logfiles coming into Splunk. By searching this index I want to replace "dst" (Destination IP address) without portnumber and interface with (for example) RegEx. Note that the formats used for "src" and "dst" = (ip address):(port number):(interface) So when I do a search like (NOTE: the red sentence is my own attempt, however, it does not give a result I had in mind.): index=sonicwall msg="Connection Opened" OR msg="Connection Closed" earliest=-2m latest=-1m | eval dst=if(match(dst, "\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3}:\d{1,5}:X\d{1}"), dst, replace(dst, "(\d{1,3}.\d{1,3}.\d{1,3}.\d{1,3}):\d{1,5}:X\d{1}","\1")) | stats first(_time) as _time by src dst proto msg | inputlookup append=t firewall_open_connections | fillnull msg value="Connection Opened" | eval closed=if(msg="Connection Closed",_time,"1") | eval open=if(msg="Connection Opened",_time,"1") | stats first(open) as open first(closed) as closed by src dst proto | where open > closed | rename open as _time | fields src dst proto _time | outputlookup firewall_open_connections Results in: src dst proto _time 10.0.1.5:50492:X2 8.8.8.8:53:X1 udp/dns 2022-06-14 15:40:08 192.168.1.100:37016:X0 54.81.233.206:443:X1 tcp/https 2022-06-14 15:39:01 192.168.1.100:38376:X0 104.244.42.130:443:X1 tcp/https 2022-06-14 14:49:14 192.168.1.100:38611:X0 172.217.132.170:443:X1 udp/https 2022-06-14 15:37:51   Now I would like the "dst" results to be stripped of :(port number):(interface)or :(interface). In other words, only the IP address should remain How do I do that within my query in Splunk with for example RegEx (or another method)? Any tip is welcome, am very new to Splunk.
I would like to extract a specific part of data from its raw data, The data that is to be extracted is ID, Which is highlighted, "aepassword": "kmdAkcu)n>Ec_.a(m5P7?8-n", "aeci": { "outgoing_ser... See more...
I would like to extract a specific part of data from its raw data, The data that is to be extracted is ID, Which is highlighted, "aepassword": "kmdAkcu)n>Ec_.a(m5P7?8-n", "aeci": { "outgoing_server": "mailrv.aaa.com", "email_footer": "C:\\ProgramData\\bbb\\AutomationNote\\Email\\aa_Mail_Footer.png", "email_header": "C:\\ProgramData\\bbb\\AutomationNote\\Email\\aa_Mail_Header.png", "signature": "C:\\ProgramData\\bbb\\Automation\\Email\\bb_Email_Signature.txt", "requires_authentication": "false", "reply-to": "us@aaa.com", "primaryaddress": "ussdev@aaa.com", "host": "ussdev@bbb.com", "entity_alternate_names": "usdev@aaa.com", "outgoing_port": "2675", "entityid": "wmid-1607548215055521", "name": "bbb_MailBox", "entitytype": "Sub-System", "entitytype": "Workplace", "technology": "O736i85", "tenantid": 1000011, "cloudprovider": "", "satellite": "sat-16107579705752592", "resourceid": null, "UDetails": { "creds": { "email": "NA" }, "id": 14, "name": "N/A" }, "encryptionKey": "5inqhg7ckj7klk2w4osk0", "user": { "id": 5, "name": "CRI Admin", "employeecode": "125", "email": "admin@aaa.com" },
Hi all, i have some data task name, execution date, link uploaded earlier. Now i want to add some more data related to the task name they are component name, number of components. If i upload the 2nd... See more...
Hi all, i have some data task name, execution date, link uploaded earlier. Now i want to add some more data related to the task name they are component name, number of components. If i upload the 2nd data in the form task name , component name, number of components will i be able to get all data together based on one common field task name. Can anyone knows is there any solution for this? My data are task name, execution date, link and the next set of data  is task name , component name, number of components.