All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Good morning to all, I have a newbie question. I know I’m missing something simple, wondering if someone could point me in the right direction. I currently use Syslog as an input stream and create ... See more...
Good morning to all, I have a newbie question. I know I’m missing something simple, wondering if someone could point me in the right direction. I currently use Syslog as an input stream and create the main index.  My Cisco applications appear to be working just fine, but I cannot get data into the same tables for the CIM-type applications to see data.
I'm running Splunk 8.2.2.1 on a Macbook with Apple Silicon and macOS 12.2. I've installed the Splunk Dashboard examples app (v8.2.2). When I navigate to the app and then to "Examples", all i get ... See more...
I'm running Splunk 8.2.2.1 on a Macbook with Apple Silicon and macOS 12.2. I've installed the Splunk Dashboard examples app (v8.2.2). When I navigate to the app and then to "Examples", all i get is an empty area below the headline "Examples". I've tried Firefox, Safari and Chromium. Any ideas?
I have a  KPI dashboard with all single value panels, I'm passing the time token from this dashboard to other dashboard using drill-down options but Im also looking to get click.value from a single p... See more...
I have a  KPI dashboard with all single value panels, I'm passing the time token from this dashboard to other dashboard using drill-down options but Im also looking to get click.value from a single panel to pass it as input token for another dashboard but its not working , Is there any other way we can capture the value on the single value panel and pass it to other dashboard Any help is highly appreciated. Thanks
Hi,  I am using following search into Windows EventViewer System logs  that I extracted for testing: index="503461" host="hp-laptop" "Sleep Time"  Log looks like below: Information,4.2.... See more...
Hi,  I am using following search into Windows EventViewer System logs  that I extracted for testing: index="503461" host="hp-laptop" "Sleep Time"  Log looks like below: Information,4.2.2022 г. 12:55:47,Microsoft-Windows-Power-Troubleshooter,1,None,"The system has returned from a low power state. Sleep Time: ‎2022‎-‎02‎-‎04T10:38:18.391571900Z Wake Time: ‎2022‎-‎02‎-‎04T10:55:46.701556600Z Wake Source: Device -USB Composite Device"     I am trying to calculate the two time stamps into total duration. Can someone help with the search string, thank you
Does Splunk have any spl command like punct? The default punct field will get patterns on the _raw field. Is there any command where I can use to get the similar pattern on the custom field inste... See more...
Does Splunk have any spl command like punct? The default punct field will get patterns on the _raw field. Is there any command where I can use to get the similar pattern on the custom field instead of _raw? Example: description="User: ABC Project: XYZ Company Name: JKLM Short Description: Project is so and so" description="User: ABC Company Name: JKLM Project: XYZ Employee Level: 7 Short Description: Project is so and so User Designation: Splunk Consultant" description="User: ABC Project: Jkl Company Name: JKLM Short Description: Project: Automation" so on.. I cannot use extract command, because sub fields which i want to extract is not in order and key as 2/3/4/5 words. the only key value delim I can see is colon : and also some times user might feed : in certain sub fields.  
Hi,   I need your help   I have a standard query like this: index=a foo and I need to return only the record that match with a list of information in a CSV (or even in a lookup table).... See more...
Hi,   I need your help   I have a standard query like this: index=a foo and I need to return only the record that match with a list of information in a CSV (or even in a lookup table). Please note that doesn't exist in the index=a a specific field like the info in the csv. Example of CSV: Name Andrew John Michael Thanks in advance.
I am looking for something like this as below I have a seach string = rubi and want to check this string presence in a lookuptable = metals.csv Name         date                region rubi       ... See more...
I am looking for something like this as below I have a seach string = rubi and want to check this string presence in a lookuptable = metals.csv Name         date                region rubi            12122021     abc diamond  12122022     def platinum   12122023    ghi what would be my splunk query to shows the presence of my search string with lookuptable. I want the result to be something like below Since in above example rubi is present in metals.csv my result table should look like with an extra column Present and status as Yes Name  Present rubi       Yes If not present say example searchstring=copper and is not present in metals.csv then output table should be Name      Present copper      No Note: I am giving the seachstring in text box of dashboard and want a result table as above
My Query is    index=windows Type=Disk host IN (abc) FileSystem="*" DriveType="*" Name="*" | dedup host, Name | table _time, host, Name | sort host, Name | join type=left host [| search index... See more...
My Query is    index=windows Type=Disk host IN (abc) FileSystem="*" DriveType="*" Name="*" | dedup host, Name | table _time, host, Name | sort host, Name | join type=left host [| search index=perfmon source="Perfmon:CPU" object=Processor collection=CPU counter="% Processor Time" instance=_Total host IN (abc) | convert num(Value) as value num(pctCPU) as value | stats avg(value) as "CPUTrend" max(value) as cpu_utz by host | eval "Max Peak CPU" = round(cpu_utz, 2) | eval "CPUTrend"=round(CPUTrend, 2) | fields - cpu_utz | sort -"Peak CPU" | rename "Max Peak CPU" AS "maxCPUutil" | dedup "maxCPUutil" | table _time, host, "maxCPUutil"] | table host, "maxCPUutil", Name   I have this below output host maxCPUutil Name host                               maxCPUutil       Name abc                                  5.59                       c: abc                                  5.59                       E: abc                                   5.59                       F:   What i want is host                                   maxCPUutil                     Name abc                                          5.59                                     C:                                                                                                  E:                                                                                                  F:
I have an application named "TA-training_samaksh_for_splunk". I have to run the following query   index="training_samaksh" source="/home/devuser/tutorialdata/www1/access.log" | table ip_address, ... See more...
I have an application named "TA-training_samaksh_for_splunk". I have to run the following query   index="training_samaksh" source="/home/devuser/tutorialdata/www1/access.log" | table ip_address, request_method,time_taken | outputlookup createinapp=true testwritecsv_lookup   The transforms.conf has the following lookup defined [testwritecsv_lookup] filename = test.csv The "test.csv" file always get created/updated in the "/splunk/etc/apps/search/lookups" or "/spunk/etc/users/<username>/TA-training_samaksh_for_splunk/lookups" and not in "/splunk/etc/apps/TA-training_samaksh_for_splunk/lookups" even though I am running the search within the app.  Any solution for this?
We are using SAP Business Technology Platform(Cloud Foundry) as PAAS and we want to drain application logs to Splunk cloud platform. Please provide implementation steps. Currently we are using Kiba... See more...
We are using SAP Business Technology Platform(Cloud Foundry) as PAAS and we want to drain application logs to Splunk cloud platform. Please provide implementation steps. Currently we are using Kibana service for log monitoring on SAP Business Technology Platform(Cloud Foundry). Now we want to drain syslog and application log to Splunk cloud platform from SAP Business Technology Platform(Cloud Foundry). We need all necessary steps to set-up integration from SAP Business Technology Platform(Cloud Foundry) to Splunk cloud platform. We are new to splunk and want to do simple PoC on it with integration set-up. 
I am new to Splunk and my use case is to send a file to Splunk and then Splunk will parse it. Can someone please help me with the code to put the file from my local machine to the Splunk server using... See more...
I am new to Splunk and my use case is to send a file to Splunk and then Splunk will parse it. Can someone please help me with the code to put the file from my local machine to the Splunk server using API? I want to automate this task.
Dear All,  Need your help I have case  to compare transaction data with lookup file, for example i have lookup file account.csv it contain : Name AccountNo Jack 1234 Bobby 4321 ... See more...
Dear All,  Need your help I have case  to compare transaction data with lookup file, for example i have lookup file account.csv it contain : Name AccountNo Jack 1234 Bobby 4321 Bobby 3214 Donny 7890   and then i have daily transaction like : Name AccountNo Amount Bobby 4321 1000 Jack 1234 500 Donny 7890 500 Bobby 8888 5000 i want to marking this daily transaction base on Name that has no AccountNo in lookup table account.csv the marking can be a note or something else thanks in advance for your help Rahmat
Good morning everyone, for my customer, i have a Splunk deployment as follow: 1 Search head 3 Indexer in cluster 1 Monitoring Console/License Master/Master node I need to integrate our Qual... See more...
Good morning everyone, for my customer, i have a Splunk deployment as follow: 1 Search head 3 Indexer in cluster 1 Monitoring Console/License Master/Master node I need to integrate our Qualys solution with Splunk, but i'm reading the Technology Add-on should be installed on a forwarder. However, we do not have an Heavy forwarder. Hence, could i install it on an Indexer? Is data replication still available for index qualys? Thanks in advance, Luca
Hello everyone I'm trying to get a list of ip addresses from an internet page and put them after that into a lookup table. My issius is that I can't use mvexpand to put every ip addresses into a sin... See more...
Hello everyone I'm trying to get a list of ip addresses from an internet page and put them after that into a lookup table. My issius is that I can't use mvexpand to put every ip addresses into a single row... here my search: | curl method=get uri=https://feodotracker.abuse.ch/downloads/ipblocklist_recommended.txt | fields curl_message | rex field=curl_message mode=sed "s/.*#//g" | rex field=curl_message mode=sed "s/DstIP//g" | rex field=curl_message mode=sed "s/^\s+//g" and as results I will get a big block of data in one single row. How can I split these in multiple rows?   Thank you all for the support.
I have JSON that is really an array of values but has been encoded as objects, something like this   { "metrics": { "timers" : { "foo_timer": { "count": 1, ... See more...
I have JSON that is really an array of values but has been encoded as objects, something like this   { "metrics": { "timers" : { "foo_timer": { "count": 1, "max": 452603, "mean": 452603, "min": 452603 }, "bar_some_other_timer": { "count": 1, "max": 367110, "mean": 367110, "min": 367110 } } } }   I can display this in a table by iterating using foreach, but what I really want to do is search for events where max > 400000, and then display it with the name of the timer - so in above that would match foo_timer.  The names of the timer can be anything and the order is not guaranteed. I've tried all sorts today and keep coming up short.
Unable to capture the application data when sample app is running with java agent. It worked last week but for the last 2 days, it stopped working.
Hi , I need a help in solving one of the issue, I have a table which is Shown below, I just want to hide the rows with the name consisting of "Raju", also if we export this table to CSV , it ... See more...
Hi , I need a help in solving one of the issue, I have a table which is Shown below, I just want to hide the rows with the name consisting of "Raju", also if we export this table to CSV , it should export all the results including the name "Raju" Can any one Please help us to solve this. Thankyou.
Dear Team, I just want to use the simple search below to see which indexes are having zero count that day/week/whichever time period. index= * | stats count by index | where count = 0 However, ... See more...
Dear Team, I just want to use the simple search below to see which indexes are having zero count that day/week/whichever time period. index= * | stats count by index | where count = 0 However, the search is not returning anything and if I remove the where count=0 it is only returning indexes with more than zero counts. How do I make sure that the indexes with count=0 are included? Thank you. Warm Regards.  
I created a trial account. Is it possible to configure Synthetic jobs using a trial account? I can see in license its possible to create it.
I have a Data Model called Web_Events with a root object called Access.  There is a field in Access called 'status_category' with values "client error", "server error", "okay" or "other". I am tryi... See more...
I have a Data Model called Web_Events with a root object called Access.  There is a field in Access called 'status_category' with values "client error", "server error", "okay" or "other". I am trying to list the count of events which have 'status_catgory' as "client error" and "server error" hour by hour So I want to generate a table of following format _time client_error_count server_error_count 2022-01-26:17:30:00 <count of client error> <count of server error> 2022-01-26:18:30:00 <count of client error> <count of server error>   Can anyone help me with this? The closest I could achieve was as following:  _time Access.status_category error_count 2022-01-26:17:30:00 server error 2 2022-01-26:18:30:00 client error 6 2022-01-26:18:30:00 server error 7   with help of this query: (status_code is another field which contains values of HTTP status codes) | tstats count(Access.status_code) as error_count from datamodel=Web_Events.Access where Access.status_code!=200 earliest="01/26/2022:00:00:00" latest="02/02/2022:23:59:59" BY Access.status_category _time span=1h | table _time, Access.status_category, error_count | sort _time