All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi, I want to create a splunk table using multiple fields. Let me explain the scenario I have the following fields Name Role (multiple roles will exist for each name) HTTPrequest (There are mul... See more...
Hi, I want to create a splunk table using multiple fields. Let me explain the scenario I have the following fields Name Role (multiple roles will exist for each name) HTTPrequest (There are multiple response as 2**,3**,4** and 5**) My final output  should be when the query is ran, It should the group the data in the below format for every day Date Name Role Success Failed  Total Failed % 01-Jan-23 Rambo Team lead 100 0 100 0 01-Jan-23 Rambo Manager 100 10 110 10 01-Jan-23 King operator 2000 100 2100 5 02-Jan-23 King Manager 100 0 100 0 03-Jan-23 cheesy Manager 100 10 110 10 04-Jan-23 cheesy Team lead 4000 600 4600 15     So, What I tried is  index=ABCD | bucket _time span=1d | eval status=case(HTTPrequest < 400,"Success",HTTPrequest > 399,"Failed" ) | stats count by _time Name Role status This works something as below but I need the success and failure  in to 2 seperate columns as I have shown above and also I need to add the failed % and total Date Name Role HTTPStatus COUNT 01-Jan-23 Rambo Team lead Success 100 01-Jan-23 Rambo Team lead Failed 0 01-Jan-23 Rambo Manager Success 100 01-Jan-23 Rambo Manager Failed 10 01-Jan-23 King operator Success 2000 01-Jan-23 King operator Failed 200 02-Jan-23 King Manager Success 10 03-Jan-23 cheesy Manager Success 300 04-Jan-23 cheesy Team lead Success 400   I used the chart count over X by Y but this allows me to use only 2 fields and not more than 2 Please could you suggest me on how to get this sorted. 
Hello,  I have a couple splunk columns that looks as follows: server:incident:incident#:severity severity   this object is then fed to another system which separates and generat... See more...
Hello,  I have a couple splunk columns that looks as follows: server:incident:incident#:severity severity   this object is then fed to another system which separates and generates incidents. Server: hostname incident: category of incident incident#: the incident number sererity: Critical/Warning/Clear Example: serverA:zabbix:123456:Warning Warning serverA:zabbix:123456:Critical Critical    The objective is that it generates uniqueness of the incident (if warning, then create a ticket, if Critical then call out) All works well when with the separate of Critical and Warning alerts, however when one clear is generated, I need to generate two records to look as follows: serverA:zabbix:123456:Warning Clear serverA:zabbix:123456:Critical Clear    This way, the object that has been sent will get the clear. Is there a way to achieve this? Thanks David
The prerequisites indicate that the Splunk DB Connect extension will not work with systems that are FIPS compliant.  Will this change in future releases and is there a timeframe for this release? 
Hi Everyone, I want to plot a chart according to the calendar week. I plotted a timechart like this,   |timechart span=7d distinct_count(Task_num) as Tasks by STATUS   But this doesn't give the ... See more...
Hi Everyone, I want to plot a chart according to the calendar week. I plotted a timechart like this,   |timechart span=7d distinct_count(Task_num) as Tasks by STATUS   But this doesn't give the exact calendar weeks. Also i am keeping this charts data to last 3 months. Anyone have idea how to plot a bar chart based on calendar week? instead of date i want to see the data for current calendar weeks of last 3 months. I got from the splunk community on how to get the Calendar week. But i am not to plot a graph out of it. | eval weeknum=strftime(strptime(yourdatefield,"%d-%m-%Y"),"%V")    
Hello I'm trying to create a timechart which will compare between two date\time range I want to see the values of last sunday (10.9) between 15:00-16:30 and compare with the values for the same tim... See more...
Hello I'm trying to create a timechart which will compare between two date\time range I want to see the values of last sunday (10.9) between 15:00-16:30 and compare with the values for the same time but sunday last week (3.9) How can I do it ? Thanks
Hi Team, I am looking for the help to created search query for my daily run report which is running 3 time in a day. we are putting the files in directory which we are monitoring in splunk. is ... See more...
Hi Team, I am looking for the help to created search query for my daily run report which is running 3 time in a day. we are putting the files in directory which we are monitoring in splunk. is there any way we can grab events from only latest sourcefile? For example:  Index=abc sourcetype=xyz source=/opt/app/file1_09092023.csv source=/opt/app/file2_09102023.csv source=/opt/app/file3_09112023.csv..... new file can be placed time to time. I wanted report can be show only events from latest file, is it possible? Thank you  
Hi Splunkers, I have to forward data inside csv files from an on prem HF to Splunk Cloud and I'm facing some issues, cause data seem to not be forwarded. Let me share with you some additional bits. ... See more...
Hi Splunkers, I have to forward data inside csv files from an on prem HF to Splunk Cloud and I'm facing some issues, cause data seem to not be forwarded. Let me share with you some additional bits. Info about data Source data are on a cloud instance (Forcepoint) provided by vendor A script has been provided by vendor to pull data from cloud The script is installed and configured on our Splunk HF Data are saved locally on HF Data are in .csv files  Info about HF configuration We create a new data inputs under Settings -> Data inputs -> Local inputs -> Files & Directories We set as data input the path were .csv are saved after script execution We set the proper sourcetype and index Of course, we configured the HF to send data to Splunk Cloud. We downloaded the file from cloud, from "Universal Forwarder" app and installed it as app on HF: the outputs.conf is proper configured, other data are sent without problem to Splunk cloud (for example, Network input ones goes to Cloud without issues; same for Windows ones) Info about sourcetype and index and their deployment We create a custom addon that simply provide the sourcetype "forcepoint" Sourcetype is configured to extract data from CSV; that means that we set parameter      Indexed_extractions=csv ​     We installed addon on both HF and Splunk Cloud The index, called simply "web", has been created on both HF and Splunk Cloud By thw way, seems that data are not sent from HF to Cloud. So, did I forgot some steps? Or I made wrong some of above ones?  
Hi, Too many hours to solve such a simple question...It is supposed to be a basic thing I want to present both percentages and regular values in bar chart (it can be in the tooltip, like it exi... See more...
Hi, Too many hours to solve such a simple question...It is supposed to be a basic thing I want to present both percentages and regular values in bar chart (it can be in the tooltip, like it exists in a pie chart), If not possible to present only percentages but add the "%" symbol (when I tried to add % it converted the fields to string and nothing was shown in the chart) * I can't add a js script, no access to the server This is my query: | stats sum(CountEvents) by CT | rename "sum(CountEvents)" as "countE" | eventstats sum(countE) as Total | eval perc=round(countE*100/Total,2) | chart sum(perc) as "EventsPercentages[%]" over CT thanks a lot
In System Center dashboard, only *NIX system data is available, not Windows system. I've already install Splunk Add-on for Microsoft Windows and run search with Inventory and Performance data models ... See more...
In System Center dashboard, only *NIX system data is available, not Windows system. I've already install Splunk Add-on for Microsoft Windows and run search with Inventory and Performance data models successfully. When I check the search of the System Center dashboard, it refers to tag All_Inventory.OS.os. When I run this search, it only returns the *NIX system. What can I do to populate the data from Microsoft Windows to System Center dashboard? I've found a link but it seems quite old version https://community.splunk.com/t5/Splunk-Enterprise-Security/Enterprise-Security-System-Center-or-Update-Center-only-have/m-p/136434  All_Inventory.OS.os All_Inventory
May I ask what is causing this?
    We are running Splunk ES and trying to make log search and app interfaces for each company. Let's call them CompanyA, CompanyB and CompanyC.Each company has to see its own data and also n... See more...
    We are running Splunk ES and trying to make log search and app interfaces for each company. Let's call them CompanyA, CompanyB and CompanyC.Each company has to see its own data and also notable events in ES. As a holding company, we need to access and see all data. What is best way to achieve this goal? Please advise.
  How to calculate the centroid of each cluser after using KMeans clustering algorithm? I have tried the following but none of them worked:  1 - | inputlookup iris.csv | fit KMeans k=3 petal* |ev... See more...
  How to calculate the centroid of each cluser after using KMeans clustering algorithm? I have tried the following but none of them worked:  1 - | inputlookup iris.csv | fit KMeans k=3 petal* |eval point_size = 1 | appendpipe [| stats mean(petal*) as petal* by cluster | eval species = "Centroid: ".cluster | eval point_size = 2] | fields species petal* point_size     2- showcentroid = t
I have configure a splunk alert with alert condition to Trigger for each result. But every time I only get the alert for only one of those results. Any idea why? Below is the screenshot of the aler... See more...
I have configure a splunk alert with alert condition to Trigger for each result. But every time I only get the alert for only one of those results. Any idea why? Below is the screenshot of the alert: And below is a sample result from the alert query  
Hello Splunk Family, I am looking for help on making a graph in Splunk. I am trying to monitor the amount of transactions by different methods names with different objects and separate that by da... See more...
Hello Splunk Family, I am looking for help on making a graph in Splunk. I am trying to monitor the amount of transactions by different methods names with different objects and separate that by date. Here is an example of the data I have Date Object Type Object Name Total Transactions Aug 1 LibPush Root 15 Aug 1 LibPush ProcessQueue 12 Aug 1 LibPush Failed 2 Aug 1 Company ChangeConfigSet 34 Aug 1 Company CleanUpMsg 15 Aug 1 Company GetMsg 32 Aug 1 Company SendMSG 13 Aug 2 LibPush Root 15 Aug 2 LibPush ProcessQueue 12 Aug 2 LibPush Failed 2 Aug 2 Company ChangeConfigSet 34 Aug 2 Company CleanUpMsg 15 Aug 2 Company GetMsg 32 Aug 2 Company SendMSG 45 Aug 3 LibPush Root 15 Aug 3 LibPush ProcessQueue 12 Aug 3 LibPush Failed 2 Aug 3 Company ChangeConfigSet 34 Aug 3 Company CleanUpMsg 15 Aug 3 Company GetMsg 32 Aug 3 Company SendMSG 45   The only thing is that there are a lot of Object Types and Object Names so maybe the top 10 object types per day. Here is a lame attempt at a drawing of what I want. Here is the code I got so far [mycode] | bin _time span=1d| chart count(indexid) over actionelementname by actionelementtype but it is missing the date and it is not stacked.   Any help would be deeply appreciated!     
I have a csv file which has data like this and i am using  | inputlookup abc.csv | search _time >= '2023-09-10" but its is not showing any data _time client noclient 2023-09-10 i... See more...
I have a csv file which has data like this and i am using  | inputlookup abc.csv | search _time >= '2023-09-10" but its is not showing any data _time client noclient 2023-09-10 iphone airpord 2023-09-11 samsung earbud   how do i get the data only for the selected date like from the above query
In my organizational environment, there are a few alerts in the enabled state. I would like to create an inventory of all the enabled alerts and their important fields on GitHub. Is there a way to au... See more...
In my organizational environment, there are a few alerts in the enabled state. I would like to create an inventory of all the enabled alerts and their important fields on GitHub. Is there a way to automate the transfer to GitHub without requiring manual effort? All the alerts on Splunk Cloud.
So I'm working to implement a clear buttons filter on a simple XML dashboard. I'm unable to do any custom java script so I've been doing all of it within the XML. I have the functionally I'm looking ... See more...
So I'm working to implement a clear buttons filter on a simple XML dashboard. I'm unable to do any custom java script so I've been doing all of it within the XML. I have the functionally I'm looking for utilizing a link list input with condition changes to unset the tokens to the default but having issues with my submit button lining back up. No matter what I seem to do I can't get the submit button to come in line with the Clear Filters "button". If anyone could help me with getting the Submit button in line with my link list input that would be grealtly appreciated. I've have some instance agnostic XML code below so you can see what I'm talking about. Thanks!       <form theme="dark"> <label>Clear Filters</label> <fieldset submitButton="true"> <input type="multiselect" token="Choice"> <label>Choices</label> <choice value="*">All</choice> <choice value="Choice 1">Choice 1</choice> <choice value="Choice 2">Choice 2</choice> <choice value="Choice 3">Choice 3</choice> <default>*</default> <initialValue>*</initialValue> </input> <input type="link" token="Clearer" searchWhenChanged="true" id="list"> <label></label> <choice value="Clear">Clear Filters</choice> <change> <condition value="Clear"> <unset token="form.Choice"></unset> <unset token="form.Clearer"></unset> </condition> </change> </input> <html> <style> #list button{ color: white; background: green; width:50%; display: inline-block; } </style> </html> </fieldset> <row> <panel> <single> <search> <query>| makeresults | eval Message="Thanks for the help!" | table Message</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">none</option> </single> </panel> </row> </form>      
Hello All, I need to monitor MongoDB Replica set for its status. For this I have to run rs.status command in admin DB for MongoDB, this will give me JSON output and i need to look for status for ... See more...
Hello All, I need to monitor MongoDB Replica set for its status. For this I have to run rs.status command in admin DB for MongoDB, this will give me JSON output and i need to look for status for replica set in that out and trigger the alert. Appreciate any pointers on this and if someone could take a look at below code provide the feedback that will be helpful, this one is for triggering the alert based on condition, I am trying to use case for this. index =XXXX | eval rs_status=case(status == "Primary", "OK", status =="ARBITER", "OK", status == "SECONDARY", "OK", status == "STARTUP", "KO", status == "RECOVERING", "KO" status == "STARTUP2", "KO", status == "UNKNOWN", "KO", status == "DOWN", "KO", status == "ROLLBACK", "KO", status == "REMOVED", "KO") | sort - _time | where status="KO"   Let me know if you see any issues here.   Regards Amit
I am looking for a Splunk Query which gives me all the enabled & disabled state use-cases. 
Hi, I just deployed the latest version 2 of SC4S and I sent syslog events from our firewall Stormshield. I checked and I didn't see a specific source for this firewall brand The box is capable of s... See more...
Hi, I just deployed the latest version 2 of SC4S and I sent syslog events from our firewall Stormshield. I checked and I didn't see a specific source for this firewall brand The box is capable of sending logs in the format RFC5424, UDP/514. I did not configure a custom filter for it and the logs are automatically recognized as UNIX OS syslog events which is wrong, they are indexed in the osnix instead of netfw. I would like to create a filter based on the source host but I don't find any examples in the official github documentation.  for version 1 there is some but I am not sure if it applies to version 2. https://splunk.github.io/splunk-connect-for-syslog/1.110.1/configuration/#override-index-or-metadata-based-on-host-ip-or-subnet-compliance-overrides any suggestion? many thanks