All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I cannot find data in field named version in my request. Please help me.See request belong   |mstats min(cpu_metric.pctIdle) as val WHERE `itsi_entity_type_ta_nix_metrics_indexes` AND CPU="all" b... See more...
I cannot find data in field named version in my request. Please help me.See request belong   |mstats min(cpu_metric.pctIdle) as val WHERE `itsi_entity_type_ta_nix_metrics_indexes` AND CPU="all" by host span=1m |eval val=100-val|lookup Serveurs-applications-Document-travail.csv "Nom du serveur" AS host OUTPUTNEW Version  
Hi, I am new to Splunk so please forgive me. I had created a field field, where if the hostname contains "*-us*" then region=NA. In search, I enter the query region=NA and I see thousands of event... See more...
Hi, I am new to Splunk so please forgive me. I had created a field field, where if the hostname contains "*-us*" then region=NA. In search, I enter the query region=NA and I see thousands of events matched, but there are no results in the current time range. I am hoping for assistance with understanding why this is, or if I am going about this the incorrect way. Thank you in advance.
Dear AppD Users, Admins, We are working on a business-oriented custom dashboard and can't find how to put the Availability metric on a dashboard. The Metric should have been viewed as a number - as ... See more...
Dear AppD Users, Admins, We are working on a business-oriented custom dashboard and can't find how to put the Availability metric on a dashboard. The Metric should have been viewed as a number - as you can see on the screenshot. I tried all kinds of options - min, max, current, count, and value. And I can't get the number as you can see on a default dashboard on the screenshot. Maybe some of you had this quest and know how to resolve it.
Hi I'm running REST queries to retrieve containers that need to be reprocessed in function of the values of some of their artifacts values. My approach is querying the artifacts REST endpoint in th... See more...
Hi I'm running REST queries to retrieve containers that need to be reprocessed in function of the values of some of their artifacts values. My approach is querying the artifacts REST endpoint in this way: /rest/artifact/?page_size=3000&_filter_name="my artifact of interest"&_filter_update_time__gt="2023-01-01T00:00:00"&_filter_[othercriteria] The thing is these artifacts are quite heavy and in this particular case I only need their container ID field, so there is no point in retrieving all the other irrelevant fields data.  If I were querying a single known artifact I could use the object detail specification documented, at https://docs.splunk.com/Documentation/SOARonprem/5.5.0/PlatformAPI/RESTQueryData#Requesting_Object_Detail  I haven't seed any similar way do specify which fields shall be retrieved while querying for an object list. Is there any way to do this?   Also, Is there any way one can query artifacts whose associated container has some properties? Right now I'm doing a massive artifact query, a massive container query and matching the results in a playbook. That's something that would be trivial and much more lighter to do by SQL-querying the underlying posrtgresql database.   Hints about this would be much appreciated.
I have data coming from a single source but I want to send the events that match a REGEX to an index and all the other that not match it to another index. I have already tried to change the order... See more...
I have data coming from a single source but I want to send the events that match a REGEX to an index and all the other that not match it to another index. I have already tried to change the order of the fields into the TRANSFORMS_ but it still put the events in both indexes. This is the content of the props.conf file: [tmpproxy] TRANSFORMS_routing1 =CIDR_Routing_matched, CIDR_Routing_others and this is the content of the transforms.conf file: [CIDR_Routing_matched] REGEX =src_host\=(?:10\.10\.10\.\d{1,3}|) FORMAT = tmp_matched_proxy DEST_KEY=_MetaData:Index WRITE_META=true [CIDR_Routing_others] REGEX =.+ FORMAT = tmp_others_proxy DEST_KEY=_MetaData:Index WRITE_META=true Is it possible to stop the TRANSFORMS_ field in the props.conf file after the first good match?
Indicates which Business Day the transaction is considered to belong to. Weekends and public holidays are often rolled forward to the next working day but it can vary according to the institution.  T... See more...
Indicates which Business Day the transaction is considered to belong to. Weekends and public holidays are often rolled forward to the next working day but it can vary according to the institution.  This is stored in the java internal time format.  Example value is 1585180800000 which represents the date 26/03/2020. This is the simple java function. Splunk solution to covert Long Date to (YYYY)MMDD value long julianDateTime = 1585180800000l; DateFormat fmt1 = new SimpleDateFormat("yyyy-MM-dd"); String dateTime = fmt1.format(new Date(julianDateTime)); System.out.println(dateTime);
Hi, I have a dashboard in Dashboard Studio but I am unable to use the magnifying glass to see the search in View mode. The dashboard has Read and Write permissions for everyone, so do not know what... See more...
Hi, I have a dashboard in Dashboard Studio but I am unable to use the magnifying glass to see the search in View mode. The dashboard has Read and Write permissions for everyone, so do not know what may be the problem.   Can you please help?     Thanks
Hi, I have been tasked to investigate what is needed to receive SAP logs in Splunk. The first thing I find when I make my first queries on google is that there is a connector called "SAP PowerConnec... See more...
Hi, I have been tasked to investigate what is needed to receive SAP logs in Splunk. The first thing I find when I make my first queries on google is that there is a connector called "SAP PowerConnect for Splunk" but when I enter https://splunkbase.splunk.com/app/3153 and try to download it I get a message saying that the download is restricted.   I also found this step by step and I would like to know what you think if the information is current because as we know about Splunk we find information on the internet but in many cases it is very old and perhaps obsolete information. https://www.wallsec.de/blog/siem-your-sap-security-audit-log-with-splunk#h.p_2Y3sy8TDSHCy   and in this last link I see a process and the truth is that the matter is complex. Solved: How to Splunk the SAP Security Audit Log - Splunk Community  
Hello, I'm having a problem where the _time field of events does not match the actual events. This happened after I rebooted the splunk server. As you can see from the pics, before the reboot the T... See more...
Hello, I'm having a problem where the _time field of events does not match the actual events. This happened after I rebooted the splunk server. As you can see from the pics, before the reboot the Time stamp, _time matches the time field After the reboot the _time stamp is 2 hours before the time field I checked the local linux server time, the user's Splunk time, they're all OK. Where does Splunk change the time of the events?
Hello I need your help for a subject.  I want to combine two search results and I need you help beacause I have a problem.  I tried this :  index="nexthink" sourcetype="st_nexthink_device"... See more...
Hello I need your help for a subject.  I want to combine two search results and I need you help beacause I have a problem.  I tried this :  index="nexthink" sourcetype="st_nexthink_device" | append [| spath "asset.last_boot_duration" output=boot | spath "asset.last_logon_duration" output=logon | spath "asset.name" output=PC | eval demarrage=boot+logon |eval date=strftime(now(),"%d/%m/%Y") |eval annee_now=mvindex(split(date,"/"),-1) | fields demarrage, PC] [ |search index=easyvista sourcetype=st_easyvista_generic "Identifiant réseau"="PCW-*" Catégorie="Borne tactile" OR Catégorie="All in One" OR Catégorie="Convertible" OR Catégorie="Odinateurs de bureau" OR Catégorie="Ordinateurs portables" OR Catégorie="Ordinateurs format micro" OR Catégorie="Workstation" | rename "Identifiant réseau" as PC "Date d'installation" as dd | eval annee=mvindex(split(dd,"/"),-1) | eval date=strftime(now(),"%d/%m/%Y") | eval annee_now=mvindex(split(date,"/"),-1) | eval difference=annee_now-annee |fields difference, PC ] | table difference PC demarrage But I have a problem with the "demarrage" field. I can't get it by combining the two searches. I want to mention that I need : -> in : index="nexthink" sourcetype="st_nexthink_device" : -----> asset.last_boot_duration -----> asset.last_logon_duration -----> demarrage = BOOT +LOGON  -> in :  index=easyvista sourcetype=st_easyvista_generic -----> date : this year  -----> annee : the year of the installation of the PC  -----> demarrage = BOOT +LOGON    Thank you for your help
Hi Folks,   I have a question, I have 2 HF and I have to configure a hec source, I would balance the data across the two HF. do you know the best pratices to do this? Do i have to create the same... See more...
Hi Folks,   I have a question, I have 2 HF and I have to configure a hec source, I would balance the data across the two HF. do you know the best pratices to do this? Do i have to create the same inputs with the same token on both the HF and use a load balancer to do that?   Thanks in advance
Hello everyone,  i have this below SPL i am using,  index=abcde* | eval logtype = if(match(_raw,".*?LTStamp.*?ConnID.*?Exp"),"browser"," ") | eval logtype = if(match(_raw,".*?MT.*?CTime.*?MBy"... See more...
Hello everyone,  i have this below SPL i am using,  index=abcde* | eval logtype = if(match(_raw,".*?LTStamp.*?ConnID.*?Exp"),"browser"," ") | eval logtype = if(match(_raw,".*?MT.*?CTime.*?MBy"),"admin"," ") | eval logtype = if(match(_raw,".*?LTStamp.*?Customer.*?SID.*?InReason"),"useractivity"," ") | eval logtype = if(match(_raw,".*?LTStamp.*?Cust.*?SID.*?SessType"),"appconnector"," ") | eval logtype = if(match(_raw,".*?LTStamp.*?Customer.*?Uname.*?SID"),"userstatus"," "), When I am using this in a search the new field "logtype" is created but the field value is just empty with count and also it is only taking the first eval statement and omitting the rest. If I use only 1 eval statement like for example 3rd eval statement "| eval logtype = if(match(_raw,".*?LTStamp.*?Customer.*?SID.*?InReason"),"useractivity"," ")" it is giving me a value "useractivity" against the newly created "logtype" field.  Now, my question is how I can join all these different eval statements into a single "eval" statement using the condition that i have used in the SPL above, [eval logtype = if(match,(regex), "X"," ")]  Note: the regexes (.*?LTStamp.*?ConnID.*?Exp) used in the match condition is hardcoded from the events we received into Splunk.   or can we use any other condition such as CASE. LIKE etc., so, that I can get all these field values (browser, adminlogs, useractivity, appconnector and userstatus) under the "logtype" field like i mentioned below. logtype  Values               count                 %  browser               xx                    xx% adminlogs           xx                    xx% useractivity        xx                    xx% appconnector   xx                    xx% userstatus           xx                   xx%   Hope the above question makes sense, any help on this will be much appreciated.  Thanks...!!!
Hello, I'm new to splunk (Internship) and couldn't find and answer. I'd need a way to filter my search. I'm curently using a ".... | ... | stats count by RequestPath" search. The problem is t... See more...
Hello, I'm new to splunk (Internship) and couldn't find and answer. I'd need a way to filter my search. I'm curently using a ".... | ... | stats count by RequestPath" search. The problem is that the "RequestPath" can contain variable/random data at the end.   Exemple: x/y/first x/y/second/randomText x/y/second/randomText x/y/third     There are millions outputs and i would like to filter them so i only keep : x/y/first x/y/second x/y/third Thanks
I have a dashboard which received a token named "time" by drilldown, which stores a specific time in epoch.  Now I want the searches in my dashboard to have a time range based on this epoch value. ... See more...
I have a dashboard which received a token named "time" by drilldown, which stores a specific time in epoch.  Now I want the searches in my dashboard to have a time range based on this epoch value.  I tried to use this token in "earliest" and "latest",  i.e, <earliest>$time$</earliest>. I worked only when I put the token as is, but not with any kind of simple arithmetic like <earliest>$time$ - 100000 </earliest>. How can I use my epoch token to set the time ranges in my panels? 
Hello Splunk Community! I have an ec2 instance of Windows Server 2022 with Splunk Enterprise (9.0.4) installed. Within a few minutes of installing, all of the processing queues are 100% blocked and i... See more...
Hello Splunk Community! I have an ec2 instance of Windows Server 2022 with Splunk Enterprise (9.0.4) installed. Within a few minutes of installing, all of the processing queues are 100% blocked and it places all indexers on quarantine. It is currently outputting to 3 different indexers, and the only logs it is supposed to send is internal logs. I am 100% positive the indexers are not the issue. I think the problem is potentially a connection issue to these indexers as I cannot ping these machines. There is no firewall blocking traffic in between them, so thinking it might be an issue with a setting in server 2022 somewhere. I made sure to install through Admin CMD line, and for testing, this ec2 has all outbound connections open. Does anyone have any ideas or have seen this before? I had this happen on another box but messing with CMD line and different install flags it finally started working but it seems like no matter what flags I use it doesn't work.
Hi, Can someone help me  "AAD SSO last reset date" query?
Hi,  I am facing an issue while implementing two textboxes in splunk dashboard.  Requirement is to make the submit button work in three scenarios:  1. Either of the textbox is empty 2. Both t... See more...
Hi,  I am facing an issue while implementing two textboxes in splunk dashboard.  Requirement is to make the submit button work in three scenarios:  1. Either of the textbox is empty 2. Both the textboxes are not empty The submit button is not working as expected. I am attaching the code here. Please have a look and let me know what's going wrong.   <row> <panel> <title>MAKE THE TEXTBOX WORK</title> <input type="text" token="text1" searchWhenChanged="true"> <label>TEXTBOX1</label> <default></default> <change> <unset token="isLifeCycleSubmit"></unset> <set token="submitLifeCycle">true</set> <unset token="setSubmitLifeCycle"></unset> <unset token="form.setSubmitLifeCycle"></unset> </change> </input> <input type="text" token="text2" searchWhenChanged="true"> <label>TEXTBOX2</label> <default></default> <change> <unset token="isLifeCycleSubmit"></unset> <set token="submitLifeCycle">true</set> <unset token="setSubmitLifeCycle"></unset> <unset token="form.setSubmitLifeCycle"></unset> </change> </input> <input type="time" token="lifeTime"> <label>Select Time</label> <default> <earliest>@d</earliest> <latest>now</latest> </default> <change> <unset token="isLifeCycleSubmit"></unset> <set token="submitLifeCycle">true</set> <unset token="setSubmitLifeCycle"></unset> <unset token="form.setSubmitLifeCycle"></unset> </change> </input> <input id="submitLifeCycle" type="link" token="submitLifeCycle" searchWhenChanged="true" depends="$submitLifeCycle$"> <label></label> <choice value="true">Submit</choice> <change> <set token="isLifeCycleSubmit">true</set> <unset token="submitLifeCycle"></unset> <unset token="form.submitLifeCycle"></unset> <set token="setSubmitLifeCycle">true</set> </change> </input> <input id="setSubmitLifeCycle" type="link" token="setSubmitLifeCycle" searchWhenChanged="true" depends="$setSubmitLifeCycle$"> <label></label> <choice value="true">Submit</choice> </input> </panel> </row> <row> <panel > <table> <title>QUERY</title> <search> <query>index=abc sourcetype ...|eval isSubmit=$isLifeCycleSubmit$ </query> </search> </table>
I have HTML sections relying on some custom CSS in a dashboard and I can make them look great in either dark or light mode like so: <form version="1.1" stylesheet="foobar_light.css" theme="light"> ... See more...
I have HTML sections relying on some custom CSS in a dashboard and I can make them look great in either dark or light mode like so: <form version="1.1" stylesheet="foobar_light.css" theme="light"> Or: <form version="1.1" stylesheet="foobar_dark.css" theme="dark"> I would ideally like not to specify in which mode the dashboard should be seen and leave it to whatever the user's preference is. The problem is dark mode with the light CSS looks awful/unreadable and vice-versa. Is there any way I can get splunk to choose the right CSS depending on the user's theme preference?
After the update to v7.1 of Splunk ES Incident Review channel, when selecting events and choosing Edit Selected, it presents with the popup/overlay window, where we can change the Status (Analyzing, ... See more...
After the update to v7.1 of Splunk ES Incident Review channel, when selecting events and choosing Edit Selected, it presents with the popup/overlay window, where we can change the Status (Analyzing, Closed, etc..) and assign ourselves as the Owner. When clicking on Save Changes, the overlay window does not auto-close, and we have to manually click on the Close button. In the previous version this overlay auto-closed and the Incident Review page refreshed after clicking on Save Changes (or Save). Is there some configuration setting that will enable this once again auto-close after making the Status changes?
Hi, I'm wondering if the syslog outputs.conf feature described in the [syslog] stanza supports TLS encryption? I see no mention of it in the docs about this.