All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hi All, kindly give your thoughts on below questions. 1.How to create high level use case to detect malicious activity in Splunk indexer which is in a weak network ? what are the log source can be ... See more...
Hi All, kindly give your thoughts on below questions. 1.How to create high level use case to detect malicious activity in Splunk indexer which is in a weak network ? what are the log source can be considered? 2.How to create high level use case to detect malicious activity in  base OS of Splunk?what are the log source can be considered? Thankyou
Hi Community, I was trying to pull the logs  in the following format _time, src, dest, src_port, dest_port by using stats command. but i could not able to get the source port and the destination por... See more...
Hi Community, I was trying to pull the logs  in the following format _time, src, dest, src_port, dest_port by using stats command. but i could not able to get the source port and the destination port details in the results. Please help me out with the query.   Used Query EX : index="notable" search_name="XXXXXXXXXXXXXX" | stats count by _time app src dest src_port dest_port |
Hello, Below query in wmi.conf file is not returning any events . But other queries are working. Please do suggest if anything is wrong [WMI:Services] interval = 60 wql = SELECT Name, State, Sta... See more...
Hello, Below query in wmi.conf file is not returning any events . But other queries are working. Please do suggest if anything is wrong [WMI:Services] interval = 60 wql = SELECT Name, State, Status FROM Win32_Service WHERE (Name = '*DynamicsNav*' OR Name = '*SQL*' OR Name = '*MsDts*') disabled = 0 index="myIndex"
Hello, I have the case that I am sharing the UFs with the Splunk SIEM solution, however I work for another project collecting the Unix / Database log details. I have no access to the SIEM and there ... See more...
Hello, I have the case that I am sharing the UFs with the Splunk SIEM solution, however I work for another project collecting the Unix / Database log details. I have no access to the SIEM and there is basically little chance to reuse the data from there for our purpose. So, I would like to collect for example the /var/log/messages from the unix/vm machines and send it to my own indexer. I thought I would create a custom app, say called VARLOG, which would consist of the inputs.conf and outputs.conf and forward the var/log/messages to my Splunk. Now, the questions that come to my mind are: - how does it work actually when there multiple inputs/outputs.conf in different apps on the forwarder? - is it possible to have it that way at all? Would my inputs/outputs.conf be valid only for my VARLOG app as it is in the corresponding app folder on the fowarder? Or will the inputs/outputs files be joined by the forwarder based on the precedence rules and then I really need to be careful what goes where? Shortly speaking, how would I take the //messages and forward it somewhere else in case it is already being collected by other app? Kind Regards, Kamil
Hi,  The following SPL returns records to me as shown below.      index="uf_basickpi" host!=DS-* (sourcetype="CPU" counter="% Processor Time") OR (sourcetype="Memory" counter="Available MBytes") ... See more...
Hi,  The following SPL returns records to me as shown below.      index="uf_basickpi" host!=DS-* (sourcetype="CPU" counter="% Processor Time") OR (sourcetype="Memory" counter="Available MBytes") OR (sourcetype="DiskStuff" counter="% Free Space" instance=C:) | stats latest(Value) as Value by host, counter | eval "CPU Time" = if(counter="% Processor Time",Value,0) | eval "RAM Available" = if(counter="Available MBytes",Value,0) | eval "C Free Space" = if(counter="% Free Space",Value,0) | table host, "CPU Time", "RAM Available", "C Free Space"     Rows 1,2 and 3 are from the same server. Rows 4,5 and 6 from the second server.  What I would like to have is a single row per server with the three values. What would be the best way to do this. 
Installed the TA on  a sandbox standalone machine with splunk 7.3.3- If i try to configure stuff in the "Configuration" view, the "Accounts" tab just shows an animated circle and that's it. A restar... See more...
Installed the TA on  a sandbox standalone machine with splunk 7.3.3- If i try to configure stuff in the "Configuration" view, the "Accounts" tab just shows an animated circle and that's it. A restart of splunkd shows the following message: Unable to initialize modular input "microsoft_defender_atp_alerts" defined in the app "TA-MS_Defender": Introspecting scheme=microsoft_defender_atp_alerts: script running failed (exited with code 1) I reviewed all permissions on the TA and elevated permissions to read/write/execute for the splunk user and "everybody" with no effect- This installation is running on a windows server 2016 box. Anybody got an idea how to fix this?
struggling to extract underlined items as RUN NAME  
Hi all, I'm trying to build a (categorical) choropleth map where the 3 ranges are showing specific colors. Part of my code is  <query>... | eval count=case(count<=-100,"-100",(count>=-100 AND count... See more...
Hi all, I'm trying to build a (categorical) choropleth map where the 3 ranges are showing specific colors. Part of my code is  <query>... | eval count=case(count<=-100,"-100",(count>=-100 AND count<=50),"-100 to -50",count>50,">50") | sort + count | geom geo_countries featureIdField=country</query> <option name="mapping.type">choropleth</option> <option name="mapping.choroplethLayer.colorBins">3</option> <option name="mapping.choroplethLayer.colorMode">categorical</option> <option name="mapping.seriesColors">{"-100":#ff9900,"-100 to 50":#ffcc00,">50":#fff0b3}</option> However, I'm just seeing the countries colored black, regardless of which bin they fall into. Without specifying "mapping.seriesColors", the countries are colored automatically by Splunk. Am I using the seriesColors option wrong? Thank you.
Our data input contains two timestamp fields — creation_time and modification_time — both formatted in line with ISO 8601 (yyyy/mm/dd hh:mm:ss.ms). Splunk parses modification_time as _time but, in d... See more...
Our data input contains two timestamp fields — creation_time and modification_time — both formatted in line with ISO 8601 (yyyy/mm/dd hh:mm:ss.ms). Splunk parses modification_time as _time but, in doing so, it applies the system-default timestamp format, in our case the British one (dd/mm/yyyy hh:mm:ss.ms). Is there any way that we can either: Change the timestamp format of _time (not "eval time = _time" etc) so that they match? or Hide or replace _time in search results, dashboard table panels, etc so that we can use the original, modification_time field instead?
Hi, We have a requirement to monitor Azure logs from Splunk.   For that we need to complete the following steps.   1. Configure a Storage Account in Microsoft Cloud Service. 2. Connect to your A... See more...
Hi, We have a requirement to monitor Azure logs from Splunk.   For that we need to complete the following steps.   1. Configure a Storage Account in Microsoft Cloud Service. 2. Connect to your Azure Storage account with the Splunk Add-on for Microsoft Cloud  Services. 3. Configure inputs.conf file.   https://docs.splunk.com/Documentation/AddOns/released/MSCloudServices/Configureinputs5   1. Configure a Storage Account in Microsoft Cloud Service. -- completed   Storage account name:aeadsplunklog key1: +e7xkNRz5S8XBgnX1IEfMHiJqWrjpuxu2HQg30jmhe/EvjLOR+BoK5thr2aBfPanNkAINgqXuphMbGxNwdl7uA== 2. Connect to your Azure Storage account with the Splunk Add-on for Microsoft Cloud  Services. We have configured mscs_storage_accounts.conf under /opt/splunk/etc/apps/Splunk_TA_microsoft-cloudservices/local [root@indexer server]# cat mscs_storage_accounts.conf account_name = aeadsplunklog account_secret = +e7xkNRz5S8XBgnX1IEfMHiJqWrjpuxu2HQg30jmhe/EvjLOR+BoK5thr2aBfPanNkAINgqXuphMbGxNwdl7uA== account_secret_type = 1 account_class_type = 1   Restarted Spunk.   Now how can I validate that Azure storage account is connected with the Splunk Add-on for Microsoft Cloud  Services.      
Hi, Absolutely strange I have a search which writes the output to the lookup file, if I run this search in search head it writes the output to the lookup file BUT if I save this search as an ale... See more...
Hi, Absolutely strange I have a search which writes the output to the lookup file, if I run this search in search head it writes the output to the lookup file BUT if I save this search as an alert and run it then it is not writing it to the lookup file even if the alert is triggered and search has ran successfully. Any idea? Thanks, 
Hi, I have a lookup file like this - users: User1 User2 User3 User4 ... I need to count the events by user: index=myindex  | stats count as count by user | inputlookup ap... See more...
Hi, I have a lookup file like this - users: User1 User2 User3 User4 ... I need to count the events by user: index=myindex  | stats count as count by user | inputlookup append=true userlist.csv | fillnull count | stats sum(count) as count by user | table user count It shows me the number of events per user in the CSV file. If a user has no events, the count is 0: user count User1 2593 User2 301 User3 0 User4 1284   But I need the output additionally splitted over time (span=1h). The output should look like this: time user count 11.08.2020 11:00:00.000  User1 1023 11.08.2020 11:00:00.000 User2 190 11.08.2020 11:00:00.000 User3 0 11.08.2020 11:00:00.000 User4 1284 11.08.2020 12:00:00.000 User1 1570 11.08.2020 12:00:00.000 User2 111 11.08.2020 12:00:00.000 User3 0 11.08.2020 12:00:00.000 User4 0 time + 1h ... ...   I saw few other questions in splunk answers but they didnt work for me... I hope you could help me. Thank a lot!
Hi all, I'm trying to set the search period such that "earliest" is a specific day, and "latest" is 7 days after that. For this example, I chose "earliest" to be 159595200, which is Jul 29, and I wo... See more...
Hi all, I'm trying to set the search period such that "earliest" is a specific day, and "latest" is 7 days after that. For this example, I chose "earliest" to be 159595200, which is Jul 29, and I would like the search period to be 7/29/20 12AM to 8/5/20 12AM. My query is     index=test earliest="1595952000" | eval latest=relative_time(earliest,"+7d") | ... | bin _time span=7d | dedup _time fieldA | stats count by _time fieldB     Splunk searches from 7/29/20 to 8/11/20 (i.e. now). It seems to ignore "latest". I've also tried      index=test | eval earliest="159595200" | eval latest=relative_time(earliest, "+7d") |...     In this case, Splunk searches from 7/29/20 to 7/30/20, i.e. only for 1 day. What is wrong with my query, and why is my "latest" ignored? Thank you.
I'm facing problem with Splunk like there is an index having  a folder of some csv file as a data input. when i'm adding another CSV file in that folder for that index, new source data is not showing... See more...
I'm facing problem with Splunk like there is an index having  a folder of some csv file as a data input. when i'm adding another CSV file in that folder for that index, new source data is not showing in index.  I have restarted Splunk many time and delete index and recreate but problem is still there. Any Help would be great Thanks
Hi,  My issue is :  I have a panel like that : what I want is to change dynamically the color (red for example) when this is not equal to the curent time.  I have my request like this :  ind... See more...
Hi,  My issue is :  I have a panel like that : what I want is to change dynamically the color (red for example) when this is not equal to the curent time.  I have my request like this :  index=abc | stats count by Timestamp | eval epochtime=strptime(Timestamp, "%Y-%m-%dT%H:%M:%S.%N")| eval desired_time=strftime(epochtime, "%d/%m/%Y %H:%M:%S") | rename desired_time as timestamp | fields timestamp | sort timestamp desc | eval now = now() | fieldformat now = strftime(now, "%d/%m/%Y %H:%M:%S") | head 1 | where timestamp<now The format option is not useful in this case.  How can I do this ?  Thanks !  
Hi Team, I am asked to use datetime.xml for my logs. May I know how to use it? In props do I need to configure datetime.xml? or I need to point it to datetime.xml which splunk has? if I need to poi... See more...
Hi Team, I am asked to use datetime.xml for my logs. May I know how to use it? In props do I need to configure datetime.xml? or I need to point it to datetime.xml which splunk has? if I need to point it how to write the configs? Index=ps_main sourcetype = psps
Hello Friends, I am not able to see custom indexes in splunk cluster master.can you please help me?   Regards,Shivanand
Hi, I am trying to execute a simple Splunk search from command prompt using CURL. I am using a simple search command like this below curl -u username:password -k splunkhost:8089/services/search/jo... See more...
Hi, I am trying to execute a simple Splunk search from command prompt using CURL. I am using a simple search command like this below curl -u username:password -k splunkhost:8089/services/search/jobs -d search='search *' I am receiving this response <response> <messages> <msg type='WARN'>insufficient permission to access this resource</msg> </messages> </response> Am I missing out some permissions?
Hi @gcusello , Need your help on this,trying to configure Azure Storage Blob Modular Input for Splunk Add-on for Microsoft Cloud Services and it seems that I have configured it properly but unfortun... See more...
Hi @gcusello , Need your help on this,trying to configure Azure Storage Blob Modular Input for Splunk Add-on for Microsoft Cloud Services and it seems that I have configured it properly but unfortunately nothing reflecting in splunk. As I have created index and sourcetype. Following is  the below documentation that I am refering. https://docs.splunk.com/Documentation/AddOns/released/MSCloudServices/Configureinputs5 Please help, Regards, Rahul 
Hi Team, I have audited below user behavior data on web UI.  For ACT=OPEN_PAGE, which means user open a web UI page, and based on the log audit mechanism. 1. Sometime there're two events returned.... See more...
Hi Team, I have audited below user behavior data on web UI.  For ACT=OPEN_PAGE, which means user open a web UI page, and based on the log audit mechanism. 1. Sometime there're two events returned. The difference between these 2 events are: One contains  "DT=PAGEPERFORMANCE", another one doesn't. The sub-number in CAID field is different. The first generated is always CAID=xxxxxxxx-X, the second generated is always CAID=xxxxxxxx-0. 2. Sometimes there's only one returned, it can't be forsee that which one is returned. Eeither the one with CAID=xxxxxxxx-X or CAID=xxxxxxxx-0 1. 2020-08-11 02:46:49,435 DT=PAGEPERFORMANCE  httpsessionID=sid1 UID=userid1 UN=username1 LOC=en_US CAID=8513937907-X  ACT=OPEN_PAGE 2. 2020-08-11 02:46:49,438  httpsessionID=sid1 UID=userid1 UN=username1 LOC=en_US CAID=8513937907-0 ACT=OPEN_PAGE 3. 2020-08-11 02:46:49,467  httpsessionID=sid1 UID=userid1 UN=username1 LOC=en_US CAID=8513937907-1  ACT=SEARCH 4. 2020-08-11 02:46:50,222 DT=PAGEPERFORMANCE  httpsessionID=sid2 UID=userid2 UN=username2 LOC=en_US CAID=1512937904-X ACT=OPEN_PAGE 5. 2020-08-11 02:46:50,333  httpsessionID=sid2 UID=userid2 UN=username2 LOC=en_US CAID=1512937904-0 ACT=START_OVER 6. 2020-08-11 02:46:52,321  httpsessionID=sid3 UID=userid3 UN=username3 LOC=en_US CAID=3903937111-X ACT=OPEN_PAGE 7. 2020-08-11 02:46:52,469  httpsessionID=sid3 UID=userid3 UN=username3 LOC=en_US CAID=3903937111-0 ACT=SEARCH Question: 1. If there're two events with ACT=OPEN_PAGE (CAID=xxxxxxxx-X ANDCAID=xxxxxxxx-0), how can I keep the one with 'DT=PAGEPERFORMANCE ' , and remove the other one.  2. If there're only 1 events with ACT=OPEN_PAGE,  no need to remove it.  Expected Result looks like this (2nd events is removed.) 1. 2020-08-11 02:46:49,435 DT=PAGEPERFORMANCE  httpsessionID=sid1 UID=userid1 UN=username1 LOC=en_US CAID=8513937907-X  ACT=OPEN_PAGE 2. 2020-08-11 02:46:49,438  httpsessionID=sid1 UID=userid1 UN=username1 LOC=en_US CAID=8513937907-0 ACT=OPEN_PAGE 3. 2020-08-11 02:46:49,467  httpsessionID=sid1 UID=userid1 UN=username1 LOC=en_US CAID=8513937907-1  ACT=SEARCH 4. 2020-08-11 02:46:50,222 DT=PAGEPERFORMANCE  httpsessionID=sid2 UID=userid2 UN=username2 LOC=en_US CAID=1512937904-X ACT=OPEN_PAGE 5. 2020-08-11 02:46:50,333  httpsessionID=sid2 UID=userid2 UN=username2 LOC=en_US CAID=1512937904-0 ACT=START_OVER 6. 2020-08-11 02:46:52,321  httpsessionID=sid3 UID=userid3 UN=username3 LOC=en_US CAID=3903937111-X ACT=OPEN_PAGE 7. 2020-08-11 02:46:52,469  httpsessionID=sid3 UID=userid3 UN=username3 LOC=en_US CAID=3903937111-0 ACT=SEARCH   Thanks, Cherie