All Topics

Top

All Topics

Currently running ES 8.2.2.1 and Visual SPL shows as not compatible with python 3.  Visual SPL is version 1.0.1.   The app shows in fail state and want to know if this will be updated at some point... See more...
Currently running ES 8.2.2.1 and Visual SPL shows as not compatible with python 3.  Visual SPL is version 1.0.1.   The app shows in fail state and want to know if this will be updated at some point, couldn't find anything searching so posting here for some possible guidance. Thanks!
https://docs.splunk.com/Documentation/SCS/current/Search/Comments says that we may use block comments or line comments in SPL2. When trying to learn how to count the number of objects in a JSON arra... See more...
https://docs.splunk.com/Documentation/SCS/current/Search/Comments says that we may use block comments or line comments in SPL2. When trying to learn how to count the number of objects in a JSON array returned from json_extract, I came across this post, which has an extended multiline splunk query. I wanted to see how the command worked, so I tried using both block and line comments to comment out the end of the query and replace it with a comand to view the intermediate output, e.g.     index=_internal | head 1 | fields _raw _time | eval _raw="{ \"cities\": [ { \"name\": \"London\", \"Bridges\": [ { \"name\": \"Tower Bridge\", \"length\": 801 }, { \"name\": \"Millennium Bridge\", \"length\": 1066 } ] }, { \"name\": \"Venice\", \"Bridges\": [ { \"name\": \"Rialto Bridge\", \"length\": 157 }, { \"name\": \"Bridge of Sighs\", \"length\": 36 }, { \"name\": \"Ponte della Paglia\" } ] }, { \"name\": \"San Francisco\", \"Bridges\": [ { \"name\": \"Golden Gate Bridge\", \"length\": 8981 }, { \"name\": \"Bay Bridge\", \"length\": 23556 } ] } ] }" | rename COMMENT as "the logic" | spath cities{} output=cities /* | stats count by cities | spath input=cities Bridges{} output=Bridges | mvexpand Bridges | spath input=cities name output=city | spath input=Bridges | table city name length */ | table cities     Both commenting schemes generate an error: Error in 'spath' command: Invalid argument: '/*'   This error occurs no matter which step I try to introspect.   The error is prevented by cutting the commented code out.  For now, my workaround is to keep another text editor open, and gradually copy and paste in the lines I want. This works, but it's slower than it needs to be, relative to other programming and query languages.   Key question: How can I use block or line comments to test the intermediate output of a multiline splunk query?
Hello everyone,  I have installed Splunk Stream on a distributed environment. All stream forwarders talk to the deployment server and have the "es" template applied/enabled as I use Enterprise Secur... See more...
Hello everyone,  I have installed Splunk Stream on a distributed environment. All stream forwarders talk to the deployment server and have the "es" template applied/enabled as I use Enterprise Security. Streams are normally populated (tcp, http, dns and many more). I have added on my correlation searches a Stream Capture as an adaptive response action (a 15min capture for dest_ip). A notable event is triggered and under Adaptive Responses, I see mode:saved and status:success. When I click though on the "Stream Capture" link, I get zero results. I followed this answer  and I saw that there was already an event_type modmakestreams_results tagged as modaction_result as it was expected. My final search is like: tag=modaction_result orig_sid=scheduler__admin_REE.................... orig_rid=2 orig_action_name=makestreams I see though that the event type of interest:   source=stream:makestreams_* orig_sid=* brings no results. If I remove the orig_sid=* part I get results related to notable but not related to a stream capture. As last information, stream capture on demand is working fine and pcaps are stored in the created NFS being able to download them. Any help would be appreciated. With kind regards, Chris
Work in a large environment including Splunk Ent. & ES. Planning to upgrade from 7.x.x to 8.2.2.1. Any optimizations to perform ? Any best practices to follow? Should we upgrade the ES (Enterprise Se... See more...
Work in a large environment including Splunk Ent. & ES. Planning to upgrade from 7.x.x to 8.2.2.1. Any optimizations to perform ? Any best practices to follow? Should we upgrade the ES (Enterprise Security 6.4) before or after the Splunk Enterprise upgrade. Thanks a million for your help in advance.
Hi, I have logs coming with server names listed into it and my requirement is to the distinct count of server by assigning region to them. for example. entries are like  {"server":"abc.uk" "detai... See more...
Hi, I have logs coming with server names listed into it and my requirement is to the distinct count of server by assigning region to them. for example. entries are like  {"server":"abc.uk" "details": xxxx"} {"server":"abc.uk" "details": yyyy"} {"server":"xyz.uk" "details": xxxx"} {"server":"abc.us" "details": xxxx"} {"server":"xyz.us" "details": xxxx"} {"server":"xyz.us" "details": yyyy"} {"server":"abc.hk" "details": xxxx"}   so now from the above list we have 2 unique servers from UK, 2 unique servers from US and 1 from HK, so i need them to be show as per below. North America : 2 Europe : 2 Asia : 1 i have tried search as <count(eval(searchmatch("*.us*")))> AS North America but this will not give me the count of unique server
@sensitive-thug Since the .conf21 wrap up, I would love to watch some of the breakout sessions that I missed, or rewatch some.  While I do see lots online, why none of the FEA*, the non-Splunk topic ... See more...
@sensitive-thug Since the .conf21 wrap up, I would love to watch some of the breakout sessions that I missed, or rewatch some.  While I do see lots online, why none of the FEA*, the non-Splunk topic ones?  I really want to listen to FEA1966  again.  And share it with some coworkers.. but its not available.  Any thoughts?
Hey all, I am starting to work with dashboards and I have a table that I would like to display that has a bunch of data on it. Unfortunately it seems like in the Dashboard Studio there isn't a reall... See more...
Hey all, I am starting to work with dashboards and I have a table that I would like to display that has a bunch of data on it. Unfortunately it seems like in the Dashboard Studio there isn't a really clear cut way in changing the font size for the table. I've tried a number of different things within the JSON source, but nothing I am doing seems to manipulate the font size in the table.  Is there a way to do this? I would like to reduce the font size so some longer objects appear without forcing multiple carriage returns in the row.    Thanks in advance!
I have an instance of java application running on my local machine under the URL: http://localhost:8080. Since its local instance, the java application can be seen running in the CMD, and if I perfo... See more...
I have an instance of java application running on my local machine under the URL: http://localhost:8080. Since its local instance, the java application can be seen running in the CMD, and if I perform some functionality(ex. download a file), I can see the live logs in the CMD(ex. Starting download of file... Download Completed...). I want to know if Splunk can monitor this localhost URL, and I see these live logs in the splunk enterprise application(Or website monitoring app). I have tried website monitoring app, but splunk returns a 404 for localhost. Kindly help on this issue.
Greetings dear Splunk Community,   I'll try to keep it short and simple: I have a Query that gets multiple fields, but only 2 really matter for this question: eventName and eventResult. The is... See more...
Greetings dear Splunk Community,   I'll try to keep it short and simple: I have a Query that gets multiple fields, but only 2 really matter for this question: eventName and eventResult. The issue here is, the very first and last eventResult entries of a given eventName are different than all the other eventResult entries. so you can kind of imagine it looking like this: eventName eventResult A 1 A Data A Data A Data A 2 B 3 B Data B Data B 4 And I require the value of the first entry as an extra field next to the actual data for computational purposes for each individual eventName. There's over 100 different eventName possibilities that also change over time, so nothing hard coded is possible and also no lookup tables. Also, no joins, since a join would require way too much performance due to the size of these tables. so I'd like eventName eventResult additionalColumn A 1 1 A Data 1 A Data 1 A Data 1 A 2 1 B 3 3 B Data 3 B Data 3 B 4 3   Is this possible? I looked into mapping functions (to try and map the first eventResult to the eventName) but couldn't figure anything out that worked in a way that would make this possible. I cannot change anything about the data structure, nor did I develop it.  I'd be very appreciative of any ideas. I feel like I'm just missing something small in order to get it. Best regards, Cyd
Hi,   This is request you to kindly provide us the configuration for Splunk login UI SSO to authenticate with Google account for Kubernetes.
Hi experts, i have below table.. how do i change background colour of the row where error Categories = Total_error_rate Error_categories                       Percentage% error_rate_Error      ... See more...
Hi experts, i have below table.. how do i change background colour of the row where error Categories = Total_error_rate Error_categories                       Percentage% error_rate_Error                        0.1138498 error_rate_Warning                 0.0011737 error_rate_Critical                   0.0000000 error_rate_HTTP                       6.5950704 Total_error_rate                        6.7100939   Thank you
Hello Everyone, I am in situation where in I will send the results to one lookup file and from there again I need to take tail 2 two rows to display as a summary in my Dashboard. Below is the exact ... See more...
Hello Everyone, I am in situation where in I will send the results to one lookup file and from there again I need to take tail 2 two rows to display as a summary in my Dashboard. Below is the exact scenario.   I have a search which compares last week and this week data and produces the results something like below. Date Active  Inactive Deleted Added 10/25/2021 80 20 10 15   I need to send the results calculated in above search to one lookup file . Like that I will keep on sending  every week. It will be like below after some weeks say 3 weeks. Date Active  Inactive Deleted Added 10/25/2021 80 20 10 15 11/1/2021 78 22 8 11 11/8/2021 83 18 9 6   so above is the lookup file,  then I need to use the the created lookup as input in the same query to perform some calculations (i.e,. I need to take tail 2 and display it as summary of last 2 weeks). Tried something like below. But it didn't worked. Could someone help me on this. <search > | outputlookup  test1.csv | search inputlookup test1.csv | tail 2
Does Splunk SOAR operate in the cloud, or just on-premises?
This question is related my previous post. https://community.splunk.com/t5/Splunk-Search/XML-field-Extraction/m-p/571944#M199301 My source have a date which i'll be extracting using rex command. I ... See more...
This question is related my previous post. https://community.splunk.com/t5/Splunk-Search/XML-field-Extraction/m-p/571944#M199301 My source have a date which i'll be extracting using rex command. I want my table data to be shown on those respective dates. I have used xyseries, but i cannot add other fields to the table. source="weekly_report_20211025_160957*.xml"  |rex field=source "weekly_report_(?<Date>\w.*)\.xml"|.... | table suitename  name "Time taken(s)" status  | xyseries name Date status My final table should contain suitename , name, "Time taken(s)", status(under the Date filed). Is there any method to append all these table fields after applying xyseries?
Hello I use a dropdown list in my dashboard like this   <input type="dropdown" token="web_domain" searchWhenChanged="true"><choice value="*www.colis.fr*">Colis</choice>   And I retrieve the toke... See more...
Hello I use a dropdown list in my dashboard like this   <input type="dropdown" token="web_domain" searchWhenChanged="true"><choice value="*www.colis.fr*">Colis</choice>   And I retrieve the token in my title panel like this   <panel> <title>Application $web_domain$ - Evolution moyenne des appels</title>   Instead the $web_domain$ , I would like to retrieve the generic name of the $web_domain$, it means that instead displaying "www.colis.fr" I would like to retrieve onlis "Colis" How to do this please?  
Hi at all, my customer has the requirement to have the "index" field in each DataModel used in ES. Obviously, this additional field doesn't modify CIM compliance but it's needed to make an addition... See more...
Hi at all, my customer has the requirement to have the "index" field in each DataModel used in ES. Obviously, this additional field doesn't modify CIM compliance but it's needed to make an additional filter to data. But the question is: at the next upgrade of ES, the customization will be maintained or not? Bye. Giuseppe
Hi, I have configured Splunk heavy forwarder in 2 machines. I want to send logs from one machine to another and expect the receiver to store all the received logs in an index called "receivedlogs". ... See more...
Hi, I have configured Splunk heavy forwarder in 2 machines. I want to send logs from one machine to another and expect the receiver to store all the received logs in an index called "receivedlogs".   This is the video I followed to configure Splunk: https://www.youtube.com/watch?v=S4ekkH5mv3E&t=454s&ab_channel=Splunk%26MachineLearning Thank you.
Good day Team, I have a application which contains 5 servers. Each server is having different path. But the end is to read error.log and wrapper.log /log/apple/production/A1/error.log /log/ball/pr... See more...
Good day Team, I have a application which contains 5 servers. Each server is having different path. But the end is to read error.log and wrapper.log /log/apple/production/A1/error.log /log/ball/production/A2/error.log .. Here I can use regex like this in monitor stanza -- /log/*/prodcution/*/error.log But the problem is each server is having many folders for that *. I dont want all folders. Need only few.  Say the first star. I want only apple or ball or cat. If it is any other name in any server I can ignore Similarly take the second star. I want only A1 or A2  or A3. I can ignore B1 or C1 or so.    So is it possible to write like that using any regex either in inputs itself or using props?