All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am trying to set a regex that works when i use say regexr.com but doesn't apply in my transforms/props file. I am wanting to not ingest any apache logs that contain:  assets/js, assets/css, assets... See more...
I am trying to set a regex that works when i use say regexr.com but doesn't apply in my transforms/props file. I am wanting to not ingest any apache logs that contain:  assets/js, assets/css, assets/img I can set one up singular, and it works fine, but the two commented out lines, even though they work in a regex case, don't seem to apply in my transforms file.  Any insight if I may be doing something wrong? Thank you for any assistance.   [drop_assets] REGEX = .*assets\/js.* #REGEX = .*(assets\/js|assets\/css|assets\/img).* #REGEX = .*assets/js.*|.*assets/css.*|.*assets/img.* DEST_KEY = queue FORMAT = nullQueue   [apache] TRANSFORMS-drop = drop_assets
Hi! I have a dropdown with one of the values being "Unknown", and I would like to have an option in the dropdown to show each value,  All, and also include an option to show "All except for Unknow... See more...
Hi! I have a dropdown with one of the values being "Unknown", and I would like to have an option in the dropdown to show each value,  All, and also include an option to show "All except for Unknown".  Has anyone been able to do so? Thank you very much!
Hi Team,   Wanted to enable SMB server audit logs in Splunk from UF or inputs.conf etc, can anyone please help with the configuration steps or any splunk docs for reference. Thanks in advance!! Th... See more...
Hi Team,   Wanted to enable SMB server audit logs in Splunk from UF or inputs.conf etc, can anyone please help with the configuration steps or any splunk docs for reference. Thanks in advance!! Thanks, Sharada Pandilla
Hi I have lots "Caused by:" in (single or  multiple) events How extract all line that contain "Caused by:" like this: Caused by: java.sql.SQLException: ISAM error: duplicate value for a record wi... See more...
Hi I have lots "Caused by:" in (single or  multiple) events How extract all line that contain "Caused by:" like this: Caused by: java.sql.SQLException: ISAM error: duplicate value for a record with unique key. Any idea? Thanks,
Hi - I have a command to clean fish buckets in a forwarder - if i want to take back in data for testing etc... cd var/lib/splunk/ rm -r fishbucket/ bin/splunk stop; cd var/lib/splunk/ ; rm -r fishbu... See more...
Hi - I have a command to clean fish buckets in a forwarder - if i want to take back in data for testing etc... cd var/lib/splunk/ rm -r fishbucket/ bin/splunk stop; cd var/lib/splunk/ ; rm -r fishbucket/ ;cd - ; rm -r var/ ; bin/splunk start But is there any way to clean fish buckets for only one source type?
We have Splunk Ent. (8.0) & ES.(6.4). What is a proper procedure to upgrade to Splunk Enterprise 8.2.2.1 to retain the settings & configurations we have done to ES (Enterprise Security)? What about S... See more...
We have Splunk Ent. (8.0) & ES.(6.4). What is a proper procedure to upgrade to Splunk Enterprise 8.2.2.1 to retain the settings & configurations we have done to ES (Enterprise Security)? What about Security Essentials we have installed. Any directions are much appreciated. Thanks a million.
Currently running ES 8.2.2.1 and Visual SPL shows as not compatible with python 3.  Visual SPL is version 1.0.1.   The app shows in fail state and want to know if this will be updated at some point... See more...
Currently running ES 8.2.2.1 and Visual SPL shows as not compatible with python 3.  Visual SPL is version 1.0.1.   The app shows in fail state and want to know if this will be updated at some point, couldn't find anything searching so posting here for some possible guidance. Thanks!
https://docs.splunk.com/Documentation/SCS/current/Search/Comments says that we may use block comments or line comments in SPL2. When trying to learn how to count the number of objects in a JSON arra... See more...
https://docs.splunk.com/Documentation/SCS/current/Search/Comments says that we may use block comments or line comments in SPL2. When trying to learn how to count the number of objects in a JSON array returned from json_extract, I came across this post, which has an extended multiline splunk query. I wanted to see how the command worked, so I tried using both block and line comments to comment out the end of the query and replace it with a comand to view the intermediate output, e.g.     index=_internal | head 1 | fields _raw _time | eval _raw="{ \"cities\": [ { \"name\": \"London\", \"Bridges\": [ { \"name\": \"Tower Bridge\", \"length\": 801 }, { \"name\": \"Millennium Bridge\", \"length\": 1066 } ] }, { \"name\": \"Venice\", \"Bridges\": [ { \"name\": \"Rialto Bridge\", \"length\": 157 }, { \"name\": \"Bridge of Sighs\", \"length\": 36 }, { \"name\": \"Ponte della Paglia\" } ] }, { \"name\": \"San Francisco\", \"Bridges\": [ { \"name\": \"Golden Gate Bridge\", \"length\": 8981 }, { \"name\": \"Bay Bridge\", \"length\": 23556 } ] } ] }" | rename COMMENT as "the logic" | spath cities{} output=cities /* | stats count by cities | spath input=cities Bridges{} output=Bridges | mvexpand Bridges | spath input=cities name output=city | spath input=Bridges | table city name length */ | table cities     Both commenting schemes generate an error: Error in 'spath' command: Invalid argument: '/*'   This error occurs no matter which step I try to introspect.   The error is prevented by cutting the commented code out.  For now, my workaround is to keep another text editor open, and gradually copy and paste in the lines I want. This works, but it's slower than it needs to be, relative to other programming and query languages.   Key question: How can I use block or line comments to test the intermediate output of a multiline splunk query?
Hello everyone,  I have installed Splunk Stream on a distributed environment. All stream forwarders talk to the deployment server and have the "es" template applied/enabled as I use Enterprise Secur... See more...
Hello everyone,  I have installed Splunk Stream on a distributed environment. All stream forwarders talk to the deployment server and have the "es" template applied/enabled as I use Enterprise Security. Streams are normally populated (tcp, http, dns and many more). I have added on my correlation searches a Stream Capture as an adaptive response action (a 15min capture for dest_ip). A notable event is triggered and under Adaptive Responses, I see mode:saved and status:success. When I click though on the "Stream Capture" link, I get zero results. I followed this answer  and I saw that there was already an event_type modmakestreams_results tagged as modaction_result as it was expected. My final search is like: tag=modaction_result orig_sid=scheduler__admin_REE.................... orig_rid=2 orig_action_name=makestreams I see though that the event type of interest:   source=stream:makestreams_* orig_sid=* brings no results. If I remove the orig_sid=* part I get results related to notable but not related to a stream capture. As last information, stream capture on demand is working fine and pcaps are stored in the created NFS being able to download them. Any help would be appreciated. With kind regards, Chris
Work in a large environment including Splunk Ent. & ES. Planning to upgrade from 7.x.x to 8.2.2.1. Any optimizations to perform ? Any best practices to follow? Should we upgrade the ES (Enterprise Se... See more...
Work in a large environment including Splunk Ent. & ES. Planning to upgrade from 7.x.x to 8.2.2.1. Any optimizations to perform ? Any best practices to follow? Should we upgrade the ES (Enterprise Security 6.4) before or after the Splunk Enterprise upgrade. Thanks a million for your help in advance.
Hi, I have logs coming with server names listed into it and my requirement is to the distinct count of server by assigning region to them. for example. entries are like  {"server":"abc.uk" "detai... See more...
Hi, I have logs coming with server names listed into it and my requirement is to the distinct count of server by assigning region to them. for example. entries are like  {"server":"abc.uk" "details": xxxx"} {"server":"abc.uk" "details": yyyy"} {"server":"xyz.uk" "details": xxxx"} {"server":"abc.us" "details": xxxx"} {"server":"xyz.us" "details": xxxx"} {"server":"xyz.us" "details": yyyy"} {"server":"abc.hk" "details": xxxx"}   so now from the above list we have 2 unique servers from UK, 2 unique servers from US and 1 from HK, so i need them to be show as per below. North America : 2 Europe : 2 Asia : 1 i have tried search as <count(eval(searchmatch("*.us*")))> AS North America but this will not give me the count of unique server
@sensitive-thug Since the .conf21 wrap up, I would love to watch some of the breakout sessions that I missed, or rewatch some.  While I do see lots online, why none of the FEA*, the non-Splunk topic ... See more...
@sensitive-thug Since the .conf21 wrap up, I would love to watch some of the breakout sessions that I missed, or rewatch some.  While I do see lots online, why none of the FEA*, the non-Splunk topic ones?  I really want to listen to FEA1966  again.  And share it with some coworkers.. but its not available.  Any thoughts?
Hey all, I am starting to work with dashboards and I have a table that I would like to display that has a bunch of data on it. Unfortunately it seems like in the Dashboard Studio there isn't a reall... See more...
Hey all, I am starting to work with dashboards and I have a table that I would like to display that has a bunch of data on it. Unfortunately it seems like in the Dashboard Studio there isn't a really clear cut way in changing the font size for the table. I've tried a number of different things within the JSON source, but nothing I am doing seems to manipulate the font size in the table.  Is there a way to do this? I would like to reduce the font size so some longer objects appear without forcing multiple carriage returns in the row.    Thanks in advance!
I have an instance of java application running on my local machine under the URL: http://localhost:8080. Since its local instance, the java application can be seen running in the CMD, and if I perfo... See more...
I have an instance of java application running on my local machine under the URL: http://localhost:8080. Since its local instance, the java application can be seen running in the CMD, and if I perform some functionality(ex. download a file), I can see the live logs in the CMD(ex. Starting download of file... Download Completed...). I want to know if Splunk can monitor this localhost URL, and I see these live logs in the splunk enterprise application(Or website monitoring app). I have tried website monitoring app, but splunk returns a 404 for localhost. Kindly help on this issue.
Greetings dear Splunk Community,   I'll try to keep it short and simple: I have a Query that gets multiple fields, but only 2 really matter for this question: eventName and eventResult. The is... See more...
Greetings dear Splunk Community,   I'll try to keep it short and simple: I have a Query that gets multiple fields, but only 2 really matter for this question: eventName and eventResult. The issue here is, the very first and last eventResult entries of a given eventName are different than all the other eventResult entries. so you can kind of imagine it looking like this: eventName eventResult A 1 A Data A Data A Data A 2 B 3 B Data B Data B 4 And I require the value of the first entry as an extra field next to the actual data for computational purposes for each individual eventName. There's over 100 different eventName possibilities that also change over time, so nothing hard coded is possible and also no lookup tables. Also, no joins, since a join would require way too much performance due to the size of these tables. so I'd like eventName eventResult additionalColumn A 1 1 A Data 1 A Data 1 A Data 1 A 2 1 B 3 3 B Data 3 B Data 3 B 4 3   Is this possible? I looked into mapping functions (to try and map the first eventResult to the eventName) but couldn't figure anything out that worked in a way that would make this possible. I cannot change anything about the data structure, nor did I develop it.  I'd be very appreciative of any ideas. I feel like I'm just missing something small in order to get it. Best regards, Cyd
Hi,   This is request you to kindly provide us the configuration for Splunk login UI SSO to authenticate with Google account for Kubernetes.
Hi experts, i have below table.. how do i change background colour of the row where error Categories = Total_error_rate Error_categories                       Percentage% error_rate_Error      ... See more...
Hi experts, i have below table.. how do i change background colour of the row where error Categories = Total_error_rate Error_categories                       Percentage% error_rate_Error                        0.1138498 error_rate_Warning                 0.0011737 error_rate_Critical                   0.0000000 error_rate_HTTP                       6.5950704 Total_error_rate                        6.7100939   Thank you
Hello Everyone, I am in situation where in I will send the results to one lookup file and from there again I need to take tail 2 two rows to display as a summary in my Dashboard. Below is the exact ... See more...
Hello Everyone, I am in situation where in I will send the results to one lookup file and from there again I need to take tail 2 two rows to display as a summary in my Dashboard. Below is the exact scenario.   I have a search which compares last week and this week data and produces the results something like below. Date Active  Inactive Deleted Added 10/25/2021 80 20 10 15   I need to send the results calculated in above search to one lookup file . Like that I will keep on sending  every week. It will be like below after some weeks say 3 weeks. Date Active  Inactive Deleted Added 10/25/2021 80 20 10 15 11/1/2021 78 22 8 11 11/8/2021 83 18 9 6   so above is the lookup file,  then I need to use the the created lookup as input in the same query to perform some calculations (i.e,. I need to take tail 2 and display it as summary of last 2 weeks). Tried something like below. But it didn't worked. Could someone help me on this. <search > | outputlookup  test1.csv | search inputlookup test1.csv | tail 2