All Topics

Top

All Topics

Hi, I'm still new to Splunk and I understand that I can extend search or report lifecycle either using GUI or change the dispatch.ttl when scheduling a report. I want to know what will happen when I... See more...
Hi, I'm still new to Splunk and I understand that I can extend search or report lifecycle either using GUI or change the dispatch.ttl when scheduling a report. I want to know what will happen when I have hundreds of searches and reports with extended lifetime (7days or more), will there be any impact to the hardware resources when Splunk holds so much data for these reports and searches?  
Hi,  Anyone know a summary index used by Splunk to retain the index sizes? I can calculate a index size by using internal index but I need to go back further than the last month.  Any other metho... See more...
Hi,  Anyone know a summary index used by Splunk to retain the index sizes? I can calculate a index size by using internal index but I need to go back further than the last month.  Any other method is welcomed as well.  Thanks  
Hello, I've got a Lamda function exporting AWS logs via HEC to my HF's to my indexers. Unfortunately, the AWS logs are coming in with event.* as all of the field names, whereas the Splunk_TA_aws a... See more...
Hello, I've got a Lamda function exporting AWS logs via HEC to my HF's to my indexers. Unfortunately, the AWS logs are coming in with event.* as all of the field names, whereas the Splunk_TA_aws addon is expecting *. I can easily do a rename event.* as a *, however that's too late for the out of the box props.conf's to take effect. This causes things like the the "FIELDALIAS-eventName-for-aws-cloudtrail-command = eventName AS commandrename eventName as command" in props.conf to fail unless I go in and modify it to be event.eventName. I'd like to fix this before it gets to SPL. Is there a way to do this easily? Thanks!
Since upgrading to 9.1.2, I am no longer able to see table output on the Splunk Search.  Even with the most simplistic search.  I receive the message "Failed to load source for Statistics Table visua... See more...
Since upgrading to 9.1.2, I am no longer able to see table output on the Splunk Search.  Even with the most simplistic search.  I receive the message "Failed to load source for Statistics Table visualization." I am able to see "Events" and also able to use "fields", just not table.  Note that this works when viewing in a Studio Dashboard, so the issue seems to be limited to the Search app.
how to show the how long alert took triggered from the time the event occurred.  To calculate the "diff" in times, to subtract either (_time - event_time) or, if event_time is null, (_time - orig_ti... See more...
how to show the how long alert took triggered from the time the event occurred.  To calculate the "diff" in times, to subtract either (_time - event_time) or, if event_time is null, (_time - orig_time), and then calculate the average time it took for each rule to fire, over time.  i have tried to calculate the diff but event_time and orig_time is present in same event and some doest have.  Please help me to identify the difference in event time and alert triggering time delay.  index=notable | eval diff = _time - event_time | convert ctime(diff), ctime(orig_time) | table event_time orig_time _time diff search_name  
Im using the search below and basically want a chart showing last 12 dates going oldest to newest from left to right by date.  | inputlookup running_data.csv | eval EP=strptime(Date, "%m/%d/%Y") | ... See more...
Im using the search below and basically want a chart showing last 12 dates going oldest to newest from left to right by date.  | inputlookup running_data.csv | eval EP=strptime(Date, "%m/%d/%Y") | eval Date=strftime(EP, "%m/%d/%Y") | chart sum(sats) over team by Date useother=false limit=12 | fillnull value=0 The search was working fine up until January and year change, now the search only shows the last date in December and is missing the newest  01/02/2024 date.  If I change the limit to be large enough to include all date entries in the csv file, I discovered the below: Its putting the 01/02/2024 date before the oldest date in the csv, instead of putting that 01/02/2024 column after the 12/18/2023 date column.  So its like its ignoring the year and going by month chronologically. Done quite a bit of searching on this to no avail, and seems like this should be an easy thing to do... Im not opposed to not using "chart" if someone has a better way.  Ideally the search returns the last 12 dates from oldest to newest in the columns and then the team name and numbers sats on that date in the rows.  Thansk for any suggestions!
I have a "myfiled" for the last update in format 2020-11-25T11:40:42.001198Z. I want to create two new fields UpdateDate and UpdateTime I used "eval" + "substr"  -------- | eval UpdateDate=sub... See more...
I have a "myfiled" for the last update in format 2020-11-25T11:40:42.001198Z. I want to create two new fields UpdateDate and UpdateTime I used "eval" + "substr"  -------- | eval UpdateDate=substr("myfield",1,10) | eval UpdateTime=substr("myfield",12,10) -------- But in the table  UpdateDate and UpdateTime are empty. while "myfield" has value as shown above. Any suggestions? Thank you.
Hi,  I am trying to create a splunk classic dashboard, but struggling with setting the earliest values. The goal is to run a search, and have the results pull data from 2 different index files whil... See more...
Hi,  I am trying to create a splunk classic dashboard, but struggling with setting the earliest values. The goal is to run a search, and have the results pull data from 2 different index files while controlling the data pulled from the first index by a dropdown time parameter. The SPL is part of a radio button option with the dashboard and starts as such: ( (index=first_index source="first_file_location" $time_range$ latest=-1d@d() ) OR (index=second_index earliest=@d latest=now() source="second_file_location") ... rest of the SPL time_range token is the dropdown input with a static value set to earliest=-7d@d() When I run the search, it is not substituting the static value into the search, and is populating it as the token name with the $ sign. Would it be possible to set the earliest value for the first index using a dropdown menu. Any assistance would be greatly appreciated. Thanks
How do I set up an email notification that is triggered by a user add/update/delete/activate?
Hello, I created a dashboard with one column of timeStamp EndTimeUTC which stores AH_TIMESTAMP4 or EHActivityItem.EH_ENDTIME. That works. result= 2024-01-01 10:09:28   Now the customer wants ... See more...
Hello, I created a dashboard with one column of timeStamp EndTimeUTC which stores AH_TIMESTAMP4 or EHActivityItem.EH_ENDTIME. That works. result= 2024-01-01 10:09:28   Now the customer wants to see the time in timezone CET. I can show him the offset time with  | eval "EndTime (CET)"=strftime(strptime(EndTimeUTC,"%Y-%m-%d %T"),"%Y-%m-%d %T %z") result: 2024-01-01 10:09:28 +0100   But how can I show the time in a calculated format ? expected result: 2024-01-01 11:09:28    Thanks!  
Hi all, On Splunk Studio - I want that my dashboard will be colored consistently. For example - if field is priority - my dashboard will show "High" in the same color everywhere in the dashboard (I... See more...
Hi all, On Splunk Studio - I want that my dashboard will be colored consistently. For example - if field is priority - my dashboard will show "High" in the same color everywhere in the dashboard (I don't mind the color right now but might mind it in the future). and if a new value is added to priority field - it will get a new different color in all charts. Is it possible? Thanks, Tamar
     splunk query "Orca High Alerts" is connected to snow TEST environment. It is showing many more close records than open records. When filtering the splunk query results with a wide time wi... See more...
     splunk query "Orca High Alerts" is connected to snow TEST environment. It is showing many more close records than open records. When filtering the splunk query results with a wide time window and a unique event id on splunk side both open and close lines appear but both have exact same timestamp - suspect splunk only sends the close if the open and the close have the exact same timestamp - is there a way to validate this?
Hi Splunkers!     I would like to filter in a field when I received a specific value from multiselect input dropdown,  I'm having a field "Type" where I will get multiselect values, that will be ... See more...
Hi Splunkers!     I would like to filter in a field when I received a specific value from multiselect input dropdown,  I'm having a field "Type" where I will get multiselect values, that will be passed to a search by macro, in that search, i would like to filter "Assetname" with field of having Z in 3rd letter, only when I'm getting ADZ value from the field "Type"   When I'm not getting the value ADZ, i need to get all values in the field Assetname Type - Indus, ADZ, Stan Assetname - abZahd-2839 so, the Assetname with 3rd letter Z needs to be filtered. Thanks in Advance! Manoj Kumar S
hi , we accidentally uploaded Personalized Dev/Test License file instead of Developer license in our splunk emterprise environment. After restarting splunk, we found that it is not acceptiing any use... See more...
hi , we accidentally uploaded Personalized Dev/Test License file instead of Developer license in our splunk emterprise environment. After restarting splunk, we found that it is not acceptiing any user creds and logging on Admin user only. Please let us know how can we fix it given that no users are registered in splunk web now and no one is able to login.   Thanks
hello, when I test actions in in app editor (view mode), the Console Output is shown with dark text over dark theme, so it makes it difficult to read the output. how can I turn the text or the... See more...
hello, when I test actions in in app editor (view mode), the Console Output is shown with dark text over dark theme, so it makes it difficult to read the output. how can I turn the text or the theme to light?   thank you in advance
Hello, I do success to disable/enable tokens using the WEB interface.   But curl command fails while trying to disable token using REST API. Executing GET method works OK: curl -k -u USER1:US... See more...
Hello, I do success to disable/enable tokens using the WEB interface.   But curl command fails while trying to disable token using REST API. Executing GET method works OK: curl -k -u USER1:USER_PASW -X GET https://localhost:8089/services/authorization/tokens -d id=80e7402b9940a7ac761f259d1e3e49bad1417394924ad0909c8edfd8eb92800e But PUT is failed with no clear error message: curl -k -u USER1:USER_PASW  -X PUT https://localhost:8089/services/authorization/token/ron -d id=80e7402b9940a7ac761f259d1e3e49bad1417394924ad0909c8edfd8eb92800e -d status=disabled The result is: <?xml version="1.0" encoding="UTF-8"?> <response> <messages> <msg type="ERROR">Not Found</msg> </messages> </response> I tried to switch username between ron and david.  What's wrong and how to locate more informative problem description. Thanks in advance, David
Hello Splunkers! Is there a way to collect iPad logs? I saw the Mint iOS SDK documentation, but I don't find it clear.
Hi, After installing the Python agent from downloads and after extracting the .tar file we are getting a jar file that is related to Java. There were no .py files. Just wanted to know how to instal... See more...
Hi, After installing the Python agent from downloads and after extracting the .tar file we are getting a jar file that is related to Java. There were no .py files. Just wanted to know how to install the Python agent from AppDynamics Downloads. Thanks, Anusha
Is it possible to store regex patterns in a lookup table so that it can be used in a search? For example lets say I have these following regexes like "(?<regex1>hello)" and "(?<regex2>world)".  My a... See more...
Is it possible to store regex patterns in a lookup table so that it can be used in a search? For example lets say I have these following regexes like "(?<regex1>hello)" and "(?<regex2>world)".  My actual regexes are not simple word matches. I want to write another query that basically runs a bunch of regexes like  | rex field=data "regex1" | rex field=data "regex2" etc | makeresults 1 | eval data="Hello world" [| inputlookup regex.csv | streamstats count | strcat "| rex field=data \"" regex "\"" as regexstring | table regexstring | mvcombine regexstring]​ is it possible to use the subsearch to extract the regexes and then use them as commands in the main query? I was trying something like | makeresults 1 | eval data="Hello world" [| inputlookup regex.csv | streamstats count | strcat "| rex field=data \"" regex "\"" as regexstring | table regexstring | mvcombine regexstring]  so that the subsearch outputs the following  | rex field=data "(?<regex1>hello)" | rex field=data "(?<regex2>world)"
Hello Everyone, We have a Splunk server installed and working. However, a month ago our license expired, and we just renewed it. Unfortunately, it didn't go without issues. We renewed our license, b... See more...
Hello Everyone, We have a Splunk server installed and working. However, a month ago our license expired, and we just renewed it. Unfortunately, it didn't go without issues. We renewed our license, but now the system isn't working, and we are getting the following error message - "Error in 'litsearch' command: Your Splunk license expired, or you have exceeded your license limit too many times." We found the below article. However, we are unable to access it as it requires a Salesforce login. Error in 'litsearch' command: Your Splunk license expired or you have exceeded your license limit too many times. | Splunk (site.com) Can anyone help us figure out how to fix this issue? Thank you, Richard