All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello i have 2 charts created with Splunk search a. pie chart showing all failed tasks names in a system b. line chart showing the amount of failed task names by day in the past 30 days.   ... See more...
Hello i have 2 charts created with Splunk search a. pie chart showing all failed tasks names in a system b. line chart showing the amount of failed task names by day in the past 30 days.   i want to be able to connect those 2 charts so when i click on a name of a chart in the pie chart the line chart will show this task name highlighted.   is that an option? if yes then how do i do that? thanks in advanced
Hello, I got a lookup file with differents range of time (start, end) looks like this Debut, Fin 2020-12-05 12:00:00, 2020-12-05 18:00:00 2021-01-24 08:00:00, 2021-01-24 18:00:00 2021-02:10 1... See more...
Hello, I got a lookup file with differents range of time (start, end) looks like this Debut, Fin 2020-12-05 12:00:00, 2020-12-05 18:00:00 2021-01-24 08:00:00, 2021-01-24 18:00:00 2021-02:10 19:00:00, 2021-02-10 21:00:00 2021-02-02 19:00:00, 2021-02-02 21:00:00 I'd like to match events which are not included in the differents range of the lookup   I tried this but it didn't work       index="my_index" [inputlookup my_lookup.csv | eval start=strptime(Debut,"%Y-%m-%d %H:%M:%S") | eval end=strptime(Fin,"%Y-%m-%d %H:%M:%S") | table start end] | search _time < start AND _time > end       Any idea ? Thanks for help
Hi,   As asked in the subject  I trying to figure out the difference between lookup input lookup because I  don't think I get it.   in this research for example:   index=windows EventCode... See more...
Hi,   As asked in the subject  I trying to figure out the difference between lookup input lookup because I  don't think I get it.   in this research for example:   index=windows EventCode=4624 [ | inputlookup damtest2.csv | rename Server AS Workstation_Name | fields Workstation_Name ] | lookup damtest2.csv Server AS Workstation_Name OUTPUT os | table Workstation_Name os Package_Name__NTLM_only_ | dedup Workstation_Name | sort Workstation_Name   plus What is the use case of a definition lookup. The command above works without lookup definition for example.   Regards  
Hi We are looking a way to integrate Checkmarx with  Splunk what will be the best way  ?
Hi Splunkers, for an addon I'm making, I need to perform a sourcetype override. The general mechanis is clearly explained on this documentation: Override source types on a per-event basis and I us... See more...
Hi Splunkers, for an addon I'm making, I need to perform a sourcetype override. The general mechanis is clearly explained on this documentation: Override source types on a per-event basis and I used it with different result. If I use, in the props.conf file, a sourcetype like <spec>, it works fine; so, if my data born with sourcetype A, and A is puttend in the props.conf as spec, and I want to override it with B, where B is putted in transforms.conf under the proper regex, nothing goes wring and I achieve the desiderd result. Now, suppose I want switch, in prop.conf file for <spec> parameter, from a sourcetype to a source and that this source is a file under a specific location. Of course, I could put the full path of source; but, for different reasons, this path may change in our production environment, so I need to switch from full path to a partial one; the worst case is whre we must change from:     C:\sub1\sub2\sub3.test_file.txt     to:     ...\test_file.txt     So, my question is: what is the proper wildcard syntax to achieve this purpose? I tried until now:     ...\test_file.txt C:\...\test_file.txt //C:\...\test_file.txt     but they does not work and the sourcetype is not overriden.
Hello community We are ingesting sftp log. The logfile rotates once every 24h. "headers" are set in the new file every rotation which gets indexed. Unlike every other event indexed, the "linecoun... See more...
Hello community We are ingesting sftp log. The logfile rotates once every 24h. "headers" are set in the new file every rotation which gets indexed. Unlike every other event indexed, the "linecount" for this event is 2 instead of 1 so they are pretty easy to spot. #Date: Mon Jan 10 00:00:00 CEST 2020 #Fields: date time ip port ......... I've seen examples regarding skipping header lines in CSV files, though this is a textfile. It is not a huge issue though still something which is a bit irritating. Is it possible to skip these lines so they are not forwarded/indexed? How would I go about accomplishing this? Thank you in advace
Hi is there any reference available to describe how we can understand different chart and graph? With practical examples. e.g. when memory usage increases step by step means we have memory leakag... See more...
Hi is there any reference available to describe how we can understand different chart and graph? With practical examples. e.g. when memory usage increases step by step means we have memory leakage.    Thanks 
I have a standalone instance of Splunk. I am running both: Splunk Add-on for Unix and Linux, and Splunk App for Unix. Since the Splunk App for Unix has reached End-of-Life and is not requir... See more...
I have a standalone instance of Splunk. I am running both: Splunk Add-on for Unix and Linux, and Splunk App for Unix. Since the Splunk App for Unix has reached End-of-Life and is not required in my deployment anymore i am looking to remove it. Initially i tried just using Splunk command: ./splunk remove app splunk_app_for_nix However noticed that this impacts the index "os" used by the Splunk Add-on for Unix and Linux. The index no longer appears in the web gui under settings>indexes. If i look in the CLI, i can still see data in /opt/splunk/os/db, so the data still appears to be there, but is not being used apparently.... I am getting Message saying "Received event for unconfigured/disabled/deleted index=os ...", so am not entirely sure what the status of this index is now. What is the best way to remove this app without affecting the index? Thanks,    
Hi All,   I'm seeking little help to drop/off board the device. So we don't have any HF in our environment we use our indexer as our HF also. Their is a windows device xyz in our environment ... See more...
Hi All,   I'm seeking little help to drop/off board the device. So we don't have any HF in our environment we use our indexer as our HF also. Their is a windows device xyz in our environment and we don't want any single logs from this xyz server and it is directly sending logs to indexer not to deployment server. So I create 2 files one is props.conf and other is transforms conf  On props.conf  [sourcetype name] TRANSFORMS-win=eventlogs On Transforms.conf REGEX=xyz DEST_KEY=queue FORMAT=nullQueue And I restart the indexer  But it is not working I can see till see logs. Can anyone please suggest where I goes wrong.    Thank you in advance 
Hi Team, I have a dashboard with form inputs and once the user enters the form inputs, around 5-6 panels will be loaded. Is there a way to send the dashboard results in Email. I could see "Schedu... See more...
Hi Team, I have a dashboard with form inputs and once the user enters the form inputs, around 5-6 panels will be loaded. Is there a way to send the dashboard results in Email. I could see "Schedule PDF Delivery" option disabled in dashboard for me. How can we schedule email delivery for this type of dashboards. Thanks in advance.
Hi, In the past if a users develops a scheduled report whose results are used by other users and dashboards we would normally change the ownership to "nobody" so that if the person ever left the re... See more...
Hi, In the past if a users develops a scheduled report whose results are used by other users and dashboards we would normally change the ownership to "nobody" so that if the person ever left the report would keep running. We dont like using service accounts as our security team are very anti them. But in splunk cloud if we reassign the saved search to nobody it seems to apply the cron schedule based on UTC rather than the timezone of the previous user (in our case New Zealand). Is there someway we can set the timezone of the nobody user?  We are reluctant to have to specify cron schedules in UTC when eveything we do is in NZ time. Any suggestions? Thanks
I've got a JSON array I ingest that I want to extract certain fields from to save into a lookup table. Here's an example of the JSON:     { "Response": { "results": [ { "memberTyp... See more...
I've got a JSON array I ingest that I want to extract certain fields from to save into a lookup table. Here's an example of the JSON:     { "Response": { "results": [ { "memberType": 2, "isOnline": false, "lastOnlineStatusChange": "1657176499", "groupId": "1234567", "destinyUserInfo": { "LastSeenDisplayName": "UserName1", "LastSeenDisplayNameType": 1, "iconPath": "/img/theme/bungienet/icons/xboxLiveLogo.png", "crossSaveOverride": 1, "applicableMembershipTypes": [ 2, 1 ], "isPublic": false, "membershipType": 1, "membershipId": "1234567890123456789", "displayName": "UserName1", "bungieGlobalDisplayName": "UserName1", "bungieGlobalDisplayNameCode": 9999 }, "bungieNetUserInfo": { "supplementalDisplayName": "UserName1#9999", "iconPath": "/img/profile/avatars/default_avatar.gif", "crossSaveOverride": 0, "isPublic": false, "membershipType": 254, "membershipId": "12345678", "displayName": "UserName1", "bungieGlobalDisplayName": "UserName1", "bungieGlobalDisplayNameCode": 9999 }, "joinDate": "2021-10-27T20:56:48Z" }, { "memberType": 2, "isOnline": false, "lastOnlineStatusChange": "1657390180", "groupId": "1234567", "destinyUserInfo": { "LastSeenDisplayName": "UserName2", "LastSeenDisplayNameType": 1, "iconPath": "/img/theme/bungienet/icons/xboxLiveLogo.png", "crossSaveOverride": 1, "applicableMembershipTypes": [ 2, 3, 1 ], "isPublic": false, "membershipType": 1, "membershipId": "4611686018431599324", "displayName": "UserName2", "bungieGlobalDisplayName": "UserName2", "bungieGlobalDisplayNameCode": 8888 }, "bungieNetUserInfo": { "supplementalDisplayName": "UserName2#8888", "iconPath": "/img/profile/avatars/HaloRingcopy.gif", "crossSaveOverride": 0, "isPublic": false, "membershipType": 254, "membershipId": "1990219", "displayName": "UserName2", "bungieGlobalDisplayName": "UserName2", "bungieGlobalDisplayNameCode": 8888 }, "joinDate": "2020-04-07T15:07:21Z" } ], "totalResults": 2, "hasMore": true, "query": { "itemsPerPage": 2, "currentPage": 1 }, "useTotalResults": true }, "ErrorCode": 1, "ThrottleSeconds": 0, "ErrorStatus": "Success", "Message": "Ok", "MessageData": {} }     (I truncated the results array to 2, there are normally many more) I want to write to a lookup table like this:     _time | membershipId | joinDate 2022-07-17 16:20:28 | 1234567890123456789 | 2021-10-27T20:56:48Z 2022-07-17 16:20:28 | 9876543210123456789 | 2020-04-07T15:07:21Z     I can get something close into a table with:     index=myindex | rename Response.results{}.destinyUserInfo.membershipId as membershipId Response.results{}.joinDate as JoinDate | table _time ID JoinDate     but saving that to a lookup table makes the membershipId and joinDate into multivalue fields and stores all of the values accordingly. I need them separate. Help?
In the Overview tab, it shows 25TB of total ingest volume. This is incorrect, we should be at ~4TB. This is important for our licensing and storage levels.
Hi All, i want to filter out url that contains IP , one way is i can write regex for it,, extract IP in other field and then i can filter out with that field, but here i want to save run time as we... See more...
Hi All, i want to filter out url that contains IP , one way is i can write regex for it,, extract IP in other field and then i can filter out with that field, but here i want to save run time as well,, as i dont have fixed index,, i need to search all indexes that are having field url with ip address,, so i want to apply search first instead of extracting ips.  raw url formats are =http://1.1.1.1/, 1.1.1.1:443, http://1.1.1.1/xyc.co  i tried so far  (index=*)  |fields url |where match )url, "^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\") it is working but only showing url as 1.1.1.1:443, not rest formats.  any idea what can i do? note: i dont want to write regex to extract IPs in new field first and than apply search (new field=*), it will work but query will take time as first it will search for all the urls,, then apply regex and then will apply filter.
Is there a form element that accepts only a single datetime value? User will be entering two fields, field 1 as  a single datetime and field 2 as a duration like "1h". Based on this two inputs, n... See more...
Is there a form element that accepts only a single datetime value? User will be entering two fields, field 1 as  a single datetime and field 2 as a duration like "1h". Based on this two inputs, need to calculate and set time values as below   For eg: User input field 1[TimeTokenMid] --> 07/12/2022:18:00:00 User input field 2[Duration] --> "1h" Need to calculate and set time ranges as below using relative_time function. Token_1 = 07/12/2022:17:00:00 [TimeTokenMid - Duration using relative_time fnc] Token_2 = 07/12/2022:19:00:00 [TimeTokenMid + Duration using relative_time fnc] Currently i am taking field 1[TimeTokenMid] as text and doing calculation as the current timepicker control is not allowing to set a single custom time value. It has options only for selecting a range like (last 60 mins, between x and y etc)like that. In my usecase,I want to take a single timevalue input and calculate time ranges(+-) based on the duration input dynamically. Could you please suggest any solution?
Hi Team,  Can some one help me who already enabled below attribute ([config_change_audit] in Version 8.2.4 or 8.2.7 . below caution was mentioned in document. * CAUTION: This setting is experimen... See more...
Hi Team,  Can some one help me who already enabled below attribute ([config_change_audit] in Version 8.2.4 or 8.2.7 . below caution was mentioned in document. * CAUTION: This setting is experimental and is related to a feature that is still under development. Using the setting might increase resource usage. What is the Experimental feature ? What will risk to the environment a part from resource usage? And if its enabled in environment what will be resource utilizations ?  ================================================================================================================ [config_change_audit] disabled = <boolean> * Whether or not splunkd writes configuration changes to the configuration change log at $SPLUNK_HOME/var/log/splunk/configuration_change.log. * If set to "false", configuration changes are captured in $SPLUNK_HOME/var/log/splunk/configuration_change.log. * If set to "true", configuration changes are not captured in $SPLUNK_HOME/var/log/splunk/configuration_change.log. * Default: true mode = [auto|track-only] * Set to "auto" or "track-only" to get log of .conf file changes under $SPLUNK_HOME/etc/system, $SPLUNK_HOME/etc/apps, $SPLUNK_HOME/etc/users, $SPLUNK_HOME/etc/slave-apps or changes to $SPLUNK_HOME/etc/instance.cfg. * The values "auto" and "track-only" are identical in their effects. Set mode to "auto" to auto-enroll this deployment into all the latest features. * CAUTION: This setting is experimental and is related to a feature that is still under development. Using the setting might increase resource usage. * Default: auto ==========================================================================================================      
Hi team, As per my requirement, on changing a particular form element [Token 1] , a set of other tokens [Token2,Token3 ..etc] needs to be recalculated and set first and then on click of submit , al... See more...
Hi team, As per my requirement, on changing a particular form element [Token 1] , a set of other tokens [Token2,Token3 ..etc] needs to be recalculated and set first and then on click of submit , all my panels should load using the recalculated tokens.   I have added the tokens to be evaluated with in the <change> tag under the <input> of the Token1. But still if i change the form element value, the other tokens are not getting recalculated. Could you please help with this? sample: <input type="text" token="timeTokenMid" searchWhenChanged="false"> <label>Start Time</label> <change> <eval token="formatted_token">strptime($timeTokenMid$,"%m/%d/%Y:%T")</eval> <eval token="timeTokenSt">relative_time($formatted_token$,"-1h")</eval> <eval token="timeTokenSt_datetime">strftime($timeTokenSt$,"%m/%d/%Y:%T")</eval> <eval token="timeTokenEnd">relative_time($formatted_token$,$obtDuration$)</eval> <eval token="timeTokenEnd_datetime">strftime($timeTokenEnd$,"%m/%d/%Y:%T")</eval> </change>    
Hi, Novice splunker here. My search only extracts 1st 10-digit number and my data contains atleast 4 or more  10-digit numbers. Need help on how to make a list of numbers from following example text.... See more...
Hi, Novice splunker here. My search only extracts 1st 10-digit number and my data contains atleast 4 or more  10-digit numbers. Need help on how to make a list of numbers from following example text.  example data in field2: 1111111111,Standard,R9801, 2222222222,Standard,S9801, 3333333333,Standard,T9801, 4444444444,Standard,U9801, 5555555555,Standard,V9801, For order and enquiries contact 111xxxxxx and mention the 15 digit ID "ACOOLXXXXXX"   |rex field=Data "(?<Numbers>\d{10,})" |table Numbers Output of above is only 1st number in the field Numbers>> 1111111111 Thanks.  
I am attempting to use the Splunkbase timeline visualization.  I am trying to create a timeline where:  Y axis  is the months (Jan - Dec) x axis is the days of the month (1-31)   need to di... See more...
I am attempting to use the Splunkbase timeline visualization.  I am trying to create a timeline where:  Y axis  is the months (Jan - Dec) x axis is the days of the month (1-31)   need to display the date a patch comes out, the days we are testing the patch, and when we roll it out to company.  I have never done a timeline in splunk and could use some help.  Thank to the community in advance! If anyone has suggestions on other ways I could visualize this without using timeline, that would be great as well. 
Hi I want to connect influxdb via splunk db, in connection type I can't find influx db! is there any jdbc or odbc driver exist for influxdb?     FYI: find this repo:  https://github.co... See more...
Hi I want to connect influxdb via splunk db, in connection type I can't find influx db! is there any jdbc or odbc driver exist for influxdb?     FYI: find this repo:  https://github.com/influxdata/influxdb-java Any idea? Thanks