All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Hello everyone ! After a few hours of research i come ask your help. Here is my data : Username_column clientip_column username1 xxx.xxx.xxx.xxx username1 xxx.xxx.xxx.x... See more...
Hello everyone ! After a few hours of research i come ask your help. Here is my data : Username_column clientip_column username1 xxx.xxx.xxx.xxx username1 xxx.xxx.xxx.xxx username1 xxx.xxx.xxx.xxx username1 yyy.yyy.yyy.yyy username2 xxx.xxx.xxx.xxx username2 zzz.zzz.zzz.zzz username3 yyy.yyy.yyy.yyy username3 xxx.xxx.xxx.xxx   So, what i would like to do is to create another column called "countUsername" which contain the number of usernames by clientip without duplicates (of usernames). Here is my dream table (what i want) : Username_column clientip_column countUsername username1 xxx.xxx.xxx.xxx 3 username1 xxx.xxx.xxx.xxx 3 username1 xxx.xxx.xxx.xxx 3 username1 yyy.yyy.yyy.yyy 2 username2 xxx.xxx.xxx.xxx 3 username2 zzz.zzz.zzz.zzz 1 username3 yyy.yyy.yyy.yyy 2 username3 xxx.xxx.xxx.xxx 3   I tried various of things like : | eventstats values(count(Username_column)) as countUsername by clientip_column  creating a multivalue column and trying of mvdedup(). combine my Username_column and my clientip_column like so : | eval countUsername=Username_column. " " . clientip_column  and doing lots of things on that, if(mach)), regex, ... But everything that i tried didn't work. The best thing that i can get is : | eventstats count(Username_column) as countUsername by clientip_column  But with this line, my usernames are duplicated. (like the table bellow, i tried some things with this result but no results on my side) Username_column clientip_column countUsername username1 xxx.xxx.xxx.xxx 5 username1 xxx.xxx.xxx.xxx 5 username1 xxx.xxx.xxx.xxx 5 username1 yyy.yyy.yyy.yyy 2 username2 xxx.xxx.xxx.xxx 5 username2 zzz.zzz.zzz.zzz 1 username3 yyy.yyy.yyy.yyy 2 username3 xxx.xxx.xxx.xxx 5   Maybe you are wondering why i'm using eventstats instead of stats. The reason is that before this line, i have a large search with multiple stats commands, and if i don't use eventstats, all my others columns at the end of my large request won't show up. Kind regards,
Hi splunkers, I want to use "null"  command in below query. If the message is "null" then it should replace with the below message otherwise it should only display the already extracted message.  ... See more...
Hi splunkers, I want to use "null"  command in below query. If the message is "null" then it should replace with the below message otherwise it should only display the already extracted message.    | eval message= if(Actor="superman","super hero", if(Actor="emma watson","model")) Thanks.
I am new to Splunk and I need help to get a query that lists all the domains that are in my logs (that were accessed from my network or that accessed my network) at any given period or range
Hi everyone, I'm sure this is a question that's been answered before, but my google-fu is failing me.  I am running Splunk Cloud 8.2 (Victoria), Salesforce App for Splunk version 4.11, and Splunk Add... See more...
Hi everyone, I'm sure this is a question that's been answered before, but my google-fu is failing me.  I am running Splunk Cloud 8.2 (Victoria), Salesforce App for Splunk version 4.11, and Splunk Add-on for Salesforce version 4.4.0-1651043262.  I have the Salesforce app configured and data inputs set and I have data in my index from all the sources: What I don't have, however, is literally any data populating any of my dashboards: I'm wondering if it has anything to do with the lookup tables being broken: Could not load lookup=LOOKUP-SFDC-USER_NAME I have enabled the saved search and run it, successfully (per the Add-on docs); however, the App docs have saved Lookup searches that, when I run, don't return data: So the lookups aren't populated.  Also there are only 3 lookups, not 4 like in the docs.  I'm sure I'm missing something *very* simple; but anyone have any ideas?
Hello,  In ES when we run the following macro for Last 30 mins or Last 24 H time range,  splunk ends up displaying results from all the way back in time as in last 6 months data as well.  Why is that... See more...
Hello,  In ES when we run the following macro for Last 30 mins or Last 24 H time range,  splunk ends up displaying results from all the way back in time as in last 6 months data as well.  Why is that so ?  Its as if its completely ignores the date/time range whatever we specify.   BTW,  This is Out of the box macro.         |`incident_review` | table _time owner rule_id rule_name status_label          My requirement is to show  the Notables triggered based on the date range we select. Secondly, does anyone know how to show  Number of Incidents (Notable alerts) worked on by each SOC analyst ?   Basically i m trying to generate performance metrics of each analyst, how many alerts they worked on, time to close each alert etc, details of each status change etc.    The default provided SOC operations dashboard sucks.
Hi I have two config files that need to monitor them, to answer these questions: Who?what?when? Change that file. Need content monitoring like git show different between versions, and history of file... See more...
Hi I have two config files that need to monitor them, to answer these questions: Who?what?when? Change that file. Need content monitoring like git show different between versions, and history of file. Any idea? Thanks
Hi All, I'm trying to get the SFTP network data protocol logs from an SFTP server (windows server) that has a universal forwarder on it, I have found the Splunk App for Stream: https://splunkbase.... See more...
Hi All, I'm trying to get the SFTP network data protocol logs from an SFTP server (windows server) that has a universal forwarder on it, I have found the Splunk App for Stream: https://splunkbase.splunk.com/app/1809/ I have configured everything in its place, but the issue here is that this app can monitor several network data protocols but not SFTP, the most relative protocol is FTP as shown below: I have enabled the FTP as shown above but I can't see any traffic from it, even though I have enabled some other protocols and I saw traffic as shown below: What can I do about this to get the SFTP logs? Thanks.
Hi, I want to transfer my classic dashboard to the dashboard studio and I have some questions regarding that. On the classic dasboard I have several search options that I want to migrate to the n... See more...
Hi, I want to transfer my classic dashboard to the dashboard studio and I have some questions regarding that. On the classic dasboard I have several search options that I want to migrate to the new dashboard:   <option name="count">20</option> <option name="dataOverlayMode">none</option> <option name="drilldown">none</option> <option name="percentagesRow">false</option> <option name="refresh.display">progressbar</option> <option name="rowNumbers">false</option> <option name="totalsRow">false</option> <option name="wrap">false</option> <format type="color" field="Action"> <colorPalette type="map">{"allowed":#99ff99,"blocked":#ff4d4d,"dropped":#ff4d4d,"monitor":#ffc44d}</colorPalette> </format>   When I tried to use it in the dashboard studio it didn't work.  Can someone please share with me the options list? I didn't find it on the documentation.  Moreover, the font size of the search's results is quiet huge and I want to reduce it. Which option should I use for that?  Thanks!
I am getting an error "check_hostname requires server_hostname" with Splunk 9 when using request.post() with proxy with https .   How to resolve this error?
I have the Field with id i want to  only  3 digits  id For example: if i take t0123-123 here i want remove t0 t456-456 here i want remove t t1023-023 here i want to remove t1 The excepted output... See more...
I have the Field with id i want to  only  3 digits  id For example: if i take t0123-123 here i want remove t0 t456-456 here i want remove t t1023-023 here i want to remove t1 The excepted output as shown below: ID expected ID a a t0123 123 t456 456 t1023 023
I am trying to use a search to find fields that I want to use in another search as a table field. The first search should return all fields that are used in a datamodel. This looks like this:   ... See more...
I am trying to use a search to find fields that I want to use in another search as a table field. The first search should return all fields that are used in a datamodel. This looks like this:     | datamodel "Authentication" | spath output=foo path=objects{} | spath input=foo output=calc_field path=calculations{}.outputFields{}.displayName | spath input=foo output=field path=fields{}.displayName | eval fields = mvappend(calc_field , field) | mvexpand fields | table fields         Then I want to use the list of fields in the table command. I do this for the reason to be able to check the coverage of the CIM fields in the search. Unfortunately, so far without success, so I am grateful for all ideas and any kind of input. My first guess was something like:     index="main" sourcetype="XmlWinEventLog" tag="authentication" | table [ | datamodel "Authentication" | spath output=foo path=objects{} | spath input=foo output=calc_field path=calculations{}.outputFields{}.displayName | spath input=foo output=field path=fields{}.displayName | eval fields = mvappend(calc_field , field) | mvexpand fields | format "" "" "," "" "" "" | rex mode=sed field=search "s/fields=//g" | rename search as table ]        
I have a CSV with numerous fields with bad field names. They have spaces and special characters such as up and down arrows. I don't know ahead of time what the field names will be. How do I locate ... See more...
I have a CSV with numerous fields with bad field names. They have spaces and special characters such as up and down arrows. I don't know ahead of time what the field names will be. How do I locate and rename all of them to more "safe" Splunk field names that work easily in all Splunk commands without funky syntax?
I am trying to get my query to work correctly and display it in a table format for easy analysis. The fields I am using are: Host device_active Device_enabled _time   I am t... See more...
I am trying to get my query to work correctly and display it in a table format for easy analysis. The fields I am using are: Host device_active Device_enabled _time   I am trying to track changes from device_active being enabled ("2") to becoming disabled ("1").  I want to display a table that shows which hostnames, within the last 2-4hrs, have changed from enabled to disabled.  If possible add traceability.  Device_active="1" ----->disabled Device_active="2" ------>enabled I tried following some tutorials but could not get it work correctly: https://splunkonbigdata.com/find-out-the-errors-occurring-2-or-more-times-consecutively/ https://community.splunk.com/t5/Splunk-Search/How-to-count-how-many-times-a-field-value-has-changed-from-one/td-p/202299 _______________________________________ Currently, I have the following query: index="log-main" sourcetype=monitoring device_active earliest=-4h latest=-2h | table host, device_active, device_enabled, _time | dedup host | streamstats current=f window=1 max(device_active) as prev_status | eval isConsecutive = if (device_active == Previous_error, 1, 0) | streamstats count as count by device_active reset_before=(isConsecutive==0) | streamstats count(eval(isConsecutive==0)) as #ofdisconnects   Which is producing the following: Host device_enabled device_active time #ofdisconnects count isconsecutive prev_status   This is currently showing "all" hostnames and not filtering out "just" the ones that have changed statuses.  I'd like to display the following information, but filtered down to just the hosts that have "device_active" disabled, but recently were enabled.
Hi, I have a json coming from CI with this template : {"source":"1","sourcetype":"json","event":{"type":"build","id":"061","durartion":"48","run_id":"1","paths":["value1",".value2","value3"]} t... See more...
Hi, I have a json coming from CI with this template : {"source":"1","sourcetype":"json","event":{"type":"build","id":"061","durartion":"48","run_id":"1","paths":["value1",".value2","value3"]} the filed are listed in splunk as: id, duration, sourcetype, paths{} and i can list all the values but my issue is i want to count paths{} (more then 11k values)  I tried using mvcount as  | eval totalpaths = mvcount(paths) retuns nothing | eval totalpaths = mvcount(paths{}) return 1 is there a way how i can return the number of total path ?  how i can list all paths ? I tried using  | stats values(paths{}) as paths | stats count(eval(paths)) AS totalbazelpaths returns 378 while the actual value is above 11k.  when expanding paths{} field I can see all 11k paths. what im doing wrong here? thanks    
I am trying to extract the _time from the log Jul 28 12:00:49 104.128.100.1 420391: Jul 28 06:30:25.023: %Sample: Sample: cp :  QFP:0.0 but the Splunk is extracting the _time as 2022-07-28T12:0... See more...
I am trying to extract the _time from the log Jul 28 12:00:49 104.128.100.1 420391: Jul 28 06:30:25.023: %Sample: Sample: cp :  QFP:0.0 but the Splunk is extracting the _time as 2022-07-28T12:00:49.000+05:30 I want it to extract the second time from log i.e Jul 28 06:30:25.023 i tried the approach  In props.conf file added [sourcetype] TIME_PREFIX = ^\S{3}\s\d{1,2}\s[^\s]+\s[^\s]+\s[^\s]+\s TIME_FORMAT = %b %d %H:%M:%S.%3Q MAX_TIMESTAMP_LOOKAHEAD = 30 TZ = UTC but not able to extract can someone pls help
I have newly installed latest version of the Splunk Supporting Add-on for Microsoft Active Directory on Splunk ES. Firstly, no matter how many times, I click on the "+" sign to add new domain, nothi... See more...
I have newly installed latest version of the Splunk Supporting Add-on for Microsoft Active Directory on Splunk ES. Firstly, no matter how many times, I click on the "+" sign to add new domain, nothing happens. Even if I add details on the default section and click "test connection", literally nothing happens. The last time I had worked on this addon was 4-5 years ago, even then its UI behavior was weird, but at least then the connection test use to be working but not it seems to have become even more terrible. Despite so many versions, I don't know what improvements they have exactly made. Can anyone please help in how to fix this ?
hi all,     I knows Splunk ODBC can use in win or mac operator system. And i tried success. but in linux, Splunk ODBC can not support.      if Tableau install in linux,how can i to use Splunk ODBC?
Hi, I have a multi-value field numbers with each of its values in the format of two numbers separated by a comma (for example 52,29).  For all of these values, I want to have an if statement that d... See more...
Hi, I have a multi-value field numbers with each of its values in the format of two numbers separated by a comma (for example 52,29).  For all of these values, I want to have an if statement that does a comparison on both the first number and second number and then return either "true" or "false".  Currently I have been using the foreach loop with the multi-value mode. However, when debugging why I am receiving the error below, I found that the default template value <<ITEM>>  appears to always return null instead of the values of numbers (isnotnull('<<ITEM>>') returns False). Shown below is how I am trying to extract the leftmost number using regex with replace and then check if it is greater than 5. Is there something wrong with this search? | foreach mode=multivalue numbers     [| eval results=if(tonumber(replace('<<ITEM>>'),  ",\d+",  "")) > 5, "true", "false")]   This is the error I get for the search above:   Thanks in advance.
I run this query to extract all IP address from the events. There are multi ip based on one event. index=* | rex max_match=0 field=_raw "(?<ipaddr>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" | dedup ipad... See more...
I run this query to extract all IP address from the events. There are multi ip based on one event. index=* | rex max_match=0 field=_raw "(?<ipaddr>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" | dedup ipaddr | table _time, ipaddr   The result is as below, My question is, how to exclude private IP from the the result? Thanks!