All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

Well, there's no good section for this so I'll just post it here. I'm trying to do some drawings using the stencils from https://docs.splunk.com/Documentation/Community/current/community/Resources ... See more...
Well, there's no good section for this so I'll just post it here. I'm trying to do some drawings using the stencils from https://docs.splunk.com/Documentation/Community/current/community/Resources Aaaand the don't work very well with Office365 Visio. If I pull the icon to my drawing it's either completely filled with solid colour (the default option): Or if I change the fill to "none" and line colour to black I get a silly looking "hollow" icon. (on both screenshots it's the same icon of multiple indexers). Fiddling with styles doesn't help much either. Oh, and by default the caption text is not visible at all. So the question is - have you any experience-based hints on how to properly use the icons in visio? As a side rant - who thought it would be a good idea to edit caption text by going Group->Open->Edit text?
Hi all, I have just downloaded the app "SSL Certificate lookup" from splunk base and it's working fine. with following query: | makeresults | eval dest="myhost1, myhost2", dest = split(dest,","... See more...
Hi all, I have just downloaded the app "SSL Certificate lookup" from splunk base and it's working fine. with following query: | makeresults | eval dest="myhost1, myhost2", dest = split(dest,",") | lookup sslcert_lookup dest | eval dayleft=round(ssl_validity_window/86400) | table dest,dayleft, ssl_is_valid,ssl_issuer_common_name,ssl_self_issued,ssl_self_signed,ssl_version However, myhost1, myhost2 is hardcoded in the initial query and I would like to dynamically pass as parameters all host matching a specific query: index=* host=*myserver* I tried several things without success (subsearch, saved search, macro...), any idea how I could achieve that ?  Any help would be greatly appreciated !
Dear Splunkers,    We are using Splunk in a distributed environment with an SHC; now, what is the best approach to use Data inputs?  For example: can I create a TCP or UDP connection in one of ... See more...
Dear Splunkers,    We are using Splunk in a distributed environment with an SHC; now, what is the best approach to use Data inputs?  For example: can I create a TCP or UDP connection in one of the SH? and can I make an HEC input in an SHC environment? is this going to replicate to the remaining SHs?    Your help is very appreciated.
Hi All, I am appending two macros to generate the following result set using append command.  The 1st row comes from one macro while the 2nd row comes from the other.  Field rule_id is common in bot... See more...
Hi All, I am appending two macros to generate the following result set using append command.  The 1st row comes from one macro while the 2nd row comes from the other.  Field rule_id is common in both macro result set. How can i achieve the following  ? End goal is to show the same in Dashboard so i am looking to consolidate the data into one common row .   Any suggestions ?   I have tried using eval as recommended by @gcusello  in Solved: Merging events from two indexes - Splunk Community  but its not working out in my case. Desired Output: Triggered_time Acknowledged_time difference rule_id 2022-08-03 23:27:13 2022-08-03 23:28:37 00:01:24.9021888 xxxxx
Hi Team, Can we monitor the lookup files i.e from updates prospective who updates what in a lookup file or even in a KV store. This is one of the requirements of monitoring so that if tomorrow some... See more...
Hi Team, Can we monitor the lookup files i.e from updates prospective who updates what in a lookup file or even in a KV store. This is one of the requirements of monitoring so that if tomorrow something needed; we can backtrack and able to answer who; what and when. Thanks in advance.
I have a search like this: sourcetype = Grandstream  | stats count by _time phone starttime answer endtime result: _time phone starttime answer endtime count 2022-08-09 14:30:42 xxx39... See more...
I have a search like this: sourcetype = Grandstream  | stats count by _time phone starttime answer endtime result: _time phone starttime answer endtime count 2022-08-09 14:30:42 xxx39xxxx 2022-08-04 14:33:58 2022-08-04 14:34:02 2022-08-04 14:34:02 1 2022-08-09 14:30:42 xxx394xxxx 2022-08-04 14:34:02 2022-08-04 14:34:02 2022-08-04 14:34:02 1 2022-08-09 14:30:42 xxx1394xxx 2022-08-04 14:34:03 2022-08-04 14:34:03 2022-08-04 14:34:09 1 2022-08-09 14:30:42 xxx1382xx 2022-08-09 14:28:52 2022-08-09 14:28:52 2022-08-09 14:29:25 1 But _time and starttime don't match because the log time is pushed wrong is there a way to filter the starttime field by its time in a week from 0h Friday to 24h Thursday? thanks 
Hi, I have a line in the event like "/v1/locations/7b-cec6-4820-b699-ec"  I need to extract  7b-cec6-4820-b699-ec, or which ever comes after  /v1/locations/ and before a " Please help on the ... See more...
Hi, I have a line in the event like "/v1/locations/7b-cec6-4820-b699-ec"  I need to extract  7b-cec6-4820-b699-ec, or which ever comes after  /v1/locations/ and before a " Please help on the same. Thank you.
Does Rex in splunk support variable in regular expression ? For example,   user could input a text from UI, usually I need  a variable like $kw$  to get the input from user,  and  use $kw$  in rex co... See more...
Does Rex in splunk support variable in regular expression ? For example,   user could input a text from UI, usually I need  a variable like $kw$  to get the input from user,  and  use $kw$  in rex command  , Can splunk support this way ? and how ?  Thanks.
Hi Splunkers, I will planning entegration splunk on our aws envirement but I m beginner on aws so please could you help me about AWS sourcetype details and let me know which are required for securi... See more...
Hi Splunkers, I will planning entegration splunk on our aws envirement but I m beginner on aws so please could you help me about AWS sourcetype details and let me know which are required for security perspective ? And if u have usescases about security please share with me.
Hi Team, Good day! Just wanted to check if you can share with me the link for Older version of Splunk Enterprise/UF installers: Splunk Enterprise version v6.6.2 (Linux 32 and 64 bit) Splunk E... See more...
Hi Team, Good day! Just wanted to check if you can share with me the link for Older version of Splunk Enterprise/UF installers: Splunk Enterprise version v6.6.2 (Linux 32 and 64 bit) Splunk Enterprise version v7.1.10  (Linux 32 and 64 bit) Splunk Enterprise version v8.0.10  (Linux 32 and 64 bit)   Splunk UF version v6.6.2 (Linux 32 and 64 bit) Splunk UF version v7.1.10  (Linux 32 and 64 bit) Splunk UF version v8.0.10  (Linux 32 and 64 bit) Also, the final target version is Splunk Enterprise version is 8.2.7. Can you confirm if which order below would work? Option 1: Splunk v6.6.2 ->upgrade to Splunk v7.1.10  -->upgrade to Splunk v8.0.10 -->upgrade to v8.2.7 Option 2: Splunk v6.6.2 ->upgrade to Splunk v7.1.10  -->upgrade to v8.2.7 (Can we upgrade direct from v7.1.10 to v8.2.7?) We are planning to upgrade some very old Splunk instances and we wanted to simulate first on our test environment. We are looking forward to your assistance on this. Thank you.  
Hello I'm an employee of MEGAZONE CLOUD. We recently decided to conduct a test with Splunk and received a 50GB license. But I don't know how to register this. Please tell me how to register the... See more...
Hello I'm an employee of MEGAZONE CLOUD. We recently decided to conduct a test with Splunk and received a 50GB license. But I don't know how to register this. Please tell me how to register the license.
(New splunk user) I want to use the Cyberark Rest Api login event for Splunk. So is there a way to access Rest API data directly from Splunk? or  Do I need to use Rest Api, some programming like... See more...
(New splunk user) I want to use the Cyberark Rest Api login event for Splunk. So is there a way to access Rest API data directly from Splunk? or  Do I need to use Rest Api, some programming like Python to get the data and then send the splunk? or direct rest api endpoint binding to use in splunk install available? Is there any idea about this flowchart to connect Splunk?    
Any recommendations out there which existing Data Model would be best to match up Qumulo (network drive file access, mods, delete, read, and so on) log events to for CIM compliance?  One might think ... See more...
Any recommendations out there which existing Data Model would be best to match up Qumulo (network drive file access, mods, delete, read, and so on) log events to for CIM compliance?  One might think the "Data Access" DM but the fields are not even close; the Endpoint.filesystem DM appears to be my best option.  
Running a dbxquery through jobs.export my results are limited to 100k rows. Do I need to paginate streaming results?  Here's my code:     data = { 'adhoc_search_level': 'fast', ... See more...
Running a dbxquery through jobs.export my results are limited to 100k rows. Do I need to paginate streaming results?  Here's my code:     data = { 'adhoc_search_level': 'fast', 'search_mode': 'normal', 'preview': False, 'max_count': 500000, 'output_mode': 'json', 'auto_cancel': 300, 'count': 0 } job = service.jobs.export(<dbxquery>, **data) reader = results.JSONResultsReader(job) lst = [result for result in reader if isinstance(result, dict)]       This runs correctly except that that results are always stopped at 100k rows, it should be over 200k.
when i installed it , i have login creditinals , and through this i have logged in splunk website and asked this question that when i enter the credentials , its doesnt let me log in, always shows lo... See more...
when i installed it , i have login creditinals , and through this i have logged in splunk website and asked this question that when i enter the credentials , its doesnt let me log in, always shows login failed
I am trying to build an Alert which will trigger whenever one of our AWS-hosted Active Directory domains get replacement Domain Controllers, i.e., we don't control if/when they replace the servers. I... See more...
I am trying to build an Alert which will trigger whenever one of our AWS-hosted Active Directory domains get replacement Domain Controllers, i.e., we don't control if/when they replace the servers. I already have a simple Alert which counts how many unique DCs it sees per-hosted domain, and then I can do a simple: index=os sourcetype="xmlwineventlog # here I perform some clean-up to identify the 2 desired fields... # stats count Domain, DC_hostname stats count Domain where count>2 (and where the default number of DCs = 2, i.e., if there are more than that, AWS is in the process of replacing one or both.) The problem is that I lose the list of DCs. How can I filter-out all the domains that just have the typical 2 DCs while still keeping the complete list of DCs from the non-typical domain? ------------------------- FYI  - this is what the search looks like before my final filter: Domain         DC_hostname ----------     ----------------------------- domain1        DC1 domain1        DC2 domain2        DC3 domain2        DC4 domain2        DC5 My current Alert returns simply: domain2 whereas I want it to return: domain2        DC3 domain2        DC4 domain2        DC5  
As far as I know using mvcommand only creates an MV field out of values from a single field. In a column for example. I need to combine several fields to a single MV_field but all these fields have d... See more...
As far as I know using mvcommand only creates an MV field out of values from a single field. In a column for example. I need to combine several fields to a single MV_field but all these fields have different names.  For example, I have field1, field2, field3. And I need a single MV_field containing values for all of them. Also, it would nice if this could be dynamic in a way that I can combine 'field*' to 'MV_field' with all the values. I am able to accomplish combining the different fields using evals mvappend function, but it doesn't take wildcards.   Example, "| eval MV_field=mvappend(field1,field2,field3)" works. But there isn't always the same amount of fields.  It would be really nice to be able to do "| eval MV_field=mvappend(field*)" to simply catch all that exist and throw them in a single MV_field. Is this possible?
I have a field names "code_value" which has the values as follows    code_value ABC-123 JHLIK ABC-456 LKJF ABC-781 klklk ABC-22 olsd   Now how do I extract the code_value field anything that come... See more...
I have a field names "code_value" which has the values as follows    code_value ABC-123 JHLIK ABC-456 LKJF ABC-781 klklk ABC-22 olsd   Now how do I extract the code_value field anything that comes before a space? something like below  new_field_derived_from_code_value ABC-123 ABC-456 ABC-781 ABC-22
I'm deploying splunk to monitor pods over kubernetes but we want to capture every event into every Pod (standard output). Is that possible to capture without creating persistent volume?   Regards... See more...
I'm deploying splunk to monitor pods over kubernetes but we want to capture every event into every Pod (standard output). Is that possible to capture without creating persistent volume?   Regards  
Hello everyone, I have built a search that returns the email sender address as sender, its recipients list as recipient, and the number of emails received. One event looks like this: sender      ... See more...
Hello everyone, I have built a search that returns the email sender address as sender, its recipients list as recipient, and the number of emails received. One event looks like this: sender                                                                        recipient                 nr of emails sent user.sender@outsidecompany.com user1@company.com 16                                                                           user2@company.com                                                                          user3@company.com                                                                          user4@company.com                                                                          user5@company.com                                                                          user6@company.com                                                                          user7@company.com I want to define the recipient field values to be 10 recipients or more because let's say I'm not interested to see outside emails from a sender that has sent an email to less than 10 people inside company.com. Do you have any idea? Best regards.