All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am attempting to take traffic logs over an arbitrary period of time and use the number of accesses and the time of those accesses and map that to a normal distribution model. My experience with Spl... See more...
I am attempting to take traffic logs over an arbitrary period of time and use the number of accesses and the time of those accesses and map that to a normal distribution model. My experience with Splunk is limited but I am sure that there must be a way to do this, right? My current course of thought is to discretize the data into some span of time and take the count of traffic in that time period and put it into buckets. Then use those buckets to map the data to a normal distribution to which I should be able to denote shifts in data. index="*" (host="192.168.0.*" OR dest="192.168.0.*") AND (dest!="198.24.14.0" AND src!="198.24.14.0") | timechart span=15min distinct_count(src) The above search is the baseline of what I have which represents the data that I am working with (obscured of course). Assuming arbitrary use of field (span=time) and field (dc(field)) but these won't be the only way that this search is applied. I've searched many previous threads and have tried using bins in nearly every way I can find. Unfortunately it appears that timechart() is interfering with bin() and returns output that I can't make use of. I have also tried manipulating _time with field(host/src/dest) in intervals of time but I don't know a way to get that into a variable to use later in the search. Any help would be appreciated but, I am trying to not use apps outside of "Search and Reporting". Thanks in advance
Hi all,   We are in the process of upgrade splunk version from 7.3.3 to 8.0.6. After running the Readiness app and solving all the blocker/warning items, we have proceed to run the upgrade. We are ... See more...
Hi all,   We are in the process of upgrade splunk version from 7.3.3 to 8.0.6. After running the Readiness app and solving all the blocker/warning items, we have proceed to run the upgrade. We are able to start the splunkd service, but the web stucks in waiting for web server forever. Checking web_service.log, we find this error trace: 2021-01-07 14:30:20,566 ERROR   [5ff70cec707fc132868e50] root:772 - Unable to start splunkweb 2021-01-07 14:30:20,566 ERROR   [5ff70cec707fc132868e50] root:773 - 'dict' object has no attribute 'iteritems' Traceback (most recent call last):   File "/opt/splunk/searchhead/lib/python3.7/site-packages/splunk/appserver/mrsparkle/root.py", line 132, in <module>     from splunk.appserver.mrsparkle.controllers.top import TopController   File "/opt/splunk/searchhead/lib/python3.7/site-packages/splunk/appserver/mrsparkle/controllers/top.py", line 27, in <module>     from splunk.appserver.mrsparkle.controllers.admin import AdminController   File "/opt/splunk/searchhead/lib/python3.7/site-packages/splunk/appserver/mrsparkle/controllers/admin.py", line 17, in <module>     import formencode   File "/opt/splunk/searchhead/lib/python3.7/site-packages/formencode/__init__.py", line 9, in <module>     from formencode import validators   File "/opt/splunk/searchhead/lib/python3.7/site-packages/formencode/validators.py", line 15, in <module>     import dns.resolver   File "/opt/splunk/searchhead/etc/apps/generateblocklist_app/bin/dns/resolver.py", line 32, in <module>     import dns.flags   File "/opt/splunk/searchhead/etc/apps/generateblocklist_app/bin/dns/flags.py", line 51, in <module>     _by_value = dict([(y, x) for x, y in _by_text.iteritems()]) AttributeError: 'dict' object has no attribute 'iteritems' Does anyone know how to solve this issue? Thanks in advance.   Best regards.
Hi, I'm having an issue with the set of the sourcetype in transforms.conf at the moment of sending the data of a single file to an a index. In first instance the data sends to another index succesfu... See more...
Hi, I'm having an issue with the set of the sourcetype in transforms.conf at the moment of sending the data of a single file to an a index. In first instance the data sends to another index succesfully but with the wrong sourcetype. Here are my conf files: props.conf: [snmp-traps_cisco-prime] DATETIME_CONFIG = NO_BINARY_CHECK = true category = Custom description = Sourcetype Generico SNMP TRAPS CISCO PRIME pulldown_type = true disabled = false TRANSFORMS-reenvioindexes_cambiostype = aruba transforms.conf: [aruba] REGEX = \[UDP\:\s\[115\.100\.9\.100\] DEST_KEY = _MetaData:Index FORMAT = aruba DEST_KEY = MetaData:Sourcetype FORMAT = sourcetype: stm P.D: im trying to asign a Aruba Networks sourcetype of a snmptrap. Thanks in advance. Diego  
I have a standalone splunk enterprise and been having an issue with my licensing. I've been receiving the following error :  "Error in 'litsearch' command: Your Splunk license expired or you have e... See more...
I have a standalone splunk enterprise and been having an issue with my licensing. I've been receiving the following error :  "Error in 'litsearch' command: Your Splunk license expired or you have exceeded your license limit too many times. Renew your Splunk license by visiting www.splunk.com/store or calling 866.GET.SPLUNK." I checked my licensing.  There's an error regarding an indexer "slave had no matching license pool for the data it indexed" The license pool and it's still valid and the indexer is part already of the license pool and it's the local server. What may seem the problem?    
client_type = 'JDBC_DRIVER' , client_version = '3.9.2' The above is the exact value in the lookup. | rex field=clienttype_minimumversion_details max_match=0 "client_type\s=\s'(?<REPORTED_CLIENT_TYP... See more...
client_type = 'JDBC_DRIVER' , client_version = '3.9.2' The above is the exact value in the lookup. | rex field=clienttype_minimumversion_details max_match=0 "client_type\s=\s'(?<REPORTED_CLIENT_TYPE>.*?(?='\s,))" | rex field=clienttype_minimumversion_details max_match=0 "client_version\s=\s'(?<MINIMUM_VERSION_REQUIRED>.*?(?='))" Using the above I am extracting 2 fields | eval version= tonumber(trim(MINIMUM_VERSION_REQUIRED)) | eval type=typeof(version) The output of (MINIMUM_VERSION_REQUIRED) is Invalid I need it in number format so that I can compare it to another numeric field in the logs. I tried tonumber and convert , it doesnt work.
Hi, I tried building time series forecasting model using KALMAN algorithm. However its showing date wise prediction, but my business requirement is Prediction should happen Date & time wise - which ... See more...
Hi, I tried building time series forecasting model using KALMAN algorithm. However its showing date wise prediction, but my business requirement is Prediction should happen Date & time wise - which day and time forecast. Can you please help how I can accomplish Date & Time forecast model? Regards, Rajyalakshmi Alluri
Hi , I need to replace the string in a field value role_seu_458137407337_prd-sso-data-science-752-2205-compute-role"     -- >  compute-role should be replaced as below 2 other field values . role_s... See more...
Hi , I need to replace the string in a field value role_seu_458137407337_prd-sso-data-science-752-2205-compute-role"     -- >  compute-role should be replaced as below 2 other field values . role_seu_458137407337_prd-sso-data-science-752-2205-pl role_seu_458137407337_prd-sso-data-science-752-2205-ds I have tried replace command compute-role WITH pl IN ad. It didnt work What would be the solution for this .Please help 
Hi, would like to move our Grafana dashboard to Splunk.  I' m fairly new in Splunk with limited experience from  Splunk..  Any idea?   Thanks in advance! Br /RR
Hello helpful people, I'm afraid I have an issue that is related to many questions already asked, but I have not been able to come up with a solution. I have a log file that creates large events - ... See more...
Hello helpful people, I'm afraid I have an issue that is related to many questions already asked, but I have not been able to come up with a solution. I have a log file that creates large events - more than 257 lines at a time. To test the file, I took an extract and uploaded it manually.  Using this file, I was able to create props.conf entry as shown below and the events ingested correctly, without breaking. When I applied this to our clustered environment, the breaking has returned. Events - ++++ information 2021-01-06 16:38:53 host = xxxx.xxxx.net process = 00002fa8 thread = 73ffe380 context = Server::calculate(), module Request failed with error(s): <?xml version='1.0'?>  Show all 257 lines [031004] Variable  has no value. [035006] Cannot have child &lt;xxxxx[E.3] (B6I2)&gt; (xxx) on link xxxxxxxxxxxxxxxxxxx  (B6I1)&gt; </clc:Error> </xxxx__xxxxx_xxxx_xxx_f123_2>  Show all 257 lines props.conf [source::///xxxx/Log/xxxxServer.log] SHOULD_LINEMERGE=true MAX_EVENTS=10000 TIME_PREFIX=\+\+\+\+ \w+ The reason I am using source and not sourcetype is because this source file is common to a number of environments and I am already changing sourcetype using props and transforms to determine the sourcetype per servername. Thanks in advance for help - much appreciated.
So basically I have some network logs and by base search filters down to source IP, destination IP, destination port, and protocol. I am trying to figure out a way to iterate over all events and grou... See more...
So basically I have some network logs and by base search filters down to source IP, destination IP, destination port, and protocol. I am trying to figure out a way to iterate over all events and group based off the 4-tuple data mentioned above. Below is an image of what I am thinking. You'll notice I added an ID field and I figured that would somehow be the equivalent to  ID+=1 at the end of a loop or something.        The original data is bro logs. If you need a list of all the fields below is a link and the source is the conn.log (the green box on the first page) http://gauss.ececs.uc.edu/Courses/c6055/pdf/bro_log_vars.pdf 
How can I retrieve data from Splunk dashboard or saved searches using SSIS. I am able to create the connection string to connect to Splunk server using Splunk ODBC Driver. But when I try to use the... See more...
How can I retrieve data from Splunk dashboard or saved searches using SSIS. I am able to create the connection string to connect to Splunk server using Splunk ODBC Driver. But when I try to use the ODBC source in the data flow using same connection manager, I am not able to get the list of views.   Please let me know how I can achieve this task.
Hello, Splunk Experts!   I'm currently working on making some awesome dashboard with splunk enterprise. I was searching for the way to accomplish the goal and I found the Splunk Dashboards(beta). ... See more...
Hello, Splunk Experts!   I'm currently working on making some awesome dashboard with splunk enterprise. I was searching for the way to accomplish the goal and I found the Splunk Dashboards(beta).   Although I read document, I still got some questions.   1. What does the beta means? Does it mean it doesn't have any credential or anything? 2. I need to embed my dashboard to some other web server. And I found EDFS is the way to make it. But I couldn't find the document to use it. And also if it is possible to use not only Search & Report App dashboard, But also Splunk Dashboards App's dashboard? Anybody has experience? 3. My dashboard is almost about server, storage usage or events. I need to make some alerts with the trend of usage. Is it possible to make it with dashboard? Or do I have to use other app or need to create custom alert?   Thank you!
Hi All, I want to eliminate TruestedLocation = Zscaler in my splunk search result. Below is my query and screenshot. Please help me with splunk query. Thanks in advance.   index=test "vendorInfo... See more...
Hi All, I want to eliminate TruestedLocation = Zscaler in my splunk search result. Below is my query and screenshot. Please help me with splunk query. Thanks in advance.   index=test "vendorInformation.provider"=IPC | eval Event_Date=mvindex('eventDateTime',0) | eval UPN=mvindex('userStates{}.userPrincipalName',0) | eval Logon_Location=mvindex('userStates{}.logonLocation',0) | eval Event_Title=mvindex('title',0) | eval Event_Severity=mvindex('severity',0) | eval AAD_Acct=mvindex('userStates{}.aadUserId',0) | eval LogonIP=mvindex('userStates{}.logonIp',0) | eval Investigate=+"https://portal.azure.com/#blade/Microsoft_AAD_Acct | stats count by Event_Date, Event_Title, Event_Severity UPN Logon_Location LogonIP Investigate | lookup WeirMFAStatusLookup.csv userPrincipalName as UPN | lookup Lookup_EMPADInfo.csv userPrincipalName as UPN | lookup WeirSiteCode2IP.csv public_ip as LogonIP | lookup ZscalerIP CIDR_IP as LogonIP | lookup WeirTrustedIPs.csv TrustedIP as LogonIP | fillnull value="Unknown Site" site_code | eval AD_Location=st + ", " + c | fillnull value="OK" MFAStatus | eval TrustedLocation=if(isnull(TrustedLocation), ZLocation, TrustedLocation) | rename site_code as LogonSiteCode | table Event_Date, Event_Title, Event_Severity UPN LogonIP LogonSiteCode Logon_Location AD_Location TrustedLocation MFAStatus count Investigate | sort - Event_Date   @isoutamo @saravanan90 @thambisetty @ITWhisperer @gcusello @to4kawa
Hi  As every one knew there are multiple user agent depends on user device.  However i am trying to achieve the below output from the user agent using table command. sample output os_family os... See more...
Hi  As every one knew there are multiple user agent depends on user device.  However i am trying to achieve the below output from the user agent using table command. sample output os_family os_version device_brand_model brower_enginer brow_engine_version hardware_type browser browser_version   User agent & Rex   Iphone - Mozilla/5.0 (iPhone; CPU iPhone OS 14_2_1 like Mac OS X) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0.1 Mobile/15E148 Safari/604.1 REX - \((?<hardware_type>\w+);\s+[^ ]+\s(?<os_family>\w+\s[^ ]+)\s+(?<os_version>\w+)\s[^ ]+\s[^ ]+\s\w+\s\w.\s(?<browser_engine>\w+)\/(?<brow_engine_version>\w+[^ ]+)\s+\(.+\)\s+(?<browser_version>\w+\/[^ ]+)\s+\w+\/\w+\s(?<browser>\w+) Xiaomi - Mozilla/5.0 (Linux; U; Android 9; en-gb; Redmi Note 6 Pro Build/PKQ1.180904.001) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/71.0.3578.141 Mobile Safari/537.36 XiaoMi/MiuiBrowser/12.7.4-gn REX - \(\w+;\s\w;\s(?<os_family>\w+)\s(?<os_version>\w+);\s[^ ]+\s(?<device_brand_model>\w+\s[^ ]+\s[^ ]+)\s[^ ]+\s[^ ]+\s(?<browser_engine>\w+)\/(?<brow_engine_version>\w+[^ ]+)\s\(.+\)\s\w+\/[^ ]+\s[^ ]+\s(?<hardware_type>\w+)\s[^ ]+\s(?<browser>\w+\/\w+)\/(?<browser_version>\w+[^ ]+) One Plus - Mozilla/5.0 (Linux; Android 10; ONEPLUS A6013) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.101 Mobile Safari/537.36  REX - \(\w+;\s(?<os_family>\w+)\s(?<os_version>\w+[^ ]+)\s+(?<device_brand_model>\w+\s[^ ]+)\s(?<browser_engine>\w+)\/(?<brow_engine_version>\w+[^ ]+)\s\(.+\)\s(?<browser>\w+)\/(?<browser_version>\w+[^ ]+)\s(?<hardware_type>\w+) Windows - Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36 Edge/87.0.664.66 REX - \((?<os_family>\w+)\s+\w+\s+(?<os_version>[^;]+)[^\)]+\)\s(?<browser_egnine>\w+)\/(?<brow_engine_version>\w+[^ ]+)\s\(.+\)\s[^ ]+\s[^ ]+\s(?<browser>\w+)\/(?<browser_version>\w+[^ ]+) Macintosh - Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.0.2 Safari/605.1.15" REX - \((?<hardware_type>\w+);\s\w+\s+(?<os_family>\w+)\s(?<os_version>\w+\s[^ ]+\s[^ ]+)\s(?<browser_enginer>\w+)\/(?<brow_engine_version>\w+[^ ]+)\s\(.+\)\s(?<browser_version>\w+\/[^ ]+)\s(?<browser>\w+) Lenovo - Mozilla/5.0 (Linux; Android 6.0.1; Lenovo YT3-X90F) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.101 Safari/537.36 REX - \(\w+;\s(?<os_family>\w+)\s(?<os_version>\w+[^ ]+)\s+(?<device_brand_model>\w+\s\w+[^ ]+)\s+(?<browser_engine>\w+)\/(?<brow_engine_version>\w+[^ ]+)\s\(.+\)\s(?<browser>\w+)\/(?<browser_version>\w+[^ ]+) Like above i have created multiple REX command for ( Ipad/HP/Meizu/Vivo/Motorola/Lenovo/ZTE blade /One Plus / Xiaomi / Google Pixel / Android / LG / Asus/   I would like to know can we run spl cmd  with multiple REX command in single search or how can get the output i am expected to obtain all user agent details. Thanks   
Hi there! I am kinda new to Splunk so I apologize if my wording is off, but I am trying to collect metrics from a weirder phone system, and the way it presents data is like this: { "response": { ... See more...
Hi there! I am kinda new to Splunk so I apologize if my wording is off, but I am trying to collect metrics from a weirder phone system, and the way it presents data is like this: { "response": { "method": "switchvox.callQueues.getCurrentStatus", "result": { "call_queue": { "extension": "***", "strategy": "ring_all", "queue_members": { "queue_member": [ { "paused_time": "15911", "completed_calls": "8", "paused_since": "", "talking_to_name": "", "login_type": "login", "order": "1", "login_time": "32963", "extension": "***", "max_talk_time": "661", "time_of_last_call": "2021-01-06 13:52:31", "paused": "0", "account_id": "***", "missed_calls": "14", "logged_in_status": "logged_off", "fullname": "***", "talking_to_number": "", "avg_talk_time": "443" }, Where it restates the variable name with each record. So far I have the rest api module pulling the data, and doing some light translation on it (it still looks like an array but it is at least identifying each extension). Is there a way to get Splunk to use the first portion of each response as a field name while tying it to the same record?   EX: { "paused_time": "15911", "completed_calls": "8", "paused_since": "", "talking_to_name": "", "login_type": "login", "order": "1", "login_time": "32963", "extension": "***", "max_talk_time": "661", "time_of_last_call": "2021-01-06 13:52:31", "paused": "0", "account_id": "**1", "missed_calls": "14", "logged_in_status": "logged_off", "fullname": "***", "talking_to_number": "", "avg_talk_time": "443" },  getting converted to    Account_ID login_time missed_calls paused_since avg_talk_time max_talk_time     **1 32963 14   443 661 **2 32945 0   250 450  
Hello    I created an app, and now I am trying to figure out the  permissions it needs because it is getting deployed on my client server with permissions drwx------   What is the recommended set... See more...
Hello    I created an app, and now I am trying to figure out the  permissions it needs because it is getting deployed on my client server with permissions drwx------   What is the recommended settings on the directory and the .conf files being deployed?
Hi there, I have s splunk search commands as follows,  the result is a table And i want to set the "99.99(Fail)" to red colors, i have tried use the methods as follow, but it not works: Is... See more...
Hi there, I have s splunk search commands as follows,  the result is a table And i want to set the "99.99(Fail)" to red colors, i have tried use the methods as follow, but it not works: Is there any methods that can only set the column cell to red if it contains "Fail",  thanks!      
Hi SPlunkers, Currently we have a single instance deployment i.e. we have a splunk enterprise console which has both indexer and search head on same instance. We are planning to setup high availabil... See more...
Hi SPlunkers, Currently we have a single instance deployment i.e. we have a splunk enterprise console which has both indexer and search head on same instance. We are planning to setup high availability, can you please guide me the correct documentation which we can look into.
Does anyone have an accurate logon | logoff duration SPL query that they would not mind sharing?  
I have set my forwarder to fetch data from jenkins jobs folder,  It fetches all the files excpt build.xml which has content as shown below: <?xml version='1.1' encoding='UTF-8'?> <build> <actions... See more...
I have set my forwarder to fetch data from jenkins jobs folder,  It fetches all the files excpt build.xml which has content as shown below: <?xml version='1.1' encoding='UTF-8'?> <build> <actions> <hudson.model.CauseAction> <causeBag class="linked-hash-map"> <entry> <hudson.model.Cause_-UserIdCause> <userId>xxxx</userId> </hudson.model.Cause_-UserIdCause> <int>1</int> </entry> </causeBag> </hudson.model.CauseAction> </actions> <queueId>103</queueId> <timestamp>1609965592275</timestamp> <startTime>1609965592280</startTime> <result>SUCCESS</result> <description>&lt;table cellpadding=&apos;4&apos; cellspacing=&apos;2&apos; bgcolor=&apos;#517cc1&apos;&gt;&lt;tr bgcolor=&apos;#FFFFFF&apos;&gt;&lt;td&gt;Build&lt;/td&gt;&lt;td&gt;6312&lt;/td&gt;&lt;/tr&gt;&lt;tr bgcolor=&apos;#FFFFFF&apos;&gt;&lt;td&gt;Build status&lt;/td&gt;&lt;td&gt;&lt;font color=&apos;green&apos;&gt;PASSED&lt;/font&gt;&lt;/td&gt;&lt;/tr&gt;&lt;tr bgcolor=&apos;#FFFFFF&apos;&gt;&lt;td&gt;Target Branch&lt;/td&gt;&lt;td&gt;c2c&lt;/td&gt;&lt;/tr&gt;&lt;tr bgcolor=&apos;#FFFFFF&apos;&gt;&lt;td&gt;Source Branch&lt;/td&gt;&lt;td&gt;hwwang_AOS-214546&lt;/td&gt;&lt;/tr&gt;&lt;tr bgc ...... How to get the build.xml file parsed