All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

in the current version 3.3.1 of Splunk DB Connect , this bug makes adding new database source impossible without reverting to editor 'Execute SQL' Preview is a requirement for adding inputs in the A... See more...
in the current version 3.3.1 of Splunk DB Connect , this bug makes adding new database source impossible without reverting to editor 'Execute SQL' Preview is a requirement for adding inputs in the App (dont seem to be able to by pass it) Any word on possible upgrade to the app to fix? Or have people been successful in reverting back to an older version of the App (and which one ?) Many thanks, Jon
Hello, Recently I added a question about how I could extract fields or get a table from a json input (https://community.splunk.com/t5/Splunk-Search/Field-extraction/m-p/517524#M145531). The solution... See more...
Hello, Recently I added a question about how I could extract fields or get a table from a json input (https://community.splunk.com/t5/Splunk-Search/Field-extraction/m-p/517524#M145531). The solution was provided in one of the responses being:       index=_internal | head 1 | eval _raw = "[{\"Type\":\"Attention\",\"ABUSE\":18,\"GSD 24x7\":1,\"CLOUD\":0,\"DC\":0,\"ECL\":0,\"ITMS\":0,\"NET\":0,\"RFO\":17,\"Total\":36},{\"Type\":\"Active\",\"ABUSE\":0,\"GSD 24x7\":22,\"CLOUD\":38,\"DC\":5,\"ECL\":1,\"ITMS\":0,\"NET\":12,\"RFO\":2,\"Total\":80},{\"Type\":\"Total\",\"ABUSE\":18,\"GSD 24x7\":23,\"CLOUD\":38,\"DC\":5,\"ECL\":1,\"ITMS\":0,\"NET\":12,\"RFO\":19,\"Total\":116},{\"Type\":\"P1\",\"ABUSE\":0,\"GSD 24x7\":0,\"CLOUD\":0,\"DC\":0,\"ECL\":0,\"ITMS\":0,\"NET\":0,\"RFO\":6,\"Total\":6},{\"Type\":\"P2\",\"ABUSE\":0,\"GSD 24x7\":1,\"CLOUD\":0,\"DC\":0,\"ECL\":0,\"ITMS\":0,\"NET\":0,\"RFO\":10,\"Total\":11},{\"Type\":\"P3\/4\",\"ABUSE\":18,\"GSD 24x7\":0,\"CLOUD\":0,\"DC\":0,\"ECL\":0,\"ITMS\":0,\"NET\":0,\"RFO\":1,\"Total\":19}]" | rename COMMENTS AS "Previous lines generate your sample data, you get it by indes=xxx" | spath | rename {}.* as json_* | table json_*       (Thank you @isoutamo ). Now I have a new request for this. I have the table but I would like to add some stats to the numbers. This information is related to the shifts we have in rotation and I would like to add for example, a simple difference between how much events there were at the beginning and at the end of the shift. The issue is that even with the table, the fields are not exactly fields I can filter. Any ideas? Thank you in advance!   PD: this is the raw data: <     >[{"Type":"Attention","ABUSE":6,"GSD 24x7":3,"CLOUD":1,"DC":0,"ECL":0,"ITMS":0,"NET":0,"RFO":15,"Total":25},{"Type":"Active","ABUSE":0,"GSD 24x7":12,"CLOUD":44,"DC":9,"ECL":2,"ITMS":0,"NET":13,"RFO":1,"Total":81},{"Type":"Total","ABUSE":6,"GSD 24x7":15,"CLOUD":45,"DC":9,"ECL":2,"ITMS":0,"NET":13,"RFO":16,"Total":106},{"Type":"P1","ABUSE":0,"GSD 24x7":0,"CLOUD":0,"DC":0,"ECL":0,"ITMS":0,"NET":0,"RFO":6,"Total":6},{"Type":"P2","ABUSE":0,"GSD 24x7":1,"CLOUD":0,"DC":0,"ECL":0,"ITMS":0,"NET":0,"RFO":9,"Total":10},{"Type":"P3\/4","ABUSE":6,"GSD 24x7":2,"CLOUD":1,"DC":0,"ECL":0,"ITMS":0,"NET":0,"RFO":0,"Total":9}]      
Hi  I am getting the following error on my application/dashboard: " Error in 'eval' command: The expression is malformed." The query that is being triggered is: | makeresults count=1 | eval i... See more...
Hi  I am getting the following error on my application/dashboard: " Error in 'eval' command: The expression is malformed." The query that is being triggered is: | makeresults count=1 | eval id=$incident_id$| sendalert canary_acknowledge_incident param.incident_id=$incident_id$ param.index_name="main"   And this is getting triggered when a Submit Button is being clicked. The Submit button is tied to a Dropdown of values. The dropdown is populated with values and is defined as below: <input type="dropdown" token="incident_id" searchWhenChanged="false"> <label>Incident to Close</label> <fieldForLabel>id</fieldForLabel> <fieldForValue>id</fieldForValue> <search> <query>`canary_tools_index` sourcetype="canarytools:incidents" | stats values(id) as id| mvexpand id</query> <earliest>-30d@d</earliest> <latest>now</latest> </search> </input> And running that drop down populating query using the Search tool gives information such as: incident:canarytoken:80f36193721b94fb268bb6df:<source_ip>:<epoch_timestamp> incident:canarytoken:80f36193721b94fb268bb6df:<source_ip>:<epoch_timestamp>   Looking at previous questions asked on this forum point towards the field names of the `eval` command not working whenever they start with a numeric character. But this is not the case in my issues as I am using : `eval id=$incident_id$`   This is happening on Splunk 8.0.0
Hello.    I'm currently kind of confused using splunk enterprise. I am using splunk enterprise 7.2.8 version. I need to use table command like below, but the command outputs with same 2 values as ... See more...
Hello.    I'm currently kind of confused using splunk enterprise. I am using splunk enterprise 7.2.8 version. I need to use table command like below, but the command outputs with same 2 values as you can see also as below.   [command] host="myhost" index="myindex" sourcetype="mytype" source="mysource" | table field1, field2   [results - table]   For your information, our org configured splunk system like below. splunk universal forwarder -> splunk heavy forwarder -> splunk indexer <- search header   And above information is sent from server with splunk universal forwarder.  Any idea to solve this problem??   Thanks.    
i have an average of 100 events coming into the splunk _internal index per minute on a instance that is not very busy and is being used by 2 people. I reduced the bucket size to allow the data to rol... See more...
i have an average of 100 events coming into the splunk _internal index per minute on a instance that is not very busy and is being used by 2 people. I reduced the bucket size to allow the data to roll over sooner to avoid a disk space error. are there any configurations that im missing that could slow down the incoming events.    
Hello everyone,  I want to install CTF_Scoreboard in Splunk. I follow the steps here : https://github.com/splunk/SA-ctf_scoreboard But, I didn't get the log files : scoreboard.log and scoreboard_ad... See more...
Hello everyone,  I want to install CTF_Scoreboard in Splunk. I follow the steps here : https://github.com/splunk/SA-ctf_scoreboard But, I didn't get the log files : scoreboard.log and scoreboard_admin.log in the repository /opt/splunk/var/log. After uploading all the questions and answers in my application scoreboard, i can't submit my responses in the app: I got an error : "404 Not Found". you can see the figures:  anyone can help me please! It's urgent! Thanks a lot.
Hi Team,   When i search splunk for windows events i am getting the result in xml format. Is there any way we can change the view from xml format to any easily readable format?
Hi Splunkers, We recently migrated to Splunk Search Head Clustering.  We are using a Load Balancer in front of 3 Search Heads Clustered so that the users can access the set of search heads through a... See more...
Hi Splunkers, We recently migrated to Splunk Search Head Clustering.  We are using a Load Balancer in front of 3 Search Heads Clustered so that the users can access the set of search heads through a single interface, without needing to specify a particular one. Now, we notice that the email alerts we received are coming randomly from any of the 3 Search Heads Clustered. It means the email alert sender for Splunk SHC comes from any 3 Search Heads Clustered. Is there a way to make it appear the alert is being sent by the load balancer name? So that once the users receives the alerts, the email alert sender is uniform.
Hello, Has anyone integrated appdynamics with Cherwell. Kindly share the approach how to integrate. Regards, Rashmi
Hi Everyone, Can anyone help me create indexed fields in accelerated data models in order to work with tstats? Thanks & Regards, Manikanth  
  Can someone help with a query to identify any events which could align with existing Data models, that contain information like (Users,actions performed, session ids, etc), for specific source typ... See more...
  Can someone help with a query to identify any events which could align with existing Data models, that contain information like (Users,actions performed, session ids, etc), for specific source type and index. Fields like Users, Actions Performed, Session id , do we need to extract, can someone help with the query?
I have a script that can be ran with alert, but if I run the script manually, the following error will showed. Any idea how I can run it manually? I have add the python location in first line. #!/d... See more...
I have a script that can be ran with alert, but if I run the script manually, the following error will showed. Any idea how I can run it manually? I have add the python location in first line. #!/data/splunk/bin/python [root@hostname scripts]# ./REST_1.py ERROR:root:code for hash sha1 was not found. Traceback (most recent call last): File "/data/splunk/lib/python2.7/hashlib.py", line 147, in <module> globals()[__func_name] = __get_hash(__func_name) File "/data/splunk/lib/python2.7/hashlib.py", line 97, in __get_builtin_constructor raise ValueError('unsupported hash type ' + name) ValueError: unsupported hash type sha1 Traceback (most recent call last): File "./REST_1.py", line 7, in <module> import requests File "/data/splunk/lib/python2.7/site-packages/requests/__init__.py", line 58, in <module> from . import utils File "/data/splunk/lib/python2.7/site-packages/requests/utils.py", line 25, in <module> from .compat import parse_http_list as _parse_list_header File "/data/splunk/lib/python2.7/site-packages/requests/compat.py", line 7, in <module> from .packages import chardet File "/data/splunk/lib/python2.7/site-packages/requests/packages/__init__.py", line 3, in <module> from . import urllib3 File "/data/splunk/lib/python2.7/site-packages/requests/packages/urllib3/__init__.py", line 16, in <module> from .connectionpool import ( File "/data/splunk/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 36, in <module> from .connection import ( File "/data/splunk/lib/python2.7/site-packages/requests/packages/urllib3/connection.py", line 43, in <module> from .util import ( File "/data/splunk/lib/python2.7/site-packages/requests/packages/urllib3/util/__init__.py", line 10, in <module> from .ssl_ import ( File "/data/splunk/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py", line 2, in <module> from hashlib import md5, sha1 ImportError: cannot import name sha1
Hi . i tried to create an alert when host listens to suspicious port. can anyone help me out with the same. 
Hi, I want to create a chloropeth map dashboard highlighting 2 countries. Example India in Red and US in green. Please help with the query where we can hardcode the values for countries
I have an IP Address after the word Source that I want to extract and create a field and use that field (e.g. clientIP) to extract city and country information that uses the iplocation database and p... See more...
I have an IP Address after the word Source that I want to extract and create a field and use that field (e.g. clientIP) to extract city and country information that uses the iplocation database and put that in a table.
Hi Splunk team, I would like to ask if we can alert user for 2/3 OOC (out of control) points grouped by host ordered by time. so if we have time based value, we would like to: 1. group it by its h... See more...
Hi Splunk team, I would like to ask if we can alert user for 2/3 OOC (out of control) points grouped by host ordered by time. so if we have time based value, we would like to: 1. group it by its host first 2. sort order each host group(s) by time 3. grouped into smaller group with size 3 and check if there is 2 out of 3 points in the smaller group have OOC data. 3 out of 3 is also consider true condition to alert.  For example:  OOC condition: value greater than 2 host time value OOC A 0:01 1 NO A 0:02 3 YES A 0:02 1 NO A 0:03 3 YES A 0:04 3 YES A 0:06 3 YES B 0:06 1 NO B 0:08 3 YES B 0:09 5 YES   I already color into orange, green, and blue color above. So if we have this data, we would alert because green and blue fulfill the alert condition. orange one did not fulfill because only 1 out of 3 OOC. Please let me know if its possible to alert this way. Thank you.  
Hi, I have a need for field extraction. I have a sourcetype that has compliance related information for our use case. This data has field name "Text". This field has data coming in variations. Below... See more...
Hi, I have a need for field extraction. I have a sourcetype that has compliance related information for our use case. This data has field name "Text". This field has data coming in variations. Below are two of the many variations. I need the extraction via regex that can detect fields within tags and parse them out. Data cardinality will be by:     <cm:compliance-check-id>36c4d07cc410439bf3bf79f7f5942672</cm:compliance-check-id>     Sample: 1     <cm:compliance-result>WARNING</cm:compliance-result> <cm:compliance-actual-value>Error -- evaluation period has ended</cm:compliance-actual-value> <cm:compliance-check-id>36c4d07cc410439bf3bf79f7f5942672</cm:compliance-check-id> <cm:compliance-policy-value>WARNING</cm:compliance-policy-value> <cm:compliance-check-name>Connection error</cm:compliance-check-name>     Sample: 2     <compliance>true</compliance> <cm:compliance-check-name>WN10-00-000005 - Domain-joined systems must use Windows 10 Enterprise Edition 64-bit version - 64-bit</cm:compliance-check-name> <cm:compliance-audit-file>DISA_STIG_Windows_10_v1r20.audit</cm:compliance-audit-file> <cm:compliance-check-id>55aeff4f26d6b8307f6f9672750a5548</cm:compliance-check-id> <cm:compliance-actual-value>'64-bit'</cm:compliance-actual-value> <cm:compliance-policy-value>'64-bit'</cm:compliance-policy-value> <cm:compliance-info> Features such as Credential Guard use virtualization based security to protect information that could be used in credential theft attacks if compromised. There are a number of system requirements that must be met in order for Credential Guard to be configured and enabled properly. Virtualization based security and Credential Guard are only available with Windows 10 Enterprise 64-bit version. </cm:compliance-info> <cm:compliance-result>PASSED</cm:compliance-result> <cm:compliance-reference>800-171|3.4.1,800-53|CM-8,CAT|II,CCI|CCI-000366,CN-L3|8.1.10.2(a),CN-L3|8.1.10.2(b),CSF|DE.CM-7,CSF|ID.AM-1,CSF|ID.AM-2,CSF|PR.DS-3,ISO/IEC-27001|A.8.1.1,ITSG-33|CM-8,NESA|T1.2.1,NESA|T1.2.2,NIAv2|NS35,Rule-ID|SV-77809r3_rule,STIG-ID|WN10-00-000005,Vuln-ID|V-63319</cm:compliance-reference> <cm:compliance-see-also>https://dl.dod.cyber.mil/wp-content/uploads/stigs/zip/U_MS_Windows_10_V1R20_STIG.zip</cm:compliance-see-also>     Thanks in-advance!!!
I am from India (Hyderabad), can you please advise how to get Splunk internship.
Hi, I am trying to setup iis logs forwarded to splunk enterprise. I am a bit confused as new to splunk but i have installed the iis add on to splunk. Do i need to copy the Splunk_TA_microsoft-iis... See more...
Hi, I am trying to setup iis logs forwarded to splunk enterprise. I am a bit confused as new to splunk but i have installed the iis add on to splunk. Do i need to copy the Splunk_TA_microsoft-iis folder to the server with the iis logs? or do i just configure the inputs.conf under the forwarder.
I have been trying to figure out a search that can be used to track failed logon events over time but really struggling to identify a workable solution (if there is one). My initial search query was... See more...
I have been trying to figure out a search that can be used to track failed logon events over time but really struggling to identify a workable solution (if there is one). My initial search query was index=wineventlog EventCode=4625 NOT TargetUserName="*$" | eval User=TargetDomainName."/".TargetUserName | transaction User EventCode maxspan=1d | stats values(User) by signature Reading some other threads indicated that the use of 'transaction' isn't very efficient and to use streamstats or eventstats instead so I came up with  index=wineventlog EventCode=4625 NOT TargetUserName="*$" | eval User=TargetDomainName."/".TargetUserName | eventstats sum(User) as Failed_Count by signature | where Failed_Count >=3 | table User signature Failed_Count however this doesn't give me any results. My aim is to search over a 7 day period and shows stats per day for each user by the signature. This would help with identifying bad scripts or possible bruteforce attempts including spray attacks over a long period.