All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

I am doing a CTF that provides logs to filter and work through, one of the questions asks for the time period between when the brute force attack was carried out and the last requests that was sent  ... See more...
I am doing a CTF that provides logs to filter and work through, one of the questions asks for the time period between when the brute force attack was carried out and the last requests that was sent  To find the first timestamp I used ``` index=botsv1 imreallynotbatman.com source="stream:http" form_data=*username*passwd* | regex "passwd=batman"| table _time | sort by _time | head 1``` similar to that I used  ``` index=botsv1 imreallynotbatman.com source="stream:http" form_data=*username*passwd* | regex "passwd=batman"| table _time | sort by _time | tail 1``` each search query works fine by itself but when used together they don't, also when trying to use ``` eval start_time = index=botsv1 imreallynotbatman.com source="stream:http" form_data=*username*passwd* | regex "passwd=batman"| table _time | sort by _time | head 1``` throws and error  Error """ : Comparator '=' has an invalid term on the left hand side: start_time=index."""  how do I chose the first and last datetime form the table created without using two queries 
I have 2 columns 1 has application name another has number of  instances . I want to remove duplicate application name but same time instance count should show addition of all the instance for the sa... See more...
I have 2 columns 1 has application name another has number of  instances . I want to remove duplicate application name but same time instance count should show addition of all the instance for the same application name. I'm using dedup but instance count addition need some other logic.    APPNAME INSTANCECOUNT sap 2 oracle  4 sap  2 git 2 oracle 4    
Currently We are able to ingest AWS cloud watch logs to Splunk. In the similar way, Is it possible to ingest AWS x-ray logs to Splunk?
Hi, all! Here's my current time format! How could I adjust into the format from 2022-01-20 18:21:19,448 to 2022-01-20 18:00  
Hello, I have a condition when the variable new_tag of the previous row is equal to 1 and the variable test_tag of the current row is equal to 1 I must subtract the start value of the previous row w... See more...
Hello, I have a condition when the variable new_tag of the previous row is equal to 1 and the variable test_tag of the current row is equal to 1 I must subtract the start value of the previous row with the start value of the current row. I want the result of the subtraction to be written in the previous row in the result column. Unfortunately I could only get this subtraction to be written to the current row in the result column. Please could someone help me, thank you very much.  
I'm receiving the below error. I am on Splunk Enterprise 8.1.3 and using Solarwinds add-on version 1.2.0. The below is being generated on the heavy forwarder. The heavy forwarder is using Python v... See more...
I'm receiving the below error. I am on Splunk Enterprise 8.1.3 and using Solarwinds add-on version 1.2.0. The below is being generated on the heavy forwarder. The heavy forwarder is using Python version 3. We can successfully ping the Solarwinds server from this heavy forwarder. Does anybody have an idea as to what the issue might be here? 2022-02-01 11:59:28,107 +0000 log_level=ERROR, pid=5855, tid=Thread-4, file=ta_data_collector.py, func_name=index_data, code_line_no=113 | [stanza_name="OT_sowin_query"] Failed to index data Traceback (most recent call last): File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/splunktacollectorlib/data_collection/ta_data_collector.py", line 109, in index_data self._do_safe_index() File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/splunktacollectorlib/data_collection/ta_data_collector.py", line 129, in _do_safe_index self._client = self._create_data_client() File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/splunktacollectorlib/data_collection/ta_data_collector.py", line 99, in _create_data_client self._data_loader.get_event_writer()) File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/splunktacollectorlib/ta_cloud_connect_client.py", line 20, in __init__ from ..core.pipemgr import PipeManager File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/core/__init__.py", line 1, in <module> from .engine import CloudConnectEngine File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/core/engine.py", line 6, in <module> from .http import HttpClient File "/opt/splunk/etc/apps/Splunk_TA_SolarWinds/bin/splunk_ta_solarwinds/aob_py3/cloudconnectlib/core/http.py", line 26, in <module> 'http_no_tunnel': socks.PROXY_TYPE_HTTP_NO_TUNNEL, AttributeError: module 'socks' has no attribute 'PROXY_TYPE_HTTP_NO_TUNNEL'
Gooood Morning I need some advice, we have several sources of Information about our Company assets, i know not ideal but better then dont know any. So i wrote a script thats collects everything ... See more...
Gooood Morning I need some advice, we have several sources of Information about our Company assets, i know not ideal but better then dont know any. So i wrote a script thats collects everything from these Asset sources and writes the Info to a big KV Store. (1.5GB) on the Splunk-ES SH. The script does that every 6h.  No i want to add these Info to the Splunk ES Asset- und Identitäts-Management. How do i aliase a kvstore field name so its CIM compliance with the required fieldnames as stated here. https://docs.splunk.com/ ? I thought about fieldaliases in a props.conf as per normal datasources. But im not sure to use the collection name as a source in the stanza?        [source::ipam_assets_collection] FIELDALIAS-asset_ip = Address AS ip         Is there a better way?
Hi, I am using Splunk 8.2.1 and I have configured the docker daemon to send logs to Splunk via an HTTP collector. I have set up the source "swarm:docker" with the following props.conf file      ... See more...
Hi, I am using Splunk 8.2.1 and I have configured the docker daemon to send logs to Splunk via an HTTP collector. I have set up the source "swarm:docker" with the following props.conf file       [swarm:docker] DATETIME_CONFIG = INDEXED_EXTRACTIONS = json KV_MODE = none LINE_BREAKER = ([\r\n]+) NO_BINARY_CHECK = true category = Structured description = Log Swarm disabled = false pulldown_type = true       The logs arrive in splunk with the right source type, but I don't have the fields extraction...I don't understand what's wrong   Can you help Me ?
how parsing xml data ?     <v8e:Event> <v8e:Level>Information</v8e:Level> <v8e:Date>2022-01-26T16:20:24</v8e:Date> <v8e:ApplicationName>Job</v8e:ApplicationName> <v8e:ApplicationPresentation>Ф... See more...
how parsing xml data ?     <v8e:Event> <v8e:Level>Information</v8e:Level> <v8e:Date>2022-01-26T16:20:24</v8e:Date> <v8e:ApplicationName>Job</v8e:ApplicationName> <v8e:ApplicationPresentation>Фоновое</v8e:ApplicationPresentation> <v8e:Event>Finish</v8e:Event> <v8e:EventPresentation>Сеанс</v8e:EventPresentation> <v8e:User>Jong Wik</v8e:User> <v8e:UserName>Корот</v8e:UserName> <v8e:Computer>srv-2-srv</v8e:Computer> <v8e:Metadata/> <v8e:MetadataPresentation/> <v8e:Comment/> <v8e:Data xsi:nil="true"/> <v8e:DataPresentation/> <v8e:TransactionStatus>NotApplicable</v8e:TransactionStatus> <v8e:TransactionID/> <v8e:Connection>0</v8e:Connection> <v8e:Session>5146</v8e:Session> <v8e:ServerName/> <v8e:Port>0</v8e:Port> <v8e:SyncPort>0</v8e:SyncPort> </v8e:Event>
Hi I launch a dashboard from another dashboard when I click on the field "Site" /app/spl_pub_dashboard/bib_reg?Site=$click.value$  Now I need to retrieve the field "Site" in the dropdown list of ... See more...
Hi I launch a dashboard from another dashboard when I click on the field "Site" /app/spl_pub_dashboard/bib_reg?Site=$click.value$  Now I need to retrieve the field "Site" in the dropdown list of the destination dashboard and in the same time to keep the site values from "site.csv" Is anybody can help please?   <input type="dropdown" token="Site" searchWhenChanged="true"> <label>Site</label> <fieldForLabel>Site</fieldForLabel> <fieldForValue>Site</fieldForValue> <search> <query>| inputlookup site.csv</query> </search> </input>  
Hello everyone. I'm looking for some assistance with a problem where I get differing search results from what should be the same search. Backstory I’m testing changes to the “ESCU - Malicious Power... See more...
Hello everyone. I'm looking for some assistance with a problem where I get differing search results from what should be the same search. Backstory I’m testing changes to the “ESCU - Malicious PowerShell Process - Execution Policy Bypass – Rule” so that I can filter out known PowerShell events. Using the same search head, user,  date and time range, and what should be two identical macros, I get different search results.   The original search uses this macro: “malicious_powershell_process___execution_policy_bypass_filter” The original search is: | tstats `security_content_summariesonly` values(Processes.process_id) as process_id, values(Processes.parent_process_id) as parent_process_id values(Processes.process) as process min(_time) as firstTime max(_time) as lastTime from datamodel=Endpoint.Processes where Processes.process_name=powershell.exe (Processes.process="* -ex*" OR Processes.process="* bypass *") by Processes.process_id, Processes.user, Processes.dest | `drop_dm_object_name(Processes)` | `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `malicious_powershell_process___execution_policy_bypass_filter`   The test search uses this macro: “malicious_powershell_process___execution_policy_bypass_filter-test” The test search is: | tstats `security_content_summariesonly` values(Processes.process_id) as process_id, values(Processes.parent_process_id) as parent_process_id values(Processes.process) as process min(_time) as firstTime max(_time) as lastTime from datamodel=Endpoint.Processes where Processes.process_name=powershell.exe (Processes.process="* -ex*" OR Processes.process="* bypass *") by Processes.process_id, Processes.user, Processes.dest | `drop_dm_object_name(Processes)` | `security_content_ctime(firstTime)` | `security_content_ctime(lastTime)` | `malicious_powershell_process___execution_policy_bypass_filter-test` Both macros contain the same content to exclude Splunk Universal Forwarder PowerShell scripts: search (process!="C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe -executionPolicy RemoteSigned -command \". 'C:\\Program Files\\SplunkUniversalForwarder\\etc\\apps\\Splunk_TA_windows\\bin\\powershell\\nt6-health.ps1'\"" AND process!="C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe -executionPolicy RemoteSigned -command \". 'c:\\Program Files\\SplunkUniversalForwarder\\etc\\apps\\Splunk_TA_windows\\bin\\powershell\\nt6-repl-stat.ps1'\"" AND process!="C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe -executionPolicy RemoteSigned -command \". 'c:\\Program Files\\SplunkUniversalForwarder\\etc\\apps\\Splunk_TA_windows\\bin\\powershell\\nt6-siteinfo.ps1'\"" AND process!="C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe -executionPolicy RemoteSigned -command \". 'C:\\Program Files\\SplunkUniversalForwarder\\etc\\apps\\Splunk_TA_windows\\bin\\powershell\\dns-zoneinfo.ps1'\"" AND process!="C:\\Windows\\system32\\WindowsPowerShell\\v1.0\\powershell.exe -executionPolicy RemoteSigned -command \". 'C:\\Program Files\\SplunkUniversalForwarder\\etc\\apps\\Splunk_TA_windows\\bin\\powershell\\dns-health.ps1'\"")     When I run both searches I get different results and I’m unsure why. The macro appended -test works fine. When I copy its contents to the original macro, that search does not seem to use the new contents. I made these changes last week and today get the same results. Any ideas as to what might be causing this?
Hello, We recently installed Splunk, we thought we had a free license, however we got a notice that we have exceeded the quota and the license has been blocked. We have changed the license group to ... See more...
Hello, We recently installed Splunk, we thought we had a free license, however we got a notice that we have exceeded the quota and the license has been blocked. We have changed the license group to free, however the search is still blocked. How can we unlock it? Thank you very much and greetings!
Hi, I have created a watchlist, AWS_IPs and added IP addresses to it. Further, I have created anomaly action rule to reduce the anomaly score by 3 and added AWS_IPs watchlist to it. But I do not se... See more...
Hi, I have created a watchlist, AWS_IPs and added IP addresses to it. Further, I have created anomaly action rule to reduce the anomaly score by 3 and added AWS_IPs watchlist to it. But I do not see this AAR getting applied to anomalies that have IP address which are listed in watchlist. Can anyone please suggest what could the reason behind it and how can I resolve it. Thanks!
Hi All, Can we leverage AppDynamics for updating CMDB (Service Now or any other CMDB) based on auto-discovery it does for applications.
I have a dashboard which has two panels; one of them shows the errors and the count of how many times similar error has occurred. Another panel shows the details of the error when clicked on an error... See more...
I have a dashboard which has two panels; one of them shows the errors and the count of how many times similar error has occurred. Another panel shows the details of the error when clicked on an error. This error string sometimes has characters like <,>,\, etc. and splunk does not read them as string but characters, which results in error on the panel. I would like to use something which would escape all the possible characters in one go. Currently, my query looks like this: index=index_name sourcetype="sourcetype_name"  [ | makeresults | eval cleansearch=replace(replace("$search$", "<", "\<"), ">", "\>") | return $cleansearch] Is there anything more simpler that I could use that would escape all the possible characters that could cause an issue instead of replacing each character individually?
Hi all, I was wondering what are the steps involved in developing a splunk alert that creates an incident ticket on Service Now - this ticket would be updated if further alerts are triggered by the ... See more...
Hi all, I was wondering what are the steps involved in developing a splunk alert that creates an incident ticket on Service Now - this ticket would be updated if further alerts are triggered by the same folder/account. E.g. Splunk alert is triggered an incident ticket is created. The same splunk alert is triggered for the same person/folder/entity instead of creating a new incident ticket it appends onto the one created and open   I've found some doc's on integrating Splunk and Snow together. But in regards to establishing the alert side of things the documentation to do so seems out dated; "Scripted Alert is now a deprecated Splunk feature. The best practice is to use custom alert actions instead."  If thats the case how should I implement the splunk alert? 
I have a Splunk dashboard as follows with some input fields.  Now As you can see above by default the input value for host column is wild card * . Now, If I click the checkbox "show" under Laten... See more...
I have a Splunk dashboard as follows with some input fields.  Now As you can see above by default the input value for host column is wild card * . Now, If I click the checkbox "show" under Latency validation check it will display a panel below it.  Now, as a good practice how do I disable executing the search when some one used the wild card in host input as the input will be used as a token for the search.  I mean only when a valid hostname is given as an input for Host the search should execute for the Latency Validation check.  Note :- Here I need the wild card as a default entry for host input as there are some other panels which uses that wild card but I only wanted to restrict the use of wild card to only one particular panel.    Code can be found below :-        <input type="text" token="host"> <label>Host</label> <default>*</default> </input> <input type="checkbox" token="overview"> <label>Latency Validation Check</label> <choice value="true">Show</choice> <change> <condition label="Show"> <set token="overview">true</set> </condition> </change> </input>      
I am trying to set up an index page in splunk studio dashboard, which redirects to other splunk dash boards based on dropdown. dropdown has three values, based on value dashboard will be updated with... See more...
I am trying to set up an index page in splunk studio dashboard, which redirects to other splunk dash boards based on dropdown. dropdown has three values, based on value dashboard will be updated with that category dashboards list, where on click it redirects to that category splunk dashboard. Say for example, in below image 1, type is B-004. Both the boxes are updated with same links given in static input. I was expecting BOX1 being updated with https://B-004/logs and BOX 2 being updated with https://B-004/results as in image 2. please find the code below. And what would be SPL if want to give a string name(any) instead number for URL display. Appreciate your help. thanks in advance. image 1: image 2: code  for image 1: {     "visualizations": {         "viz_Hpfgo6KN": {             "type": "viz.rectangle",             "options": {                 "stroke": "transparent",                 "fill": "#1A1C20",                 "rx": 4             }         },         "viz_lto5aays": {             "type": "viz.text",             "options": {                 "content": "box1",                 "fontSize": 14,                 "fontWeight": "bold",                 "textColor": "#FBFBFB"             }         },         "viz_AJxRjGHA": {             "type": "abslayout.line",             "options": {                 "strokeColor": "#33343B"             }         },         "viz_ZI34ttXn": {             "type": "viz.singlevalue",             "options": {                 "backgroundColor": "#ffffff"             },             "dataSources": {                 "primary": "ds_pQ72LSdV"             },             "eventHandlers": [                 {                     "type": "drilldown.customUrl",                     "options": {                         "url": "$myWebsiteToken$",                         "newTab": true                     }                 }             ]         },         "viz_Nd21m86c": {             "type": "viz.rectangle",             "options": {                 "stroke": "transparent",                 "fill": "#1A1C20",                 "rx": 4             }         },         "viz_icJmS4Ub": {             "type": "viz.text",             "options": {                 "content": "box2",                 "fontSize": 14,                 "fontWeight": "bold",                 "textColor": "#FBFBFB"             }         },         "viz_mSmorwn2": {             "type": "viz.singlevalue",             "options": {                 "backgroundColor": "#ffffff"             },             "dataSources": {                 "primary": "ds_gy5Vi8dr_ds_pQ72LSdV"             },             "eventHandlers": [                 {                     "type": "drilldown.customUrl",                     "options": {                         "url": "$myWebsiteToken$",                         "newTab": true                     }                 }             ]         }     },     "dataSources": {         "ds_pQ72LSdV": {             "type": "ds.search",             "options": {                 "query": "index= _internal\n| stats count",                 "queryParameters": {                     "earliest": "-4h@m",                     "latest": "now"                 }             },             "name": "Search_1"         },         "ds_gy5Vi8dr_ds_pQ72LSdV": {             "type": "ds.search",             "options": {                 "query": "index= _internal\n| stats count",                 "queryParameters": {                     "earliest": "-4h@m",                     "latest": "now"                 }             },             "name": "Copy of Search_1"         }     },     "inputs": {         "staticInput": {             "type": "input.dropdown",             "options": {                 "items": [                     {                         "label": "A-002",                         "value": "https://A-002/"                     },                     {                         "label": "B-004",                         "value": "https://B-004/"                     },                     {                         "label": "C-005",                         "value": "https://C-005/"                     }                 ],                 "token": "myWebsiteToken"             },             "title": "type"         }     },     "layout": {         "type": "absolute",         "options": {             "height": 1100,             "showTitleAndDescription": false,             "backgroundColor": "#111215",             "width": 1500,             "display": "auto-scale"         },         "globalInputs": [             "staticInput"         ],         "structure": [             {                 "item": "viz_Hpfgo6KN",                 "type": "block",                 "position": {                     "x": 0,                     "y": 0,                     "w": 390,                     "h": 360                 }             },             {                 "item": "viz_lto5aays",                 "type": "block",                 "position": {                     "x": 0,                     "y": 0,                     "w": 300,                     "h": 50                 }             },             {                 "item": "viz_AJxRjGHA",                 "type": "line",                 "position": {                     "from": {                         "x": 1496,                         "y": 212                     },                     "to": {                         "x": -3,                         "y": 212                     }                 }             },             {                 "item": "viz_ZI34ttXn",                 "type": "block",                 "position": {                     "x": 70,                     "y": 120,                     "w": 250,                     "h": 130                 }             },             {                 "item": "viz_Nd21m86c",                 "type": "block",                 "position": {                     "x": 400,                     "y": 0,                     "w": 390,                     "h": 360                 }             },             {                 "item": "viz_icJmS4Ub",                 "type": "block",                 "position": {                     "x": 430,                     "y": 20,                     "w": 300,                     "h": 50                 }             },             {                 "item": "viz_mSmorwn2",                 "type": "block",                 "position": {                     "x": 500,                     "y": 140,                     "w": 250,                     "h": 130                 }             }         ]     },     "description": "Viz Description Here",     "title": "testing_index" }
Hi monitoring Postgres databases using Prometheus server and setup alerts using alert manager however trying to integrate alerting from alert manager in Prometheus to Splunk using http endpoint and ... See more...
Hi monitoring Postgres databases using Prometheus server and setup alerts using alert manager however trying to integrate alerting from alert manager in Prometheus to Splunk using http endpoint and Hec in splunk   please suggest if it is possible if yes then how 
Hello All,   I am trying to calculate the Average of a column, but i want it to ignore all values that are equal to 0.   This currently what I have right now:    stats avg(ComplianceScore) as C... See more...
Hello All,   I am trying to calculate the Average of a column, but i want it to ignore all values that are equal to 0.   This currently what I have right now:    stats avg(ComplianceScore) as CS by GeoLocation   But I need it to calculate AVG only if Compliance Score is not Zero.    Thank you, Marco