All Topics

Top

All Topics

How come this doesn't work given indexers.csv is a list of Splunk servers with role Indexer? | inputlookup indexers.csv| rename splunk_server as Indxr| foreach Indxr [search index=_introspection so... See more...
How come this doesn't work given indexers.csv is a list of Splunk servers with role Indexer? | inputlookup indexers.csv| rename splunk_server as Indxr| foreach Indxr [search index=_introspection sourcetype=splunk_resource_usage component=IOStats host=Indxr | eval reads_ps = 'data.reads_ps'| eval writes_ps = 'data.writes_ps' | eval writes_ps=avg(write_ps) | eval reads_ps=avg(reads_ps)]
How can I find out how often the forwarders are sending their logs to indexers? How to search in splunk enterprise   Thanks, RPM
Hi splunkers, I would like to inform you that i am using below geostat spl, but i am unable to get result can anyone help me please where i am doing mistake i have chosen .csv file source type when... See more...
Hi splunkers, I would like to inform you that i am using below geostat spl, but i am unable to get result can anyone help me please where i am doing mistake i have chosen .csv file source type when i am trying to get spl result it says no data found index="main" | geostats latfield=vendorlatitude longfield=vendorlongtitude count by vendorcountry Would be appreciate your kind support. thanks in advance
Can't figure a search to generate a history of URL's generated from a PC in a Domain on a closed Network.
Is there a way to journal all Exchange 365 messages to splunk for archiving /Compliance purposes? The actual message, so if there were to be a lawsuit down the road, we could bring up the actual me... See more...
Is there a way to journal all Exchange 365 messages to splunk for archiving /Compliance purposes? The actual message, so if there were to be a lawsuit down the road, we could bring up the actual messages and read it.    
Hi All, i have one app in splunk and created inputs, under input i have  Jira Query Language(JQL)......................project=IIT%20AND%20updated%3E-6h But now i need to add other project int... See more...
Hi All, i have one app in splunk and created inputs, under input i have  Jira Query Language(JQL)......................project=IIT%20AND%20updated%3E-6h But now i need to add other project into it, the project is IIM, now i am not getting how to add in this JQL query. please help me on this.
I dont want incidentid/ checkpoint type Hi Team, I have created an application which has key, name etc., But there are some extra fields like checkpoint type are popping up whenever I create a ... See more...
I dont want incidentid/ checkpoint type Hi Team, I have created an application which has key, name etc., But there are some extra fields like checkpoint type are popping up whenever I create a new input to that app. How do I change these setting backend?   Please help!!!   Regards, Samhitha.
I have tried the following to send the included windows event to null but it does not work I have tried the props.conf and transform.conf in system\local and apps\"appname"\local raw event: <Ev... See more...
I have tried the following to send the included windows event to null but it does not work I have tried the props.conf and transform.conf in system\local and apps\"appname"\local raw event: <Event xmlns='http://schemas.microsoft.com/win/2004/08/events/event'><System><Provider Name='Microsoft-Windows-Sysmon' Guid='{5770385f-c22a-43e0-bf4c-06f5698ffbd9}'/><EventID>13</EventID><Version>2</Version><Level>4</Level><Task>13</Task><Opcode>0</Opcode><Keywords>0x8000000000000000</Keywords><TimeCreated SystemTime='2023-02-22T16:39:16.083750800Z'/><EventRecordID>18650882160</EventRecordID><Correlation/><Execution ProcessID='2496' ThreadID='3780'/><Channel>Microsoft-Windows-Sysmon/Operational</Channel><Computer>site-wec.site.lan</Computer><Security UserID='S-1-5-18'/></System><EventData><Data Name='RuleName'>-</Data><Data Name='EventType'>SetValue</Data><Data Name='UtcTime'>2023-02-22 16:39:16.081</Data><Data Name='ProcessGuid'>{4bf925e4-0d0b-63e5-4100-000000002000}</Data><Data Name='ProcessId'>2688</Data><Data Name='Image'>C:\Windows\system32\svchost.exe</Data><Data Name='TargetObject'>HKLM\SOFTWARE\Microsoft\Windows\CurrentVersion\EventCollector\Subscriptions\Sysmon\EventSources\site-wec.site.lan\Bookmark</Data><Data Name='Details'>&lt;BookmarkList&gt;&lt;Bookmark Channel="Microsoft-Windows-Sysmon/Operational" RecordId="18650811531" IsCurrent="true"/&gt;&lt;/BookmarkList&gt;</Data><Data Name='User'>NT AUTHORITY\NETWORK SERVICE</Data></EventData></Event>   props.conf [XmlWinEventLog:Microsoft-Windows-Sysmon/Operational TRANSFORMS-sysmon13Bookmark = sysmon13-Bookmark transforms.conf [sysmon13-Bookmark] REGEX = (<EventID>13<\/EventID>).+Bookmark DEST_KEY = queue FORMAT = nullQueue
Try this request on Splunk :     | makeresults | eval redir="../../app"     My request is automatically transformed by      | makeresults | eval redir="app"     How can I have... See more...
Try this request on Splunk :     | makeresults | eval redir="../../app"     My request is automatically transformed by      | makeresults | eval redir="app"     How can I have a work around ? I try on chrome or firefox without success.
I am trying to create a report that will take a username(user) and look for the most recent IP address(src_ip) they used, then take that IP address that was found and look back 7 days for all events ... See more...
I am trying to create a report that will take a username(user) and look for the most recent IP address(src_ip) they used, then take that IP address that was found and look back 7 days for all events where the src_ip is the same. I've attempted to use join in different manners but have been unsuccessful.  Here is the latest try, where I get a return of everything, but I just need a complete list of users using that same IP.   Any help is appreciated, thanks!     index=firewall |where isnotnull(src_ip) |join type=left [ | search (index=firewall user=mrusername) | eval ip=src_ip | where isnotnull(ip) | fields ip ] | where ip=src_ip |table user IP src_ip      
Hi everybody, I would like to duplicate data coming from my sourcetype in such a way: - send the original data to Splunk for indexing. - send the duplicated events to an external server with "<... See more...
Hi everybody, I would like to duplicate data coming from my sourcetype in such a way: - send the original data to Splunk for indexing. - send the duplicated events to an external server with "<DNS>" prefix string. How should I modify the transform.conf file in order to do that? Another question: is there a better way to forwards logs to external server while keeping the original source  host (source IP) instead of adding prefixes like what I'm trying to do.  Thanks in advance, Angelo
Hi everybody,   I've been struggling for hours to install splunks universal forwarder on windows server 2022. Here's the msiexec logs : https://drive.google.com/file/d/1NtNN9mT97-gbwprIc4cCAec5... See more...
Hi everybody,   I've been struggling for hours to install splunks universal forwarder on windows server 2022. Here's the msiexec logs : https://drive.google.com/file/d/1NtNN9mT97-gbwprIc4cCAec5mi7Jhl6H/view?usp=sharing  Help
We have been trying to address a problem that exists between our Splunk deployment and AWS Firehose, namely that Firehose adds 250 bytes of useless JSON wrapper to all log events (which when multipli... See more...
We have been trying to address a problem that exists between our Splunk deployment and AWS Firehose, namely that Firehose adds 250 bytes of useless JSON wrapper to all log events (which when multiplied by millions/billions events increases our storage and license costs enormously). In order to address this we turned to a combination of INGEST_EVAL on our heavy forwarders which will: 1. Strip the JSON envelope from the event 2. Unescape all of the JSON quotes in the actual log data, making it parse-able JSON once again 3. Assign the logStream/logGroup values to host/source respectively This is somewhat working and when we look in Splunk it appears our events are showing up with all the appropriate fluff removed... so for example this is what our events used to look like (logGroup, logStream, message and timestamp are all added values from AWS Firehose): After the processing they now look like this: As you can see, the event is much much smaller without losing any necessary information. However, to our surprise this has not had any impact on ingestion levels. It seems to be exactly the same. We also noticed that all of these fields, even though they do not appear in the event view, are actually available and indexed in the "interesting fields" area, which seems to explain why our ingestion/storage has not decreased at all: For reference, these are the props/transforms I'm using to accomplish this: Props.conf:       [source::http:AWS2Splunk] TRANSFORMS-hostname = changehost TRANSFORMS-sourceinfo = changesource priority = 100 [aws:firehose:json] priority = 1000 TRANSFORMS-stripfirehosewrapper = stripfirehosewrapper       Transforms.conf:       [changehost] DEST_KEY = MetaData:Host REGEX = \,.logStream...([^\"]+)\"\,\"timestamp FORMAT = host::$1 [changesource] DEST_KEY = MetaData:Source REGEX = \,.logGroup...([^\"]+)\"\,\"logStream FORMAT = source::$1 [stripfirehosewrapper] INGEST_EVAL = _raw=replace(replace(replace(replace(replace(_raw,"\{\"message\"\:\"",""),"..\"logGroup\"\:\".*",""),"\\\\\"","\""),"\\\{2}","\\"),"\"stream\":\"\w+\"\,","")         Anyone have any thoughts as to what we're doing wrong? Is this possibly a conflict with doing a DEST_KEY before INGEST_EVAL? Will these two not necessarily play nice together? UPDATE: I changed from DEST_KEY to using INGEST_EVAL completely... still seems to be the same issue   [changehost] #DEST_KEY = MetaData:Host #REGEX = \,.logStream...([^\"]+)\"\,\"timestamp #FORMAT = host::$1 INGEST_EVAL = host=replace(replace(_raw,".*\"\,\"logStream\"\:\"",""),"\"\,\"timestamp\".*","") [changesource] #DEST_KEY = MetaData:Source #REGEX = \,.logGroup...([^\"]+)\"\,\"logStream #FORMAT = source::$1 INGEST_EVAL = source=replace(replace(_raw,".*logGroup\"\:\"",""),"\"\,\"logStream\"\:.*","")    
Hi, I'm filtering a search to get a result for a specific values by checking it manually this way: .... | stats sum(val) as vals by value | where value="v1" OR value="v2" OR value="v3" I'm w... See more...
Hi, I'm filtering a search to get a result for a specific values by checking it manually this way: .... | stats sum(val) as vals by value | where value="v1" OR value="v2" OR value="v3" I'm wondering if it is possible to do the same by checking if the value exists in a list coming from another index: (something like this) .... | append [search index=another_index | stats values(remote_value) as values_list] | stats sum(val) as vals by value | where (value in values_list)
Hi everyone, Im trying to install appdynamics saas agent in a cointainer running on a graviton linux instance (ARM64). Someone have done this before ? Or is not possible to run in linux arm64 ? Tha... See more...
Hi everyone, Im trying to install appdynamics saas agent in a cointainer running on a graviton linux instance (ARM64). Someone have done this before ? Or is not possible to run in linux arm64 ? Thanks!
We are using splunk in Enterprise environemnt with Very large scale operation.  Management decided to address the question why is the above mentioned app included in the package? We are not using... See more...
We are using splunk in Enterprise environemnt with Very large scale operation.  Management decided to address the question why is the above mentioned app included in the package? We are not using it and would like not to have it in the future.
Hello, I need to ingest Cynet XDR audit and alert events into Splunk Cloud solution but can not find a procedure docs. Neither in Cynet nor in Splunk. Does someone know the how-to or point me to a ... See more...
Hello, I need to ingest Cynet XDR audit and alert events into Splunk Cloud solution but can not find a procedure docs. Neither in Cynet nor in Splunk. Does someone know the how-to or point me to a starting point? Thank you  
Before upgrading in production can I ask for update notes , changelogs of Splunk Add-on for ServiceNow 7.4.1 >7.5.0 Thanks
Hello, I have a data model named firewall_logs with firewall data in which the interesting fields are: file_hash, url and source/dest IP. And I have a dataset named intel_indicators with column nam... See more...
Hello, I have a data model named firewall_logs with firewall data in which the interesting fields are: file_hash, url and source/dest IP. And I have a dataset named intel_indicators with column named ioc in which I have hashes, IPs, domains and timestamp.   What I want to do is to compare the data (hashes, IPs, domains) from ioc column with the fields: file_hash, url, dest_ip. If there is a match, it should be visible. Any ideea how I can accomplish this ?   | tstats summariesonly=t allow_old_summaries=t  ...interesting fields.... from datamodel="firewall_logs"  a and here I'm stuck
Hello I have python script just like this         #!/bin/python import os import json import datetime HOMEPATH = '/opt/monitor_dirs/SomeDir' def path_to_dict(path, depth = 1, first = Fa... See more...
Hello I have python script just like this         #!/bin/python import os import json import datetime HOMEPATH = '/opt/monitor_dirs/SomeDir' def path_to_dict(path, depth = 1, first = False): for base, dirs, files in os.walk(path): r = {'name': base, 'dirs': len(dirs), 'files': len(files)} if first: r['datetime'] = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S%z") if depth > 0: r['subdirs'] = {} for subdir in dirs: r['subdirs'][subdir] = path_to_dict(os.path.join(path, subdir), depth - 1); return r #print path_to_dict(HOMEPATH, 1) result = path_to_dict(HOMEPATH, 1, True) if result: print (json.dumps(result, sort_keys=True, indent=4))           And i have output          # ./file_count.py { "datetime": "2023-02-22T21:10:49", "dirs": 9, "files": 0, "name": "/opt/monitor_dirs/SomeDir", "subdirs": { "XXXX": { "dirs": 0, "files": 63, "name": "/opt/monitor_dirs/XXXX" } } }             There is some problem in Index I have 2 event instead just only one 1. { 2.  ""datetime": "2023-02-22T21:10:49", "dirs": 9, "files": 0, and so on, but there is no '{'   How i can get only one event with my JSON