All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

_time device1_avg device2_avg device3_avg device4_avg 2022-04-07 00:00 34 3 11 22 2022-04-07 01:00 21 76 41 87 2022-04-07 02:00 2 18 32 32 2022-04-07 03:00 12 3 36 ... See more...
_time device1_avg device2_avg device3_avg device4_avg 2022-04-07 00:00 34 3 11 22 2022-04-07 01:00 21 76 41 87 2022-04-07 02:00 2 18 32 32 2022-04-07 03:00 12 3 36 54 2022-04-07 04:00 7 8 21 43 2022-04-07 05:00 11 3 17 21 2022-04-07 06:00 19 12 19 16 2022-04-07 07:00 15 10 12 19 2022-04-07 08:00 4 2 19 6   I have a table of averages for an arbitrary number of arbitrary devices as shown above. How do I use these averages as thresholds for alerts about these devices? I'm trying to have a search that runs every 15 minutes to check which devices have exceeded these averages. For example, if a search is run at 06:45, and returns that device1 has a count of 10, device2 has a count of 15, device3 has a count of 21, and device 4 has a count of 2, send an alert that says device2 and device3 have exceeded their averages listed for the 06:00 hour (i.e., 12 and 19, respectively).
Hi All, I am doing a very simple search over All Time of:        index=index=orafin sourcetype=ORAFIN2       It returns 26 rows and, as this shows, all have a transaction_type value... See more...
Hi All, I am doing a very simple search over All Time of:        index=index=orafin sourcetype=ORAFIN2       It returns 26 rows and, as this shows, all have a transaction_type value: If I then select D it adds that to the search but retuns NO rows:   Oddly if I change the search to a double negative  I get my data: Whats going on? Hoping to be enlightened, Keith
Here's the text string from the log I'm searching: store license for Store 123456 2022-04-07 19:17:44,360 ERROR path not found   Here's my splunk search: index=* host="storelog*" "store lice... See more...
Here's the text string from the log I'm searching: store license for Store 123456 2022-04-07 19:17:44,360 ERROR path not found   Here's my splunk search: index=* host="storelog*" "store license for " |rex field=_raw "Store\s123456\n\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\,\d{3}\s(?P<errortext>.*)path" | stats count by errortext   Why am I getting the following when I search? No results found.
Hey all ,  just need a little regex help trying to pull an IP address out  and its not working. here is my rex  | rex field=_raw "Remote host:(?<Remotehost>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" ... See more...
Hey all ,  just need a little regex help trying to pull an IP address out  and its not working. here is my rex  | rex field=_raw "Remote host:(?<Remotehost>\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})" here is an example event  4/7/22 3:11:32.000 PM   04/07/2022 03:11:32 PM LogName=Security EventCode=4779 EventType=0 ComputerName=BPSQCP00S080.rightnetworks.com SourceName=Microsoft Windows security auditing. Type=Information RecordNumber=115076290 Keywords=Audit Success TaskCategory=Other Logon/Logoff Events OpCode=Info Message=A session was disconnected from a Window Station. Subject: Account Name: 705628 Account Domain: RIGHTNETWORKS Logon ID: 0x13887BFB Session: Session Name: RDP-Tcp#81 Additional Information: Client Name: DESKTOP-PIT40LB Client Address: 73.175.205.64
I'm trying to make a visualization showing our number of signatures, but the data is not very organized because I have 20+ results with variations of the name generic, like for example: Generic.T... See more...
I'm trying to make a visualization showing our number of signatures, but the data is not very organized because I have 20+ results with variations of the name generic, like for example: Generic.TC.ldrvmp 1 Generic.TC.ligldq 1 Generic.TC.ljhook 1 Generic.TC.lmzdbq 1 Generic.TC.lnionm 1 Generic.TC.lniqpu 1 Generic.TC.lxboaq 1 Generic.TC.mpneia 1 Generic.TC.mpngod I want to group all these results under the name "generic", but it seems like if I try to use wild cards in the below search it gives me an error. I could do write out each signature individually in the |eval command but that seems very inefficient. I was wondering if it was possible for me to group the results in to the same name? | eval signature=case(signature="Generic.*", "generic")  |stats count by signature | sort -count
Hi, I have documents similar to the one below:     request_id: 12345 revision: 123 other_field: stuff my_precious: { [-] 1648665400.774453: { [-] keys: [ [-] key:key... See more...
Hi, I have documents similar to the one below:     request_id: 12345 revision: 123 other_field: stuff my_precious: { [-] 1648665400.774453: { [-] keys: [ [-] key:key1, size: 329 ] op: operation_1 } 1648665400.7817056: { [-] keys: [ [-] key:key2, size: 785 ] op: operation_2 } 1648665400.7847242: { [-] keys: [ [-] key:key4, size: 632 ] op: operation_1 } 1648665400.7886434: { [-] keys: [ [-] key:key5, size: 1938 ] op: operation_3 } 1648665400.7932374: { [-] keys: [ [-] key:key3, size: 23 ] op: operation_2 }     I currently have a query to get the frequency of a certain key, but how can I display the "size" information along side with it? My query right now is:     rex (?<keys>"(?<=key:).*?(?=,)") |stats count by keys | sort -count | head 10     which displays the keys with the highest number of count, but it doesn't show the key's associated "size". Can't quite figure this out...any help is appreciated!
Is it possible to get all app insights data using data manager in splunk cloud Victoria experience? 
Hey Community,  I am trying to get my head around this query  My subsearch below, The query will look for the api path,src and Ip's and I am doing dns lookup to get hostname which is present in dif... See more...
Hey Community,  I am trying to get my head around this query  My subsearch below, The query will look for the api path,src and Ip's and I am doing dns lookup to get hostname which is present in different index        site = "friendly" index=traffic_log src="*" uri="*" | eval date = date_month + "/" + date_mday + "/" + date_wday + "/" + date_year | mvexpand date | dedup src | dedup uri | lookup dnslookup clientip as src OUTPUT clienthost as ComputerName | where like (ComputerName,"p%") | dedup ComputerName |table ComputerName,src,uri,date       Main query. If see my main query Computername is the only filed which is present in main index search and want to use for searching with computername. which will give the owner details of the hostname but also I want the src,uri,date fileds from subsearch to be added in table      index="wineventlog" source="WinEventLog:Application" [ search site = "friendly.org" index=traffic_log src="*" uri="*" | eval date = date_month + "/" + date_mday + "/" + date_wday + "/" + date_year | mvexpand date | dedup src | dedup uri | lookup dnslookup clientip as src OUTPUT clienthost as ComputerName | where like (ComputerName,"p%") | dedup ComputerName |fields ComputerName,src,uri,date] | dedup ComputerName| dedup ownerEmail | dedup ownerFull | dedup ownerName | dedup ownerDept | table ComputerName, ownerEmail,ownerFull,ownerName,ownerDept,src,uri,date       Can someone throw insights into the query
Hello,  Many thanks in advance for taking the time to read/consider my question, it's always appreciated! I'm currently working on reducing the overhead of our existing Windows UF by adding to o... See more...
Hello,  Many thanks in advance for taking the time to read/consider my question, it's always appreciated! I'm currently working on reducing the overhead of our existing Windows UF by adding to our blacklist in a way that effectively blacklists all login events for Windows where the Process Name is "-", since these are often extremely voluminous and often don't directly correlate with actual user logins (feel free to correct me if I'm wrong here). These are also indicated when the "Process ID" is "0x0", which is also shown by the blacklists I've attempted below: The blacklists that I have tried to no avail are as follows:   blacklist = EventCode="4624" Message="(?:Process Name:).+(?:C:\\Windows\\System32\\services.exe)|.+(?:C:\\Windows\\System32\\winlogon.exe)|.+(?:C:\\Windows\\CCM\\CcmExec.exe)|.+(?:[-]\sNetwork)" blacklist = EventCode="4624" Message="Process\sID:\s0x0" blacklist = EventCode="4624" Message="(?:Process\sName:\s[-])"   Please let me know if I'm missing anything with any of those blacklists above, but so far that I've tested none of those blacklists actually eliminate the "Process Name" of "-" being sent to Splunk, which increases our license ingestion while providing essentially no new information or value.  Thanks in advance, any & all answers will be rewarded with karma!  Charlie  
Not able to get rid of EDT timezone using strftime command 2022-04-07 07:00:11.028-EDT . Any suggestions
Greetings, A network engineer recently modified the firewalls for the ports and we are getting this error messages in our search head, Cluster Manager, and Indexers:  Auto Load Balanced TCP Output ... See more...
Greetings, A network engineer recently modified the firewalls for the ports and we are getting this error messages in our search head, Cluster Manager, and Indexers:  Auto Load Balanced TCP Output Root Cause(s): More than 70% of forwarding destinations have failed. Ensure your hosts and ports in outputs.conf are correct. Also ensure that the indexers are all running, and that any SSL certificates being used for forwarding are correct.
When I used the following code to perform a query: service = client.connect( host= 'splunk.bart.gov', port = '8089', username = 'userid', password = 'secrete', ) query = "search index=slog_ics... See more...
When I used the following code to perform a query: service = client.connect( host= 'splunk.bart.gov', port = '8089', username = 'userid', password = 'secrete', ) query = "search index=slog_ics sourcetype=occ_mgr | table _time, ENTRY | head 3" query_results = service.jobs.oneshot(query) reader = res.ResultsReader(query_results) results = [] for item in reader: print(item) results.append(item) print("results[1]:") print(results[1]) In the above result, I cannot see the value for the field ENTRY. ENTRY is a field defined by the sourcetype occ_mgr in my application ics_analytics. While in Splunk web UI, in the context of the application ics_analytics using the same query, I can see the field value of ENTRY: index=slog_ics sourcetype=occ_mgr | fields _time, ENTRY | head 3 with the result: _time ENTRY 4/6/22 2:11:00.000 AM EOR. 4/6/22 1:48:00.000 AM (ref 0120) T203 released ATO, (762) second delay. 4/6/22 1:36:00.000 AM CORE Blanket established. What could be the root cause of the problem?          
Does anyone have a solution for a query that will return the daily event count of every index, index by index, even the ones that have ingested zero events? | tstats count WHERE index=* OR index=_*... See more...
Does anyone have a solution for a query that will return the daily event count of every index, index by index, even the ones that have ingested zero events? | tstats count WHERE index=* OR index=_* by index ... only returns indexes that have > 0 events.  
Hello, I'm trying to create a dashboard with a statistics table that will show a list of domains/hosts 10 minutes before and after a user connected to a specific domain. i.e. A user connected to ab... See more...
Hello, I'm trying to create a dashboard with a statistics table that will show a list of domains/hosts 10 minutes before and after a user connected to a specific domain. i.e. A user connected to abc.com at 12pm EST on 4/7/2022. I want to be able to input the users ID and host (abc.com) into text fields and when I submit/search I want results to show (sorted by time) all of the domains/hosts the user went to 10 minutes before and leading up to abc.com as well as all of the domains/hosts after abc.com I am stuck and have tried different variations of the query below using sort and desc. I've gotten results however, they only show the specific host that was entered into the text field and not the other hosts around that searched host. This is what I've started with and as previously mentioned, I've tried altering it quite a few times: index=proxy userID=$user_id$ host=$host_id$ | table _time, userID, host, ip, | sort host span=10m, -host span=10m Any assistance is appreciated!
Hi, I am trying to send data to HTTP Event Collector(HEC) free trial via postman but I am getting  SSL Error: Self signed certificate in certificate chain error. After searching online, I got t... See more...
Hi, I am trying to send data to HTTP Event Collector(HEC) free trial via postman but I am getting  SSL Error: Self signed certificate in certificate chain error. After searching online, I got that some self signed certs are required. So I added the certs in the postman, But I am getting another error: SSL Error: Hostname/IP does not match certificate's altnames Could anyone help in this issue for HEC free trial account. I have a hunch that it could be possible that free trial only support HTTP and for HTTPS paid cloud account is required. Kindly correct me if I am going in wrong direction. Also can anyone share certs if free trial support HTTPS   Endpoint: https://prd-***.splunkcloud.com:8088/services/collector/event Authorization:  "Splunk **********"
Hello,  I want to know if it's possible to upload files in Splunk Cloud  through the Http Collector or other way ?  Now i have a file with lines as events and i'm making an htttp request for each... See more...
Hello,  I want to know if it's possible to upload files in Splunk Cloud  through the Http Collector or other way ?  Now i have a file with lines as events and i'm making an htttp request for each lines to load events in Splunk.    Do you have another solution please ?  Thanks !!  
My Splunk access token seems to have been revoked today. My admin generated a new one but I don't see it. I have read permission.
Hello, everyone! I collect script logs from light forwarders to indexers directly. Logs looks like: 0348788934="Y"; 0304394493="N"; 0874844788="Y"; etc.   When in automatically parses o... See more...
Hello, everyone! I collect script logs from light forwarders to indexers directly. Logs looks like: 0348788934="Y"; 0304394493="N"; 0874844788="Y"; etc.   When in automatically parses on splunk i got fields 348788934=Y, 304394493=N and so on... I did props.conf on indexers:   [my_sourcetype] FIELD_DELIMETERS=;   but still not working, can anybody help? Thank you
How do I find the time events have been sent in for the last 3 days. I want to see the time 53 different events came in
We are trying to run some custom commands that requires cython, but Splunk's python doesnt support it. We tried creating an anaconda environment inside the app, just like MLTK and Python for Scienti... See more...
We are trying to run some custom commands that requires cython, but Splunk's python doesnt support it. We tried creating an anaconda environment inside the app, just like MLTK and Python for Scientific Computing apps, but some issues appeared regarding anaconda symlinks. This is beign discussed in another thread: https://community.splunk.com/t5/Developing-for-Splunk-Enterprise/Why-when-installing-custom-made-app-that-contains-symlinks-the/td-p/592751 ¿Anyone managed to run python custom commands that required cython?