All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Sorry I missed the definition here.  This is easily fixable index=owner ``` where source != host.csv ``` | rename ip as ip_address | append [inputlookup host.csv | eval source = "host.csv"] | st... See more...
Sorry I missed the definition here.  This is easily fixable index=owner ``` where source != host.csv ``` | rename ip as ip_address | append [inputlookup host.csv | eval source = "host.csv"] | stats values(owner) as owner values(source) as source by host ip_address | where source == "host.csv" | fields - source Here, I am back at using the side effect of Splunk's multivalue equality. Here is the full emulation | makeresults format=csv data="ip, host, owner 10.1.1.3, host3, owner3 10.1.1.4, host4, owner4 10.1.1.5, host5, owner5" | eval source = "not-host.csv" ``` the above emulates index=owner ``` | rename ip as ip_address | append [makeresults format=csv data="ip_address, host 10.1.1.1, host1 10.1.1.2, host2 10.1.1.3, host3 10.1.1.4, host4" ``` the above emulates | inputlookup host.csv ``` | eval source = "host.csv"] | stats values(owner) as owner values(source) as source by ip_address host | where source == "host.csv" | fields - source  
Hello Cisco Security team, Firstly I'd like to say thank you for creating such a great splunk app! Now I am playing with this and found this app directly receive syslog on Splunk combined instance ... See more...
Hello Cisco Security team, Firstly I'd like to say thank you for creating such a great splunk app! Now I am playing with this and found this app directly receive syslog on Splunk combined instance itself. I would like to install this in the test network where FMC generates approx. 300-500MB syslog per hour. Assuming 700 bytes per event, it could be reaching to 200 Events per sec . https://community.cisco.com/t5/network-security/fmc-connection-events-log-size-and-location/td-p/4769765 What number of events is this application designed to handle? Any advice on performance such as utilizing multiple sockets, modifying receiving buffer size, and etc. would be appreciated. Thank you, Urikura      
It didn't help... I mean that a user has not made a check for 365 (or more) days until now.
Hi All, Deployment: Single Instance Splunk Enterprise What I want: install the Splunk_TA_stream on my universal forwarder to capture DNS traffic as stream The doc I followed https://docs.splunk.... See more...
Hi All, Deployment: Single Instance Splunk Enterprise What I want: install the Splunk_TA_stream on my universal forwarder to capture DNS traffic as stream The doc I followed https://docs.splunk.com/Documentation/StreamApp/8.1.3/DeployStreamApp/Deploymentrequirements  https://lantern.splunk.com/Data_Descriptors/DNS_data/Installing_and_configuring_Splunk_Stream The App and add-on are already installed The Splunk_TA_stream has been deployed to the UF: But I found that the streamfwd.exe is not running. Also I don't see the UF in the dashboard: (only the splunk single instance itself is present, and it is even in Error Status)   Any insights for me to discover what went wrong?   Thank you in advance.  
I've got a string that contains CSV contents. How do I send an email that has an attachment which is made from my string variable?
Thank you ever so much
I appreciate the explanation and example. The search that I have is very long and doing a lot of calculation, so it's not that easy to do your suggestion I've been doing similar thing, but much s... See more...
I appreciate the explanation and example. The search that I have is very long and doing a lot of calculation, so it's not that easy to do your suggestion I've been doing similar thing, but much simpler I just decode the URL using URL decoder, then open a new search and paste the search. Thank you for your suggestion.
Specifically, in my use case, let's just say the display returns Status failed: unique id, can i still pattern match the Status failed part?
I just had this exact issue installing Splunk on a Windows 2022 Server running on ESXi. Followed your advice and worked like a charm. Thank you, sir.
Intestesting because I didn't decide to uninstall first, I was told by support to do it.
It's not necessary to uninstall a universal forwarder before upgrading it.  Just run the installer and it will perform the steps needed for the upgrade.
|union [ search index=osp source=xxx EVENT_TYPE=xxx EVENT_SUBTYPE=xxx field1=* field3=xxx field4="" | eval DATE = strftime(strptime(xxx, "%Y%m%d"), "%Y-%m-%d") | stats latest(source) as example1 b... See more...
|union [ search index=osp source=xxx EVENT_TYPE=xxx EVENT_SUBTYPE=xxx field1=* field3=xxx field4="" | eval DATE = strftime(strptime(xxx, "%Y%m%d"), "%Y-%m-%d") | stats latest(source) as example1 by field5 field6 DATE] [ search index=osp source=xxx EVENT_TYPE=xxx EVENT_SUBTYPE=xxx field1=* field3=xxx field3=xxx field4="" | eval DATE = strftime(strptime(xxx, "%Y%m%d"), "%Y-%m-%d") | stats latest(source) as example2 by field5 field6 DATE] [ search index=osp source=xxx EVENT_TYPE=xxx EVENT_SUBTYPE=xxx field1=* field3=xxx NOT field3=xxx field4="" | eval DATE = strftime(strptime(xxx, "%Y%m%d"), "%Y-%m-%d") | stats latest(source) as example3 by field5 field6 DATE] | stats count(example1) as "example 1", count(example2) as "example 2", count(example3) as "example 3" by DATE The data is populating correctly for example 1 and example 3, individually, and if I just use two queries. However, I need all 3 queries for my data but data is missing from example 2.
I have created a simple addon on Splunkbase that can identify bad CSV files in your environment: https://splunkbase.splunk.com/app/7497  
I have created a simple addon on Splunkbase that can identify bad CSV files in your environment: https://splunkbase.splunk.com/app/7497  
Verify your network allows connections *out* to your Splunk Cloud stack's port 8089.
Also, I am trying to refrain from using css styling. is there an alternative way?
Is there a way for me to match a background color if the output from the panel involves rex. For example,  if the output displays a unique error how do i still match the background color to red wi... See more...
Is there a way for me to match a background color if the output from the panel involves rex. For example,  if the output displays a unique error how do i still match the background color to red without changing the display text for single visualization panels 
Also in A&I 08-03-2024 03:38:37.525 INFO ChunkedExternProcessor [25501 searchOrchestrator] - Running process: /opt/splunk/bin/python3.9 /opt/splunk/etc/apps/SA-IdentityManagement/bin/entitymerge_com... See more...
Also in A&I 08-03-2024 03:38:37.525 INFO ChunkedExternProcessor [25501 searchOrchestrator] - Running process: /opt/splunk/bin/python3.9 /opt/splunk/etc/apps/SA-IdentityManagement/bin/entitymerge_command.py 08-03-2024 03:38:37.845 ERROR ChunkedExternProcessor [25506 ChunkedExternProcessorStderrLogger] - stderr: (AttributeError) module 'time' has no attribute 'clock' I searched around and changed: vi /opt/splunk/etc/apps/SA-Utils/lib/SolnCommon/cexe.py Change time.clock to time.time
Hi I think that this helps you https://community.splunk.com/t5/Splunk-Search/What-is-the-relation-between-the-Splunk-inner-left-join-and-the/m-p/391288/thread-id/113948 In that answer there is defi... See more...
Hi I think that this helps you https://community.splunk.com/t5/Splunk-Search/What-is-the-relation-between-the-Splunk-inner-left-join-and-the/m-p/391288/thread-id/113948 In that answer there is define those joins and how you should do those in Splunk. r Ismo
Hi if I understood right you issue, you have used too much of SPLUNK_DB space on your system. There are many instructions  on net, how to move/change SPLUNK_DB into another Drive on Windows which yo... See more...
Hi if I understood right you issue, you have used too much of SPLUNK_DB space on your system. There are many instructions  on net, how to move/change SPLUNK_DB into another Drive on Windows which you need to do. Another option is limit your disk space to lower, but probably this will denied to collect enough events to your system. Another option is ask some local Splunk Partner to fix this for you. r. Ismo