All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I'm trying to discover my source input.conf file that is responsible for pulling in the WinEventLogs.  Our original implementation was back in 2019, and completed by another SME that has since moved ... See more...
I'm trying to discover my source input.conf file that is responsible for pulling in the WinEventLogs.  Our original implementation was back in 2019, and completed by another SME that has since moved on.   When we implemented Splunk Cloud there was many other onsite components implemented, incuding an IDM server.  Since moving to the Victoria Experience we no longer utilize an IDM server, but have the rest of the resources in placed as shown in my attached..  That said, I'm just trying to confirm where to filter my oswin logs from, but not convinced I have identified the source.  While I found the inputs.conf file under Splunk_TA_windows (where I'd expect it to be) on the deployment server, I'm not confident it's responsible for this data input. This is because all my entries in the  stanza specific for WinEventLog ... has a disable = 1.  So while I want to believe, I cannot.  I've look over mulmore importantly where are my WinEventLogs truly being sourced from (which inputs.conf)?  I've review my resources on the Deployment Server, DMZ Forwarder and Syslog UFW Server  and not finding anything else that would be responsible, nor anything installed regarding Splunk_TA_windows, however I am indeed getting plenty of data, and trying to be more efficient with our ingest and looking to filter some of these type of logs out.  TIA   
I am having the same issue.  I have tried all the recommendations above.  Thank you in advance for any assistance.
I'm wondering if anyone could advise on how to best standardize a log of events with different fields. Basically, I have a log with about 50 transaction types (same source and sourcetype), and each e... See more...
I'm wondering if anyone could advise on how to best standardize a log of events with different fields. Basically, I have a log with about 50 transaction types (same source and sourcetype), and each event can have up to 20 different fields based on a specific field, ActionType. Here are a few sample events with some sample/generated data: 2025-02-10 01:09:00, EventId="6", SessionId="123abc",  ActionType="Logout" 2025-02-10 01:08:00, EventId="5", SessionId="123abc", ActionType="ItemPurchase", ItemName="Item2",  Amount="200.00", Status="Failure" 2025-02-10 01:07:00, EventId="4", SessionId="123abc", ActionType="ItemPurchase", ItemName="Item1", Amount="500.00", Status="Success", FailureReason="Not enough funds" 2025-02-10 01:06:00, EventId="3", SessionId="123abc" ActionType="ProfileUpdate", ElementUpdated="Password", NewValue="*******", OldValue="***********", Status="Failure", FailureReason="Password too short" 2025-02-10 01:05:00, EventId="2", SessionId="123abc" ActionType="ProfileUpdate", ElementUpdated="Email", NewValue="NewEmail@somenewdomain.com", OldValue="OldEmail@someolddomain.com", Status="Success" 2025-02-10 01:04:00, EventId="1", SessionId="123abc", ActionType="Login", IPAddress="10.99.99.99", Location="California", Status="Success" I'd like to put together a table with user-friendly EventDescription, like below: Time: SessionId Action EventDescription 2025-02-10 01:04:00 123abc LogIn User successfully logged in from IP 10.99.99.99 (California). 2025-02-10 01:05:00 123abc ProfileUpdate User failed to update password (Password too short) 2025-02-10 01:06:00 123abc ProfileUpdate User successfully updated email from NewEmail@somenewdomain.com to OldEmail@someolddomain.com 2025-02-10 01:07:00 123abc ItemPurchase User successfully purchased item1 for $500.00 2025-02-10 01:08:00 123abc ItemPurchase User failed to purchase item2 for $200.00 (insufficient funds) 2025-02-10 01:09:00 123abc LogOut User logged out successfully   Given that each action will have different fields, what's the best way to approach this, given that there could be about 50 different events (possibly more in the future).  I was initially thinking this can be done using a series of case statements, like the one below.  However, this approach doesn't seem too scalable or maintainable given the number of events and possible fields for each one: eval EventDescription=case(EventId="LogIn", case(Status="Success", "User successfully logged in from IP ".IpAddress." (Location)", 1=1, "User failed to login"), EventId="Logout......etc I was also thinking of using a macro to extract the field and compose an EventDescription, which would be easier to maintain since the code for each Action would be isolated, but I don't think execution 50 macros in one search is the best way to go.  Is there a better way to do this?  Thanks!
Try like this - note that your search needs to use the $app_name_choice$ token not $app_name$ <input type="multiselect" token="app_name"> <label>Application Name</label> <choice valu... See more...
Try like this - note that your search needs to use the $app_name_choice$ token not $app_name$ <input type="multiselect" token="app_name"> <label>Application Name</label> <choice value="All">All</choice> <default>All</default> <initialValue>*</initialValue> <fieldForLabel>app_name</fieldForLabel> <fieldForValue>app_name</fieldForValue> <search base="base_search"> <query> |stats count by app_name </query> </search> <valuePrefix>app_name="</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter> OR </delimiter> <change> <eval token="form.app_name">case(mvcount('form.app_name')=0,"All",mvcount('form.app_name')&gt;1 AND mvfind('form.app_name',"All")&gt;0,"All",mvcount('form.app_name')&gt;1 AND mvfind('form.app_name',"All")=0,mvfilter('form.app_name'!="All"),1==1,'form.app_name')</eval> <eval token="app_name_choice">if('form.app_name'=="All","app_name=\"*\"",'app_name')</eval> </change> </input>
Yes there is at least one firewall between the client network and the Intermediate forward network. I did a quick and dirty test like you did by making a powershell script that ran on the client subn... See more...
Yes there is at least one firewall between the client network and the Intermediate forward network. I did a quick and dirty test like you did by making a powershell script that ran on the client subnet and simply opened as many connection to the IF as it could.  I created a corresponding server script to listen on a port.  As expected the server maxed out at 16000 connections.  This confirms that there is not a networking device  between the client network and the IF network that would limit the total number of connections.   The inputs and outputs that you have are effective the same as what I have.  I am not doing anything special with them and it is just about as basic as it comes. The next hop from the IF to the indexers needs to go through a NAT as my IF is a private address and the indexers are public.  I don't suspect that the IF server would not allow more that 1k connections if the upstream is limiting the connections but I don't have a easy way to verify this.  I don't control the indexers and so I cant do a similar end to end connection test with a lot of port. I am still scratching my head on this and like I said I am not satisfied with the suggestion of just building more IF servers and limiting them to 1k clients each.
That's a warning, not an error.  The file will be ingested, but while Splunk is busy with it other monitored files are ignored. Consider standing up a separate UF on that server just for the large f... See more...
That's a warning, not an error.  The file will be ingested, but while Splunk is busy with it other monitored files are ignored. Consider standing up a separate UF on that server just for the large files. Also, make sure maxKBps in limits.conf is set to 0 or the largest value the network can support.
For a particular sourcetype I am facing log ingestion issue. Getting below error.  As checked with the team, this log file can not be split. So is there any solution to resolve this issue.  
I agree!  Oh well self service CIM it is.
Thanks for that information.  We actually had the previous version installed and the upgrade wouldn't work.  So we deleted and intended to re-install.  Didn't even think to check compatibility since ... See more...
Thanks for that information.  We actually had the previous version installed and the upgrade wouldn't work.  So we deleted and intended to re-install.  Didn't even think to check compatibility since the previous version was installed already
Hello All, We’re recently encountering an issue when editing a classic dashboard in Splunk. Whenever we try to edit a dashboard containing a "mailto" protocol, we receive the following error: Use... See more...
Hello All, We’re recently encountering an issue when editing a classic dashboard in Splunk. Whenever we try to edit a dashboard containing a "mailto" protocol, we receive the following error: Uses scheme: "mailto", but the only acceptable schemes are: {"https", "http"} However, dashboards without the "mailto" protocol are working fine and we are able to edit them without any issues. Has anyone experienced this before? Is there a known solution or workaround to bypass or resolve this issue, allowing us to edit dashboards that include the "mailto" protocol? would appreciate any guidance or suggestions. Thanks in advance! I
Sorry, I don’t know why it’s not available in the trial. It could be that the trial version is a lower than production version which is why it’s not displaying for you.?
Hi @Ankur.Sharma, Thanks for checking out the Community. Given how old this post is, it may not get a reply. If the community does not jump in soon, you can reach out to AppDynamics Support: How do... See more...
Hi @Ankur.Sharma, Thanks for checking out the Community. Given how old this post is, it may not get a reply. If the community does not jump in soon, you can reach out to AppDynamics Support: How do I open a case with AppDynamics Support? 
Also... INDEXED_EXTRACTIONS uses up disk space... so I almost never use it 
Thank you @gcusello. Our Proofpoint account manager said the following -  "There is an API but no mail flow API so Splunk wouldn't have anything on the Essentials side. Enterprise side - Remote Sysl... See more...
Thank you @gcusello. Our Proofpoint account manager said the following -  "There is an API but no mail flow API so Splunk wouldn't have anything on the Essentials side. Enterprise side - Remote Syslog gets them all sorts of mail flow details!  Having said that, the only way to get an integration with Splunk would be to upgrade from Essentials to our Enterprise email." Is there a way to get the Proofpoint data without an upgrade?
Hi @Kenny_splunk  other people will still be able to reply but the one accepted will be at the top to allow others to see it easily if they come across the same questions.  Thanks! will
LOL/SOB I really wish there was more compliance around CIM, especially for these TAs built by big industry types... but yes...self-CIM
I don't think you actually want to remove "ALL" from the multi-select...it makes it so people can go back to the default when they are done with whatever choice they made originally.  I would say if ... See more...
I don't think you actually want to remove "ALL" from the multi-select...it makes it so people can go back to the default when they are done with whatever choice they made originally.  I would say if you don't want people to go back to the default...then maybe you don't it there at all in the first place?  Or maybe you don't actually want a multi-select but just a regular drop-down list?  But it seems to me if you want it there in the first place, you actually want it there always so people can revert back to the default behavior of the dashboard when they are done messing around  (IMO).
Here is my existing multiselect XML.. <input type="multiselect" token="app_name"> <label>Application Name</label> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValu... See more...
Here is my existing multiselect XML.. <input type="multiselect" token="app_name"> <label>Application Name</label> <choice value="*">All</choice> <default>*</default> <initialValue>*</initialValue> <fieldForLabel>app_name</fieldForLabel> <fieldForValue>app_name</fieldForValue>   <search base="base_search"> <query> |stats count by app_name </query> </search> <valuePrefix>app_name="</valuePrefix> <valueSuffix>"</valueSuffix> <delimiter> OR </delimiter> </input>
@ITWhisperer thanks for the reply. Where I need to give this input? In my existing multiselect input?
I just removed complete kvstore folder from "/opt/splunk/var/lib/splunk/" after taking the backup and restart the splunk services.