All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Generally speaking.. 1) Alerts are created for monitoring a threshold and get email notification (or other tasks like incident creation, etc) 2) Reports are created for daily/weekly/monthly reports... See more...
Generally speaking.. 1) Alerts are created for monitoring a threshold and get email notification (or other tasks like incident creation, etc) 2) Reports are created for daily/weekly/monthly reports generation(generally on a large dataset) and email the reports.  3) Dashboards are created for viewing/checking/showcasing the current status of a search query/system.  So generally you will not required email notification from dashboard. hope you got it. thanks.    As you are a new member, let me update you that, karma points / upvotes are appreciated by everybody. thanks. 
Hi, If un-necessary fields displaying from token, then please check whether you've unset token 'nf' when 'sf' is present and vice versa if 'nf' present then unset 'sf' fields.
Hi @jerome ... troubleshooting this requires mooore details from you. 1. from the UF, are you able to receive other logs to indexer? 2. was this java logs showing up at indexer previously or.. it d... See more...
Hi @jerome ... troubleshooting this requires mooore details from you. 1. from the UF, are you able to receive other logs to indexer? 2. was this java logs showing up at indexer previously or.. it did not work from you have configured 3. is it a prod or test system... 4. your inputs.conf at the UF configuration please
The most common way to handle this is to use append instead.  The following example uses eventstats. index=aruba sourcetype="aruba:stm" "*Denylist add*" OR "*Denylist del*" | eval stuff=split(messag... See more...
The most common way to handle this is to use append instead.  The following example uses eventstats. index=aruba sourcetype="aruba:stm" "*Denylist add*" OR "*Denylist del*" | eval stuff=split(message," ") | eval mac=mvindex(stuff,4) | eval mac=substr(mac,1,17) | eval denyListAction=mvindex(stuff,3) | eval denyListAction= replace (denyListAction,":","") | eval reason=mvindex(stuff,5,6) | dedup mac,denyListAction,reason | append [ search index=main host=thestor Username="*adgunn*" | dedup Client_Mac | eval Client_Mac = "*" . replace(Client_Mac,"-",":") . "*" | rename Client_Mac AS mac | fields mac Username ] | eventstats values(UserName) as UserName by mac | where isnotnull(UserName) | table _time,mac,denyListAction,reason,UserName  
I created the index via splunk and have a log4j-spring.xml file where I have the necessary configurations for splunk see below: I'm using log4j as the logging mechanism in my application. <?xm... See more...
I created the index via splunk and have a log4j-spring.xml file where I have the necessary configurations for splunk see below: I'm using log4j as the logging mechanism in my application. <?xml version="1.0" encoding="UTF-8"?> <Configuration> <Appenders> <Console name="console" target="SYSTEM_OUT"> <PatternLayout pattern="%style{%d{ISO8601}} %highlight{%-5level }[%style{%t}{bright,blue}] %style{%C{10}}{bright,yellow}: %msg%n%throwable" /> </Console> <SplunkHttp name="splunkhttp" url="http://localhost:8088" token="*******" host="localhost" index="gam_event_pro_dev" type="raw" source="gameventpro" sourcetype="log4j" messageFormat="text" disableCertificateValidation="true"> <PatternLayout pattern="%m" /> </SplunkHttp> </Appenders> <Loggers> <!-- LOG everything at INFO level --> <Root level="info"> <AppenderRef ref="console" /> <AppenderRef ref="splunkhttp" /> </Root> </Loggers> </Configuration> I have admin access to our splunk account so permission should not be an issue.
I created the index via splunk and have a log4j-spring.xml file where I have the necessary configurations for splunk see below: I'm using log4j as the logging mechanism in my application. <?xm... See more...
I created the index via splunk and have a log4j-spring.xml file where I have the necessary configurations for splunk see below: I'm using log4j as the logging mechanism in my application. <?xml version="1.0" encoding="UTF-8"?> <Configuration> <Appenders> <Console name="console" target="SYSTEM_OUT"> <PatternLayout pattern="%style{%d{ISO8601}} %highlight{%-5level }[%style{%t}{bright,blue}] %style{%C{10}}{bright,yellow}: %msg%n%throwable" /> </Console> <SplunkHttp name="splunkhttp" url="http://localhost:8088" token="*******" host="localhost" index="gam_event_pro_dev" type="raw" source="gameventpro" sourcetype="log4j" messageFormat="text" disableCertificateValidation="true"> <PatternLayout pattern="%m" /> </SplunkHttp> </Appenders> <Loggers> <!-- LOG everything at INFO level --> <Root level="info"> <AppenderRef ref="console" /> <AppenderRef ref="splunkhttp" /> </Root> </Loggers> </Configuration>
And what is it that you did? Because "all required integration steps" doesn't say anything. Are you writing your logs to files and ingesting events from those files? Are you sending directly to splu... See more...
And what is it that you did? Because "all required integration steps" doesn't say anything. Are you writing your logs to files and ingesting events from those files? Are you sending directly to splunk from your app? If so - how and to which component? If you configured the process with a specific destination index - are you sure that the user you're checking it with has proper permissions to access that index? Just a few questions to start.
Hi, I'm trying to integrate splunk to our springboot java application, I believe that I have made all the required integration steps but the logs are not showing up in our splunk account.    Thank... See more...
Hi, I'm trying to integrate splunk to our springboot java application, I believe that I have made all the required integration steps but the logs are not showing up in our splunk account.    Thanks,   Jerome
Shocking if this vendor decided to abandon Powershell.
Example https://www.tewari.info/2016/02/22/using-powershell-with-splunk/
10 years ago there was a module called of all things "splunk" that you could use to connect to an instance and pull data out. The connect cmdlet was called Connect-Splunk and was part of the module ... See more...
10 years ago there was a module called of all things "splunk" that you could use to connect to an instance and pull data out. The connect cmdlet was called Connect-Splunk and was part of the module set.
Will do thank You
The certificate on Splunk's download site expired.  Give them a day or two fix it and try again.
What exactly are you looking at/for?  Have you checked Splunkbase (apps.splunk.com)? What problem are you trying to solve?
I couldn't find those details, either.  I think they're one of those companies that hide their documentation so you may need to sign in to their customer portal (there's a link at the bottom of the p... See more...
I couldn't find those details, either.  I think they're one of those companies that hide their documentation so you may need to sign in to their customer portal (there's a link at the bottom of the page).
Is there no current PowerShell module support for Splunk?  I am only finding old articles on this and various sites.
Hey guys I keep getting this privacy error every time i attempt to download Splunk Enterprise on Mac, i read somewhere that removing the s behind http should fix and resolve the issue but i still kee... See more...
Hey guys I keep getting this privacy error every time i attempt to download Splunk Enterprise on Mac, i read somewhere that removing the s behind http should fix and resolve the issue but i still keep getting an error. Thanks for any help   https://download.splunk.com/products/splunk/releases/9.1.1/osx/splunk-9.1.1-64e843ea36b1-darwin-64.tgz "download.splunk.com normally uses encryption to protect your information. When Chrome tried to connect to download.splunk.com this time, the website sent back unusual and incorrect credentials. This may happen when an attacker is trying to pretend to be download.splunk.com, or a Wi-Fi sign-in screen has interrupted the connection. Your information is still secure because Chrome stopped the connection before any data was exchanged."
I haven't used the importtool myself but the logical thing to do is to run it on an indexer. If you run it on the master, it has no way of replicating the data to indexers because master is  not a pa... See more...
I haven't used the importtool myself but the logical thing to do is to run it on an indexer. If you run it on the master, it has no way of replicating the data to indexers because master is  not a part of the "replication group"
Hi @sigma, did you tried from Search Heads using the collect command (https://docs.splunk.com/Documentation/Splunk/9.1.1/SearchReference/Collect)? In other words: you run a search on one index and ... See more...
Hi @sigma, did you tried from Search Heads using the collect command (https://docs.splunk.com/Documentation/Splunk/9.1.1/SearchReference/Collect)? In other words: you run a search on one index and then you use the collect command: index=old_index | collect index=new_index Obviously you can define the time period to export. Ciao. Giuseppe
Hi all, I deployed Splunk and enabled indexer clustering. Then I created an index in master-apps and it has been replicated to peer nodes. Now I want to export some event from an index and import ... See more...
Hi all, I deployed Splunk and enabled indexer clustering. Then I created an index in master-apps and it has been replicated to peer nodes. Now I want to export some event from an index and import to the newly created index. I tested multiple methods: I export events using following command: ./splunk cmd exporttool /opt/splunk/var/lib/splunk/defaultdb/db/db_1305913172_1301920239_29/ /myexportpath/export1.csv -et 1302393600 -lt 1302480000 -csv and import the result using following command: ./splunk cmd importtool /opt/splunk/var/lib/splunk/defaultdb/db /myexportpath/export1.csv  but the data not replicated to indexers. I tried another method using UI in cluster master. I import my events to newly created index. In the cluster master search everything is OK but this events not replicated to the indexers. Note that my newly index does not shown in the indexes tab in indexer clustering: manger node. There are just three indexes: _internal, _audit, _telementry I think I did a wrong way to do this. Does anyone have an idea?