All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I suspect you want to know about priority alerts, but how will Splunk magically know about this?  Its always better to give good context to the Splunk communiy, so what is P1C? and JMET sounds like ... See more...
I suspect you want to know about priority alerts, but how will Splunk magically know about this?  Its always better to give good context to the Splunk communiy, so what is P1C? and JMET sounds like some internal Splunk environment company code (which you should anonymise)  Unless you have say for instance in the saved search title name P1C, example, my_search_P1C, Splunk will not be able to find it or filter on it. Or you will need to use the eval command and for each saveded that you know is a P1C and assign a eval field called priority, but will require a lot of work.  Tip: As ever its always best practise to have good business naming conventions, makes things easier in the long run Example using makeresults to assign PC1 | makeresults count=2 | streamstats count as search_num | eval title=case(search_num=1, "my_savedsearch1", search_num=2, "my_savedsearch2") | eval priority=if(title=="my_savedsearch1", "P1C", null()) | fields - search_num      
Please read this https://docs.splunk.com/Documentation/Splunk/9.3.0/Installation/ChoosetheuserSplunkshouldrunas  It describes how splunk works in windows and which kind of user you should select to ... See more...
Please read this https://docs.splunk.com/Documentation/Splunk/9.3.0/Installation/ChoosetheuserSplunkshouldrunas  It describes how splunk works in windows and which kind of user you should select to fulfill your requirements.
Have you read and followed the steps in this document https://docs.splunk.com/Documentation/Splunk/9.3.0/Indexer/Migratenon-clusteredindexerstoaclusteredenvironment ? Based on your comments and ques... See more...
Have you read and followed the steps in this document https://docs.splunk.com/Documentation/Splunk/9.3.0/Indexer/Migratenon-clusteredindexerstoaclusteredenvironment ? Based on your comments and questions and where you currently are, I doubt it!  We cannot help you without knowing what you have exactly done! I hope that you have write down journal and you could share it.  Also you must check what you have on nodes’ splunkd.log.
If the DB connect is not supported with your current Splunk version, then plan to upgrade Splunk to the supported levels to do what you want. If its not supported, sometimes you can still install an... See more...
If the DB connect is not supported with your current Splunk version, then plan to upgrade Splunk to the supported levels to do what you want. If its not supported, sometimes you can still install and it may work, but you take this risk and that not advisable for production environments.   
Hey Giuseppe,   I followed all of your steps for HA configuration. Built a new Search head, master node, and new indexer. I enabled clustering and added the search head and newly built indexer to ... See more...
Hey Giuseppe,   I followed all of your steps for HA configuration. Built a new Search head, master node, and new indexer. I enabled clustering and added the search head and newly built indexer to the cluster. Once i added the once standalone splunk server to the cluster the splunk service wouldnt start. And now if fails when i try starting it. Any idea on why this would be?
Getting data in requires a number of steps and investigation work. Some high level notes/tips 1. The first thing you need to do is to determine what data you want from Cloudflare, they offer a numb... See more...
Getting data in requires a number of steps and investigation work. Some high level notes/tips 1. The first thing you need to do is to determine what data you want from Cloudflare, they offer a number of services right?. 2. Investigate what options they provide in getting the data you want, logs, API, syslog etc. 3. You then look and explore Splunkbase (type in Cloudflare) and see if there is a Add-on (this is what typically helps you collect the data) you will need to do some homework and find out if it meets your methods of getting the data from step 2. Once you have this you need to Deploy the TA as per the instructions and connect to the data source.
Hello, Thank you for the response I had taken captues, there's only 2 lines followed by an ACK and a FIN, ACK: TLSv1.2 Client Hello TLSv1.2 Server Hello, Certificate, Server Key Exchange, Server ... See more...
Hello, Thank you for the response I had taken captues, there's only 2 lines followed by an ACK and a FIN, ACK: TLSv1.2 Client Hello TLSv1.2 Server Hello, Certificate, Server Key Exchange, Server Hello Done TCP [ACK] TCP [FIN, ACK] I understood the issue is with Client certificate. Can you kindly help me answer the below: Where do I find the certificates that is used by TA-cisco-cloud-security-umbrella-addon in Splunk ? What is the path/location of the certificate store used by the TA-cisco-cloud-security-umbrella-addon ?
Can i keep my once standalone server now one of my indexers as the deployment server? Or do i need to designate another server as the deployment server?
Currently working on these steps. I have copied the indexes.conf from the standalone to the master node. Will i need to copy that config file to the new indexer as well?
This indicates that the SSL certificate is either missing from the certificate store or has expired in the add-on. Additionally, if the server is configured to use a self-signed or third-party c... See more...
This indicates that the SSL certificate is either missing from the certificate store or has expired in the add-on. Additionally, if the server is configured to use a self-signed or third-party certificate, it may not be included in the certificate store used by the add-on.
Hi @Jonathan.Wang, I found this existing post that talks about the same issue. Check it out and let me know if it helps. https://community.appdynamics.com/t5/Java-Java-Agent-Installation-JVM/Inst... See more...
Hi @Jonathan.Wang, I found this existing post that talks about the same issue. Check it out and let me know if it helps. https://community.appdynamics.com/t5/Java-Java-Agent-Installation-JVM/Install-Events-service-Error/m-p/52419
@ankitarath2011 Please have a look  https://www.splunk.com/en_us/blog/tips-and-tricks/collecting-docker-logs-and-stats-with-splunk.html?locale=en_us  https://www.tekstream.com/blog/containerization... See more...
@ankitarath2011 Please have a look  https://www.splunk.com/en_us/blog/tips-and-tricks/collecting-docker-logs-and-stats-with-splunk.html?locale=en_us  https://www.tekstream.com/blog/containerization-and-splunk-how-docker-and-splunk-work-together/ 
Hi @Srinath.S, I found this AppD Docs page: https://docs.appdynamics.com/appd/24.x/24.7/en/cisco-appdynamics-essentials/dashboards-and-reports Let me know if it helps with your question. 
Many thanks! I was troubleshooting why Splunk was not reading out the Security event log. After adding "NT Service\SplunkForwarder" to the "Event Log Readers" group, it finally works.
I'm trying to call the nslookupsearch custom command. All that does is an nslookup for an IP or computer name. But I'm trying to use it in a search because some of the data we get ingested doesn't co... See more...
I'm trying to call the nslookupsearch custom command. All that does is an nslookup for an IP or computer name. But I'm trying to use it in a search because some of the data we get ingested doesn't contain the information we need, so we implemented the custom command to be able to nslookup and populate a table with the data retrieved from the nslookupsearch. 
You could try something along these lines | makeresults format=csv data="index,1-Aug,8-Aug,15-Aug,22-Aug,29-Aug index1,5.76,5.528,5.645,7.666,6.783 index2,0.017,0.023,0.036,0.033,14.985 index3,2.333... See more...
You could try something along these lines | makeresults format=csv data="index,1-Aug,8-Aug,15-Aug,22-Aug,29-Aug index1,5.76,5.528,5.645,7.666,6.783 index2,0.017,0.023,0.036,0.033,14.985 index3,2.333,2.257,2.301,2.571,0.971 index4,2.235,1.649,2.01,2.339,2.336 index5,19.114,14.179,14.174,18.46,19.948" ``` the lines above simulate your data (without the calculations) ``` | untable index date size | eval date=strptime(date."-2024","%d-%b-%Y") | fieldformat date=strftime(date,"%F") | sort 0 index date | streamstats last(size) as previous window=1 global=f current=f by index | eval relative_size = 100 * size / previous | fields - previous | appendpipe [| eval date=strftime(date, "%F")." change" | xyseries index date relative_size] | appendpipe [| eval date=strftime(date, "%F") | xyseries index date size] | fields - date size relative_size | stats values(*) as * by index
Hi @ferdousfahim , I usually use this transformations at search time, but to apply them on Forwarders, you have to use INDEXED_EXTRACTIONS=CSV in props.conf, for more infos see at https://docs.splun... See more...
Hi @ferdousfahim , I usually use this transformations at search time, but to apply them on Forwarders, you have to use INDEXED_EXTRACTIONS=CSV in props.conf, for more infos see at https://docs.splunk.com/Documentation/Splunk/9.3.0/Data/Extractfieldsfromfileswithstructureddata Ciao. Giuseppe
Hi @MoeTaher , please try something like this: index=EDR | stats count | eval Status=if((count > "0"),"Compliant","Not Compliant"), Solution="EDR" | fields -count | appemd [ | inputlookup complianc... See more...
Hi @MoeTaher , please try something like this: index=EDR | stats count | eval Status=if((count > "0"),"Compliant","Not Compliant"), Solution="EDR" | fields -count | appemd [ | inputlookup compliance.csv | fields Solution Status ] | stats first(Status) AS Status BY Solution | outputlookup compliance.csv Ciao. Giuseppe
Thanks, it worked! All I have to do is convert it to a percentage and we're all good to go. I'll pass along the karma.
Hi @MK3 , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated by all the contributors