All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Thank you. I really appreciate.
You can do this in one alert but it gets a bit messy - you would probably be better off using two alerts with different schedules, time periods and alert criteria
What is the pattern? Please describe it in more detail. (Regular expressions work by finding patterns but you have to be able to precisely describe the pattern.)
I have a Splunk universal forwarder installed. The Splunk Enterprise is seeing the forwarder, now I want to send network firewall logs to host forwarder to be sent to Enterprise platform.
I'm trying to produce an architecture diagram of our Splunk environment and I want to know what each of our universal forwarders and heavy forwarders are ingesting and sending. I'm looking in inputs ... See more...
I'm trying to produce an architecture diagram of our Splunk environment and I want to know what each of our universal forwarders and heavy forwarders are ingesting and sending. I'm looking in inputs and outputs.conf but they are of no use. Is there a way to view what each forwarder is ingesting and sending, whether that be via the command line or in Splunk itself?
I want a condition like when it is severity=ERROR then show its received payload event and if it has sync/C2V event then it is COO error and if it does not have that then it is RDR error.Is there any... See more...
I want a condition like when it is severity=ERROR then show its received payload event and if it has sync/C2V event then it is COO error and if it does not have that then it is RDR error.Is there any way please help me in this. Thanks
Hello I'm using Splunk cloud, i have jenkins logs indexed to my system but for some reason breaks I took an output example and add it to Splunk with the "Add Data" option and there it looks ok bu... See more...
Hello I'm using Splunk cloud, i have jenkins logs indexed to my system but for some reason breaks I took an output example and add it to Splunk with the "Add Data" option and there it looks ok but when im searching for the sourcetype it is still broken. What is the best way to parse jenkins logs ? this is my sourcetype configuration :   [ console_logs ] CHARSET=UTF-8 LINE_BREAKER=([\r\n]+) NO_BINARY_CHECK=true SHOULD_LINEMERGE=true category=Structured disabled=false pulldown_type=true   and i want it to be shown with the bulks :   <time> Started by user <time> Finished:    
Also, can we define 2 different search run interval in this query ? like below--- index=ABC sourcetype=XYZ login |stats count |where count =0 between23:00 to 07:00, search can be run every after 2... See more...
Also, can we define 2 different search run interval in this query ? like below--- index=ABC sourcetype=XYZ login |stats count |where count =0 between23:00 to 07:00, search can be run every after 2 hours to check last 2 hours events  AND  index=ABC sourcetype=XYZ login |stats count |where count <=20 between 07:00 to 23:00, , search can be run every after 1 hours to check last 1 hours events   
thanks    but I have number of uri's same pattern
Hi I checked this doc https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Forwarddatatothird-partysystemsd?_gl=1*161pvu9*_ga*MTAxNjI5MzU0NC4xNjYyNjM1MTI0*_ga_GS7YF8S63Y*MTY5MzkwNDY1OC4... See more...
Hi I checked this doc https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Forwarddatatothird-partysystemsd?_gl=1*161pvu9*_ga*MTAxNjI5MzU0NC4xNjYyNjM1MTI0*_ga_GS7YF8S63Y*MTY5MzkwNDY1OC40MC4wLjE2OTM5MDQ2NTguMC4wLjA.*_ga_5EPM2P39FV*MTY5MzkwNDY1OC4zNDcuMS4xNjkzOTA0NjYwLjAuMC4w&_ga=2.172375459.1250912549.1693843555-1016293544.1662635124#Send_a_subset_of_data_to_a_syslog_server and changed from TCP to SYSLOG and it also works:   outputs.conf [syslog] forwardedindex.3.blacklist = (.*) forwardedindex.4.whitelist = (indexA) [syslog:syslog_qradar_10_10_10_10_514] disabled = false sendCookedData = false server = 10.10.10.10:514   props.conf: [source::9997] TRANSFORMS-routing = send_to_qradar_syslog_10_10_10_10_514   transforms.conf [send_to_qradar_syslog_10_10_10_10_514] DEST_KEY = _SYSLOG_ROUTING FORMAT = syslog_qradar_10_10_10_10_514 REGEX = .     And the question is - how to change this config (what should I add) in order to send logs from indexA to 514 and logs from indexB to port 12468 ? regards, pawelF
Thank you @gcusello,  Its Working
| rex "(?<uri>/ready/term/planess)"
Since you renamed the count field, you have to use the new name n the calculation [search] |stats count as EventCount by ClientName Outcome | eventstats sum(EventCount) as total by ClientName | eval... See more...
Since you renamed the count field, you have to use the new name n the calculation [search] |stats count as EventCount by ClientName Outcome | eventstats sum(EventCount) as total by ClientName | eval percent=100*EventCount/total
index=ABC sourcetype=XYZ login |stats count |where count=0 OR (count <=20 AND tonumber(strftime(now(),"%H")) >= 7 AND tonumber(strftime(now(),"%H")) < 23)
Hi @cbiraris, yes, you have to define a threshold value using eval: index=ABC sourcetype=XYZ login | stats count | eval time_hour=strftime(now(),"%H") | eval threshold=if(time_hour>22 OR time_hou... See more...
Hi @cbiraris, yes, you have to define a threshold value using eval: index=ABC sourcetype=XYZ login | stats count | eval time_hour=strftime(now(),"%H") | eval threshold=if(time_hour>22 OR time_hour<8,0,20) | where count<=threshold Ciao. Giuseppe
Hello, I'm currently exploring the integration of Splunk with SAP Analytics Cloud for our data analysis and visualization needs. While I've found some documentation on the topic, I'm looking for pra... See more...
Hello, I'm currently exploring the integration of Splunk with SAP Analytics Cloud for our data analysis and visualization needs. While I've found some documentation on the topic, I'm looking for practical advice and insights from those who have successfully implemented this integration. Specifically, I'd like to know: What are the key considerations when setting up the integration between Splunk and SAP Analytics Cloud? Are there any best practices or recommendations for optimizing data transfer and visualization between the two platforms? How can I ensure that real-time data from Splunk is effectively utilized in SAP Analytics Cloud for timely decision-making? Are there any common challenges or pitfalls I should be aware of during this integration process, and how can I mitigate them? I have checked  https://community.splunk.com/t5/Community/ct-p/en-us/SAP Analytics Cloud Course for guidance.  If you have experience with this integration or can point me to valuable resources, I would greatly appreciate your insights. Thank you!
Dear All, I have a dashboard with Choropleth map presenting established connection from various countries. | index=*** sourcetype=***  bla bla  | bla bla bla | iplocation IP | table Time Usernam... See more...
Dear All, I have a dashboard with Choropleth map presenting established connection from various countries. | index=*** sourcetype=***  bla bla  | bla bla bla | iplocation IP | table Time Username IP Country | stats count by Country | eval count=case(count < 10, "1:Less than 10", (count > 10 AND count <= 20), "2:Between 10 to 20", (count > 20 AND count <= 50), "3:Between 21 to 50", (count > 51 AND count <= 100), "4:Between 51 to 100", (count > 100 AND count <= 500), "5:Between 101 to 500",(count > 500), "6:More than 500") | sort +count | geom geo_countries featureIdField=Country   In the legend, I see colours with the count of established connections. May I edit the legend in a way that the name of each country will be shown up and along with count? I have spent many days googling but unfortunately I am unable to get the answer. I also tried with Cluster Map visualization, but unfortunately, no luck for me. Thank you very much in advanced for your advices. | index=*** sourcetype=***  bla bla  | bla bla bla | rename ext_device as VPN, ext_when as Time, ext_Username as Username, ext_IP_addr as IP | iplocation IP | geostats count by Country
Oh, I see what I did, but not sure, why that works the way it does. This is what I did: [search] |stats count as EventCount by ClientName Outcome | eventstats sum(EventCount) as total by ClientName... See more...
Oh, I see what I did, but not sure, why that works the way it does. This is what I did: [search] |stats count as EventCount by ClientName Outcome | eventstats sum(EventCount) as total by ClientName | eval percent=100*count/total   It works when I do what you said exactly! Thanks!
/ready/term/planess
Which part is the uri field?