All Topics

Top

All Topics

I have an indexer, a search head, and a heavy forwarder for a small installation. How do I configure them to communicate correctly?
I'm in the process of creating a small Splunk installation and I would like to know from where I would download the syslog-ng Linux Ubuntu installation for version 20.x.
Splunk streamlines the process of extracting insights from large volumes of data. In this fast-paced world, using data wisely sets industry leaders apart. Splunk allows financial organizations to imp... See more...
Splunk streamlines the process of extracting insights from large volumes of data. In this fast-paced world, using data wisely sets industry leaders apart. Splunk allows financial organizations to improve processes, boost security, and fine-tune customer interactions. In this article, discover Splunk's role in banking and changing financial services data management. Alternatively, for a more hands-on approach, try out the Splunk FSI Essentials app in this click-through demo. For more information on the app, see the Splunkbase page. Digital Complexities in Financial Services Every day, financial institutions create vast amounts of digital information from customer transactions, online banking, security systems, and business operations. Splunk allows financial firms to sort, understand, and, most importantly, gain insights from this data. The finance industry must follow strict government rules to ensure customer data privacy and security while using that data to make smarter business decisions. Organizations that can successfully manage their data have an advantage in terms of time to insights, understanding customer needs, and staying ahead of their competition in the fast-changing world of financial services. Splunk & Financial Services The financial services sector depends on Splunk for security and strict compliance with government regulations. Splunk tracks and reports critical information required for adherence. In daily operations, Splunk monitors system performance, identifies potential slowdowns, and finds ways to enhance customer service. Splunk also predicts potential problems, identifying issues early in the lifecycle. One of Splunk's most powerful features is its ability to collect data in a single pane of glass. Instead of monitoring various systems, administrators view all essential information on easy-to-read dashboards. This improved efficiency accelerates problem-solving and enhances decision-making and customer understanding. Risk Management Financial institutions are increasingly focusing on risk management. Splunk’s risk management capabilities enable quick and informed decisions. Real-Time Monitoring and Alerts Monitoring finances in real time is key to managing risks effectively. Splunk provides instant alerts about potential problems and catches issues early in the lifecycle. It prevents minor issues from escalating, safeguards assets, and builds trust. Data-Driven Decision-Making Splunk uses complex analytics to turn data into valuable insights. These insights help leaders make informed choices to lower risks. Splunk's comprehensive data guarantees that every key detail is monitored and presented to facilitate easy decision-making. Enhancing Customer Experience Enhancing customer experience is a crucial element of growth in the financial sector. The top financial organizations use data insights to predict client requirements. Studies show customers are more satisfied and loyal when banks provide services tailored to individual needs. A loyal customer base enables financial firms to differentiate themselves in a competitive market. By analyzing customer data, services and offers that resonate effectively are identified, enhancing relevance and engagement. By incorporating advanced data analytics, banks enhance efficiency and strengthen customer relationships through personalization. Leveraging data insights boosts customer retention and satisfaction. Banking services must be personalized, timely, and aligned with customers' needs in our fast-paced society. Financial Fraud In today's financial landscape, Splunk is crucial for combating fraud. Splunk's analytics empower banks to detect and prevent fraudulent activities, identify banking anomalies, and enhance fraud prevention measures. Detecting Anomalies and Suspicious Activities Splunk's tools facilitate the identification of unusual financial trends, which is crucial for rapidly detecting banking problems. These tools also prevent illicit transactions, ensuring the safety of customer data and information while safeguarding the bank's reputation. Machine Learning for Predictive Fraud Prevention Splunk uses machine learning to predict and prevent fraud, preventing it before it happens. As machine learning models evolve, financial firms have the upper hand over criminals. Regulatory Compliance and Reporting Splunk assists with regulatory compliance and transforms the industry by automating report generation and data analysis. Automating Compliance Processes Automated reporting reduces the time and resources required to manage financial data. Companies use real-time data to comply with regulations and avoid fines. Regulatory Requirements The finance landscape is ever-evolving, requiring companies to stay vigilant and flexible. Analytics enables organizations to comprehend and implement new regulations quickly. Splunk facilitates rapid adjustments, ensuring uninterrupted daily operations tasks. Operational Efficiency Banks and financial companies strive to maximize efficiency for success. By leveraging advanced analytics and automation, companies are more competitive in today's fast-moving digital environment. Splunk FSI Essentials App The Splunk FSI Essentials app was created specifically to showcase example use cases for the financial vertical. The app covers 144 use cases that serve as core examples of how Splunk maximizes efficiency and allows practitioners to consolidate data into a single pane of glass. The app delivers in-depth insights and includes the following: Comprehensive Dashboards: It provides examples of highly detailed dashboards with important quick-decision metrics. Enhanced Financial Analytics: The app enhances financial services analytics applications by showcasing advanced tools for analyzing economic data. Security Features: Security is key in finance. Splunk FSI Essentials demonstrates how strong security is required to protect sensitive information from threats. The financial sector constantly evolves, demanding dependable tools customized to specific requirements. Splunk FSI Essentials demonstrates how traditional banking methods with modern analytics allow firms to tackle future challenges. Conclusion The financial world is evolving at an unprecedented pace, transforming how organizations handle data, manage security requirements, and navigate regulatory compliance. Traditional approaches are becoming obsolete as financial companies seek sophisticated solutions to address their complex digital ecosystem. In the Splunk FSI Essentials app, see how Splunk converts vast amounts of intricate data into actionable intelligence. The dashboards highlight Splunk's key use cases in the financial sector. Test drive a click-through demo of the FSI Essentials app here: https://cisco-full-stack-observability.navattic.com/1z3r0q6i
In my logs I am getting 4 events for 1 id.  1)Updating DB record with displayId=ABC0000000; type=TRANSFER 2)Updating DB record with displayId=ABC0000000; type=MESSAGES 3)Updating DB record with ... See more...
In my logs I am getting 4 events for 1 id.  1)Updating DB record with displayId=ABC0000000; type=TRANSFER 2)Updating DB record with displayId=ABC0000000; type=MESSAGES 3)Updating DB record with displayId=ABC0000000; type=POSTING 4)Sending message to  topic ver. 2.3 with displayId=ABC0000000 Sample logs: [13.01.2025 15:45.50] [XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX] [XXXXXXXXXXXXXXXXXXXXXX] [INFO ] [Application_name]- Updating DB record with displayId=ABC0000000; type=TRANSFER I want to get the list of all those ids which have all 3 events like "Updating DB........." but missing "Sending message to  topic ........."
Hi All, I have a main search where name1 filed will have multiple values I need to run sub search based on the value of name1.  The structure goes like this: mail_search which has name1=a sub... See more...
Hi All, I have a main search where name1 filed will have multiple values I need to run sub search based on the value of name1.  The structure goes like this: mail_search which has name1=a sub search if name1=a   then run search1 if name1=b   then run search2 I have tried this with the following code:     | makeresults | eval name1="a" | eval condition=case(name1="a", "index=_internal | head 1 | eval val=\"Query for a1\" | table val", name1="b", "index=_internal | head 1 | eval val=\"Query for b\" | table val", 1=1, "search index=_internal | head 1 | eval val=\"Default query\" | table val") |table condition | map search=$condition$     I am getting the following error Unable to run query '"index=_internal | head 1 | eval val=\"Query for a1\" | table val"'.
Hi, I have a requirement to mask any sensitive data, such as credit card numbers or Social Security Numbers, that might be ingested into Splunk. I can write the props to handle data masking, but the... See more...
Hi, I have a requirement to mask any sensitive data, such as credit card numbers or Social Security Numbers, that might be ingested into Splunk. I can write the props to handle data masking, but the challenge is that I do not know where or if the sensitive data will appear. Although the data we currently have doesn't contain any sensitive information, compliance mandates require us to implement controls that detect and mask such data before it is ingested into Splunk. Essentially, the props need to be dynamic. Is there a way to achieve this?   Thanks.
Hi everyone, recently we had an use case where we had to use the scheduled png export function from a dashboard studio dashboard (Enterprise 9.4) Unfortunately it´s only working with some kind of l... See more...
Hi everyone, recently we had an use case where we had to use the scheduled png export function from a dashboard studio dashboard (Enterprise 9.4) Unfortunately it´s only working with some kind of limitations (Bug?). If you change the custom input fields of the export for subject and message it´s not considered in the mail. In the Dev Tools you will find something like "action.email.subject"  for the subject and action.email.message for the message with the informationen written to the export schedule, seems to be okay so far. If you start the export again, only the following fields will be considered in the email, which seems to be predefined and not changable at all via GUI. "action.email.message.view": "Splunk Dashboard: api_modify_dashboard_backup "action.email.subject.view" : A PDF was generated for api_modify_dashboard_backup (even if you select the png-export)   Anyone else experienced this or even better has found a solution? Thanks to all  
Hi, I have problem with parsing log in Splunk Add-on for Check Point Log Exporter. I have install it in both SH and HF, but log from checkpoint not parsing properly. I have try change REGEX to ([a-zA... See more...
Hi, I have problem with parsing log in Splunk Add-on for Check Point Log Exporter. I have install it in both SH and HF, but log from checkpoint not parsing properly. I have try change REGEX to ([a-zA-Z0-9_-]+)[:=]+([^|]+) and try to change DEPTH_LIMIT to 200000 like in troubleshooting said but it still not working.  Can you give me some advice?  Thank you so much !
Hello, I was trying to ingest snmptrapd logs with self file monitoring (Only one Splunk Instance in my environment) Here is the log format: <UNKNOWN> - 2025-01-13 10:55:44 UDP: [10.0.216.39]:53916... See more...
Hello, I was trying to ingest snmptrapd logs with self file monitoring (Only one Splunk Instance in my environment) Here is the log format: <UNKNOWN> - 2025-01-13 10:55:44 UDP: [10.0.216.39]:53916->[10.0.214.14]:162 SNMPv2-SMI::mib-2.1.3.0 30:17:26:51.00 SNMPv2-SMI::snmpModules.1.1.4.1.0 CYBER-ARK-MIB::osDiskFreeSpaceNotification CYBER-ARK-MIB::osDiskDrive "C:\\" CYBER-ARK-MIB::osDiskPercentageFreeSpace "71.61" CYBER-ARK-MIB::osDiskFreeSpace "58221" CYBER-ARK-MIB::osDiskTrapState "Alert" <UNKNOWN> - 2025-01-13 10:55:44 UDP: [10.0.216.39]:53916->[10.0.214.14]:162 SNMPv2-SMI::mib-2.1.3.0 30:17:26:51.00 SNMPv2-SMI::snmpModules.1.1.4.1.0 CYBER-ARK-MIB::osMemoryUsageNotification CYBER-ARK-MIB::osMemoryTotalKbPhysical 16776172 CYBER-ARK-MIB::osMemoryAvailKbPhysical 13524732 CYBER-ARK-MIB::osMemoryTotalKbSwap 19266540 CYBER-ARK-MIB::osMemoryAvailKbSwap 3660968 CYBER-ARK-MIB::osMemoryTrapState "Alert" <UNKNOWN> - 2025-01-13 10:55:44 UDP: [10.0.216.39]:53916->[10.0.214.14]:162 SNMPv2-SMI::mib-2.1.3.0 30:17:26:51.00 SNMPv2-SMI::snmpModules.1.1.4.1.0 CYBER-ARK-MIB::osSwapMemoryUsageNotification CYBER-ARK-MIB::osMemoryTotalKbPhysical 16776172 CYBER-ARK-MIB::osMemoryAvailKbPhysical 13524732 CYBER-ARK-MIB::osMemoryTotalKbSwap 19266540 CYBER-ARK-MIB::osMemoryAvailKbSwap 3660968 CYBER-ARK-MIB::osMemoryTrapState "Alert" I tried to use "<UNKNOWN>" as the line breaker, but it does not work at all and the event is broke in a weird way(sometimes it works, most of the time it doesn't) Please find the props.conf setting as below: [cyberark:snmplogs] LINE_BREAKER = \<UNKNOWN\> NO_BINARY_CHECK = true SHOULD_LINEMERGE = true category = Custom pulldown_type = 1 BREAK_ONLY_BEFORE = \<UNKNOWN\> MUST_NOT_BREAK_BEFORE = \<UNKNOWN\> disabled = false LINE_BREAKER_LOOKBEHIND = 2000     Line Breaking Result in Splunk:
I am watching the training for the core user certification path on STEP and they are using an index that has the usage field.  I have uploaded the tutorial data from the community site but it doesn't... See more...
I am watching the training for the core user certification path on STEP and they are using an index that has the usage field.  I have uploaded the tutorial data from the community site but it doesn't have the usage field.  I don't know how to rectify this and I cannot replicate the activity in the learning material.  Does anyone have a suggestion?   EDIT - i just made up my own CSV and imported that data. ggwp
Hello everyone, I am in the process of installing a Java agent on Linux (RHEL8) for WebMethods. It's pretty straight forward in the documentation. However, there is a difference between the AppDynam... See more...
Hello everyone, I am in the process of installing a Java agent on Linux (RHEL8) for WebMethods. It's pretty straight forward in the documentation. However, there is a difference between the AppDynamics documentation and the WebMethods one. In AppD, it says (and I am quoting here from webMethods Startup Settings) For webMethods servers that use the Tanuki Java service wrapper for start-up, you need to configure the agent in the wrapper.conf file. See Tanuki Service Wrapper Settings. Yet in WebMethods documentation (My webMethods Server Webhelp) There are some parameters that do not relate to My webMethods Server but to the JVM itself. You set custom JVM parameters in the custom_wrapper.conf file for My webMethods Server, using the following syntax: wrapper.java.additional.n=parameter Which configuration method is correct, and if both are correct which one is recommended? Can the AppD documentation be updated also to include the default paths/locations to the .conf files in WebMethods? 
We are currently using config explorer app to update configurations across our deployments  My doubt here is how can I run CLI command in config explorer? I need to give CLI command in Deployer to ... See more...
We are currently using config explorer app to update configurations across our deployments  My doubt here is how can I run CLI command in config explorer? I need to give CLI command in Deployer to deploy apps across SH cluster members? We don't have backend server access as of now. Is it possible to run CLI command through config explorer or do we need to have backend server access for that for sure?
Hi, I have json data structured as follows:     { "payload": { "status": "ok", # or "degraded" } }     I'm trying to use the stats command to count the "ok" and "degraded" events separa... See more...
Hi, I have json data structured as follows:     { "payload": { "status": "ok", # or "degraded" } }     I'm trying to use the stats command to count the "ok" and "degraded" events separately. I am using the following query:      index=whatever | eval is_ok=if(payload.status=="ok", 1, 0) | stats count as total, count(is_ok) as ok_count     I have tried passing it through spath, , with "=" in the if condition,  and several other approaches changes. What always happens is that both counts contain all elements, despite there being different numbers of them. Please help!
Is it possible to execute a script through a button click and display the script's output on a Splunk dashboard? Has anyone implemented something similar before? Any guidance would be greatly appreci... See more...
Is it possible to execute a script through a button click and display the script's output on a Splunk dashboard? Has anyone implemented something similar before? Any guidance would be greatly appreciated, as I am currently stuck on this. Thank you!
Hello everyone! I would like to ask about the Splunk Heavy Forwarder Splunk-side config: https://splunk.github.io/splunk-connect-for-syslog/main/sources/vendor/Splunk/heavyforwarder/ With those ... See more...
Hello everyone! I would like to ask about the Splunk Heavy Forwarder Splunk-side config: https://splunk.github.io/splunk-connect-for-syslog/main/sources/vendor/Splunk/heavyforwarder/ With those settings it will send the metadata in the format of key::value. Is it possible to reconfigure it to send metadata key-value pairs with some other key-value separator instead of "::"? If yes, how exactly?
Hi All, As per the exam blueprint for "SPLK-3001: Splunk Enterprise Security Certified Admin" it says that there is a prerequisite of "Splunk Core Certified Power User". However, while booking the ex... See more...
Hi All, As per the exam blueprint for "SPLK-3001: Splunk Enterprise Security Certified Admin" it says that there is a prerequisite of "Splunk Core Certified Power User". However, while booking the exam, I am able to see the booking option directly for SPLK-3001. Can I safely book the SPLK-3001 exam then? Anything I am missing here?
I'm trying to create a simple status page visualization that mimics the style I've seen by Atlassian Statuspage.   You can see it on the status page for Discord and Wiz. Currently, I have a timechar... See more...
I'm trying to create a simple status page visualization that mimics the style I've seen by Atlassian Statuspage.   You can see it on the status page for Discord and Wiz. Currently, I have a timechart and if status=1 then it's up, but if status=0 then it's down.  When the app is down, there is simply no bar on the graph.  How do I "force" a value for the bar to appear but then color each bar based on the status value.  I think I'm missing something really simple and hoping someone can point me in the right direction. Current SPL index=main app="myApp" | eval status=if(isnull(status), "0", status) | timechart span=1m max(status) by app   Current XML   <dashboard version="1.1" theme="light"> <label>Application Status</label> <row> <panel> <chart> <search> <query>index=main app="myApp" | eval status=if(isnull(status), "0", status) | timechart span=1m max(status) by app</query> <earliest>-60m@m</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="charting.axisLabelsX.majorLabelStyle.overflowMode">ellipsisNone</option> <option name="charting.axisLabelsX.majorLabelStyle.rotation">0</option> <option name="charting.axisLabelsY.majorUnit">1</option> <option name="charting.axisTitleX.visibility">collapsed</option> <option name="charting.axisTitleY.visibility">collapsed</option> <option name="charting.axisTitleY2.visibility">visible</option> <option name="charting.axisX.abbreviation">none</option> <option name="charting.axisX.scale">linear</option> <option name="charting.axisY.abbreviation">none</option> <option name="charting.axisY.maximumNumber">1</option> <option name="charting.axisY.minimumNumber">0</option> <option name="charting.axisY.scale">linear</option> <option name="charting.axisY2.abbreviation">none</option> <option name="charting.axisY2.enabled">0</option> <option name="charting.axisY2.scale">inherit</option> <option name="charting.chart">column</option> <option name="charting.chart.bubbleMaximumSize">50</option> <option name="charting.chart.bubbleMinimumSize">10</option> <option name="charting.chart.bubbleSizeBy">area</option> <option name="charting.chart.columnSpacing">0</option> <option name="charting.chart.nullValueMode">gaps</option> <option name="charting.chart.showDataLabels">none</option> <option name="charting.chart.sliceCollapsingThreshold">0.01</option> <option name="charting.chart.stackMode">default</option> <option name="charting.chart.style">shiny</option> <option name="charting.drilldown">none</option> <option name="charting.seriesColors">[0x459240]</option> <option name="charting.layout.splitSeries">0</option> <option name="charting.layout.splitSeries.allowIndependentYRanges">0</option> <option name="charting.legend.labelStyle.overflowMode">ellipsisMiddle</option> <option name="charting.legend.mode">standard</option> <option name="charting.legend.placement">none</option> <option name="charting.lineWidth">2</option> <option name="trellis.enabled">0</option> <option name="trellis.scales.shared">1</option> <option name="trellis.size">medium</option> </chart> </panel> </row> </dashboard>    
Hi, I have two indexes - "cart" and "purchased" . In "cart" index there is a field "cart_id" and in "purchased" there is a field "pur_id".  If  payment will be successfully for a cart then the card_... See more...
Hi, I have two indexes - "cart" and "purchased" . In "cart" index there is a field "cart_id" and in "purchased" there is a field "pur_id".  If  payment will be successfully for a cart then the card_id values will be stored as a pur_id in the "purchased" index. cart purchased  cart_id 123 payment received  pur_id   123 cart_id 456   no payment  no record for 456 Now I want to display the percentage of cart for which payment is done. I wonder if anyone can help here.   Thank you so much 
Is it possible to create a button in a Splunk dashboard that, when clicked, runs a script to export logs from Zabbix and display them on the dashboard? The dashboard should only be visible after the ... See more...
Is it possible to create a button in a Splunk dashboard that, when clicked, runs a script to export logs from Zabbix and display them on the dashboard? The dashboard should only be visible after the button is clicked. Has anyone implemented something like this before? Please help, as I’m really stuck on this!
Remember Splunk Community member, Pedro Borges? If you tuned into Episode 2 of our Smartness interview series, you know just how inspiring his journey with Splunk has been! Now, we’re excited to shar... See more...
Remember Splunk Community member, Pedro Borges? If you tuned into Episode 2 of our Smartness interview series, you know just how inspiring his journey with Splunk has been! Now, we’re excited to share a new companion video that dives even deeper into his story.   Pedro shares how Splunk Education helped him transform his career and optimize Splunk for his organization. From leveraging top-tier training to tapping into the incredible Splunk Community, Pedro’s story is proof that the right resources can make a world of difference. Ready to follow Pedro’s lead? Take your Splunk skills to the next level by exploring the tools that made a difference for Pedro: Splunk Lantern: Your guide to real-world use cases and solutions. Splunk Docs: The ultimate knowledge base for everything Splunk. Splunk Education: Courses to help you master Splunk. Splunk Community: Join discussions and connect with Splunk enthusiasts. Splunk Certifications: Showcase your expertise and grow your career. We hope Pedro's story helps to inspire your next steps with Splunk Education and nurturing your growth mindset! -Callie Skokos on behalf of the Splunk Education Crew