All Topics

Top

All Topics

Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Tr... See more...
Hi Everyone, Has anyone managed to successfully use the "Akamai Prolexic DNS GTM and SIEM API (Unofficial)"  app ? I keep getting this error when testing the Prolexic API data input:         Traceback (most recent call last): File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\connection.py", line 175, in _new_conn (self._dns_host, self.port), self.timeout, **extra_kw File "C:\Program Files\Splunk\etc\apps\akamai-api-integration\bin\akamai_api_integration\aob_py3\urllib3\util\connection.py", line 72, in create_connection for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM): File "C:\Program Files\Splunk\Python-3.7\lib\socket.py", line 752, in getaddrinfo for res in _socket.getaddrinfo(host, port, family, type, proto, flags): socket.gaierror: [Errno 11001] getaddrinfo failed         The official Akamai SIEM app was not designed to ingest the Prolexic API so unfortunately is of no use to me. Many thanks.
Hello! I am new to Splunk and attempting the BOTS workshop, Hunting an APT with Splunk - Reconnaissance, and have encountered an issue. Following the video, I tried to access the identity centre, a... See more...
Hello! I am new to Splunk and attempting the BOTS workshop, Hunting an APT with Splunk - Reconnaissance, and have encountered an issue. Following the video, I tried to access the identity centre, asset centre and the Frothly environment network diagram. However none of these are working for me. The Frothly environment shows a blank screen. The Identity and Asset centres show an error in 'inputlookup' command: External command based lookup 'identity_lookup_expanded is not available because KV store initialisation has failed. Does anyone have any idea how to get around this, or has anyone else encountered this error?
Hello ! I have a date showing in this format (str) : 20240324090045.560961-240 I've been trying to convert it into a readable date format for my dashboards, with no success so far. Does someone ... See more...
Hello ! I have a date showing in this format (str) : 20240324090045.560961-240 I've been trying to convert it into a readable date format for my dashboards, with no success so far. Does someone have a solution ? I'm getting nowhere with the splunk docs or other solved problems. Thanks !
I'm trying to achieve the following search and hoped others might have some helpful suggestions? I have two events from a summary index: `type_A` and `type_B`. They share a common field `entity_id... See more...
I'm trying to achieve the following search and hoped others might have some helpful suggestions? I have two events from a summary index: `type_A` and `type_B`. They share a common field `entity_id` that may or may not match. I want to get all events of `type_B` where there is an event of `type_A` with a matching `entity_id`.  From this result, in `type_B` I have some wildcard fields (a common `wildcard_field` name with different sub-fields, such as `wildcard_field.field1`, `wildcard_field.field2`) and I want to extract the data for those fields into a table for visualisation. Example of event structure:     { event: type_A; entity_id: 123; } { event: type_B; entity_id: 123; // Matches a type_A event wildcard_field.field1: val1; wildcard_field.field2: val2; } { event: type_B; entity_id: 345; // This one won't have a matching type_A event wildcard_field.field1: val1; wildcard_field.field2: val2; }     Thank you for any suggestions  
I have two dropdownlist. Second dropdown list should show/hide based on first drop down list value. (based on two values)  With one value works fine. <input type="dropdown" token="sourceToken" depen... See more...
I have two dropdownlist. Second dropdown list should show/hide based on first drop down list value. (based on two values)  With one value works fine. <input type="dropdown" token="sourceToken" depends="$t1_token$" searchWhenChanged="false"> I have tried as below for two values : But it not working <input type="dropdown" token="sourceToken" depends="$t1_token$,$t2_token$" searchWhenChanged="false"> <input type="dropdown" token="sourceToken" depends="t1_token,t2_token" searchWhenChanged="false"> it is not working. Please advise
I am trying to get the count of the hit to the particular api and based on the field called execution-time  I am calculating SLA , Somehow I am able to see the no of requests coming to the api but n... See more...
I am trying to get the count of the hit to the particular api and based on the field called execution-time  I am calculating SLA , Somehow I am able to see the no of requests coming to the api but not able to get the SLA count using below query, Can some one help me where I am doing wrong in the below query,   index=* uri=validate | eval SLA=1000| stats count as total_calls count(eval(execution-time < SLA)) as sla_compliant_count
this is the query, so i'm still a baby in this world (so I'm sorry if there is a dummy mistakes that might drive you crazy when you read this query). However, I'm trying to Join the Source Process Id... See more...
this is the query, so i'm still a baby in this world (so I'm sorry if there is a dummy mistakes that might drive you crazy when you read this query). However, I'm trying to Join the Source Process Id (from event code 10) with the Process Id ( from event code 1) and then print the command line, I tried to use `type=inner` but it gave me nothing which is wired, because when I look for the first query there is result and the same for the inner query.     index="main" sourcetype="WinEventLog:Sysmon" EventCode=10 lsass SourceImage="C:\\Windows\\system32\\rundll32.exe" | join left=L right=R type=left where L.SourceProcessId=R.ProcessId [search EventCode=1 lsass "C:\\Windows\\system32\\rundll32.exe"] | table L.TargetImage, R.ProcessId, R.commandLine  
Hi Community,  Please help me out, I am trying to monitor a path on the splunk search head in a Splunk enterprise environment. What would be the best practice to implement this? Would it be advisa... See more...
Hi Community,  Please help me out, I am trying to monitor a path on the splunk search head in a Splunk enterprise environment. What would be the best practice to implement this? Would it be advisable to install a UF on the search head server ? If not, what are the other ways by which we can monitor a path on the splunk search head server.   Thanks,
We need to update the threshold of a KPI, the KPI is used by 100+ services and some of these services have the thresholding unlinked from the Service Template. Is there a macro or saved search tha... See more...
We need to update the threshold of a KPI, the KPI is used by 100+ services and some of these services have the thresholding unlinked from the Service Template. Is there a macro or saved search that we can use to do a bulk update of the KPI threshold settings?  this is for the Services which thresholding is already unlinked to the Service Template to avoid manually opening each service to edit the KPI thresholds.  TIA. 
Hey all, Tech stack: Next.js 13 (pages router) I've been following the guide https://docs.appdynamics.com/display/GOVAPM234/Add+Custom+User+Data+to+a+Page+Browser+Snapshot to set custom attribute... See more...
Hey all, Tech stack: Next.js 13 (pages router) I've been following the guide https://docs.appdynamics.com/display/GOVAPM234/Add+Custom+User+Data+to+a+Page+Browser+Snapshot to set custom attributes.   On the initial page I load the AppDynamics script provided below window['adrum-start-time'] = new Date().getTime() ;((config) => { config.appKey = 'XXX' config.adrumExtUrlHttp = 'http://cdn.appdynamics.com' config.adrumExtUrlHttps = 'https://cdn.appdynamics.com' config.beaconUrlHttp = 'http://syd-col.eum-appdynamics.com' config.beaconUrlHttps = 'https://syd-col.eum-appdynamics.com' config.useHTTPSAlways = true config.xd = { enable: true } config.resTiming = { bufSize: 200, clearResTimingOnBeaconSend: true } config.maxUrlLength = 512; config.userEventInfo = { PageView: getAppDynamicsUserInfo(), VPageView: getAppDynamicsUserInfo(), } })(window['adrum-config'] || (window['adrum-config'] = {})) getAppDynamicsUserInfo is a function attached to window and will return the attribute sessionId always and if available, another attribute called customerId. On the initial page load, the sessionId is sent and viewable on AppDynamics Analyze view. When I get to the page where the customerId is available, it is not sent to AppDynamics.  If I inspect window["adrum-config"] or use ADRUM.conf.userConf, I can see both sessionId and customerId. In the above script I've tried just setting PageView and just setting VPageView.  In terms of methods of loading the above script, I've used the Next.js Script component and tried the following: Load the above as an external script file on different pages (different react components) Load the above in different versions of the same script file (different names) on different pages Added the above script into a React component and loaded the component on different pages I've also tried to use the AJAX method to intercept http calls. It intercepts the http call but does not result in sending the user data to AppDynamics.  In addition to trying to set it via config.userInfo as above, I've tried to use the following options as well.  (function (info) { info.PageView = getAppDynamicsUserInfo info.VPageView = getAppDynamicsUserInfo })(config.userEventInfo || (config.userEventInfo = {})) (function (info) { info.PageView = getAppDynamicsUserInfo() info.VPageView = getAppDynamicsUserInfo() })(config.userEventInfo || (config.userEventInfo = {})) ​ Any help is appreciated, thank you  
from the below query,  i am running for 2 to 3 and posted the output and ran again same query from 3 to 4 and posted the output. i want a query where i can compare pervious hour(2 to 3 data) with (3... See more...
from the below query,  i am running for 2 to 3 and posted the output and ran again same query from 3 to 4 and posted the output. i want a query where i can compare pervious hour(2 to 3 data) with (3 to 4) data  and i want to calculate the difference percentage  |mstats sum(transaction) as Trans where index=host-metrics service=login application IN(app1, app2, app3, app4) span=1h by application output: 02:00 to 03:00 hours data _time application Trans 2022-01-22 02:00 app1 3456.000000 2022-01-22 02:00 app2 5632.000000 2022-01-22 02:00 app3 5643.000000 2022-01-22 02:00 app4 16543.00000   03:00 to 04:00 hours data output: _time application Trans 2022-01-22 03:00 app1 8753.000000 2022-01-22 03:00 app2 342.000000 2022-01-22 03:00 app3 87653.000000 2022-01-22 03:00 app4 8912.00000
Below are the sample logs , i am not getting how to write props line breaker. can anyone help on this. A0C0A0H8~~AB~ABCg.C~AB~Wed Jan 11 19:11:17 IST 2021~C~0.00~0.00~0.01~Z~1HTYYY B0C0A0K8~~AB~ABC... See more...
Below are the sample logs , i am not getting how to write props line breaker. can anyone help on this. A0C0A0H8~~AB~ABCg.C~AB~Wed Jan 11 19:11:17 IST 2021~C~0.00~0.00~0.01~Z~1HTYYY B0C0A0K8~~AB~ABCUHg.C~AB~Mon Jan 10 20:11:17 IST 2021~C~0.00~0.00~0.01~Z~1HTYYY1245 D0C01010~~CD~SDRg.D~HH~Thu Jan 20 11:11:17 IST 2021~C~0.00~0.00~0.01~Z~1140AU A0C01212~~AB~ABCg.C~AB~Wed Jan 11 19:11:17 IST 2021~C~0.00~0.00~0.01~Z~1HTYYY    
Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user conference where attendees learn about Splunk by doing, hearing, and watching!  What is S... See more...
Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user conference where attendees learn about Splunk by doing, hearing, and watching!  What is Splunk Certification? Splunk Certification isn't just a badge—it's your ticket to higher earnings and more opportunities in the tech world. Whether you're a rookie or a seasoned pro, there's a certification for you. And here's the icing on the cake: Splunk certified practitioners earn almost 31% more than those without the stamp of approval. Part of the Splunk Education family, the Splunk Certification program currently offers a dozen certifications designed to validate different areas of expertise. From observability to security, users to administrators, there’s a certification for you. And, at .conf24, you can take any of our Splunk certification exams on-site with PearsonVUE for just $25—that's a steal compared to the usual $130 registration fee.  Registration and Dates So, register today to schedule and take any exam onsite in Las Vegas. The Splunk Certification testing center will be up and running from Tuesday, June 11 to Friday, June 14, 2024. Register with PearsonVUE to save your seat in Vegas.  Exclusive Bonus Certification Available On-site Only  Psst...if you’re on the fence about attending live, maybe this will help you decide. We're offering a *free* beta exam: Splunk Cybersecurity Defense Certified Engineer certification exclusively available during .conf24 this year. It's the next big thing in our Cybersecurity Defense track and only available for those who have already earned their Splunk Cybersecurity Defense Certified Analyst  certification. Registration is not open yet for the free beta exam, but be sure to register for the Splunk Cybersecurity Defense Certified Analyst certification so you’ll be ready to take the Splunk Cybersecurity Defense Certified Engineer certification when you arrive PRO Tip: Don't get caught in the registration hustle. Create your splunk.com account and connect it to  PearsonVUE  to generate your Splunk ID before you roll into Vegas. It's a breeze to register, but give us 24-48 hours to establish your accounts and then you’re good to go. Please follow steps 1 through 4 of the Exam Registration Tutorial to create your Splunk ID. Already have a Splunk ID? You’re good to go to schedule your exam at .conf24! Get ready to collect a new badge, show off your Splunk skills, and take your career to even higher heights. Register today and we’ll see you at .conf24! Oh, and please join us at the Bragging Rights Spotlight networking event!    See you there! -Callie Skokos on behalf of the Splunk Education Crew
We have a small satellite deployment of 40+ servers, that have a dedicated HF doubling as a Deployment Server running on Linux.  Equal mix of Windows and Linux.  24h ago discovered that a few of the ... See more...
We have a small satellite deployment of 40+ servers, that have a dedicated HF doubling as a Deployment Server running on Linux.  Equal mix of Windows and Linux.  24h ago discovered that a few of the Windows servers were now reporting that they no longer had the Windows_TA installed, but instead were running the Linux_TA.  Checking the UF hosts directly, they in fact were running the Windows_TA even though the DS was reporting they were running the Linux_TA?? After a day of trying to figure out how (validated filters, tested, removed and readded all Server Classes, and Apps), it continued.  Noticed throughout the day a few more were now reporting this "mix-up", and again validated those reporting Linux_TA were running Windows_TA.  As a final drastic measure, removed Splunk from the host (the HF/DS, not the UF's), reinstalled from scratch, and created the environment new.  Made sure the UF's were not running any of the distributed apps/ta's.  Built new Apps, Server Class.  The UF's started phoning home, and once again, the Windows servers were reporting the Linux_TA, but running the Windows_TA
I have an on-prem splunk enterprise installation, consisting exclusively of Universal forwarders and a single Indexer. We now have a cloud-hosted environment, that it restricted, as it is hosted by ... See more...
I have an on-prem splunk enterprise installation, consisting exclusively of Universal forwarders and a single Indexer. We now have a cloud-hosted environment, that it restricted, as it is hosted by an external company.  They do not allow us to install any software (but their own) on the servers. Is there any way to get data into my Indexer, without a forwarder? Without a forwarder, am I able to apply allow/deny lists to events?
We are pleased to announce the public preview of Splunk’s latest data management capability - Ingest Processor - a fully-hosted offering managed by Splunk that provides customers with pre-processing ... See more...
We are pleased to announce the public preview of Splunk’s latest data management capability - Ingest Processor - a fully-hosted offering managed by Splunk that provides customers with pre-processing and data transformation capabilities. Sign up to preview now! Accelerate the value of your data using Splunk Cloud’s new data processing features! Introducing Splunk Data Management Experience Ingest Processor (formerly known as Data/Cloud Processor) that brings data processing capabilities to the Splunk Cloud Ingest Pipeline that enables customers with preprocessing before indexing, and provides the ability to convert logs to metrics and route them to Splunk Observability Cloud. In addition, this capability is able to perform data actions such as filtering, masking, enriching, routing among other data transformations. Ingest Processor is powered by our new search processing language SPL2,  that enables you to author, test and validate data pipelines. You can get started with Splunk Data Management Ingest Processor today with zero infrastructure requirements.   Watch the demo video and sign-up here if you are interested: About the preview program: We are in the early development process for this product and are actively recruiting customers to preview it and offer early feedback. Currently, public preview is available in the US-East-1 and US-West-2 is the first available region for the preview program. Email us if you want to participate in one of these regions coming soon. Upcoming regions for preview :  eu-west-1 (Dublin) - Mar 29, 2024 eu-central-1 (Frankfurt) - Apr 3, 2024 ap-southeast-2 (Sydney) Apr 10, 2024 What does the preview cover?  Customers enrolled in the program will get a 1:1 demonstration of Ingest Processor that is led by the product team. Then, customers will be able to onboard it for a hands-on experience. It’s a 4 week preview program and there will be weekly calls scheduled. What are the requirements for participating in the Preview Program? Splunk Cloud - NOAH stacks , US East-1 region, US-West-2 EC stack upgraded  9.1.2308.20x  Ready to join the preview program? Sign up here! Additional Resources Please reach out to ingestprocessor@splunk.com with any questions.
CISCO LEARNING NETWORK |  Cisco Secure Application Learn about Cisco SecureApp's security features and how they can be easily implemented to improve and strengthen the security posture of your app... See more...
CISCO LEARNING NETWORK |  Cisco Secure Application Learn about Cisco SecureApp's security features and how they can be easily implemented to improve and strengthen the security posture of your applications with minimal time and effort. Maximizing the Benefits of Cisco Secure Application Register for the Live Session  AMER | Thursday, April 25, 9am PST, 12pm EST About the Session Join Observability Security Specialist Senthil Arunagirinathan, as he demonstrates how Secure Application enables organizations to secure their applications using real-time vulnerability analytics and business risk observability. By adding business context to security findings, Cisco Secure Application helps organizations quickly assess risk, prioritize actions, and remediate security issues based on their potential business impact. During this session, learn how Cisco Secure App helps you: • Pinpoint and address vulnerabilities by rapidly mapping threats to business transactions within common application workflows. • Expose critical vulnerabilities found within applications and gain visibility into security threats by leveraging various telemetry sources. • Prioritize threat remediation by correlating business risk scoring to business impacts   Additional Resources Learn more about Application Security Monitoring When the event has passed, you can always watch the recap on our YouTube Channel
I want mask some data coming from web server logs particularly only one server out of all my web server logs. Can I apply my masking rule to only one my webserver  source instead of all my web server... See more...
I want mask some data coming from web server logs particularly only one server out of all my web server logs. Can I apply my masking rule to only one my webserver  source instead of all my web server sending to the same sourcetype? If I apply this rule to all web server log it will be high resource usage at my indexer? Thanks
Hi all,   Im analysing event counts for a specific search criteria and I want to know how the count of values changed over time.  Below search is not good enough to see whats going on as many use... See more...
Hi all,   Im analysing event counts for a specific search criteria and I want to know how the count of values changed over time.  Below search is not good enough to see whats going on as many usernames have huge number of events and some with small numbers are barely noticeable (Im interested in rate of change and not count itself) ``` index=test_index "search string" | timechart span=10m count(field1) by username ``` So I want to see a rate of change of the count rather than simple count, by username field. How can we achieve this?
I have 2 servers (hosts) and I need to create an alert so that when the difference in value (or load) between the 2 hosts is greater than 50 percent, it gives results (alerts).