All Topics

Top

All Topics

I am trying to write a rex command that extracts the field "registrar" from the below four event examples. The below values in bold are what i am looking for to be the value for "registrar".  I am us... See more...
I am trying to write a rex command that extracts the field "registrar" from the below four event examples. The below values in bold are what i am looking for to be the value for "registrar".  I am using the following regex to extract the field and values, but i seem to be capturing the \r\n after the bold values as well.  How can i modify my regex to capture just the company name in bold leading up to \r\n Registrar IANA Current regex being used:   Registrar:\s(?<registrar>.*?) Registrar IANA   Expiry Date: 2026-12-09T15:18:58Z\r\n Registrar: ABC Holdings, Inc.\r\n Registrar IANA ID: 972 Expiry Date: 2026-12-09T15:18:58Z\r\n Registrar: Gamer.com, LLC\r\n Registrar IANA ID: 837 Expiry Date: 2026-12-09T15:18:59Z\r\n Registrar: NoCo MFR Ltd.\r\n Registrar IANA ID: 756 Expiry Date: 2026-12-09T15:18:59Z\r\n Registrar: Onetrust Group, INC\r\n Registrar IANA ID: 478
Data Ingest and Search are core Splunk Cloud Platform capabilities that customers rely on. However, customers primarily in the financial, public and healthcare sectors are increasingly concerned abou... See more...
Data Ingest and Search are core Splunk Cloud Platform capabilities that customers rely on. However, customers primarily in the financial, public and healthcare sectors are increasingly concerned about traversing their data workloads over the public Internet. To support these requirements, we introduced AWS PrivateLink support on the Splunk Cloud Platform. Private Connectivity helps insulate data exchange channels between a customer’s AWS cloud environment and Splunk Cloud Platform from the public Internet. Since Oct '22, compliance customers with an AWS presence can send their ingest data (Forwarder and HEC traffic) to their Splunk Cloud Platform stack over private endpoints, without exposing it over the public internet.  Expanding on this foundational capability, I am excited to announce that starting today customers with compliance subscriptions, such as PCI, HIPAA, IRAP, and FedRAMP moderate, will be able to perform their core search and access APIs flowing through the Search endpoints via AWS Privatelink.   Powered by Splunk Cloud Platforms's Admin Config Service APIs, onboarding private connectivity (both for search and data ingest) on your stack is completely self-serviceable. You can learn more about the functionality and evaluate if it is the right choice for you by reviewing the private connectivity Overview and the Getting started guide.     *Customers are responsible for AWS data transfer costs associated with their VPC. For more info, refer to AWS PrivateLink pricing. AWS is a trademark of Amazon.com, Inc. or its affiliates.
The eighth leaderboard update (10.26-11.8) for The Great Resilience Quest is out >> Shoutout to all the brave questers who have made it to the leaderboard. Your dedication to conquering chal... See more...
The eighth leaderboard update (10.26-11.8) for The Great Resilience Quest is out >> Shoutout to all the brave questers who have made it to the leaderboard. Your dedication to conquering challenges and expanding your knowledge is the true spirit of resilience.  As we embark on the exciting beginnings of our quest's second phase, remember that the leaderboard is within everyone’s reach. The race for a spot is still on, and the thrill of the quest continues! We are currently in the midst of selecting the 20 victors of the Champion's Tribute from the first phase (7.17-10.20), so keep an eye out for that announcement via emails and our community post. Best regards, Customer Success Marketing
Hi Folks,   I'm looking for a document that will help me understand my options for ensuring the integrity of data inbound to splunk from monitored devices, and any security options I may have there... See more...
Hi Folks,   I'm looking for a document that will help me understand my options for ensuring the integrity of data inbound to splunk from monitored devices, and any security options I may have there.  I know TLS is an option for inter-splunk traffic.  Unfortunately, I'm not having any luck with finding options to ensure the integrity and security of data when it's first received into splunk. Surely there's a way for me to secure that, what am I missing here?  
I am trying to write a regex to extract a field called "registrar" from some data like i have below. Can you please help how i could write this regex to be used in a rex command to extract the field?... See more...
I am trying to write a regex to extract a field called "registrar" from some data like i have below. Can you please help how i could write this regex to be used in a rex command to extract the field? Below are three example events: Registry Date: 2025-10-08T15:18:58Z   Registrar: ABC Holdings, Inc.   Registrar ID: 291  Server Name: AD12 Registry Date: 2025-11-08T15:11:58Z   Registrar: OneTeam, Inc.   Registrar ID: 235  Server Name: AD17 Registry Date: 2025-12-08T15:10:58Z   Registrar: appit.com, LLC   Registrar ID: 257  Server Name: AD14   I need the regex to use to extract the field called "registrar"  which in the above example would have the following three value matches:   ABC Holdings, Inc.  OneTeam, Inc appit.com, LLC    
Hello,  Currently, I am using the append command to combine two queries and tabulate the results, but I see only 4999 transactions. Is there any way I can get full results?  Thanks in advance!
When a payment process times out, customers may not complete their purchases. How can line-of-business application owners quantify the revenue risk? Video Length: 2 min 28 seconds  CONTENTS | ... See more...
When a payment process times out, customers may not complete their purchases. How can line-of-business application owners quantify the revenue risk? Video Length: 2 min 28 seconds  CONTENTS | Introduction | Video |Resources | About the presenter  When a payment process times out, customers may not complete their purchases. How can line-of-business application owners quantify the revenue risk? In this demo, Matt Schuetze uses AppDynamics Business iQ Analytics to quantify revenue impact for payment processing timeouts across a geographical market area. This analysis helps identify the areas where the largest revenue losses occur.   Additional Resources  Learn more about Using Analytics Data in the documentation. Check out Six Business iQ use cases that drive application and business performance on the website About presenter Matt Schuetze Matt Schuetze Field Architect Matt Schuetze is a Field Architect at Cisco on the AppDynamics product. He confers with customers and engineers to assess application tooling choices and helps clients resolve application performance problems. Matt runs the Detroit Java User Group and the AppDynamics Great Lakes User Group. His career includes 10+ years of speaking periodically at user groups and industry trade shows. He has a Master’s degree in Nuclear Engineering from MIT and a Bachelor’s degree in Engineering Physics from the University of Michigan.
Does anyone know a pattern for detecting half-duplex connections from server/laptop sources to server destinations? not switches, not routers. I am Splunk Cloud Version: 9.0.2305.101
Hi,  My main goal is to find user id. Index=A sourcetype=signlogs outcome=failure The above search has a field name called processId but it doesn't have the userId which I needed. Index=A sourcet... See more...
Hi,  My main goal is to find user id. Index=A sourcetype=signlogs outcome=failure The above search has a field name called processId but it doesn't have the userId which I needed. Index=A sourcetype=accesslogs -->This search has a SignatureProcessId( which is same as processId in the search1) and also it has userId. So I need to join these 2 query with common field as processId/SignatureProcessId I tried the below query but it results 0 events: Index=A sourcetype=signlogs outcome=failure  | dedup processId | rename processId as SignatureProcessId | join type=inner SignatureProcessId [Index=A sourcetype=accesslogs | dedup SignatureProcessId ]  | Table _time, SignatureProcessId, userId   Someone please help with fixing this query.
Hello! I have just created a trial account to try Open Telemetry integration. When I go to the OTel tab to generate a key and press  nothing happens, the access key does not appear, but the bu... See more...
Hello! I have just created a trial account to try Open Telemetry integration. When I go to the OTel tab to generate a key and press  nothing happens, the access key does not appear, but the button is active again.  So does the trial account is not enable for OTel integration? thanks!
Hi Team, I have set alert for below query: index= "abc" "ebnc event did not balanced for filename" sourcetype=600000304_gg_abs_dev source!="/var/log/messages" | rex "-\s+(?<Exception>.*)" | table E... See more...
Hi Team, I have set alert for below query: index= "abc" "ebnc event did not balanced for filename" sourcetype=600000304_gg_abs_dev source!="/var/log/messages" | rex "-\s+(?<Exception>.*)" | table Exception source host sourcetype _time And I got below result:   I have set the alert as below   And I have set the incident for it with SAHARA Forwarder but I am getting only 1 incident though the statistics was 6. 6 incidents should get created And also Incidents are coming very late if  event triggered at 8:20 incident is coming on 9:16 Can someone guide me on it.      
I have a SH cluster with 3 servers, but I'm getting a lot of replication errors because the datamodels fill up the dispatch directory. How are jobs released from dispatch? Are files cleaned automati... See more...
I have a SH cluster with 3 servers, but I'm getting a lot of replication errors because the datamodels fill up the dispatch directory. How are jobs released from dispatch? Are files cleaned automatically? There are many bad alloc errors. Thanks.  
We are excited to announce the first round of Adventurer’s Bounty victors in the Great Resilience Quest! A hearty congratulations to the first 25 Security Saga Challengers and the 25 Observability Ch... See more...
We are excited to announce the first round of Adventurer’s Bounty victors in the Great Resilience Quest! A hearty congratulations to the first 25 Security Saga Challengers and the 25 Observability Chronicle Warriors who have successfully conquered Chapter 1 between 7/17/23 to 10/20/23! To our winners, your prizes have been dispatched as digital gift cards to your email addresses. Please check your inboxes and claim your reward. If your gift card has not arrived, please reach out to me  via a community message before 11/17. Let's keep the momentum going - more use-case knowledge awaits to help you build greater digital resilience, and more opportunities to claim your rewards are on the way. Onward to the next stage! Best regards, Customer Success Marketing
I'm trying to troubleshoot some Windows Event Log events coming into Splunk. The events are stream processed, and come in as JSON. Here is a sample (obfuscated). {"Version":"0","Level":"0","Task":"... See more...
I'm trying to troubleshoot some Windows Event Log events coming into Splunk. The events are stream processed, and come in as JSON. Here is a sample (obfuscated). {"Version":"0","Level":"0","Task":"12345","Opcode":"0","Keywords":"0x8020000000000000","Correlation_ActivityID":"{99999999-9999-9999-9999-999999999999}","Channel":"Security","Guid":"99999999-9999-9999-9999-999999999999","Name":"Microsoft-Windows-Security-Auditing","ProcessID":"123","ThreadID":"12345","RecordID":"999999","TargetUserSid":"AD\\user","TargetLogonId":"0xXXXXXXXXX"} There are a number of indexed fields as well, including "Computer" and "EventID". What's interesting - signature_id seems to be created, but when I search on it, it fails. In this event, signature_id is shown under "Interesting Fields" with the value 4647, but if I put signature_id=4647 in the search line, it comes back with no results. If I put EventID=4647, it comes back with the result. I'm using Smart Mode. This led me to digging into the Fields configurations (alias', calculations, etc.) but I couldn't figure out how signature_id was created in the Windows TA. Can anyone provide any insight? Thank you! Ed
Good Day Ladies, Gentlemen! It's my first Dashboard Studio experience, and one (1) space boggles me. I have a datasource that works :       "ds_teamList": { "type": "ds.search", "options"... See more...
Good Day Ladies, Gentlemen! It's my first Dashboard Studio experience, and one (1) space boggles me. I have a datasource that works :       "ds_teamList": { "type": "ds.search", "options": { "query": "host=\"splunk.cxm\" index=\"jira\" sourcetype=\"csv\" \"Project name\"=\"_9010 RD\" \n| rename \"Custom field __Team\" as TEAM\n| table TEAM\n| dedup TEAM \n| sort TEAM" }, "name": "teamList" }       A multiselect input that list the correct data: With one (1) team name containing spaces.       "input_TEAM": { "options": { "items": [{}], "token": "ms_Team", "clearDefaultOnSelection": true, "selectFirstSearchResult": true }, "title": "TEAM", "type": "input.multiselect", "dataSources": { "primary": "ds_teamList" }       A chain search that uses the ms_team token:       | search TEAM IN ($ms_Team$) | search CLIENT IN ($dd_Client$) | search Priority IN ($ms_priority$) | chart count over Status by TEAM       The result gets all good data, but for the team that have a space in its name:   I know that if I could add double quotes for the team with space, it would work, but cannot find a solution for this simple minus issue. Or this is a bug, or not the way I'm suppose to use Dashboard Studio.       | search TEAM IN (Detector,Electronic,Mechanical,Software,"Thin film")       I searched and tried many solutions about strings in token, search... then I'm here for the first time...  Any simple solution possible? Thank you! Sylvain
Hi All,   My requirement is source data records data need to be encrypted. What does process need to follow? Is there any possibly  props.conf ?   Please help me the process.   Regards, Vij 
I have a field called environment which has values like dev,prod,uat,sit. Now I want to create a new_field which all the field values of environment field. Example: (4 field values) environment ... See more...
I have a field called environment which has values like dev,prod,uat,sit. Now I want to create a new_field which all the field values of environment field. Example: (4 field values) environment  dev prod uat sit After query: ( 1 field value, separated by any string) merge_environment= dev | prod | uat | sit How to achieve this?
Hi,  I am trying to upload the dSYM files automatically in the pipeline by hitting the Appdynamics REST APIs. Would like to know, how can I do it using API tokens?  1. I want to generate the token ... See more...
Hi,  I am trying to upload the dSYM files automatically in the pipeline by hitting the Appdynamics REST APIs. Would like to know, how can I do it using API tokens?  1. I want to generate the token using Appdynamics REST API.   The token generation API requires both an authentication header with username and password as well as the oAuth request body to successfully request a token. We use only SAML login. Do I need to create a local account for this purpose? Then, how long the API token can live? 2. API Clients (appdynamics.com) When I generate the token via Admin UI, it shows the max is 30days. Then it needs to be regenerated.  Any comments on it? Appreciate your inputs on this.  Thanks,  Viji
After installing the latest UF 9.1.1 on a linux i tried to  connect it to the deployment server ./splunk set deploy-poll <host name or ip address>:<management port> i get an "error" with allowRemote... See more...
After installing the latest UF 9.1.1 on a linux i tried to  connect it to the deployment server ./splunk set deploy-poll <host name or ip address>:<management port> i get an "error" with allowRemoteLogin and the deployment.con  is not created  after i added the following entry in the server.conf, the command added successfuly the string to connect to the deployment server allowRemoteLogin = always anyone experiencing the same issue?
Hey Everyone, I currently have a dashboard the has two maps utilizing the "| geom geo_us_states featureIdField=State" and one maps utilizing cities, which i have the longitude and latitude for. For... See more...
Hey Everyone, I currently have a dashboard the has two maps utilizing the "| geom geo_us_states featureIdField=State" and one maps utilizing cities, which i have the longitude and latitude for. For the cities maps, i currently have it in markers mode. Is there a way that when i hover over the cities, or click on them, it can display the count and field name associated with the count? For example, A = 10 B=12 and C=9 For the states map, is there a way to group certain states together to form a region? for example, California, Nevada, and Oregon are the western region and have them colored a certain way. Or is there an app i can download that can help me achieve this. I appreciate all the help!