All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

And even back to single-site https://docs.splunk.com/Documentation/Splunk/9.1.2/Indexer/Converttosinglesite.  
Hi One reason for that is bad timestamp handling or data from different times to one index. There should be some previous posts about it on community. r. Ismo
Thank you!
Agent configuration and maintenance don't have to be complex. With our new Smart Agent, managing upgrades and installations is as easy as a few clicks.  See for yourself! Check out this click-throu... See more...
Agent configuration and maintenance don't have to be complex. With our new Smart Agent, managing upgrades and installations is as easy as a few clicks.  See for yourself! Check out this click-through demo to see it in action: 
SEDCMD settings must contain either an s or y command not just a regex. To properly extract a timestamp, the props stanza should contain TIME_PREFIX, TIME_FORMAT, and MAX_TIMESTAMP_LOOKAHEAD setting... See more...
SEDCMD settings must contain either an s or y command not just a regex. To properly extract a timestamp, the props stanza should contain TIME_PREFIX, TIME_FORMAT, and MAX_TIMESTAMP_LOOKAHEAD settings. [sourcetype_name] disabled = false SHOULD_LINEMERGE = false MAX_TIMESTAMP_LOOKAHEAD = 80 TIME_FORMAT = %m/%d/%Y-%H:%M:%S TIME_PREFIX = \d! LINE_BREAKER = ([\r\n]+)library! SEDCMD-null = s/\<Header>[\s\S]*?\<\/Header>//g You may have a problem with time zones, depending on the zones of the Splunk server and that in the data.  Ideally, the time zone should be specified as part of the timestamp rather than as a separate element.  The time zone should be a recognized abbreviation such as "CST" or "-0600".  BTW, Central Daylight Time is not in effect in November.
Hi Splunk Community,    I am trying to create a props.conf for the sample log file below.  My goal is to      * Delete the Header tag and remove the data from being ingested.      * Break the in... See more...
Hi Splunk Community,    I am trying to create a props.conf for the sample log file below.  My goal is to      * Delete the Header tag and remove the data from being ingested.      * Break the individual events by starting with ( "library!WindowsService_98!..." OR "processing!ReportServer_0-127!" )     * Extracting time stamp such as ( "!11/26/2023-00:21:18::")   Heres the props.conf that I have so far but it is not working.  --------- [sourcetype_name] disabled = false SHOULD_LINEMERGE = false MAX_TIMESTAMP_LOOKAHEAD = 80 TIME_FORMAT = %m/%d/%Y-%H:%M:%S LINE_BREAKER = ([\r\n]+)library! SEDCMD-null = (<Header>([\s\S]*?)<\/Header>)   disabled ------------------- sample log file ------------------------- <Header> <Product>Microsoft SQL Server Reporting Services Version 2007.0100.6000.029 ((Random_value).18802-2848 )</Product> <Locale>English (United States)</Locale> <TimeZone>Central Daylight Time</TimeZone> <Path>C:\Program Files\Microsoft SQL Server\MSRS10.MSSQLSERVER\Reporting Services\Logfiles\ReportServerService__11_26_2023_00_00_01.log</Path> <SystemName>hostName01</SystemName> <OSName>Microsoft Windows NT 6.2.9200</OSName> <OSVersion>6.2.9200</OSVersion> <ProcessID>3088</ProcessID> </Header>library!WindowsService_98!1234!11/26/2023-00:00:01:: i INFO: Call to CleanBatch() library!WindowsService_98!1234!11/26/2023-00:00:01:: i INFO: Cleaned 0 batch records, 0 policies, 0 sessions, 0 cache entries, 0 snapshots, 0 chunks, 0 running jobs, 0 persisted streams, 0 segments, 0 segment mappings. library!WindowsService_98!1234!11/26/2023-00:00:01:: i INFO: Call to CleanBatch() ends library!WindowsService_98!1218!11/26/2023-00:10:01:: i INFO: Call to CleanBatch() library!WindowsService_98!1218!11/26/2023-00:10:01:: i INFO: Cleaned 0 batch records, 0 policies, 1 sessions, 0 cache entries, 1 snapshots, 14 chunks, 0 running jobs, 0 persisted streams, 9 segments, 9 segment mappings. library!WindowsService_98!1218!11/26/2023-00:10:01:: i INFO: Call to CleanBatch() ends library!WindowsService_98!d00!11/26/2023-00:20:01:: i INFO: Call to CleanBatch() library!WindowsService_98!d00!11/26/2023-00:20:01:: i INFO: Cleaned 0 batch records, 0 policies, 0 sessions, 0 cache entries, 0 snapshots, 0 chunks, 0 running jobs, 0 persisted streams, 0 segments, 0 segment mappings. library!WindowsService_98!d00!11/26/2023-00:20:01:: i INFO: Call to CleanBatch() ends library!ReportServer_0-127!2558!11/26/2023-00:21:18:: i INFO: RenderForNewSession('/Hampton.Common.Reports/BOL') processing!ReportServer_0-127!2558!11/26/2023-00:21:18:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 19., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 19. processing!ReportServer_0-127!2558!11/26/2023-00:21:18:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 54., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 54. processing!ReportServer_0-127!2558!11/26/2023-00:21:18:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 61., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 61. processing!ReportServer_0-127!2558!11/26/2023-00:21:18:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 62., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 62. processing!ReportServer_0-127!2558!11/26/2023-00:21:19:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 1., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 1. processing!ReportServer_0-127!2558!11/26/2023-00:21:19:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 2., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 2. processing!ReportServer_0-127!2558!11/26/2023-00:21:19:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 1., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 1. processing!ReportServer_0-127!2558!11/26/2023-00:21:19:: e ERROR: Throwing Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 2., ; Info: Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: There is no data for the field at position 2. library!WindowsService_98!1234!11/26/2023-00:30:01:: i INFO: Call to CleanBatch() ------------------- sample log file end -------------------------
If this only happens upon a restart of the server, you will need to ensure that the variable is declared in your startup script. Splunk will be configured to boot-start in a best practice scenario. ... See more...
If this only happens upon a restart of the server, you will need to ensure that the variable is declared in your startup script. Splunk will be configured to boot-start in a best practice scenario. However, if the boot-start script does not declare that variable, it will not exist when Splunk is started by init or systemd. Your shell RC file does not execute upon boot or init/systemd starting services; only upon interactive login. Keep it there. It is important for when Splunk is restarted by a human running the shell. Or automation for that matter. Solution: declare the variable. For init, you can simply declare it just like you already do in your shell RC in the start) f. For systemd, you will declare it like this in the [Settings] stanza:     Environment="REQUESTS_CA_BUNDLE=/full/path/to/your/internal/ca/pem/file/with/no/variables"      
Announcement and Additional Resources here! Hey there, Community! We're excited to share Cisco AppDynamics Smart Agent, newly released in v23.11.0. It simplifies agent operations, enabling instal... See more...
Announcement and Additional Resources here! Hey there, Community! We're excited to share Cisco AppDynamics Smart Agent, newly released in v23.11.0. It simplifies agent operations, enabling installation, upgrade, and rollback right from the Controller UI. Plus, you can view Smart Agent inventory details along with other installed agents. There's also a CLI option for advanced configuration.  Getting started with Smart Agent is simple; just install it on each desired host.   Find Smart Agent Info We've put together a Smart Agent resource collection here in Community. For an overview, we recommend starting with Smart Agent: agent lifecycle management reimagined.  You'll also find the first pair of several real-life example how-to articles: Exploring an APM agent upgrade scenario with Smart Agent Exploring an APM agent installation scenario with Smart Agent We've also put together a collection of Frequently Asked Questions for Smart Agent here. Don't miss @Aaron.Schifman's clickable demo: Smart Agent: How easy is it? To learn more, see the complete Smart Agent documentation. Tell us what you think! Our whole team is eager to see how you use Smart Agent. Join the conversation here. We're open for questions! Best, Claudia Landivar Community Manager & Editor Ryan Paredez Community Manager
I was doing regular health checks in my Splunk deployment and found In indexing health is critical mainly due to the small bucket count, maxbucketSize has been set to auto, not sure what else might b... See more...
I was doing regular health checks in my Splunk deployment and found In indexing health is critical mainly due to the small bucket count, maxbucketSize has been set to auto, not sure what else might be the cause.  I'm new to the org and have little or no idea about the underlying architecture and implementation. 
AFAIK, Splunk Cloud does not provide sample data.  You should be able to use a Universal Forwarder to index data from your local workstation.
Hi In Splunk Cloud on left side app list there is app called Universal Forwarder (or something similar). Just click it and it gives to you instructions how to install it and collect data from your l... See more...
Hi In Splunk Cloud on left side app list there is app called Universal Forwarder (or something similar). Just click it and it gives to you instructions how to install it and collect data from your local node to splunk cloud.  r. Ismo
Thanks Rich! I am brand new and I have no data that I am prepared to upload to SPLUNK.  I am just exploring and I understood that in the trial mode, SPLUNK provided sample data could be used.  But I ... See more...
Thanks Rich! I am brand new and I have no data that I am prepared to upload to SPLUNK.  I am just exploring and I understood that in the trial mode, SPLUNK provided sample data could be used.  But I don't know how to get that.  It seems to be expecting a real stream from me.   
Have you tried Settings->Add Data? You also can set up a universal forwarder to submit files either by monitoring them or by using splunk add oneshot
https://www.splunk.com/en_us/download/previous-releases-universal-forwarder.html
One the query one pipe at a time (start with the first pipe and iteratively add the next) until you find the one that returns no results.  Share that command and we'll see if can figure out what's ha... See more...
One the query one pipe at a time (start with the first pipe and iteratively add the next) until you find the one that returns no results.  Share that command and we'll see if can figure out what's happening.
How do I add sample data to a cloud based trial instance?
The scheduled PDF will use the time range defined in the dashboard.  The dashboard cannot use a time picker or other user input.
i failed too... end up with setting up syslog-ng with TLS. 
Hello, I unfortunately lost my webui password for admin account. What is the way to reset it ? I mean I know that on Splunk enterprise, there is used-seed, rest-api and even cli command but I can't ... See more...
Hello, I unfortunately lost my webui password for admin account. What is the way to reset it ? I mean I know that on Splunk enterprise, there is used-seed, rest-api and even cli command but I can't find any clue about Splunk UBA. Thanks for the help
  Hi, my apologies for not being more specific. The query I provided was modified for security reasons. I am indeed replacing the placeholders like "foo"  with the correct indexes, plus t... See more...
  Hi, my apologies for not being more specific. The query I provided was modified for security reasons. I am indeed replacing the placeholders like "foo"  with the correct indexes, plus the rest of the required data( host names etc.) Thank you.