All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi,  how can I rewrite the following search using tstats and datamodel Network_Traffic? index=*pan* sourcetype="pan:threat" severity IN ("high", "critical") so far I have tested the following: | ... See more...
Hi,  how can I rewrite the following search using tstats and datamodel Network_Traffic? index=*pan* sourcetype="pan:threat" severity IN ("high", "critical") so far I have tested the following: | tstats count from datamodel=Network_Traffic by All_Traffic.src_ip but given the fact that “severity” is not a field included in the datamodel but just in the index, how can I add the condition severity IN ("high", "critical")?   thank you!
Is there any playbook for setting up Heavy Forwarders ?
Hi have you read this https://docs.splunk.com/Documentation/Splunk/9.2.1/Data/AboutHECIDXAck? And have you implemented ack response handling on your HEC client? Are you using separate channel value... See more...
Hi have you read this https://docs.splunk.com/Documentation/Splunk/9.2.1/Data/AboutHECIDXAck? And have you implemented ack response handling on your HEC client? Are you using separate channel values on every HEC client instances? How many HEC receiver you have and are those behind load balancer? r. Ismo
Hi Splunk has designed to archive data from buckets, not from collection phase. If you are running on AWS then you could try to archive data before it’s indexed by ingest action, but I think that yo... See more...
Hi Splunk has designed to archive data from buckets, not from collection phase. If you are running on AWS then you could try to archive data before it’s indexed by ingest action, but I think that you are running it in on premise?  The best option is use archive script in indexes.conf for archiving buckets. If this is not an option to you, then you have two option. setup some props + transforms.conf files to duplicate that data e.g. to syslog server and use it to store and archive it. But as splunk use UDP to send syslog feed, you will lose some events time by time. Use some other tool to collect and archive those events and send those also to splunk by that tool r. Ismo
Thanks for the side message. cms.apache-access | eval request_time_num = tonumber(request_time) | eval category = case( request_time_num < 100000, "<1s", request_time_num < 200000, "<2s", request_t... See more...
Thanks for the side message. cms.apache-access | eval request_time_num = tonumber(request_time) | eval category = case( request_time_num < 100000, "<1s", request_time_num < 200000, "<2s", request_time_num < 300000, "<3s" ) | stats count by category
Hi you could try like LINE_BREAKER = ([\n\r]+)\d{8} \d\d:\d\d: TIME_FORMAT = %Y%m%d %H:%M%:%S.%3Q TIME_PREFIX = ^  r. Ismo
Hello, I'm new to Dashboard Studio. I'm looking for a way to show/hide certain visualizations based on user selection in a dropdown, e.g. based on token value. As I understand, this is pretty easy t... See more...
Hello, I'm new to Dashboard Studio. I'm looking for a way to show/hide certain visualizations based on user selection in a dropdown, e.g. based on token value. As I understand, this is pretty easy to achieve in the older (xml-based) version of Dashboards using the "depends" attribute. Is there an equivalent of this in Dashboard Studio? I wasn't able to find any good info on this.
I'm trying to figure out how to query all of the events from an Apache log and produce a report with counts of the number events with request_time less than 3s, less than 2s and less than 1s.
Running the index="firewall" works successfully and adding the sourcetype="firewall" lets me search through the logs successfully but it will only let me filter and look for the fields below for some... See more...
Running the index="firewall" works successfully and adding the sourcetype="firewall" lets me search through the logs successfully but it will only let me filter and look for the fields below for some reason.  I can't look for destination IP addresses?  
I would check that the saved search populating the forwarder table in MC is finding the results as expected. Maybe the logs aren't making it from those forwarders that are missing? If you followed th... See more...
I would check that the saved search populating the forwarder table in MC is finding the results as expected. Maybe the logs aren't making it from those forwarders that are missing? If you followed the setup instructions below, I don't think you're missing anything glaringly obvious. Saved search and setup information: https://docs.splunk.com/Documentation/Splunk/9.2.1/DMC/Configureforwardermonitoring
I have a log the needs the props.conf setup but the year month and date is complied into one with no spaces or separators.  How can I regex this in the  line breaker or time format  this is an exa... See more...
I have a log the needs the props.conf setup but the year month and date is complied into one with no spaces or separators.  How can I regex this in the  line breaker or time format  this is an example of the log start of each event  20240507 10:47:38.467 [DEBUG] 12672
Thanks!  I will give this a shot and see if it works  
This could be a number of things as to why your not getting any results.  With Splunk you should be able to see the fields in the fields side bar provided you have access to the index (permission... See more...
This could be a number of things as to why your not getting any results.  With Splunk you should be able to see the fields in the fields side bar provided you have access to the index (permissions) and the data has been onboarded correctly and fields are extracted. Run index="firewall" and see of you get data and then you should find the sourcetype associated with the data you want to search.   Example  index="firewall" sourcetype=<Add your sourcetype here> | table host, src_addr, dest_addr Note: The fields your interested based on your data may be different - so look at the left fields side bar.  If you  cant get anything, it may be that you don't have permissions to see that firewall index/data or the data has not been onboarded correctly    
The first thing about dashboards is that you should create draw out a design, what data, fields, and what kind of layout, table, chart, timechart, forms etc.   Then create a prototype dashboard base... See more...
The first thing about dashboards is that you should create draw out a design, what data, fields, and what kind of layout, table, chart, timechart, forms etc.   Then create a prototype dashboard based on that and refine it until you have the results.  Why not try and create the dashboards, have a look here there a several examples  https://docs.splunk.com/Documentation/SplunkCloud/latest/SearchTutorial/Createnewdashboard  Even better if you run through this tutorial - by the end of the week you should be able to create some of you own dashboard.  https://docs.splunk.com/Documentation/SplunkCloud/9.1.2312/SearchTutorial/WelcometotheSearchTutorial    These are very simple examples of different ways to present your data and put them into a dashboard.  This shows just a table for the fields of interest to you:. index="wireless_retail" source="CPS.cpsLog" | fields ID, Date, Level, Logger, Message | table Date, ID, Level, Message   This shows how many events by Level field index="wireless_retail" source="CPS.cpsLog" | fields ID, Date, Level, Logger, Message | stats count by Level, ID, message   This shows using time period for Level by ID index="wireless_retail" source="CPS.cpsLog" | fields ID, _time, Date, Level, Logger, Message | timechart span=1h count by Level by ID You can also download this app and use the many great examples here  https://splunkbase.splunk.com/app/1603   
the below are two different drop down list as we have different host and index. Based on the index selection i do set/unset -show/hide - can we make as single query with single drop down list  based... See more...
the below are two different drop down list as we have different host and index. Based on the index selection i do set/unset -show/hide - can we make as single query with single drop down list  based on the value of the dropdownlist which is visible?     1) index=aaa (source="/log/test.log" $host1$ ) | rex field=name "(?<DB>[^\.]*)" | rename DB AS "Database" | table Database | dedup Database 2) index=bbb (source="/log/test.log" $host2$ ) | rex field=name "(?<DB>[^\.]*)" | rename DB AS "Database" | table Database | dedup Database      if ddl1 is visible fetch the value and pass to $host1$ to the  query if ddl2 is visible fetch the value and pass $host2$ to the query or based on the drop down list value selected can we set value and pass to query to avoid multiple queries for only differnt host/index?
I am looking to write a simple search that tells me if a host or hosts are reaching out to a specific IP address.  So far I have  index="firewall" host=hostname src_addr=x.x.x.x dest_addr=x.x.x.x ... See more...
I am looking to write a simple search that tells me if a host or hosts are reaching out to a specific IP address.  So far I have  index="firewall" host=hostname src_addr=x.x.x.x dest_addr=x.x.x.x When I run this it doesn't come back with anything.  Should I be searching under my domain instead?  I would like for it to be lined up like below,   Hostname | source IP | destination IP
Our Splunk instance is currently setup as a deployment server. All our clients have the Universal Forwarder installed and setup as deployment clients, phoning home to the server to get their neces... See more...
Our Splunk instance is currently setup as a deployment server. All our clients have the Universal Forwarder installed and setup as deployment clients, phoning home to the server to get their necessary apps. Under the "Forwarder Management" page of the Distributed Environment settings, can see all 20 of our clients and their respective host name and IP address actively talking with the server by phoning home and getting apps deployed... However, when I go to the Monitoring Console's, "Forwarders: Deployment" page, only 6 of the 20 Universal Forwarders are showing as installed and active? Sure we're messing up one of the many different config files somewhere but not sure which one...
Hello, Is there any way to add a Custom Logo in place of the Splunk Logo in an exported PDF? I am on Splunk Cloud not Enterprise so Im not sure if I can access a static folder to use with server s... See more...
Hello, Is there any way to add a Custom Logo in place of the Splunk Logo in an exported PDF? I am on Splunk Cloud not Enterprise so Im not sure if I can access a static folder to use with server settings > email settings > PDF report settings. Alternatively Ive tried just adding an image to the dashboard but even after having it appear (using embedded base64), the image does not appear when exported to PDF. Any guidance or alternatives would be appreciated.
Hi, I have a raw data as below, with the fields "ID, Date, Level, Logger, Message which needs to be dsiplayed in a dashboard.  index="wireless_retail" source="CPS.cpsLog" Level="ERROR", Logger="Uti... See more...
Hi, I have a raw data as below, with the fields "ID, Date, Level, Logger, Message which needs to be dsiplayed in a dashboard.  index="wireless_retail" source="CPS.cpsLog" Level="ERROR", Logger="Utils.Helpers.LogHelper". Can someone help me with this dashboard creation for this ID="39090", Date="2024-05-07 14:12:53.313", Thread="4", Level="ERROR", Logger=".Utils.Helpers.LogHelper", Message="UserName: abc Location:  Sales Channel: GW_STORE Error in Path: /pap/getcpsinput Raw Url: /pap/getcpsinput User Name: Error: System.Data.Entity.Core.EntityException: An error occurred while starting a transaction on the provider connection. See the inner exception for details. ---&gt; System.Data.SqlClient.SqlException: Execution Timeout Expired. The timeout period elapsed prior to completion of the operation or the server is not responding. ---&gt; System.ComponentModel.Win32Exception: The wait operation timed out --- End of inner exception stack trace --- at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error) at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync() at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket() at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer() at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte&amp; value) at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean&amp; dataReady)