All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi @victorcorrea , I usually use monitor command for this requirements. I like to use batch command that removes files after reading but I experienced an issue with big files, so, for my knowledge,... See more...
Hi @victorcorrea , I usually use monitor command for this requirements. I like to use batch command that removes files after reading but I experienced an issue with big files, so, for my knowledge, the best solution is monitor. I never used MonitorNoHandle. About the grants, you have to give the grants of reading to the splunk user or to its group. Usually files to read have 644. Ciao. Giuseppe
Hi @hazem , Splunk Stream is a packet capture app, for my knowledge isn't the best solution for DNS logs, I usually use Splunk_TA_Windows add-on. Ciao. Giuseppe
Hi @chandrag , on Splunk Cloud you have few chances for action, open a case to Splunk Cloud Support. It seems that KV-Store is disabled in one component of your architecture and this is a best prac... See more...
Hi @chandrag , on Splunk Cloud you have few chances for action, open a case to Splunk Cloud Support. It seems that KV-Store is disabled in one component of your architecture and this is a best practice for Indexers and Heavy Forwarder, but the Add-On you're using requires KV-Store. Ciao. Giuseppe
In Splunk Cloud for one of my client environment, I'm seeing below message. TA-pps_ondemand Error: KV Store is disabled. Please enable it to start the data collection. Please help me with suitable ... See more...
In Splunk Cloud for one of my client environment, I'm seeing below message. TA-pps_ondemand Error: KV Store is disabled. Please enable it to start the data collection. Please help me with suitable solution.
  Hi Bhumi, For example, with that instruction I am extracting data from msg_old field with the word full in between like: she works successfully for us, but what I need is just data that contain t... See more...
  Hi Bhumi, For example, with that instruction I am extracting data from msg_old field with the word full in between like: she works successfully for us, but what I need is just data that contain the word word "full" itself.  For instance: The house is full.  Hope this can clarify. Thanks.  
Hi ITWhisperer, For example, with that instruction I am extracting data from msg_old field with the word full in between like: she works successfully for us, but what I need is just data that contai... See more...
Hi ITWhisperer, For example, with that instruction I am extracting data from msg_old field with the word full in between like: she works successfully for us, but what I need is just data that contain the word word "full" itself.  For instance: The house is full.  Hope this can clarify. Thanks.  
Thank you very much. This is working to me and feel a bit faster than before.
Hey all, 2 part question here... I'm using the MLTK Smart Forecasting tool to model hard drive free space (time vs hard drive space). Currently the y-axis (i.e. free space %/free space MB) is auto... See more...
Hey all, 2 part question here... I'm using the MLTK Smart Forecasting tool to model hard drive free space (time vs hard drive space). Currently the y-axis (i.e. free space %/free space MB) is automatically adjusting its range to the graph produced by the forecast. I would want it to instead run from 0-100 (in the case of Free Space %) and be something reasonable for the Free Space MB-line. By extension, how would I get the graph to "tell or show" me where the x-intercept would be from the prediction and/or confidence interval (i.e. tell me the date the hard drive would theoretically run out of space; where Free Space %/MB = 0)? Attached is an image of the current output visualization should this help.  
Hi There,  I got issue Drill-down and Next Step are not read in Incident Review, i create Splunk Lab for Research And Development by myself. I just install Splunk Enterprise and Enterprise Security ... See more...
Hi There,  I got issue Drill-down and Next Step are not read in Incident Review, i create Splunk Lab for Research And Development by myself. I just install Splunk Enterprise and Enterprise Security (nothing another external apps) and i ingest DVWA to my Splunk. As you know DVWA has various vulnerabilities, and I want to utilize this as a log that I will then manage in Splunk. Therefore, I made a rule regarding uploading inappropriate files. The query is like this    index=lab_web sourcetype="apache:access" | rex field=_raw "\[(?<Time>[^\]]+)\] \"(?<Method>\w+) (?<Path>/DVWA/vulnerabilities/upload/[^/]+\.\w+) HTTP/1.1\" (?<Status>\d{3}) \d+ \"(?<Referer>[^\"]+)\" \"(?<UserAgent>[^\"]+)\"" | eval FileName = mvindex(split(Path, "/"), -1) | eval FullPath = "http://localhost" . Path | where match(FileName, "\.(?!jpeg$|png$)[a-zA-Z0-9]+$") | table Time, FileName, FullPath, Status   In that correlation, I added notables that were filled in from the drill-down and also the next step.  But why when I enter the incident review, the drill-down and next steps that I created are not readable? Maybe there is an application that I haven't installed or something else? I will attach my full correlation setting include with notable, drill-down, and Next Steps.   Splunk Enterprise Version : 9.3.1 Enterprise Security Version : 7.3.2
Yes, HEC input stanza will honor all routing fields. _TCP_ROUTING/_SYSLOG_ROUTING/_INDEX_AND_FORWARD_ROUTING Also other fields as per inputs.conf.spec. outputgroup internally maps to _TCP_ROUTING ... See more...
Yes, HEC input stanza will honor all routing fields. _TCP_ROUTING/_SYSLOG_ROUTING/_INDEX_AND_FORWARD_ROUTING Also other fields as per inputs.conf.spec. outputgroup internally maps to _TCP_ROUTING value. But _TCP_ROUTING is multi-value field. You can set multiple output groups.
In addtion, I don't want to overwrite the hostnames.csv file.  You have no choice about this.  CSV file is just a file.  You can append new rows into a file - which your use case does not call... See more...
In addtion, I don't want to overwrite the hostnames.csv file.  You have no choice about this.  CSV file is just a file.  You can append new rows into a file - which your use case does not call for; or you can rewrite the file.
im here still no idea for this issue
In the past I've used outputgroup = <string> on the inputs.conf of [http] stanzas It sounds like the versions mentioned (and newer versions) now support: _TCP_ROUTING _meta   And a few other set... See more...
In the past I've used outputgroup = <string> on the inputs.conf of [http] stanzas It sounds like the versions mentioned (and newer versions) now support: _TCP_ROUTING _meta   And a few other settings, is that correct? It is nice to have the product match it's spec file documentation   Thanks
I'm trying to format timestamps in a table in dashboard studio. The original times are values such as: 2024-10-29T10:13:35.16763423-04:00 That is the value I see if I don't add a specific format.... See more...
I'm trying to format timestamps in a table in dashboard studio. The original times are values such as: 2024-10-29T10:13:35.16763423-04:00 That is the value I see if I don't add a specific format. If I add a format to the column : "YYYY-MM-DD HH:mm:ss.SSS Z" it is formatted as: 2024-10-29 10:13:35.000 -04:00 Why are the millisecond values zero? Here is the section of the source code for reference: "visualizations": { "viz_mfPU11Bg": { "type": "splunk.table", "dataSources": { "primary": "ds_xfeyRsjD" }, "options": { "count": 8, "columnFormat": { "Start": { "data": "> table | seriesByName(\"Start\") | formatByType(StartColumnFormatEditorConfig)" } } }, "context": { "StartColumnFormatEditorConfig": { "time": { "format": "YYYY-MM-DD HH:mm:ss.SSS Z" } } } } }, Any ideas what I'm doing wrong? Thanks, Andrew
Thank you. That's what I thought too. However,  30 05 8-14 * 2 is a valid cron and Splunk should consider fixing this 
Now it matches what document says # GENERAL SETTINGS: # The following settings are valid for all input types (except file system # change monitor, which is described in a separate section in this fi... See more...
Now it matches what document says # GENERAL SETTINGS: # The following settings are valid for all input types (except file system # change monitor, which is described in a separate section in this file).
These two sections of inputs.conf( whatever is applicable for monitor/splunktcpin/tcpin etc.) ############################################################################ # GENERAL SETTINGS: # The f... See more...
These two sections of inputs.conf( whatever is applicable for monitor/splunktcpin/tcpin etc.) ############################################################################ # GENERAL SETTINGS: # The following settings are valid for all input types (except file system # change monitor, which is described in a separate section in this file). # You must first enter a stanza header in square brackets, specifying the input # type. See later in this file for examples. Then, use any of the # following settings. # # To specify global settings for Windows Event Log inputs, place them in # the [WinEventLog] global stanza as well as the [default] stanza. ############################################################################   ############################################################################ # This section contains options for routing data using inputs.conf rather than # outputs.conf. # # NOTE: Concerning routing via inputs.conf: # This is a simplified set of routing options you can use as data comes in. # For more flexible options or details on configuring required or optional # settings, see outputs.conf.spec. ############################################################################
The stream app can save pcaps using the configure packet stream. I was able to get packets saved using just IPs. Now I want to search for content based on some snort rules. For ascii content I am try... See more...
The stream app can save pcaps using the configure packet stream. I was able to get packets saved using just IPs. Now I want to search for content based on some snort rules. For ascii content I am trying to create new target using the field: content/contains by just putting in an ascii word. For hex values, there are no instructions. Do I use escape characters \x01\x02...., |01 02 ...| or a regular expression? Is there an example.
1. 9.0.5 is a fairly old version. Unless there are any severe known bugs it's recommended to use the latest version available for the platform in question. 2. What do you mean by "stops"? Does the p... See more...
1. 9.0.5 is a fairly old version. Unless there are any severe known bugs it's recommended to use the latest version available for the platform in question. 2. What do you mean by "stops"? Does the process exist but stops sending data or is the process killed? Did you check the logs (both UF's logs as well as general system logs)? 3. I'm assuming the server wasn't restarted lately, right?
https://docs.splunk.com/Documentation/Splunk/9.3.1/Updating/Upgradepre-9.2deploymentservers