All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

| bin _time span=1h | stats count as volume by _time component | bin _time span=1mon | chart max(volume) as volume by component _time | addtotals | eval Average=Total/3
Hi @phanikumarcs , sorry id I'm repeating: if you don't want to search a full text search on _raw, you have to declare the field to associate to each input (every kind of them). But you have to put... See more...
Hi @phanikumarcs , sorry id I'm repeating: if you don't want to search a full text search on _raw, you have to declare the field to associate to each input (every kind of them). But you have to put attention if some event's don't have one of the fields because the default (e.g. event_id=*) will exclude the events without this field. Ciao. Giuseppe
@gcusello @ITWhisperer  To clarify, my understanding is that if any fields are included in the '_raw' only will search for those fields, applicable to all input methods (text, dropdown, multi-select... See more...
@gcusello @ITWhisperer  To clarify, my understanding is that if any fields are included in the '_raw' only will search for those fields, applicable to all input methods (text, dropdown, multi-select, and others). Is that correct? In this case what is the solution for custom fields like in my query where field ("Severity") values (Critical, Warning, Information).
Hi @calvinmcelroy, Splunk can read LZW, gzip, bzip2, or any other compression format that supports streaming via stdin/stdout if properly configured, so I'm surprised you had problems with logrotate... See more...
Hi @calvinmcelroy, Splunk can read LZW, gzip, bzip2, or any other compression format that supports streaming via stdin/stdout if properly configured, so I'm surprised you had problems with logrotate. Is your configuration outside the norm? If you're running oneshot from a host with Splunk Enterprise installed, i.e. a heavy forwarder, then yes, you should have Palo Alto Networks Add-on for Splunk installed on the server.
Hi @aavyu20, I recommend contacting TrackMe support directly. Their contact information is available on the TrackMe Splunkbase page.
Hi @PReynoldsBitsIO, URL options are specified in $SPLUNK_HOME/etc/apps/Trellix_Splunk/appserver/static/js/build/globalConfig.json: ... { "field"... See more...
Hi @PReynoldsBitsIO, URL options are specified in $SPLUNK_HOME/etc/apps/Trellix_Splunk/appserver/static/js/build/globalConfig.json: ... { "field": "url", "label": "URL", "help": "Select a unique URL for this account. Refer to https://docs.trellix.com/ to get specific FQDN and Region for your account", "required": true, "type": "singleSelect", "options": { "disableSearch": true, "autoCompleteFields": [ { "value": "https://arevents.manage.trellix.com", "label": "Global" }, { "value": "https://areventsfrk.manage.trellix.com", "label": "Frankfort" }, { "value": "https://areventsind.manage.trellix.com", "label": "India" }, { "value": "https://areventssgp.manage.trellix.com", "label": "Singapore" }, { "value": "https://areventssyd.manage.trellix.com", "label": "Sydney" } ] } }, ... You may be able to add custom endpoints to this file following the pattern shown, but I recommend contacting the app developer directly to confirm. You can find their email address on the contact tab of other apps they've developed: https://splunkbase.splunk.com/apps?author=lgodoy
Hi @lumi, Although your command should work, you might try: $SplunkInstallationDir = "C:\Program Files\SplunkUniversalForwarder" & "$($SplunkInstallationDir)\bin\splunk.exe" start --accept-license ... See more...
Hi @lumi, Although your command should work, you might try: $SplunkInstallationDir = "C:\Program Files\SplunkUniversalForwarder" & "$($SplunkInstallationDir)\bin\splunk.exe" start --accept-license --answer-yes --no-prompt # or $SplunkExe = "C:\Program Files\SplunkUniversalForwarder\bin\splunk.exe" & $SplunkExe start --accept-license --answer-yes --no-prompt To run "splunk start," the account should have Full Control permission on C:\Program Files\SplunkUniversalForwarder and all subdirectories and files. Ideally, the command should be executed by the service account, assuming the forwarder is also configured to run as a service.
Hi @mnj1809, The choice element value attribute is a literal string. The string is compared against the current token value to determine which radio button is selected. For example: <input type="r... See more...
Hi @mnj1809, The choice element value attribute is a literal string. The string is compared against the current token value to determine which radio button is selected. For example: <input type="radio" token="tokradio"> <choice value="1">One</choice> <choice value="2">Two</choice> <choice value="3">Three</choice> <default>1</default> </input> The default value of $tokradio$ is 1, and choice One is selected. If either a user interaction or dashboard code sets the value of $tokradio$ to 2 or 3, choice Two or Three is selected, respectively. If $tokradio$ is set to a value other than the value attributes defined in the choice list, e.g. 4, no choice is selected. If your goal is to user a radio input to select field names and a text input to enter field values, you can define and update a separate field when either token changes: <form version="1.1" theme="light"> <label>mnj1809_radio</label> <init> <set token="tokradiotext">$tokradio$="$toktext$"</set> </init> <fieldset submitButton="false"> <input type="radio" token="tokradio"> <label>Field</label> <choice value="category">Group</choice> <choice value="severity">Severity</choice> <default>category</default> <change> <set token="tokradiotext">$value$="$toktext$"</set> </change> </input> <input type="text" token="toktext"> <label>Value</label> <default>*</default> <change> <set token="tokradiotext">$tokradio$="$value$"</set> </change> </input> </fieldset> <row> <panel> <event> <title>tokradiotext=$tokradiotext$</title> <search> <query>| makeresults</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> </event> </panel> </row> </form>  
Hi @sandfly_dev, Can you add an option to your configuration to allow customers to provide a list of trusted certificates in PEM or some or other format? This could be a single self-signed certifica... See more...
Hi @sandfly_dev, Can you add an option to your configuration to allow customers to provide a list of trusted certificates in PEM or some or other format? This could be a single self-signed certificate, a list of concatenated certificates in a certificate chain, etc. depending on what's supported by your code.
Hi @Intidev, I've achieved something similar in the past using a column chart with overlays. Hours are represented by two series, one for the hours before start of day (slack hours) and another for... See more...
Hi @Intidev, I've achieved something similar in the past using a column chart with overlays. Hours are represented by two series, one for the hours before start of day (slack hours) and another for working hours. Events are represented by further series, with no more than one event per series; the overlays will render single events as points rather than lines. Building on @Richfez's example with mock data: | makeresults format=csv data="day_of_week,slack_hours,work_hours,event01,event02,event03,event04,event05,event06,event07,event08,event09,event10,event11,event12,event13 2024-03-04,14,11,23,,,,,,,,,,,, 2024-03-05,0,0,,21,22,23,,,,,,,,, 2024-03-06,0,11,,,,,3,11,,,,,,, 2024-03-07,9,11,,,,,,,13,15,,,,, 2024-03-08,9,11,,,,,,,,,14,15,16,, 2024-03-09,0,0,,,,,,,,,,,,, 2024-03-10,9,11,,,,,,,,,,,,16,17" | eval _time=strptime(day_of_week, "%F") | chart values(slack_hours) as slack_hours values(work_hours) as work_hours values(event*) as event* over _time we can save a column chart into a classic dashboard with the following configuration: <dashboard version="1.1" theme="light"> <label>intidev_chart</label> <row> <panel> <html> <style> #columnChart1 .ui-resizable { width: 500px !important; } #columnChart1 .highcharts-series.highcharts-series-1.highcharts-column-series { opacity: 0 !important; } </style> </html> <chart id="columnChart1"> <search> <query>| makeresults format=csv data="day_of_week,slack_hours,work_hours,event01,event02,event03,event04,event05,event06,event07,event08,event09,event10,event11,event12,event13 2024-03-04,14,11,23,,,,,,,,,,,, 2024-03-05,0,0,,21,22,23,,,,,,,,, 2024-03-06,0,11,,,,,3,11,,,,,,, 2024-03-07,9,11,,,,,,,13,15,,,,, 2024-03-08,9,11,,,,,,,,,14,15,16,, 2024-03-09,0,0,,,,,,,,,,,,, 2024-03-10,9,11,,,,,,,,,,,,16,17" | eval _time=strptime(day_of_week, "%F") | chart values(work_hours) as work_hours values(slack_hours) as slack_hours values(event*) as event* over _time</query> <earliest>-24h@h</earliest> <latest>now</latest> <sampleRatio>1</sampleRatio> </search> <option name="charting.axisLabelsY.majorTickVisibility">show</option> <option name="charting.axisLabelsY.majorUnit">1</option> <option name="charting.axisLabelsY.minorTickVisibility">hide</option> <option name="charting.axisTitleX.visibility">collapsed</option> <option name="charting.axisTitleY.text">Hour</option> <option name="charting.axisTitleY.visibility">visible</option> <option name="charting.axisX.abbreviation">none</option> <option name="charting.axisX.scale">linear</option> <option name="charting.axisY.abbreviation">none</option> <option name="charting.axisY.includeZero">1</option> <option name="charting.axisY.maximumNumber">24</option> <option name="charting.axisY.minimumNumber">0</option> <option name="charting.axisY.scale">linear</option> <option name="charting.chart">column</option> <option name="charting.chart.markerSize">16</option> <option name="charting.chart.nullValueMode">gaps</option> <option name="charting.chart.overlayFields">event01,event02,event03,event04,event05,event06,event07,event08,event09,event10,event11,event12,event13</option> <option name="charting.chart.stackMode">stacked</option> <option name="charting.drilldown">none</option> <option name="charting.layout.splitSeries">0</option> <option name="charting.layout.splitSeries.allowIndependentYRanges">0</option> <option name="charting.legend.placement">none</option> <option name="charting.fieldColors">{"work_hours": 0xc6e0b4}</option> <option name="height">500</option> </chart> </panel> </row> </dashboard> This gives us a dashboard similar to this: We can further manipulate the layout and colors with CSS and JavaScript (if available to us) and creative use of dashboard tokens.
The question is indeed a bit vaguely worded. But in general, you would first want to search only for authentication attempts (hard to say how since we don't know what data you have). It would be bes... See more...
The question is indeed a bit vaguely worded. But in general, you would first want to search only for authentication attempts (hard to say how since we don't know what data you have). It would be best if you had data normalized to CIM, then you could just seafch from the data model. Then you just do stats over your desired splitting fields. That should do the trick.
Count is not the same as volume. Unless you have a synthetic field added during ingestion (or use summary indexing), you have to calculate it manually (unfortunately you cannot use tstats for that so... See more...
Count is not the same as volume. Unless you have a synthetic field added during ingestion (or use summary indexing), you have to calculate it manually (unfortunately you cannot use tstats for that so it's gonna be costly since every single matching event has to be read and "measured") index=whatever <your other conditions> | eval eventlength=len(_raw) Now you can do some summarizing | bin _time span=1h | stats sum(eventlength) as volume by source component whatever This will give you one hour volumes. Now you can do with it whatever you want. Like the stats @ITWhisperer  already posted.
Hi @shocko, The typical approach discards lines at an intermediate heavy forwarder or indexer by sending them to nullQueue: # props.conf [my_sourcetype] # add line and event-breaking and timestamp... See more...
Hi @shocko, The typical approach discards lines at an intermediate heavy forwarder or indexer by sending them to nullQueue: # props.conf [my_sourcetype] # add line and event-breaking and timestamp extraction here TRANSFORMS-my_sourcetype_send_to_nullqueue = my_sourcetype_send_to_nullqueue # transforms.conf [my_sourcetype_send_to_nullqueue] # replace foo with a string or expression matching "keep" events REGEX = ^(?!foo). DEST_KEY = queue FORMAT = nullQueue As with @PickleRick, I've not seen a common use case for force_local_processing. I often say I don't want my application servers turning into Splunk servers, so I prioritize a lightweight forwarder configuration over data transfer. If CPU cores (fast growing files) and memory (large numbers of files) cost you less than network I/O, you may prefer the force_local_processing option; you won't save on disk I/O either way. If you need a refresher on the functions performed by the uft8, linebreaker, aggregator, and regexreplacement processors, see https://community.splunk.com/t5/Getting-Data-In/Diagrams-of-how-indexing-works-in-the-Splunk-platform-the-Masa/m-p/590781/highlight/true#M103485.
The depends and rejects options control visibility of the element.  That is the only function of the options.  To use a token in an element, just invoke the token name within the element.  There is n... See more...
The depends and rejects options control visibility of the element.  That is the only function of the options.  To use a token in an element, just invoke the token name within the element.  There is no need to "declare" the token as one might in a programming language.
There is no "search for specific values in any field" - where you have placed the token, it effectively searches the _raw field, and there doesn't appear to be anything wrong here. You have already ... See more...
There is no "search for specific values in any field" - where you have placed the token, it effectively searches the _raw field, and there doesn't appear to be anything wrong here. You have already got a "token-related condition". Please provide examples where this is not working for you, particularly with events which should have been found for a particular token value, or events which were found which shouldn't have been.
Hey Experts, I am encountering an issue  with using filter tokens in specific row on my dashboard. I have two filters named ABC and DEF, tokens represented for ABC is $abc$ and DEF is $def$.  I... See more...
Hey Experts, I am encountering an issue  with using filter tokens in specific row on my dashboard. I have two filters named ABC and DEF, tokens represented for ABC is $abc$ and DEF is $def$.  I want to pass these tokens only to one specific row, while for others, I want to reject them.  For the rows where i need to pass the tokens, I've used the following syntax:  <row depends="$abc$ $def$"></row> For the row where i don't want to use the token, I've used the following syntax;  <row rejects="$abc$ $def$"></row>. However when i use the rejects condition, the rows are hidden. I want these rows to still be visible. Any help or example queries would be greatly appreciated. Thank You!
You are reading his request backwards.  That git project is for SENDING TO OpenCTI.  He (and I) need to RECEIVE FROM OpenCTI.  I cannot find anything that does this.
Hi @phanikumarcs , you have to declare the field that you want to use for the value in the text input, otherwise it willsearch in the raw text, and e.g. the host field usually isn't in the raw event... See more...
Hi @phanikumarcs , you have to declare the field that you want to use for the value in the text input, otherwise it willsearch in the raw text, and e.g. the host field usually isn't in the raw event, but in metadata. but this add to your dashboard an additional issue: if the eventId field isn't present in all the events, adding event_Id=* will exclude from the results all the events without this field, so beware to how you use this input. Ciao. Giuseppe
@gcusello great, understood. Suppose when i want to search the Server field value (goo1) in the EventID Textbox, it will display the results of goo1, similar to other fields as well (Message, Seve... See more...
@gcusello great, understood. Suppose when i want to search the Server field value (goo1) in the EventID Textbox, it will display the results of goo1, similar to other fields as well (Message, Severity)
This is the sample stats command for my log.  index=company app=abc | stats count by component I don't have field for volume. We have to calculate volume from the stats count.