All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

    <form version="1.1" theme="light"> <label> Report </label> <search id="Night"> <query>| inputlookup handover_timeline.csv | dedup Shift Date | search Shift="Night" | appendcols [| maker... See more...
    <form version="1.1" theme="light"> <label> Report </label> <search id="Night"> <query>| inputlookup handover_timeline.csv | dedup Shift Date | search Shift="Night" | appendcols [| makeresults count=24 | streamstats count as Timeline | eval Timeline=if(Timeline&lt;10, "0".Timeline.":00", Timeline.":00") | table Timeline] | streamstats first(Date) as Date, first(Shift) as Shift | tail 6 | sort Timeline | append [| inputlookup handover_timeline.csv | dedup Shift Date | search Shift="Night" | appendcols [| makeresults count=24 | streamstats count as Timeline | eval Timeline=if(Timeline&lt;10, "0".Timeline.":00", Timeline.":00") | table Timeline] | streamstats first(Date) as Date, first(Shift) as Shift | head 6 ] | fields Date Shift Timeline "Hourly details of shift"</query> <done> <set token="SID1">$job.sid$</set> </done> </search> <search id="Day"> <query>| inputlookup handover_timeline.csv | dedup Shift Date | search Shift=Day | appendcols [| makeresults count=24 | streamstats count as Timeline | eval Timeline=if(Timeline&lt;10, "0".Timeline.":00", Timeline.":00") | table Timeline] | streamstats first(Date) as Date, first(Shift) as Shift | streamstats count as row_number | eventstats max(row_number) as total_rows | where row_number &gt; 6 AND row_number &lt;= total_rows - 6 | fields - row_number, total_rows</query> <done> <set token="SID2">$job.sid$</set> </done> </search> <search> <query> | makeresults | eval token="$date_tok$" | eval earliest=if(token="today", relative_time(now(), "@d"), strptime(token, "%d/%m/%Y")) | eval latest=if(token="today", now(), earliest + 86400) | table earliest, latest </query> <finalized> <set token="earliest_tok">$result.earliest$</set> <set token="latest_tok">$result.latest$</set> </finalized> <earliest>-7d@d</earliest> <latest>now</latest> <refresh>300</refresh> <refreshType>delay</refreshType> </search> <fieldset submitButton="false"> <input type="dropdown" token="date_tok" searchWhenChanged="true"> <label>Date:</label> <fieldForLabel>Date</fieldForLabel> <fieldForValue>Date</fieldForValue> <search> <query> | makeresults | timechart span=1d count | sort - _time | eval Date=strftime(_time, "%d/%m/%Y"), earliest=relative_time(_time, "@d") | table Date, earliest | tail 7 | sort - earliest </query> <earliest>-7d@h</earliest> <latest>now</latest> </search> <choice value="today">Today</choice> <initialValue>today</initialValue> <default>today</default> </input> <input type="dropdown" token="shift_tok" searchWhenChanged="true"> <label>Shift:</label> <choice value="Day">Day</choice> <choice value="Night">Night</choice> <default>Day</default> <initialValue>Day</initialValue> <change> <condition match="$value$ == 'Day'"> <set token="selected_shift">$SID1$</set> </condition> <condition match="$value$ == 'Night'"> <set token="selected_shift">$SID2$</set> </condition> </change> </input> </fieldset> <row> <panel> <html> NOTES: The data shown corresponds to the start of the shift, which is 6:45 AM for the Day shift and 6:45 PM for the Night shift. </html> </panel> </row> <row> <panel id="flf"> <title>FLF</title> <single> <search> <query>| inputlookup daily_ticket_count.csv | eval today = strftime(now(), "%d/%m/%Y") | eval Date = if(Date == today, "today", Date) | search Shift="$shift_tok$" Date="$date_tok$" | where isnotnull(FLF_perc) | head 1 | fields FLF_perc</query> <earliest>$earliest_tok$</earliest> <latest>$latest_tok$</latest> </search> <option name="drilldown">none</option> <option name="height">75</option> <option name="numberPrecision">0.00</option> <option name="rangeColors">["0xd93f3c","0x65a637"]</option> <option name="rangeValues">[80]</option> <option name="refresh.display">none</option> <option name="unit">%</option> <option name="unitPosition">after</option> <option name="useColors">1</option> </single> </panel> <panel> <title>Ticket Count</title> <table> <search> <query>| inputlookup daily_ticket_count.csv | eval today = strftime(now(), "%d/%m/%Y") | eval Date = if(Date == today, "today", Date) | search Shift="$shift_tok$" Date="$date_tok$" type IN ("Request", "Incident") | fields - FLF_perc | head 2</query> <earliest>$earliest_tok$</earliest> <latest>$latest_tok$</latest> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row> <row> <panel> <title>Timeline</title> <table> <title>$shift_tok$</title> <search> <query>| loadjob $selected_shift$ | table Date Shift Timeline "Hourly details of shift"</query> </search> <option name="count">13</option> <option name="drilldown">none</option> </table> </panel> </row> </form> Now getting this message for the Timeline panel. Search is waiting for input.   Full XML above, if someone can spot any errors.
Try like this <input type="dropdown" token="shift_tok" searchWhenChanged="true"> <label>Shift:</label> <choice value="Day">Day</choice> <choice value="Night">Night</choice> <... See more...
Try like this <input type="dropdown" token="shift_tok" searchWhenChanged="true"> <label>Shift:</label> <choice value="Day">Day</choice> <choice value="Night">Night</choice> <default>Day</default> <initialValue>Day</initialValue> <change> <condition match="$value$ == 'Day'"> <set token="selected_shift">$SID1$</set> </condition> <condition match="$value$ == 'Night'"> <set token="selected_shift">$SID2$</set> </condition> </change> </input> <row> <panel> <title>Timeline</title> <table> <title>$shift_tok$</title> <search> <query>| loadjob $selected_shift$ | table Date Shift Timeline "Hourly details of shift"</query> </search> <option name="count">13</option> <option name="drilldown">none</option> </table> </panel> </row>
Thanks @PickleRick for the detailed explanation! it's very helpful
Hello, i face strugling to make base search using a datamodel with tstats command. My objective is to make dashboard easily access with tsats datamodels and chain search for each panel with that. Thi... See more...
Hello, i face strugling to make base search using a datamodel with tstats command. My objective is to make dashboard easily access with tsats datamodels and chain search for each panel with that. This my sample  | tstats summariesonly=true values(Vulnerabilities_Custom.Vulnerabilities_Non_Remediation.dest) as dest values(Vulnerabilities_Custom.Vulnerabilities_Non_Remediation.hostname) as hostname values(Vulnerabilities_Custom.Vulnerabilities_Non_Remediation.os_type) as os_type values(Vulnerabilities_Custom.Vulnerabilities_Non_Remediation.exploit_title) as exploit_title values(Vulnerabilities_Custom.Vulnerabilities_Non_Remediation.malware_title) as malware_title from datamodel=Vulnerabilities_Custom.Vulnerabilities_Non_Remediation where nodename IN ("Vulnerabilities_Custom.Vulnerabilities_Non_Remediation", "Vulnerabilities_Custom.High_Or_Critical_Vulnerabilities_Non_Remediation", "Vulnerabilities_Custom.Medium_Vulnerabilities_Non_Remediation", "Vulnerabilities_Custom.Low_Or_Informational_Vulnerabilities_Non_Remediation") by Vulnerabilities_Custom.Vulnerabilities_Non_Remediation._time, Vulnerabilities_Custom.Vulnerabilities_Non_Remediation.dest | table event_time dest hostname os_type exploit_title malware_title  Has anyone have clues about this?   
How can I constantly hit a http end point in a remote server to collect useful metrics and then import it to splunk hourly for example and use it for useful visualisations?
Thanks for getting back to me. This is what I`ve done:  - base searches: <search id="Night"> <query>...</query> <done> <set token="SID1">$job.sid$</set> </done> </search> ... See more...
Thanks for getting back to me. This is what I`ve done:  - base searches: <search id="Night"> <query>...</query> <done> <set token="SID1">$job.sid$</set> </done> </search> <search id="Day"> <query>...</query> <done> <set token="SID2">$job.sid$</set> </done>   - dropdown input: <input type="dropdown" token="shift_tok" searchWhenChanged="true"> <label>Shift:</label> <choice value="Day">Day</choice> <choice value="Night">Night</choice> <default>Day</default> <initialValue>Day</initialValue> <change> <condition match="$value$ == 'Day'"> <set token="selected_shift">Day</set> </condition> <condition match="$value$ == 'Night'"> <set token="selected_shift">Night</set> </condition> </change> </input>    - panel: <row> <panel> <title>Timeline</title> <table> <title>$shift_tok$ $selected_shift$</title> <search base="$selected_shift$"> <query>| table Date Shift Timeline "Hourly details of shift"</query> </search> <option name="count">13</option> <option name="drilldown">none</option> </table> </panel> </row>  The $selected_shift$ token doesn`t seem to be working properly - any idea ? Thanks.
Is this a per profile basis? Per cluster basis? how does this restart back?  
Assuming it is not simple a typo and case does matter (Shift_tok is not the same as shift_tok), then you could try setting a different token in the done handler of each of your two bases with the job... See more...
Assuming it is not simple a typo and case does matter (Shift_tok is not the same as shift_tok), then you could try setting a different token in the done handler of each of your two bases with the job.sid, then use the change handler of the dropdown to copy the relevant sid token value into a token which you use in your search with the loadjob command
I`ve got 2 base searches:      <search id="Night">     and      <search id="Day">      And a dropdown input:     <input type="dropdown" token="shift_tok" searchWhenChanged="true"> ... See more...
I`ve got 2 base searches:      <search id="Night">     and      <search id="Day">      And a dropdown input:     <input type="dropdown" token="shift_tok" searchWhenChanged="true"> <label>Shift:</label> <choice value="Day">Day</choice> <choice value="Night">Night</choice> <default>Day</default> <initialValue>Day</initialValue> </input>      I need to find a way to reference the base searches, depending on the input provided by the user. I was hoping to use a token to reference the base searches, but donesn`t seem to be working:     <row> <panel> <title>Timeline</title> <table> <title>$shift_tok$</title> <search base="$Shift_tok$"> <query>| table Date Shift Timeline "Hourly details of shift"</query> </search> <option name="count">13</option> <option name="drilldown">none</option> </table> </panel> </row> </form>  
Theoretically, you could set up forwarding logs to HEC endpoint by defining proper destination and message template. But. As Palo themselves write on their docs page - "Log forwarding to an HTTP se... See more...
Theoretically, you could set up forwarding logs to HEC endpoint by defining proper destination and message template. But. As Palo themselves write on their docs page - "Log forwarding to an HTTP server is designed for log forwarding at low frequencies and is not recommend for deployments with a high volume of log forwarding. You may experience log loss when forwarding to an HTTP server if your deployment generate a high volume of logs that need to be forwarded." Which actually makes sense since it seems Palo will emit a separate HTTP request for each event which might flood your receiver in case of - for example - traffic logs. (and I'm not sure how well it does with keepalives and reusing connections). It doesn't seem to be able to aggregate multiple events into a single request. So it is indeed a typical approach to send events via syslog to any reasonable modern syslog daemon (either rsyslog or syslog-ng) and handle it there. They can either write them to file which will be picked up by UF or can aggregate them (at least rsyslog can, I'm not that good with syslog-ng but I suppose it does as well) into bigger batches and send much lower number of requests to destination HEC endpoint (like a single HTTP request for every 100 events to send). Of course you have much more flexibility in processing data in-transit if you use an intermediate syslog daemon.
Thanks!! Yes something like x axis time and y axis is count column in bar chart + line chart of processing_time so it looks correct However I think the processing_time calculation is abit tricky: ... See more...
Thanks!! Yes something like x axis time and y axis is count column in bar chart + line chart of processing_time so it looks correct However I think the processing_time calculation is abit tricky: How can I calculate the processing_time for the below 21:13:12,825: done bulk saving messages should have a processing_time as per below: Look at 2nd prev "All read threads finished flush.." which is 21:13:12,528 and take the time difference so is 297ms, but I can't use transaction since there are also many other "all read threads finished in between 2 done bulk saving messages  2024-08-07 21:13:12,007 [39] INFO DistributorCommon.DBHandlerBase [(null)] - Done Bulk saving messages, Count=1, used 113 ms 2024-08-07 21:13:12,007 [15] INFO DistributorCommon.WMQClient [(null)] - No msg in the queue (NoMessageCounter=8), retry in 10 ms. 2024-08-07 21:13:12,054 [39] INFO DistributorCommon.WMQClient [(null)] - Saved messages to DB, Q Manager to Commit (Remove messages from Queue) 2024-08-07 21:13:12,132 [15] INFO DistributorCommon.WMQClient [(null)] - No msg in the queue (NoMessageCounter=9), retry in 10 ms. 2024-08-07 21:13:12,179 [39] INFO DistributorCommon.WMQClient [(null)] - Clear Write Buffer 2024-08-07 21:13:12,257 [39] INFO DistributorCommon.WMQClient [(null)] - All Read threads finished flush the messages, total messages: 0 2024-08-07 21:13:12,398 [39] INFO DistributorCommon.WMQClient [(null)] - All Read threads finished flush the messages, total messages: 0 2024-08-07 21:13:12,528 [39] INFO DistributorCommon.WMQClient [(null)] - All Read threads finished flush the messages, total messages: 0 2024-08-07 21:13:12,778 [33] INFO DistributorCommon.WMQClient [(null)] - Message Read from Queue, Message Length:4668 2024-08-07 21:13:12,809 [39] INFO DistributorCommon.WMQClient [(null)] - All Read threads finished flush the messages, total messages: 1 2024-08-07 21:13:12,809 [39] INFO DistributorCommon.WMQClient [(null)] - Processing messages, Count=1 2024-08-07 21:13:12,809 [39] INFO DistributorCommon.WMQClient [(null)] - Done Processing messages, Count=1, IsBufferedEvent=True 2024-08-07 21:13:12,809 [39] INFO DistributorCommon.DBHandlerBase [(null)] - Bulk saving messages, Count=1 2024-08-07 21:13:12,825 [39] INFO DistributorCommon.DBHandlerBase [(null)] - Done Bulk saving messages, Count=1, used 24 ms  
Hi, I'm unable to launch the Splunk Add-on on AWS page on the Admin console, page show as Loading but no output at all. No abnormalities seen in the splunkd.log, only some checksum mismatch errors. ... See more...
Hi, I'm unable to launch the Splunk Add-on on AWS page on the Admin console, page show as Loading but no output at all. No abnormalities seen in the splunkd.log, only some checksum mismatch errors.  My splunk was recently upgraded to 9.2.2, last tried on earlier version it was working.  Splunk Add-on on AWS version is 5.1.0. Can I check if anyone came across the same issue and managed to resolve it?
I'm not sure you understand the macros correctly. if you define a macro with two parameters paramA and paramB it will get substituted in your search with whatever values you specify for them. These ... See more...
I'm not sure you understand the macros correctly. if you define a macro with two parameters paramA and paramB it will get substituted in your search with whatever values you specify for them. These are separate layers.
OK. That actually makes sense. I'm no AD expert but indeed as far as I remember you cannot use local accounts on domain controllers - all "local" accounts are indeed domain accounts. If this is not d... See more...
OK. That actually makes sense. I'm no AD expert but indeed as far as I remember you cannot use local accounts on domain controllers - all "local" accounts are indeed domain accounts. If this is not described in the forwarder installation manual, it could be worth posting a feedback (there is a feedback form on the bottom of every doc page).
You simply have to use the variable any way is appropriate for the programming/scripting solution you're using - powershell, python, whatever. One important thing though - %USERPROFILE% in case of a... See more...
You simply have to use the variable any way is appropriate for the programming/scripting solution you're using - powershell, python, whatever. One important thing though - %USERPROFILE% in case of a scripted input run by splunkd.exe will at best point to Splunk Forwarder's technical user's profile. Is that what you want? Why not simply use the forwarder's SPLUNK_HOME variable then?
No. It's not about the client certificate. I understand that the FIN/ACK packet comes from your end of the connection. And the message clearly indicates that it's the server's certificate which is no... See more...
No. It's not about the client certificate. I understand that the FIN/ACK packet comes from your end of the connection. And the message clearly indicates that it's the server's certificate which is not trusted. I asked about on-prev vs. cloud earlier because the additional question with an on-prem installation is whether you are using any TLS-inspection tools in your network. Either as an explicit proxy or as pass-through appliance. Anyway, first thing I'd try would be to simply openssl s_client to that Cisco service and make sure what the cert looks like before you start looking for local trusted cert store.
Finally, and after 3 weeks problem is solved and I was sent a new 6-month 10GB/day developer license.  In the meantime I've tried to apply for renewal on the web site for 3 additional times sent ... See more...
Finally, and after 3 weeks problem is solved and I was sent a new 6-month 10GB/day developer license.  In the meantime I've tried to apply for renewal on the web site for 3 additional times sent 3 emails to devinfo@splunk.com got in-touch with sales guy The sales guy (Hubert) finally sent my a temp 45-day 10GB/per day license to come over this and promised to find out what happens. In addition he mentioned that possibly developer licenses are stopped and no longer available. But only 2 days later a very kind Lady (Rinita) from Splunk sent me an email with copy to another license guy, apologizing for delay. And finally the license was sent last night with any further comment. So, whatever might happen to you guys when experiencing similar things: never give up :-), finally you will find someone able and willing to assist. That's great guys ! Ekkehard
Hi, according to the log, our customer has exactly the same error. Will try to convince them, to uninstall UF completely and install once again. Their first install ended with: SetAccountType: Er... See more...
Hi, according to the log, our customer has exactly the same error. Will try to convince them, to uninstall UF completely and install once again. Their first install ended with: SetAccountType: Error 0x80004005: Cannot set USE_ADMIN_USER=1 since the local users/groups are not available on Domain Controllers. May be this installation trial causes the failure later on...
Installation by domain admin with current UF fails on the domain controller with error 1603 and there are additional loglines that may be usefull: MSI (s) (F0:34) [08:38:07:344]: Note: 1: 1708 MSI ... See more...
Installation by domain admin with current UF fails on the domain controller with error 1603 and there are additional loglines that may be usefull: MSI (s) (F0:34) [08:38:07:344]: Note: 1: 1708 MSI (s) (F0:34) [08:38:07:344]: Note: 1: 2205 2: 3: Error MSI (s) (F0:34) [08:38:07:344]: Note: 1: 2228 2: 3: Error 4: SELECT `Message` FROM `Error` WHERE `Error` = 1708 MSI (s) (F0:34) [08:38:07:344]: Note: 1: 2205 2: 3: Error MSI (s) (F0:34) [08:38:07:344]: Note: 1: 2228 2: 3: Error 4: SELECT `Message` FROM `Error` WHERE `Error` = 1709 MSI (s) (F0:34) [08:38:07:344]: Product: UniversalForwarder -- Installation failed. MSI (s) (F0:34) [08:38:07:344]: Das Produkt wurde durch Windows Installer installiert. Produktname: UniversalForwarder. Produktversion: 9.3.0.0. Produktsprache: 1033. Hersteller: Splunk, Inc.. Erfolg- bzw. Fehlerstatus der Installation: 1603. I thought I would attach the complete log for examination, unfortunably not possible here.  Up to the showed lines everything is running good, no hint of missing permissions etc. Any help is appreciated, our project depends on it.
Use macro params to pass these tokens.  Here is an example: Name Definition Arguments non-alphabetic-token(2) index=_internal earliest=$earliest_tok$ latest=$latest_tok$ earliest_tok, late... See more...
Use macro params to pass these tokens.  Here is an example: Name Definition Arguments non-alphabetic-token(2) index=_internal earliest=$earliest_tok$ latest=$latest_tok$ earliest_tok, latest_tok   <form version="1.1" theme="light"> <label>Non-alphabetic tokens</label> <description>https://community.splunk.com/t5/Splunk-Search/How-to-include-arguments-in-search-macros-with-non-alphanumeric/m-p/696333#M236667</description> <fieldset submitButton="false"> <input type="time" token="timepicker" searchWhenChanged="true"> <label>pick time</label> <default> <earliest>-15m</earliest> <latest>now</latest> </default> </input> </fieldset> <row> <panel> <title>$timepicker.earliest$</title> <table> <search> <query>`non-alphabetic-token($timepicker.earliest$, $timepicker.latest$)` | addinfo | stats count by info_min_time info_max_time | foreach info_* [eval &lt;&lt;FIELD&gt;&gt; = strftime(&lt;&lt;FIELD&gt;&gt;, "%F %T")]</query> <earliest>-24h@h</earliest> <latest>now</latest> </search> <option name="drilldown">none</option> <option name="refresh.display">progressbar</option> </table> </panel> </row> </form>   Here are some sample interactions: