All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I found out that the message about duplicated configuration is from ES (Enterprise Security). This check has a period of 10 minutes. apps/SplunkEnterpriseSecuritySuite/bin/configuration_checks/confc... See more...
I found out that the message about duplicated configuration is from ES (Enterprise Security). This check has a period of 10 minutes. apps/SplunkEnterpriseSecuritySuite/bin/configuration_checks/confcheck_es_correlationmigration.py:MSG_DUPLICATED_STANZA = 'Configuration file settings can be duplicated in multiple applications: stanza="%s" conf_type="%s" apps="%s"' I tested the above scenario on a clean Splunk Enterprise without ES and the behavior matches the documentation, but btool not. [splunk@siemsearch01 apps]$ cat ATest_app/default/savedsearches.conf [ss] search = `super_macro` atest [splunk@siemsearch01 apps]$ cat Test_app/default/savedsearches.conf [ss] search = `super_macro` test request.ui_dispatch_app = search disabled = 0 alert.track = 0 cron_schedule = */2 * * * * dispatch.earliest_time = -4m dispatch.latest_time = -2m enableSched = 1 [splunk@siemsearch01 apps]$ cat ZTest_app/default/savedsearches.conf [ss] search = `super_macro` ztest [splunk@siemsearch01 apps]$ [splunk@siemsearch01 apps]$ /opt/splunk/bin/splunk btool savedsearches list --debug ss | grep "search =" /opt/splunk/etc/apps/ATest_app/default/savedsearches.conf search = `super_macro` atest btool returns:  search = `super_macro` atest Gui returns:  index=ztest ztest, i.e. `super_macro` ztest    
I am using Splunk Enterprise trial version. and i am able to see live events coming from Paloalto to Splunk. but when I am selecting last 30 min logs in splunk its not showing anything when I am sele... See more...
I am using Splunk Enterprise trial version. and i am able to see live events coming from Paloalto to Splunk. but when I am selecting last 30 min logs in splunk its not showing anything when I am selecting all time then its showing the latest events as well.  I am assuming this can be due to timezone issue. I tried to change the Palotalto time Zone but that also didn't work.  as per you solution using props.conf. can you please help me what need to be change there. my paloalto time Zone is US/Pecific/.  I am completely new to splunk. @richgalloway 
What version of the app are you using?  Does the vulnerability tool report a CVE?  What is it?
Your settings look good to me. The first problem may be with the documentation.  Submit feedback on the docs page telling them that btool doesn't match the documentation and they should update the d... See more...
Your settings look good to me. The first problem may be with the documentation.  Submit feedback on the docs page telling them that btool doesn't match the documentation and they should update the docs. I'm not sure what can be done about the second problem other than ignoring it.
Splunk Cloud operates in the UTC time zones.  Data could come in from any of 23+ other time zones so trying to get them to match is futile. The correct process is to tell Splunk what time zone the d... See more...
Splunk Cloud operates in the UTC time zones.  Data could come in from any of 23+ other time zones so trying to get them to match is futile. The correct process is to tell Splunk what time zone the data is from and let it adjust it to the system time.  Do that using props.conf.  The best method to use depends on the data itself.  See the Admin Manual's description of the TZ setting for more information. The algorithm for determining the time zone for a particular event is as follows: * If the event has a timezone in its raw text (for example, UTC, -08:00), use that. * If TZ is set to a valid timezone string, use that. * If the event was forwarded, and the forwarder-indexer connection uses the version 6.0 and higher forwarding protocol, use the timezone provided by the forwarder. * Otherwise, use the timezone of the system that is running splunkd.
In a word: nothing. Ingestion will not stop.  Unless you're grossly over the contracted amount, Splunk is unlikely to contact you. At license renewal time, however, Splunk may recommend a higher li... See more...
In a word: nothing. Ingestion will not stop.  Unless you're grossly over the contracted amount, Splunk is unlikely to contact you. At license renewal time, however, Splunk may recommend a higher limit (at higher cost, no doubt).
Configuration page failed to load,  Something went wrong! Unable to xml-parse the following data: %s I have installed the updated Splunk Add on for Microsoft cloud services on Splunk Ente... See more...
Configuration page failed to load,  Something went wrong! Unable to xml-parse the following data: %s I have installed the updated Splunk Add on for Microsoft cloud services on Splunk Enterprise Free trails but getting this error while configuration    Your response will help to resolve this issue
_time is set for the event when it is ingested. If you change the way the event is ingested, then perhaps you could use a different part of the event for the timestamp. However, this would only apply... See more...
_time is set for the event when it is ingested. If you change the way the event is ingested, then perhaps you could use a different part of the event for the timestamp. However, this would only apply going forward, it would not re-index the existing events. One possibility is that you could copy (using the collect command) the events to another index having reset the _time field to the value you want.
I am working Service now logs in Splunk. The tickets data has one field called "sys_created" this field gives the ticket created time in "%Y-%m-%d %H:%M:%S" format. when I am running the query for t... See more...
I am working Service now logs in Splunk. The tickets data has one field called "sys_created" this field gives the ticket created time in "%Y-%m-%d %H:%M:%S" format. when I am running the query for the last 7 days. The tickets which were raised before 7 days are also populating because of another field called sys_updated. This sys_updated field will store all the updates in the tickets, so if an old ticket is updated within last 7 days, it will be populated when i keep timerange picker as last 7 days. Is there a way to consider "sys_created"  as "_time" ?
Hi, I am testing the Security Essentials App 3.8.0 in Splunk 9.0.8, and I found the same issue while trying to activate the following contents: Unknown Process Using The Kerberos Protocol Windows... See more...
Hi, I am testing the Security Essentials App 3.8.0 in Splunk 9.0.8, and I found the same issue while trying to activate the following contents: Unknown Process Using The Kerberos Protocol Windows Steal or Forge Kerberos Tickets Klist ServicePrincipalNames Discovery with SetSPN Rubeus Command Line Parameters Mimikatz PassTheTicket CommandLine Parameters In all cases above, I get two errors:  "Must have data in data model Endpoint.Processes" is in red even though I have installed several Add-ons suggested as compatible such as Splunk Add-on for Microsoft Windows 8.9.0 Palo Alto Networks Add-on for Splunk 8.1.1 Error in 'SearchParser': The search specifies a macro 'summariesonly_config' that cannot be found.  I searched that missing macro and indeed it does not exist. Should I create it manually? With which value? Do you have any idea how to fix those two errors? Many thanks
Due to Office 365 connectors in Microsoft Teams will be retired. Have anyone success to transit from Office 365 connectors to Workflows in the splunk enterprise solution? Could anyone give me some ... See more...
Due to Office 365 connectors in Microsoft Teams will be retired. Have anyone success to transit from Office 365 connectors to Workflows in the splunk enterprise solution? Could anyone give me some document to do this or the workflow template that work with the splunk enterprise solution?
Is this case is the splunk enterprise or splunk cloud
Try this over last 30 days index=* | timechart span=1d count by sourcetype
I used [WinEventLog://Security] checkpointInterval = 5 current_only = 0 disabled = 0 blacklist1 = EventCode="4688" Message="New Process Name: (?i)(?:[C-F]:\Program Files\Splunk(?:UniversalForwarder... See more...
I used [WinEventLog://Security] checkpointInterval = 5 current_only = 0 disabled = 0 blacklist1 = EventCode="4688" Message="New Process Name: (?i)(?:[C-F]:\Program Files\Splunk(?:UniversalForwarder)?\bin\(?:btool|splunkd|splunk|splunk-(?:MonitorNoHandle|admon|netmon|perfmon|powershell|regmon|winevtlog|winhostinfo|winprintmon|wmi)).exe)" start_from = oldest but it stops all 4688,  like not process other information
 
What do you get if you set the timeframe for that search to be last 30 days?
 
Check your index to see when data was last entered | metadata type=sourcetypes index=test | fieldformat recentTime=strftime(recentTime,"%F %T") | fieldformat firstTime=strftime(firstTime,"%F %T") | ... See more...
Check your index to see when data was last entered | metadata type=sourcetypes index=test | fieldformat recentTime=strftime(recentTime,"%F %T") | fieldformat firstTime=strftime(firstTime,"%F %T") | fieldformat lastTime=strftime(lastTime,"%F %T")
I have my reasons. I don't want to impose changes on the local. I need to use the original addon and add my correctly named addon to it, which would override the search= parameter in original one. ... See more...
I have my reasons. I don't want to impose changes on the local. I need to use the original addon and add my correctly named addon to it, which would override the search= parameter in original one. Orig add-on is Splunk_TA_openldap default/savedsearches.conf [Update openldap_user_lookup KV Store collection] request.ui_dispatch_app = search disabled = 0 alert.track = 0 cron_schedule = */2 * * * * dispatch.earliest_time = -4m dispatch.latest_time = -2m enableSched = 1 search = sourcetype="openldap:access" operation="BIND" | dedup conn cn | table conn op cn | rename cn as user | lookup openldap_user_lookup conn, op OUTPUTNEW _key AS _key | outputlookup append=t openldap_user_lookup My append is A10_aaa_ta_openldap default/savedsearches.conf [Update openldap_user_lookup KV Store collection] search = `openldap_index` sourcetype="openldap:access" operation="BIND" | dedup conn cn | table conn op cn | rename cn as user | lookup openldap_user_lookup conn, op OUTPUTNEW _key AS _key | outputlookup append=t openldap_user_lookup I know btool and I am using it.  There are more problems. One is that according to btool, the savedsearch.conf precedence does not behave as documented, i.e. app/user context with reverse reverse-lexicographic order. The second is that Splunk reports a problem with duplicate configuration. So far I haven't found any information in the documentation that savedsearches.conf should behave differently than for example macros, props etc.
Hi All  We have created a dashboard to monitor CCTV and it was working fine. However suddenly data stopped populating.  We have done any change.  My finding  1 - If i select last 30 days i can see... See more...
Hi All  We have created a dashboard to monitor CCTV and it was working fine. However suddenly data stopped populating.  We have done any change.  My finding  1 - If i select last 30 days i can see the dashboard working fine  2 - If i select time range last 20 days i can the dashboard is not working 3 - Started trouble shooting the issue and found the below  Spl query The below works fine when the time range is last 30 days  working - index=test 1sourcetype="stream" NOT upsModel=*1234* |rename Device AS "UPS " |rename Model AS "UPS Model" |rename MinRemaining AS "Runtime Remaining" |replace 3 WITH Utility, 4 WITH Bypass IN "Input Source" |sort "Runtime Remaining" |dedup "UPS Name" |table "UPS Name" "UPS Model" "Runtime Remaining" "Source" "Location" Note- The same spl query dont work when time range is last 20 days.  Trouble shooting - Splunk receiving data till date however i have notice few thing,  When i select last 30 days i can see the by fields in the search  UPS Name , UPS Model , Runtime Remaining , Source When i select last 20 days the below fields are missing not sure why?  Missing fields - UPS Name , UPS Model , Runtime Remaining , Source . So the below SPL query is not showing any data  index=test 1sourcetype="stream" NOT upsModel=*1234* |rename Device AS "UPS " |rename Model AS "UPS Model" |rename MinRemaining AS "Runtime Remaining" |replace 3 WITH Utility, 4 WITH Bypass IN "Input Source" |sort "Runtime Remaining" |dedup "UPS Name" -  |table "UPS Name" "UPS Model" "Runtime Remaining" "Source" "Location" The highlighted part not pulling any data due to missing field.   Thanks