All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Configuration page failed to load,  Something went wrong! Unable to xml-parse the following data: %s I have installed the updated Splunk Add on for Microsoft cloud services on Splunk Ente... See more...
Configuration page failed to load,  Something went wrong! Unable to xml-parse the following data: %s I have installed the updated Splunk Add on for Microsoft cloud services on Splunk Enterprise Free trails but getting this error while configuration    Your response will help to resolve this issue
_time is set for the event when it is ingested. If you change the way the event is ingested, then perhaps you could use a different part of the event for the timestamp. However, this would only apply... See more...
_time is set for the event when it is ingested. If you change the way the event is ingested, then perhaps you could use a different part of the event for the timestamp. However, this would only apply going forward, it would not re-index the existing events. One possibility is that you could copy (using the collect command) the events to another index having reset the _time field to the value you want.
I am working Service now logs in Splunk. The tickets data has one field called "sys_created" this field gives the ticket created time in "%Y-%m-%d %H:%M:%S" format. when I am running the query for t... See more...
I am working Service now logs in Splunk. The tickets data has one field called "sys_created" this field gives the ticket created time in "%Y-%m-%d %H:%M:%S" format. when I am running the query for the last 7 days. The tickets which were raised before 7 days are also populating because of another field called sys_updated. This sys_updated field will store all the updates in the tickets, so if an old ticket is updated within last 7 days, it will be populated when i keep timerange picker as last 7 days. Is there a way to consider "sys_created"  as "_time" ?
Hi, I am testing the Security Essentials App 3.8.0 in Splunk 9.0.8, and I found the same issue while trying to activate the following contents: Unknown Process Using The Kerberos Protocol Windows... See more...
Hi, I am testing the Security Essentials App 3.8.0 in Splunk 9.0.8, and I found the same issue while trying to activate the following contents: Unknown Process Using The Kerberos Protocol Windows Steal or Forge Kerberos Tickets Klist ServicePrincipalNames Discovery with SetSPN Rubeus Command Line Parameters Mimikatz PassTheTicket CommandLine Parameters In all cases above, I get two errors:  "Must have data in data model Endpoint.Processes" is in red even though I have installed several Add-ons suggested as compatible such as Splunk Add-on for Microsoft Windows 8.9.0 Palo Alto Networks Add-on for Splunk 8.1.1 Error in 'SearchParser': The search specifies a macro 'summariesonly_config' that cannot be found.  I searched that missing macro and indeed it does not exist. Should I create it manually? With which value? Do you have any idea how to fix those two errors? Many thanks
Due to Office 365 connectors in Microsoft Teams will be retired. Have anyone success to transit from Office 365 connectors to Workflows in the splunk enterprise solution? Could anyone give me some ... See more...
Due to Office 365 connectors in Microsoft Teams will be retired. Have anyone success to transit from Office 365 connectors to Workflows in the splunk enterprise solution? Could anyone give me some document to do this or the workflow template that work with the splunk enterprise solution?
Is this case is the splunk enterprise or splunk cloud
Try this over last 30 days index=* | timechart span=1d count by sourcetype
I used [WinEventLog://Security] checkpointInterval = 5 current_only = 0 disabled = 0 blacklist1 = EventCode="4688" Message="New Process Name: (?i)(?:[C-F]:\Program Files\Splunk(?:UniversalForwarder... See more...
I used [WinEventLog://Security] checkpointInterval = 5 current_only = 0 disabled = 0 blacklist1 = EventCode="4688" Message="New Process Name: (?i)(?:[C-F]:\Program Files\Splunk(?:UniversalForwarder)?\bin\(?:btool|splunkd|splunk|splunk-(?:MonitorNoHandle|admon|netmon|perfmon|powershell|regmon|winevtlog|winhostinfo|winprintmon|wmi)).exe)" start_from = oldest but it stops all 4688,  like not process other information
 
What do you get if you set the timeframe for that search to be last 30 days?
 
Check your index to see when data was last entered | metadata type=sourcetypes index=test | fieldformat recentTime=strftime(recentTime,"%F %T") | fieldformat firstTime=strftime(firstTime,"%F %T") | ... See more...
Check your index to see when data was last entered | metadata type=sourcetypes index=test | fieldformat recentTime=strftime(recentTime,"%F %T") | fieldformat firstTime=strftime(firstTime,"%F %T") | fieldformat lastTime=strftime(lastTime,"%F %T")
I have my reasons. I don't want to impose changes on the local. I need to use the original addon and add my correctly named addon to it, which would override the search= parameter in original one. ... See more...
I have my reasons. I don't want to impose changes on the local. I need to use the original addon and add my correctly named addon to it, which would override the search= parameter in original one. Orig add-on is Splunk_TA_openldap default/savedsearches.conf [Update openldap_user_lookup KV Store collection] request.ui_dispatch_app = search disabled = 0 alert.track = 0 cron_schedule = */2 * * * * dispatch.earliest_time = -4m dispatch.latest_time = -2m enableSched = 1 search = sourcetype="openldap:access" operation="BIND" | dedup conn cn | table conn op cn | rename cn as user | lookup openldap_user_lookup conn, op OUTPUTNEW _key AS _key | outputlookup append=t openldap_user_lookup My append is A10_aaa_ta_openldap default/savedsearches.conf [Update openldap_user_lookup KV Store collection] search = `openldap_index` sourcetype="openldap:access" operation="BIND" | dedup conn cn | table conn op cn | rename cn as user | lookup openldap_user_lookup conn, op OUTPUTNEW _key AS _key | outputlookup append=t openldap_user_lookup I know btool and I am using it.  There are more problems. One is that according to btool, the savedsearch.conf precedence does not behave as documented, i.e. app/user context with reverse reverse-lexicographic order. The second is that Splunk reports a problem with duplicate configuration. So far I haven't found any information in the documentation that savedsearches.conf should behave differently than for example macros, props etc.
Hi All  We have created a dashboard to monitor CCTV and it was working fine. However suddenly data stopped populating.  We have done any change.  My finding  1 - If i select last 30 days i can see... See more...
Hi All  We have created a dashboard to monitor CCTV and it was working fine. However suddenly data stopped populating.  We have done any change.  My finding  1 - If i select last 30 days i can see the dashboard working fine  2 - If i select time range last 20 days i can the dashboard is not working 3 - Started trouble shooting the issue and found the below  Spl query The below works fine when the time range is last 30 days  working - index=test 1sourcetype="stream" NOT upsModel=*1234* |rename Device AS "UPS " |rename Model AS "UPS Model" |rename MinRemaining AS "Runtime Remaining" |replace 3 WITH Utility, 4 WITH Bypass IN "Input Source" |sort "Runtime Remaining" |dedup "UPS Name" |table "UPS Name" "UPS Model" "Runtime Remaining" "Source" "Location" Note- The same spl query dont work when time range is last 20 days.  Trouble shooting - Splunk receiving data till date however i have notice few thing,  When i select last 30 days i can see the by fields in the search  UPS Name , UPS Model , Runtime Remaining , Source When i select last 20 days the below fields are missing not sure why?  Missing fields - UPS Name , UPS Model , Runtime Remaining , Source . So the below SPL query is not showing any data  index=test 1sourcetype="stream" NOT upsModel=*1234* |rename Device AS "UPS " |rename Model AS "UPS Model" |rename MinRemaining AS "Runtime Remaining" |replace 3 WITH Utility, 4 WITH Bypass IN "Input Source" |sort "Runtime Remaining" |dedup "UPS Name" -  |table "UPS Name" "UPS Model" "Runtime Remaining" "Source" "Location" The highlighted part not pulling any data due to missing field.   Thanks 
I have modified the api link of azure, and replaced all of them with the api url of azure China, but I can only collect a part of the data, not all of the data.
apiStartTime and apiEndTime are not set when info=completed but are set when info=granted - try something like this index=_audit action=search provenance=* info=granted host IN (...) (apiStartTime=... See more...
apiStartTime and apiEndTime are not set when info=completed but are set when info=granted - try something like this index=_audit action=search provenance=* info=granted host IN (...) (apiStartTime="ZERO_TIME" OR apiEndTime="ZERO_TIME") | table user, apiStartTime, apiEndTime, search_et, search_lt, search | convert ctime(search_*)  
Hi DavidLi   I didn't realise that after a year you would still reply, thank you so much!
Hi Team,    I am using a free trail version of Splunk. and forwarding logs from a Paloalto firewall to splunk. sometimes i am getting logs sometimes not . its seems to be a timeZone issue. my paloa... See more...
Hi Team,    I am using a free trail version of Splunk. and forwarding logs from a Paloalto firewall to splunk. sometimes i am getting logs sometimes not . its seems to be a timeZone issue. my paloalto firewall is in US/Pacific time Zone.  how can I check the Splunk timezone. and how can i configure it same on both the side.  #splunktimeZone
Hi @Taruchit , at first don't use the search command when you cn put all the parameters in the main search. Then I'd avoid to use all time in a search because you could have too many events, but de... See more...
Hi @Taruchit , at first don't use the search command when you cn put all the parameters in the main search. Then I'd avoid to use all time in a search because you could have too many events, but define a useful timerange. index=_audit action=search provenance=* info=completed host IN (...) (apiStartTime="ZERO_TIME" OR apiEndTime="ZERO_TIME") | table user, apiStartTime, apiEndTime, search_,et, search_lt, search | convert ctime(search_*) about the meaning of the results, they dependsa on the parameters you defined, probably with apiEndTime="ZERO_TIME" you don't have the apiStartTime field. Analyze your search and modify it to have the best results for you. Ciao. Giuseppe
| eventstats values(eval(if(status="Issue","Bad",null()))) as Health