All Topics

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Topics

how to distribute the default app, if I want to do some changes  to the default app to the SHC members ?
Hello - thank you for assisting in advance. I need to write up a query which will pull in client/server errors from event message into table format as shown below.  _time status_category ... See more...
Hello - thank you for assisting in advance. I need to write up a query which will pull in client/server errors from event message into table format as shown below.  _time status_category Error Code error_count 2022-01-26:17:30:00 server error 503 2 2022-01-26:18:30:00 client error 404 6   Here are  examples of the EvenTypes and available fields for the index. Error 503       Fields
Hi there, Is it possible to search for windows interactive logons from the Authentication data model? eg. I can do it this way: index=* source="*WinEventLog:Security" LogonType=2 OR LogonType=1... See more...
Hi there, Is it possible to search for windows interactive logons from the Authentication data model? eg. I can do it this way: index=* source="*WinEventLog:Security" LogonType=2 OR LogonType=10 OR LogonType=11 And I'm looking for an equivalent way using a data model eg: | tstats summariesonly=true count from datamodel=Authentication by Authentication.action Authentication.app Authentication.dest Authentication.signature Authentication.src Authentication.src_user Authentication.user |search <SOME LOGIC> Thank you!
Hi team, I have a query related to splunk alert msg send to WebEx chat to individual person. If there is any process, please help me out on this. Thanks in advance.  
Hi was wondering if possible, how to convert a date field into an abbreviate Month (Jan , Feb, Mar, Apr) So the 2 fields on the left are existing fields and the ones on the right would be the new o... See more...
Hi was wondering if possible, how to convert a date field into an abbreviate Month (Jan , Feb, Mar, Apr) So the 2 fields on the left are existing fields and the ones on the right would be the new ones  Created  Closed Month_Open Month_Closed 8/27/2020 3:37 9/2/2020 12:00 Aug Sep 10/15/2020 3:31 10/21/2020 12:00 Oct Oct 11/5/2020 3:59 11/10/2020 5:17 Nov Nov 12/3/2020 3:33 4/13/2022 10:48 Dec Apr
Team, I am having a query which would result as below. _time Host Name version 3/2/2022  15:22:04 PM 3 car 248 3/1/2022  15:21:04 PM 3 car 246 3/1/2022 ... See more...
Team, I am having a query which would result as below. _time Host Name version 3/2/2022  15:22:04 PM 3 car 248 3/1/2022  15:21:04 PM 3 car 246 3/1/2022  15:20:07PM 2 car 246 3/1/2022  15:20:03 PM 3 bus 600 3/1/2022  15:19:02 PM 2 bus 600 2/1/2022  15:20:03 PM 3 Toy 600 2/1/2022  15:19:02 PM 2 Toy 248 2/1/2022  14:19:02 PM 2 Toy 248   After that i need final output like below. _time Host Name version Final 2/1/2022  15:20:03 PM 3 Toy 600 Not matching 3/1/2022  15:20:03 PM 3 bus 600 Matched 3/1/2022  15:21:04 PM 3 car 246 Matched 3/2/2022  15:22:04 PM 3 car 248 Not matching    I am not sure to compare between columns itself. Could someone please help me out here. Thanks
Logs are going to source= WinEventLog:Application and sourcetype="WinEventLog" instead of source="WinEventLog:Security" sourcetype="WinEventLog:Security" Ran this search index=*** sourcetype="*wi... See more...
Logs are going to source= WinEventLog:Application and sourcetype="WinEventLog" instead of source="WinEventLog:Security" sourcetype="WinEventLog:Security" Ran this search index=*** sourcetype="*wineventlog*" rha***s-wds EventCode=517 signature="The audit log was cleared"
hello,  I just started with splunk and I need your help. I am not sure why alerts not working for me this is an example ( looking for ping event +  PowerShell )       ... See more...
hello,  I just started with splunk and I need your help. I am not sure why alerts not working for me this is an example ( looking for ping event +  PowerShell )       I set up to send an email to my inbox ( Do i need to configure stmp or something? or it will working without any configuration?) also I cant see anything in Alet tab - just a comment  >  There are no fired events for this alert. I am not sure what I am doing wrong, please help me if you can! Many thanks     ( I have 60days free splunk)   thank you
Hi Team, My universal forwarder certificate package, will be expiring soon in my splunk cloud environment. As a result, splunk vendor updated forwarder package on stack with updated certificates to ... See more...
Hi Team, My universal forwarder certificate package, will be expiring soon in my splunk cloud environment. As a result, splunk vendor updated forwarder package on stack with updated certificates to be deployed across any forwarders that connect directly to my Splunk instance. My Action: I should download and install the updated Universal Forwarder certificate package on all forwarders prior to the upcoming maintenance window. Can someone elaborate the pre-conditions and further steps to be taken care before my maintenance window. FYI - I have the splunkclouduf.spl package Thanks, Sabari    
Hello As you can see in my search I transpose time in my header field   | eval time=strftime(_time,"%H:%M") | sort time | fields - _time _span _origtime _events | fillnull value=0 | transpo... See more...
Hello As you can see in my search I transpose time in my header field   | eval time=strftime(_time,"%H:%M") | sort time | fields - _time _span _origtime _events | fillnull value=0 | transpose 0 header_field=time column_name=KPI include_empty=true | sort KPI   Most of the time it works well But it seems that until I have results = 0, the time header field is dont display I have row1, instead 08:00, row2 instead 09:00 You can see the result below is anybody have an idea please?
I read from here that "access to the GlassFish browser interface is limited to local machine access by default". And that "Secure Admin must be enabled to access the DAS remotely." I didn't quite ge... See more...
I read from here that "access to the GlassFish browser interface is limited to local machine access by default". And that "Secure Admin must be enabled to access the DAS remotely." I didn't quite get how to enable "Secure Admin"?
Okay, so this is quite theorectical.... the nature of this search is to basically count the Incoming Domains when there is greater than 200 unique emails.  Then, I need to count the outgoing Domains ... See more...
Okay, so this is quite theorectical.... the nature of this search is to basically count the Incoming Domains when there is greater than 200 unique emails.  Then, I need to count the outgoing Domains when they MATCH those domains and do a count and compare a % of when the conversation is let's say >30%-45%... What are we doing with this? Answer: We are going to count the domains that send IN email with which we clearly RESPOND to, and without getting into the mixture of RE:'s and FWD's... Out of Office Replies... these will and should NOT make up 45% of the conversation... plus we have to be careful NOT to consider any emails with which are NEW conversations... 1-2 emails in, compared to 1 reply back is not 50% in this search... we have to look at a wide bearth of time to determine the real senders, but get rid of the apple.card, itunes.com, apple.com and things that "SEND" large quantities of emails, but do not have a formal conversation to them. Do you have a search to play with on this? Answer: Kind of... which is why I'm posting here hoping for a better logic     index=email "filter.routeDirection"=inbound | rex field=envelope.from "\@(?<domainIN>[^ ]*)" | stats dc(envelope.from) as num_count by domainIN ```Possible combination``` | join type=inner domainIN [search index=email sourcetype=pps_messagelog "filter.routeDirection"=outbound | rex field=msg.header.to "\@(?<domainIN2>.*)"\>] | stats dc(msg.header.to) as num_count2 by domainIN2 | where count >200     Data is Proofpoint using the sourcetype=pps_messagelog  Above is the capture regex of the domains seen where the routeDirection=inbound.  Now i need to compare the same domains seen in domainIN as the opposite direction... I could see where a "keep a count of large senders" lookup could be good, but the end goal of this is to simply make a list of the domains we KNOW we talk to... this will then get consumed in other security stacks as a method to determine "THIS IS A FRIEND" basically.  You could as a Use-Case send that list to a Threat Intel platform for "domain watcher" status to determine the "look-a-like" domains that could pop up, or if they got compromised you could KNOW it's a threat to you as well...  I hope that makes sense.  If not, I can answer questions, and hopefully my brain can help erode the terrible code you see above... cause it's not working..!!
Hey, I'm very experienced using Splunk as an analyst, but not at all experienced on the admin side of things, but am trying to learn.  I was recently given a JSON file full of Windows Logs to analyze... See more...
Hey, I'm very experienced using Splunk as an analyst, but not at all experienced on the admin side of things, but am trying to learn.  I was recently given a JSON file full of Windows Logs to analyze.  Not sure why they gave me the data that way, but they did, and that's how I have to use it.   When I try and upload the file to Splunk, I select "Add Data", I upload the file, and it does not recognize it as JSON.  If I select json_no_timestamp, it seems to recognize it, but doesn't break it up into events.  Every event starts the same way, and I copied the first 12 lines of JSON below (when auto-arranged).  Using Regex101, I found a Regex that matches the beginning of the event, but adding that into Event Breaks Pattern does not break the event.   I've tried the following Event Breaks Patterns because sometimes when you copy the lines, there is whitespace, and sometimes there is no whitespace (Splunk, Atom, and Regex101 show line breaks and whitespace, but when I copied it into this comment... no line breaks!  Unsure if that's b/c of presentation or just copy/paste): \{\s\"sort\"\: {\n\s+\"sort\" \{\r\n\s+\"sort\"\: { "sort":   { "data": [ { "sort": [ 0 ], "_score": null, "_type": "winevtx", "_index": "winevtx", "_id": "==", "_source": { "process_id": 488, "message": "A Kerberos service ticket was requested.", "provider_guid": "{}", "log_name": "Security", "source_name": "Microsoft-Windows-Security-Auditing", "event_data": { "TicketOptions": "0x60810010", "TargetUserName": "JOHN$@LOCAL.LOCAL", "ServiceName": "krbtgt", "IpAddress": "::ffff:10.10.0.1", "TargetDomainName": "LOCAL.LOCAL", "IpPort": "53782", "TicketEncryptionType": "0x12", "LogonGuid": "{}", "TransmittedServices": "-", "Status": "0x0", "ServiceSid": "S-1-5-21-3052363079-1128767895-2942130287-502" }, "beat": { "name": "LOCAL", "version": "5.2.2", "hostname": "LOCAL" }, "thread_id": 1096, "@version": "1", "@metadata": { "index_local_timestamp": "2017-04-20T06:27:21.283576", "hostname": "LOCAL", "index_utc_timestamp": "2017-04-20T06:27:21.283576", "timezone": "UTC+0000" }, "opcode": "Info", "@timestamp": "2017-04-20T06:25:33.801Z", "tags": [ "beats_input_codec_plain_applied" ], "type": "wineventlog", "computer_name": "LOCAL.LOCAL.local", "event_id": 4769, "record_number": "127898", "level": "Information", "keywords": [ "Audit Success" ], "host": "LOCAL", "task": "Kerberos Service Ticket Operations" } } ] }     Every event starts with { "sort": [ 0 ], so I know that's where I want to break it up.  I'm sure I'm missing something simple.  What is it? Appreciate any assistance.
hello In my search I use an eval command like below in order to identify character string in web url | eval Kheo=case( like(url,"%SLG%"),"G", like(url,"%SLK%"),"G", like(url,"%SLY%"),"I... See more...
hello In my search I use an eval command like below in order to identify character string in web url | eval Kheo=case( like(url,"%SLG%"),"G", like(url,"%SLK%"),"G", like(url,"%SLY%"),"I", like(url,"%SK%"),"T", like(url,"%SL%"),"E" ) | search Kheo=* The problem I have is that my eval identify every url which conatains for example "SLG" letters in lowercase or uppercse My need is to strictly identify URL which contains "SLG" letters in uppercase I tried with match but it changes nothing | eval Kheo=case( match(url,"SLG"),"G",  could you help please?
Hello everyone, A query, I have the following problem where a query is made to a specific index and sourcetype at a certain time and if the next day I execute that query again, the number of events ... See more...
Hello everyone, A query, I have the following problem where a query is made to a specific index and sourcetype at a certain time and if the next day I execute that query again, the number of events is less. It is worth mentioning that it is not possible to see any corrupted buket, or that the space of the indexers is full, which could cause a loss of information. Excuse the translation by google
Hi,  After reviewing most of the posts and not finding a solution. I finally came here to ask for help related to my query problem.  I have a lookup table which runs sweeps to check if logs are m... See more...
Hi,  After reviewing most of the posts and not finding a solution. I finally came here to ask for help related to my query problem.  I have a lookup table which runs sweeps to check if logs are missing in any particular index/host. My query was working like a charm from last two years, but suddenly, it started to show FPs.  My query =  | inputlookup mylookuptable.csv | table index sourcetype host | join index sourcetype host type=left [| tstats count where index=* sourcetype=* host=* by _time index sourcetype host | stats count by index sourcetype host] | fillnull value=0 | search count = 0 Can someone please help me understand why its not displaying results for those values only if any index/sourcetype is missing logs?
I have a trial version of Splunk Cloud (Classic Experience) and I tried to upload/install a private app, as described here: http://docs.splunk.com/Documentation/SplunkCloud/8.2.2202/Admin/PrivateApps... See more...
I have a trial version of Splunk Cloud (Classic Experience) and I tried to upload/install a private app, as described here: http://docs.splunk.com/Documentation/SplunkCloud/8.2.2202/Admin/PrivateApps. I went to "App Management" and clicked the "Upload App button". After I entered my splunk.com credentials, checked the T&C box and clicked the Login button, I see a POST call to https://<my-trial-instance>.splunkcloud.com/en-US/splunkd/__raw/services/uploaded-apps/package?output_mode=json but it returned a 404. I also saw an error message: Error logging in. Enter your Splunk.com username and password. Splunk Cloud requires these credentials to complete app validation before installing your app. I'm not sure why I received a 404. It's not even checking my credentials, even though I entered them correctly. Any help would be appreciated. Thanks, Jason  
Hello Community, I would like to add trailing zeros in front of a value, but only display 5 characters for the value. In addition, I would want to add a prefix of "ABC-". I have no issue with the p... See more...
Hello Community, I would like to add trailing zeros in front of a value, but only display 5 characters for the value. In addition, I would want to add a prefix of "ABC-". I have no issue with the prefix, but the zeros I would need assistance. I could add 4 zeros in front of the value and then trim the value for displaying last 5 characters, but I wanted to see the cleanest way to accomplish. Examples below. Value = 876 I would like the new value to be ABC-00875. Value = 1678 I would like the new value to be ABC-01678. Value = 5 I would like the new value to be ABC-00005.   Thanks, Joe
Dear All, I want to install an external app within the Splunk instance of our client, the problem I have is that with my access account to this instance it does not allow me to install applications... See more...
Dear All, I want to install an external app within the Splunk instance of our client, the problem I have is that with my access account to this instance it does not allow me to install applications, so my question is how or what account should be used to install external applications ? In my case I want to install the Splunk Add-on for linux monitoring app. In addition, does the installation of this application require a license or an additional cost to the one already purchased within Splunk? First of all, Thanks.
Hi I am trying to automate alert set up for splunk alerts . I am using splunk tf provider : https://registry.terraform.io/providers/splunk/splunk/latest/docs/resources/saved_searches#argument-referen... See more...
Hi I am trying to automate alert set up for splunk alerts . I am using splunk tf provider : https://registry.terraform.io/providers/splunk/splunk/latest/docs/resources/saved_searches#argument-reference   We have couple of actions  when an alert is generated like email , slack message & call out . Call out is using custom action called xmatter. Curl command creates alert well curl -ks -u username:password https://<splunkurl>:8089/servicesNS/nobody/digital_dcps_sre_search/saved/searches -d name=No_Memory_Left -d cron_schedule="*/5 * * * *" -d description="This test job is a durable saved search" -d dispatch.earliest_time="-24h@h" -d dispatch.latest_time="now" -d action.digital_slack="1" -d action.digital_slack.param.channel="#dcps-sre-alerts" -d action.digital_slack.param.message="Kong DP Alert .This line indicates that the data plane instance is trying to read a config from the control plane that is bigger than the config cache shared memory location. This means the data plane can no longer receive configuration updates." -d action.digital_slack.param.workspace="rbwm" -d alert.track="true" -d alert_comparator="greater than" -d alert_threshold="1" -d is_scheduled="true" -d alert_type="number of events" -d action.abc_xmatters_alerts="0" -d action.abc_xmatters_alerts.param.key="xxxxxx" -d action.abc_xmatters_alerts.param.severity="MINOR" -d action.abc_xmatters_alerts.param.summary="text=UK TP Splunk Alert. '$name$' alert was triggered. $result.final_gateway_url$. Link: $results_link$\napplication=Technical-Platform-Engineering_UK" -d actions="digital_slack" -d alert.digest_mode="true" -d alert.expires="5d" -d alert.severity="4" -d alert.suppress="true" -d alert.suppress.period="60m" -d description="This line indicates that the data plane instance is trying to read a config from the control plane that is bigger than the config cache shared memory location. This means the data plane can no longer receive configuration updates" -d disabled="false" --data-urlencode search="search index=digital_technical_onprem_kongdp_raw sourcetype=SystemErr \\[clustering\\] unable to update running config: no memory"   But it fails if I try to use splunk tf provider because action_param_key etc are not supported . Is there anyway I can set customer action in alert using tf ?   resource "splunk_saved_searches" "No_Memory_Left" { cron_schedule = "*/5 * * * *" dispatch_earliest_time = "-24h@h" dispatch_latest_time = "now" #action_slack = "1" action_slack_param_channel = "#dcps-sre-alerts" action_slack_param_message = "Kong DP Alert .This line indicates that the data plane instance is trying to read a config from the control plane that is bigger than the config cache shared memory location. This means the data plane can no longer receive configuration updates. Refer " #action_slack_param_workspace = "rbwm" alert_track = "true" alert_comparator = "greater than" alert_threshold = "1" is_scheduled = "true" alert_type = "number of events" #action_abc_xmatters_alerts = "0" #action_param_key = "xxxxxx" #action_param_severity = "MINOR" #action_param_summary = "text=UK TP Splunk Alert. '$name$' alert was triggered. $result.final_gateway_url$. Link: $results_link$\napplication=Technical-Platform-Engineering_UK" actions = "digital_slack" alert_digest_mode = "true" alert_expires = "5d" alert_severity = "4" alert_suppress = "true" alert_suppress_period = "60m" description = "This line indicates that the data plane instance is trying to read a config from the control plane that is bigger than the config cache shared memory location. This means the data plane can no longer receive configuration updates" disabled = "false" search = "search index=digital_raw sourcetype=SystemErr \\[clustering\\] unable to update running config: no memory" name = "No_Memory_Left" acl { app = "digital_dcps_sre_search" owner = "GB-SVC-DSRE-SPL" sharing = "app" } }