Activity Feed
- Karma Re: "Failed to bootstrap playbook repos" on clean install for intothelight. Wednesday
- Posted Re: advhunt custom Command on Splunk Enterprise Security. Wednesday
- Karma Re: Splunk TV Replacement for Gattaca2. a month ago
- Karma Re: What happened if Splunk SOAR license expired? for dolezelk. a month ago
- Posted Re: Splunk Add-on Builder Checkpoint issues on All Apps and Add-ons. a month ago
- Posted Re: How to relate multi values in a table on Splunk Enterprise. 01-11-2025 11:15 PM
- Posted Re: Request for Assistance with Opening Links in a New Tab on Feedback. 01-11-2025 11:07 PM
- Got Karma for Re: How to relate multi values in a table. 01-07-2025 11:21 AM
- Got Karma for Re: Application for collection logs. 01-07-2025 03:24 AM
- Got Karma for Re: Application for collection logs. 01-07-2025 03:23 AM
- Got Karma for Re: What happened if Splunk SOAR license expired?. 01-06-2025 01:05 AM
- Posted Re: Splunk Enterprise 9.4.0 Integrity Check warning on Splunk Enterprise. 01-03-2025 02:10 PM
- Karma Re: Disable Splunk Enterprise phoning home? for frusso. 01-03-2025 02:06 PM
- Karma Re: add info about downstream jenkins jobs to upstream event for yuanliu. 01-03-2025 01:15 PM
- Posted Re: Help about keycloak configuration on All Apps and Add-ons. 01-03-2025 01:00 PM
- Posted Re: Splunk to search only latest log file and not any historical data on Splunk Search. 01-03-2025 12:45 PM
- Posted Re: How to relate multi values in a table on Splunk Enterprise. 01-03-2025 12:16 PM
- Posted Re: Splunk Add-On for Google Cloud Platform failing to load inputs page on All Apps and Add-ons. 01-02-2025 11:41 AM
- Posted Re: Session Key Authentication fails sometimes on Splunk Search. 01-02-2025 11:38 AM
- Got Karma for Re: Splunk Enterprise and Forwarders 9.3.2 on Windows TLS Configuration. 12-30-2024 06:58 AM
Topics I've Started
No posts to display.
Wednesday
I think this "admin_all_objects" privilege is needed by the app to access the client secret stored in the app, which is used to authenticate the advanced hunting requests. There is another app (https://splunkbase.splunk.com/app/6463) which appears to do the same thing albeit with a differently named command "defkqlg". It says in the Details tab that you can use the "edit_storage_passwords" capability instead of "admin_all_objects" if your Splunk Enterprise version is later than 9.1.0. It might also be possible to use edit_storage_passwords privilege instead on the MS Defender Advanced Hunting app, but it would need to be tested.
... View more
a month ago
Can you increment the checkpoint number by one before saving it using the helper functions in the add-on builder? This should prevent it from getting the last event multiple times when there are no new events after the last checkpoint.
... View more
01-11-2025
11:07 PM
In what format are you adding that link? If it is using <A> tags, or you can otherwise control the attributes of the link, then you can add a target="_blank" to it to make it open a new tab by default. <a href="linktowebsite.com" target="_blank">Link</a>
... View more
01-03-2025
02:10 PM
I don't have Splunk running on a windows machine so I can't comment on whether those files are necessary or not, but if you find that your splunk installation is working well without those files and then you would like to just disable the warning, then you can remove the related lines in the manifest file in your splunk directory to disable the integrity checking on them.
... View more
01-03-2025
01:00 PM
According to the readme.md file, to configure it: On your Splunk instance navigate to `/app/KeycloakAPI_nxtp` to perform the configuration. I would assume this takes place after you install the app on your instance. Then you should be able to go to https://yoursplunk:8000/<locale>/app/KeycloakAPI_nxtp And there may be a setup page.
... View more
01-03-2025
12:45 PM
Splunk will store the indexed data until the end of the retention period in the index. You cannot tell Splunk to just store the latest copy from inputs.conf. You can, however, use searches to return only the latest indexed event. By default, events will be returned in reverse chronological order. So if your list of certificates is in a single event, then you may be able to filter to only the latest one by using "head 1" index=test_event source=/applications/hs_cert/cert/log/cert_monitor.log
| head 1
| rex field=_raw "(?<Severity>[^\|]+)\|(?<Hostname>[^\|]+)\|(?<CertIssuer>[^\|]+)\|(?<FilePath>[^\|]+)\|(?<Status>[^\|]+)\|(?<ExpiryDate>[^\|]+)"
| multikv forceheader=1
| table Severity Hostname CertIssuer FilePath Status ExpiryDate If this is not the case, then perhaps you could post a sanitized screenshot of your events to give us a better idea of how they appear in your search interface.
... View more
01-03-2025
12:16 PM
1 Karma
Talvez algo assim: index=analise Task.TaskStatus="Concluído" Task.DbrfMaterial{}. SolutionCode="410 TROCA DO MOD/PLACA/PECA" State IN ("*") CustomerName IN ("*") ItemCode("*")
| spath path=Task.DbrfMaterial{} output=DbrfMaterial
| mvexpand DbrfMaterial
| table TaskNo DbrfMaterial
| spath input=DbrfMaterial path=
| table TaskNo EngineeringCode ItemDescription ItemQty SolutionCode Como exatamente você gostaria que sua tablela fosse?
... View more
01-02-2025
11:41 AM
I've had this issue before with a custom app and tried recreating it from scratch to no avail. However, changing the locale in the URL (e.g. en-US to en-GB) somehow got the inputs page to load. Perhaps it will work for you.
... View more
01-02-2025
11:38 AM
Is there any reason why you aren't creating a token from the interface under Settings->Users and Authentication->Tokens, and then using it to call the API? That would be much more reliable than using a single session key.
... View more
12-27-2024
12:51 PM
This looks like it would work. If you're not quite sure and you want to make sure it is correct before the data goes into the index, then you could set up a sandbox index and use crcSalt to stop the logs from being registered as indexed already. In terms of billing, you would be paying for all logs, sandboxed or not, but it would avoid the annoyance of deleting wrongly-indexed data in your production indexes. E.g. [monitor://D:\Exchange Server\TransportRoles\Logs\*\ProtocolLog\SmtpReceive]
whitelist=\.log$|\.LOG$
time_before_close = 0
sourcetype=MSExchange:2019:SmtpReceive
queue=parsingQueue
index=sandbox
disabled=false
crcSalt = "testing" (then remove or modify the crcSalt when the logs look good in the sandbox and are ready for production.)
... View more
12-27-2024
12:22 PM
1 Karma
Indeed I also cannot find a direct statement in the docs about this. I would assume that SOAR falls back to the community license, but I have never seen a SOAR license expire on a machine. You could submit this question as feedback at the bottom of the docs page for the SOAR license, then they may add this information in a future version
... View more
12-27-2024
12:08 PM
Usually people safelist the SplunkForwarder service and the entire SplunkForwarder directory. You could safelist individual services and apps but you may end up spending lots of time playing whack-a-mole. You may be able to look through the community forums and Splunk Slack usergroup for your particular EDR solution to see if others have specific tips applying to that EDR.
... View more
12-26-2024
11:52 AM
1 Karma
Note that the lastTime field contains the timestamp of the latest event seen, while the recentTime field contains the indextime of the latest event. If the log was indexed soon after being generated, then these times will be close together. If a log was generated on Dec 11 but indexed today, then the lastTime and recentTime values will be different. Ref: https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Metadata You should think about how you want your search to handle historical data. Perhaps you want this search to filter to sourcetypes where you expect data to come in every day, and then you can filter out sourcetypes which contain data indexed recently but are timestamped a long time ago. The highlighted sourcetype only has 3 events which makes me think that it does not have fresh data every day.
... View more
12-26-2024
10:47 AM
1 Karma
What makes you think it may be the wrong algorithm for your use case?
... View more
12-23-2024
11:49 AM
2 Karma
You could calculate the current hour of the alert execution, then adjust the threshold at the end. <mySearch>
| bin _time span=1m
| stats avg(msg.DurationMs) AS AvgDuration by _time, msg.Service
| eval hour = strftime(now(),"%H")
| where (AvgDuration > 1000 and hour >= 8 and hour < 17) or (AvgDuration > 500 AND (hour < 8 OR hour >= 17))
... View more
12-23-2024
08:23 AM
Try the API: | rest "/servicesNS/-/-/saved/searches" splunk_server=*
| regex search="index=\*"
| table title search disabled author eai:acl.app
... View more
12-22-2024
01:51 PM
The targeted server/endpoint for integration with this app is the machine that you would like to run commands on. The server/endpoint itself does not need to be integrated into SOAR, but rather SOAR needs credentials/certificates/tickets to authenticate with the winRM service on the target server/endpoint.
... View more
12-22-2024
01:20 PM
This app is listed as "Developer Supported", which means you may have more success by directly contacting the developer to ask if they plan to add the retrieval of archive events to future releases. Email Support email: support@mimecast.com Contact Notes For technical support, visit https://community.mimecast.com/s/contactsupport
... View more
12-22-2024
01:11 PM
Not directly in the web UI, but you can set the number of page entries by changing the URL parameters. Go to the "Events" queue, then set the page size using the dropdown to any value other than the default. You will then see your URL look like: https://yoursoar.com/browse?page=1&per_page=25&filter=new_events&status=new You can then change per_page=25 to any number. For 100 entries, set per_page=100. You can do this with Playbooks, Events, and Cases.
... View more
12-22-2024
11:52 AM
Can you paste a copy of your original event in a code sample format? Perhaps one of the double-quotes is wrong.
... View more
12-22-2024
06:36 AM
1 Karma
Could you log in as the Splunk user on your indexer and then run btool for the stanzas relating the TLS-secured forwarding? /opt/splunk/bin/splunk btool inputs list SSL
/opt/splunk/bin/splunk btool inputs list splunktcp-ssl
/opt/splunk/bin/splunk btool server list sslConfig Make sure that the settings are set according to the instructions in the article. If they are the wrong values, then add --debug to the btool commands to find the file which is setting the command. If there are no problems there, then do you find specific complaints in the splunkd log of the forwarder? E.g. "Invalid certificate", or does the connection time out? Have you been able to forward logs, even _internal logs, before setting up TLS?
... View more
12-22-2024
06:24 AM
1 Karma
When you press the "New Token" button, then do you see the "New Token" menu which contains the fields: User, Audience, Expiration, Not Before, and then a field for "Token" which has the warning that the token will appear there only once upon creation? When you press "Create", then the "Token" field will contain the token itself, not the token ID. In the Tokens page, you will see the tokens listed, and the first column will be their token ID
... View more
12-22-2024
06:17 AM
1 Karma
It is possible to use props.conf settings on your indexer machines to pre-process the JSON into distinct events for each transaction, but I will assume that you instead have that one json object as a single event in Splunk. You can then use the following search: <Your search for finding the json event>
``` Chop off the first and last brackets ```
| rex field=_raw mode=sed "s/^{//"
| rex field=_raw mode=sed "s/}$//"
``` Add a "SplitHere" keyword to target with a makemv command ```
| rex field=_raw mode=sed "s/},/},SPLITHERE/g" max_match=99
``` Remove the Transaction1 etc. labels for each sub-object ```
| rex field=_raw mode=sed "s/\s*\"Transaction\d*\"\s:\s//g" max_match=99
``` To avoid making _raw a multivalue lets eval it to the "a" field ```
| eval a = _raw
``` Split 'a' into multiple values and table it ```
| makemv a delim=",SPLITHERE"
| mvexpand a
| table a
``` Extract the key values for each json object ```
| spath input=a
``` Filter to desired fields and make it into final table with renaming and rounding ```
| table transaction pct2ResTime
| rename transaction as "Transaction Name"
| eval pct2ResTime = round(pct2ResTime)
... View more
12-16-2024
11:15 AM
That should work already. Could you try putting that search filter at the end of your alert search? <yoursearch>
| search (errorType = "Client" AND count > 8 ) OR (errorType = "Credentials" AND count > 8 ) OR (errorType = "Other" AND count > 8 )
... View more