All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

@MohammedKhanIUK I believe that's the requirement.  NOTE: To collect the audit-logs, the user should have admin access of the organization/enterprise and read:audit_log scope for the Personal Ac... See more...
@MohammedKhanIUK I believe that's the requirement.  NOTE: To collect the audit-logs, the user should have admin access of the organization/enterprise and read:audit_log scope for the Personal Access Token.   https://docs.splunk.com/Documentation/AddOns/released/Github/Configureinputs If this reply helps you an upvote and "Accept as Solution" is appreciated.    
I am currently using the new dashboard studio interface, they make calls to saved reports in Splunk. Is there a way to have time range work for the dashboard, but also allow it to work with the re... See more...
I am currently using the new dashboard studio interface, they make calls to saved reports in Splunk. Is there a way to have time range work for the dashboard, but also allow it to work with the reports? The issue we face is  we are able to set the reports in the studio dashboard, but the default is that they are stuck as static reports. how can we add in a time range input that will work with the dashboard and the reports?
Can the permissions be limited to specific capabilities aside from admin:org for audit events? Or is that a fundamental requirement to pull in audit logs?
Ahhh... So you don't want to just join two indexes. You want to join an index onto itself using external "bracketing" event. Ugh. With small datasets you can try to stats once then append the index... See more...
Ahhh... So you don't want to just join two indexes. You want to join an index onto itself using external "bracketing" event. Ugh. With small datasets you can try to stats once then append the index again and stats another time. Well, you could even try to use the cursed join command But the real question is how to do this operation effectively. I have a rough idea but have to test it first.
So the best performing solution I could come up with was something like this: index=ind1 earliest=-1d field1=abc field2 IN ([search index=ind1 earliest=-1d "A" field1=xyz | stats count by field2 | ... See more...
So the best performing solution I could come up with was something like this: index=ind1 earliest=-1d field1=abc field2 IN ([search index=ind1 earliest=-1d "A" field1=xyz | stats count by field2 | fields field2 | rename field2 as query | format mvsep="" "" "" "" "" "" "" | replace "NOT ()" WITH ""]) | append [search index=ind1 earliest=-1d "A" field1=xyz]   This way, the parent query is running with the additional filtering provided by the subquery.  One thing I was wondering was whether the search results of "search index=ind1 earliest=-1d "A" field1=xyz" could be stored to not have to run it twice.  Is that possible?
So it is now flagged to include "z" versions of OpenSSL, meaning that all prior and current versions were and are indeed affected? Could you provide a link to the supporting information?
the curl command works from the command line only if I specify "-k" (ignore SSL cert) how do I get Splunk to accept the cert?
| makeresults format=csv data="_time, username, computer, printer, source_dir, status 2024-09-24 15:32 , auser, cmp_auser, print01_main1, \\\\cpn-fs.local\data\program\..... See more...
| makeresults format=csv data="_time, username, computer, printer, source_dir, status 2024-09-24 15:32 , auser, cmp_auser, print01_main1, \\\\cpn-fs.local\data\program\..., Printed 2024-09-24 13:57 , buser, cmp_buser, print01_offic1, c:\program files\documents\..., Printed 2024-09-24 12:13 , cuser, cmp_cuser, print01_offic2, \\\\cpn-fs.local\data\transfer\..., In queue 2024-09-24 09:26, buser, cmp_buser, print01_offic1, F:\transfers\program\..., Printed 2024-09-24 09:26, buser, cmp_buser, print01_front1, \\\\cpn-fs.local\transfer\program\..., Printed 2024-09-24 07:19, auser, cmp_auser, print01_main1, \\\\cpn-fs.local\data\program\...., In queue" | rex field=source_dir "(?P<FolderPath>(\\\\\\\\[^\\\\]+|\w:)\\\\[^\\\\]+\\\\)"
The problem here may be that splunk is not releasing the updated libraries for libssl and libcrypto up through the 9.3.1 release now that the vulnerabilities are being flagged to "zk"
Hi @Andrew.Bray, Have you seen this AppD Docs page? https://docs.appdynamics.com/appd/23.x/23.11/en/infrastructure-visibility/machine-agent/install-the-machine-agent
Glad it was helpful.  If this reply helps you an upvote and "Accept as Solution" is appreciated.  
It appears to be failing to pull the docker image. This guide for setting up sc4s suggests using a different value for SC4S_IMAGE https://splunk.github.io/splunk-connect-for-syslog/main/gettingstart... See more...
It appears to be failing to pull the docker image. This guide for setting up sc4s suggests using a different value for SC4S_IMAGE https://splunk.github.io/splunk-connect-for-syslog/main/gettingstarted/podman-systemd-general/#unit-file Is "ghcr.io/splunk/splunk-connect-for-syslog/container3:latest" the SC4S_IMAGE value you tried?
Yes, exactly like this. I was beating around the bushes, but this one works perfectly. Thanks a lot @sainag_splunk  
Hello Yuanliu, Thanks so much for your suggestion. This is getting close. I did have to change the first "span=1s" to something greater than 1m in order to get any results. Most likely because the "... See more...
Hello Yuanliu, Thanks so much for your suggestion. This is getting close. I did have to change the first "span=1s" to something greater than 1m in order to get any results. Most likely because the "Query" total (and other DNS stats) are only logged once every 5 minutes with the totals for the past five minutes. As you mentioned this does not give the connection points in the graph so I had a thought; what if I use this query to generate a list of sites to use in my original query. Something like this: index=metrics host=* | rex field=host "^(?<host>[\w\d-]+)\." | lookup dns.csv sd_hostname AS host | search Site IN (*) | bin _time span=5m | stats values(Query) as QPS by Site _time | bin _time span=5m | stats avg(QPS) as QPS by Site _time | streamstats window=2 global=false current=true stdev(QPS) as devF by Site | sort Site, - _time | streamstats window=2 global=false current=true stdev(QPS) as devB by Site | where 4*devF > QPS OR devB*4 > QPS | table Site | dedup Site | mvcombine Site delim="," | nomv Site   This gives a CSV list of sites to search: Site austx.1,snavtx.1   I am using Dashboard Studio and I'm trying to figure out how to chain these results as a variable in my original search ... | search Site IN ($my-csv-list-from-above$) ...+ but so far I have not figured that out. Let me know if you have suggestions.  Thanks again for your help!
Do you mean like this ? (index=Index1 sourcetype=SourceType1) OR (index=Index2 sourcetype=SourceType2)  
Hi,    I have index called Index1 which has sourcetype called SourceType1 and another index called Index2 with sourceType called SourceType2. Some data is in combination of Index1<-> SourceType1 an... See more...
Hi,    I have index called Index1 which has sourcetype called SourceType1 and another index called Index2 with sourceType called SourceType2. Some data is in combination of Index1<-> SourceType1 and some data is in combination of Index2<->SourceType2.   How can I write a query that targets the correct index and sourceType?                                                  
Are you able to go to one of those 500 Error lines, then post 10-20 of the previous lines? If it contains sensitive data then you should sanitize it before posting.
Thanks. That helped me resolving my issue, just a small correction that it would be: target="_blank"
Did you already try this ? please refer: https://github.com/signalfx/splunk-otel-collector-chart/tree/main?tab=readme-ov-file helm install my-splunk-otel-collector --set="splunkPlatform.endpoint... See more...
Did you already try this ? please refer: https://github.com/signalfx/splunk-otel-collector-chart/tree/main?tab=readme-ov-file helm install my-splunk-otel-collector --set="splunkPlatform.endpoint=https://127.0.0.1:8088/services/collector,splunkPlatform.token=xxxxxx,splunkPlatform.metricsIndex=k8s-metrics,splunkPlatform.index=main,clusterName=my-cluster" splunk-otel-collector-chart/splunk-otel-collector
Appreciate the help.  This is working in part.  For the server path, I am getting the proper output.  However, for the drive path, I am getting a result as c:\program files\documents\ or F:\transf... See more...
Appreciate the help.  This is working in part.  For the server path, I am getting the proper output.  However, for the drive path, I am getting a result as c:\program files\documents\ or F:\transfers\program\ and not c:\program files\  or F:\transfers\.   Trying to make the output see that the drive letter is the root folder.  I should have worded it as the root location.  Also, I have done some review of rex/regex videos online and still learning and trying to decipher each part of the regular expression and how they are broken up to capture each part of the file path.  Can you explain this a bit or point me to any additional tutorial that can help me understand this more.  Much appreciated.