All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi All, Which Capability do i assign to Splunk user to upload image in Dashboard Studio
I found the reason for the problem. MySQL v5.7 uses system timezone → 2025-04-29T11:42:01.532704+01:00 MySQL v8.0 uses system timezone → 2025-04-29T11:42:01.532704+02:00 I can't explain the differ... See more...
I found the reason for the problem. MySQL v5.7 uses system timezone → 2025-04-29T11:42:01.532704+01:00 MySQL v8.0 uses system timezone → 2025-04-29T11:42:01.532704+02:00 I can't explain the difference because the timestamps are specified the same in both versions Anyways, I tried to fix this, by setting the timezones by TZ= in props.conf of forwarder and indexer. But no success    
How to understand: "report_to_map_through_indexes", I tried to built a macro but got server error  or shall it become a custom command? or how tp implement?
HI I'm trying to run a search via CLI from federated Splunk instance > Splunk cloud. Everything is configured correctly and I have access to all indexes that on Splunk Cloud from Federated Instance... See more...
HI I'm trying to run a search via CLI from federated Splunk instance > Splunk cloud. Everything is configured correctly and I have access to all indexes that on Splunk Cloud from Federated Instance  via web interface But when I'm trying to check connection via CLI on Federated Search instance splunk display app -uri https://<splunk cloud uri>:8089 I get this error:  argument uri is not supported by this handler splunk Also, while trying to execute a search from Federated Search: splunk search "index="some remote index on splunk cloud" | head 10" I'm getting the following error: ERROR: Unknown error for indexer: <splunk cloud>. Search results may be incomplete. If this occurs frequently , check on the peer.   Please assist 
Good Day @livehybrid  Yes, It helped. Some research with Browser Dev Tools shows that all posibilities (login to splunk base, downloading, login to splunk) are inside the main domain: *.splunk.com... See more...
Good Day @livehybrid  Yes, It helped. Some research with Browser Dev Tools shows that all posibilities (login to splunk base, downloading, login to splunk) are inside the main domain: *.splunk.com So allowing by domain to splunk.com should be ok.   Kind Regards.
Sourcetype is "cisco:sfw:estreamer" and i am using with default app settings .
Hi @yssplunker  Please could you confirm the sourcetype of your data? Looking in the app, most of the sourcetypes have TRUNCATE=0 which means they shouldnt be truncated, although not all of them! P... See more...
Hi @yssplunker  Please could you confirm the sourcetype of your data? Looking in the app, most of the sourcetypes have TRUNCATE=0 which means they shouldnt be truncated, although not all of them! Please let me know which sourcetype you are having with and I'll check that specifically.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @sudha_krish  httpout sends Splunk2Splunk (S2S) data but over HTTP (HEC) rather than typical S2S port 9997, is this what you are trying to achieve?  It is intended that this is used when you are... See more...
Hi @sudha_krish  httpout sends Splunk2Splunk (S2S) data but over HTTP (HEC) rather than typical S2S port 9997, is this what you are trying to achieve?  It is intended that this is used when you are not able to send data to a remote Splunk instance using typical S2S.  As @gcusello has said, if you want to send to a non-Splunk system you should look into using syslog output which will send the raw data rather than Splunk-parsed S2S data.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @dipali  Im unable to download the app to check, but it sounds like there could be knowledge objects within the app which are not readable by the User role due to their RBAC/Metadata configuratio... See more...
Hi @dipali  Im unable to download the app to check, but it sounds like there could be knowledge objects within the app which are not readable by the User role due to their RBAC/Metadata configuration. Please check within the metadata/default.meta (and local.meta if you have made changes) to see what the different permissions are - feel free to share the contents here so we can walk through it.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
If you want for value to contain only those two values, you could modify @bowesmana 's solution like so | makeresults | fields - _time | eval value=split("ABC","") | where mvcount(value)=2 | search ... See more...
If you want for value to contain only those two values, you could modify @bowesmana 's solution like so | makeresults | fields - _time | eval value=split("ABC","") | where mvcount(value)=2 | search value=A AND value=C
Hi @livehybrid, Thanks for your response. Yes it is JSON structured data but there is not data like data -> vulnerability -> severity. How can i send you root cause analysis data? sample data :... See more...
Hi @livehybrid, Thanks for your response. Yes it is JSON structured data but there is not data like data -> vulnerability -> severity. How can i send you root cause analysis data? sample data : {"timestamp":"2025-04-29T12:44:53.812+0600","rule":{"level":5,"description":"Systemd: Service exited due to a failure.","id":"40704","firedtimes":4,"mail":false,"groups":["local","systemd"],"gpg13":["4.3"],"gdpr":["IV_35.7.d"]},"agent":{"id":"001","name":"debian-pc","ip":"192.168.11.XX"},"manager":{"name":"ubuntu"},"id":"1745909093.11585380","full_log":"Apr 29 06:44:53 proxmox systemd[1]: logstash.service: Main process exited, code=exited, status=1/FAILURE","predecoder":{"program_name":"systemd","timestamp":"Apr 29 06:44:53","hostname":"proxmox"},"decoder":{"name":"systemd"},"location":"journald"}
Yes, you should edit your Entity Search by implementing a new Info field like "location" which is filled ie by rex.
Hi @sudha_krish , I'm not sure that's possible to forward logs to a third party using http, the usual way is syslog as described at https://docs.splunk.com/Documentation/SplunkCloud/9.3.2411/Forward... See more...
Hi @sudha_krish , I'm not sure that's possible to forward logs to a third party using http, the usual way is syslog as described at https://docs.splunk.com/Documentation/SplunkCloud/9.3.2411/Forwarding/Forwarddatatothird-partysystemsd Anyway, http requires to use a token, did you created a token in the receiver? did you enabled it? did you passed it ot your output' Ciao. Giuseppe
I want to forward the logs to third party server from heavy forwarder over http. Here is my outputs.conf [httpout] defaultGroup = otel_hec_group [httpout:otel_hec_group] #server = thirdparty... See more...
I want to forward the logs to third party server from heavy forwarder over http. Here is my outputs.conf [httpout] defaultGroup = otel_hec_group [httpout:otel_hec_group] #server = thirdparty_server:8443 uri = http://thirdparty_server:8443 useSSL = false sourcetype = hf_to_otel disabled = false sslVerifyServerCert = false headers = {"Host": "hf_server", "Content-Type": "application/json"} timeout = 30 but i don't receive logs in third party server and i don't find any error in splunkd logs aswell. @SplunkSE 
I think you have your answer in other posts, but this is a good indication of asking the right question - including the "by day" also is an important point
Users with an Admin or Power role are able to view the Seclytics dashboard provided by the "Seclytics for Splunk App". However, when users with the "User" role attempt to access the same dashboard, t... See more...
Users with an Admin or Power role are able to view the Seclytics dashboard provided by the "Seclytics for Splunk App". However, when users with the "User" role attempt to access the same dashboard, the content does not display. Additionally, we discovered that the lookup file "event_by_days.csv" is missing from the expected directory: /opt/splunk/etc/apps/seclytics-splunk-app/lookups/. We would like to understand the following: Why is the dashboard visible to Admin/Power roles but not to the User role? Are there specific role-based permissions required to access this dashboard? Or is there a configuration change needed on our end to ensure all roles can access the content correctly? Seclytics for Splunk App 
Hello, aamer, could you please share your experience? Our logs are being parsed wrong atm, could you lend me a hand?
You didn't answer @bowesmana 's question about whether your sample is from an index or a lookup table.  I will assume that they come from events.  In this case, it is unnecessary to extract _time inl... See more...
You didn't answer @bowesmana 's question about whether your sample is from an index or a lookup table.  I will assume that they come from events.  In this case, it is unnecessary to extract _time inline.  You can use latest as @bowesmana and @ITWhisperer suggested, or you can simply use dedup to get the latest events before further processing: | eval day = strftime(_time, "%F") | dedup day Name Given this dataset Name Status _raw _time ABC F ABC,F, 04/25/2025 15:50:00 2025-04-25 15:50:00 ABC R ABC,R, 04/25/2025 15:25:00 2025-04-25 15:25:00 ABC F ABC,F, 04/24/2025 15:30:03 2025-04-24 15:30:03 ABC R ABC,R, 04/24/2025 15:15:01 2025-04-24 15:15:01 The above will give you Name Status _raw _time day ABC F ABC,F, 04/25/2025 15:50:00 2025-04-25 15:50:00 2025-04-25 ABC F ABC,F, 04/24/2025 15:30:03 2025-04-24 15:30:03 2025-04-24 Here is a full emulation of your mock data | makeresults | eval _raw="Name,Status,Datestamp ABC,F, 04/24/2025 15:30:03 ABC,R, 04/24/2025 15:15:01 ABC,F, 04/25/2025 15:50:00 ABC,R, 04/25/2025 15:25:00" | multikv forceheader=1 | eval _time = strptime(Datestamp, "%m/%d/%Y %T") | fields - Datestamp linecount | sort - _time ``` data emulation above ```
| makeresults format=csv data="Name,Status,Timestamp ABC,F, 04/24/2025 15:30:03 ABC, R, 04/24/2025 15:15:01 ABC, F, 04/25/2025 15:50:00 ABC, R, 04/25/2025 15:25:00" | eval _time = strptime(Timestamp,... See more...
| makeresults format=csv data="Name,Status,Timestamp ABC,F, 04/24/2025 15:30:03 ABC, R, 04/24/2025 15:15:01 ABC, F, 04/25/2025 15:50:00 ABC, R, 04/25/2025 15:25:00" | eval _time = strptime(Timestamp, "%m/%d/%Y %T") | bin _time as _day span=1d | stats latest(*) as * by _day Name
Hi @bsreeram  If you want it splitting by Name and day so you get the latest per Name AND day then you can use a timechart | timechart span=1d latest(*) as * Did this answer help you? If so... See more...
Hi @bsreeram  If you want it splitting by Name and day so you get the latest per Name AND day then you can use a timechart | timechart span=1d latest(*) as * Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing.