I believe some developer license requests go to an approver, so this might be the cause of the delay. Is your email account associated with a Splunk subscription/license (Cloud or Enterprise) ? Ple...
See more...
I believe some developer license requests go to an approver, so this might be the cause of the delay. Is your email account associated with a Splunk subscription/license (Cloud or Enterprise) ? Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
I would suggest checking your spam folder, but also note that the setup of your Splunk Cloud Trial stack might take a little time, how long has it been since you requested this? Has it been more than...
See more...
I would suggest checking your spam folder, but also note that the setup of your Splunk Cloud Trial stack might take a little time, how long has it been since you requested this? Has it been more than a couple of hours? Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Its not possible within Splunk to have a crontab for the first Sunday of the month, however.. you might be able to run it every day for first 7 days of the month (`30 9 1-7 * *`) and add the followin...
See more...
Its not possible within Splunk to have a crontab for the first Sunday of the month, however.. you might be able to run it every day for first 7 days of the month (`30 9 1-7 * *`) and add the following to your search: | where strftime(now(),"%a")=="Sun" This will stop the search from continuing if it isnt Sunday... Does this help? Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
Hi I check the fields are correct but i want to know that using spath for extraction from json and using props and transforms gives same result or not? As I am getting same message value that was uns...
See more...
Hi I check the fields are correct but i want to know that using spath for extraction from json and using props and transforms gives same result or not? As I am getting same message value that was unstructured earlier which was coming after using below statement in conf file [securelog_override_raw] INGEST_EVAL = message := json_extract(_raw, "message") the value in message is still same unstructured with lots of messed up data like below do i have to separate this myself in this case using search queries? 2025-02-11 20:20:46.192 [com.bootstrapserver.runtim] DEBUG Stability run result : com.cmp.bootstrapserver.runtime.internal.api.RStability@36464gf 2025-02-11 20:20:46 [com.bootstrapserver.runtim] DEBUG Stability run result :com.cmp.bootstrapserver.runtime.interndal.api.RStability@373638cgf after spath same message came from message field and now using conf file with prof and tranforms its still the same. will it extract like this only?
hi, i activate cloud trial but i don't received an email for activation and the button access instance is disabled my is valid because i receive a mail for resetting password thank you
Hey @darrfang How about this? | makeresults
| eval _raw="{
\"key1\": {
\"key2\": {
\"key3\": [
{\"data_value\": {\"aaa\": \"12345\", \"bbb\": \"23456\"}}
]
}
}...
See more...
Hey @darrfang How about this? | makeresults
| eval _raw="{
\"key1\": {
\"key2\": {
\"key3\": [
{\"data_value\": {\"aaa\": \"12345\", \"bbb\": \"23456\"}}
]
}
}
}"
| spath input=_raw output=data_value path=key1.key2.key3{}.data_value
| mvexpand data_value
| eval key_value=split(replace(data_value, "[\{\}\"]", ""), ",")
| mvexpand key_value
| rex field=key_value "\s?(?<node>[^:]+):(?<value>.*)"
| table node value Please let me know how you get on and consider accepting this answer or adding karma this answer if it has helped. Regards Will
I am trying to call the applicationFlowMapUiService from Postman but getting an error. I can fetch other restui services just fine from Postman client but this one is giving me the following error: ...
See more...
I am trying to call the applicationFlowMapUiService from Postman but getting an error. I can fetch other restui services just fine from Postman client but this one is giving me the following error: "HTTP ERROR 500 javax.servlet.ServletException: javax.ejb.EJBException: Identity Entity must be of type "USER" to retrieve user." The API I am calling looks like: {{controller_host}}/controller/restui/applicationFlowMapUiService/application/{{application}}?time-range={{timerange}}&mapId=-1&baselineId=-1&forceFetch=false Any ideas? That's the one you wanted me to use right?
Thank you! There's a lot of good stuff here I'll try and incorporate into the search. I will say, however, that session_time doesn't actually have duplicates. It contains a each unique timestamp for ...
See more...
Thank you! There's a lot of good stuff here I'll try and incorporate into the search. I will say, however, that session_time doesn't actually have duplicates. It contains a each unique timestamp for each Session_ID.....however, as every event has its own copy of session_time, when I do mvexpand, it leads to multiple duplicates that I need to take out with dedup. Ideally, it'd be nice if each entry of session_time within each event only had the time and session_id for that specific event and not all events.... I'm going to see if maybe I can use mvmap to check each events Session_ID against the session_id part of session_time. Sorry, I know it's convoluded. Still, I'm making progress thanks to your suggestions~
Hi splunk team, I have a question about how to extract the key-value pair from json data. Let's say for example I have two raw data like this: # raw data1:
{
"key1": {
"key2": {
"ke...
See more...
Hi splunk team, I have a question about how to extract the key-value pair from json data. Let's say for example I have two raw data like this: # raw data1:
{
"key1": {
"key2": {
"key3": [
{"data_value": {"aaa": "12345", "bbb": "23456"}}
]
}
}
}
# raw data 2:
{
"key1": {
"key2": {
"key3": [
{"data_value": {"ccc": "34567"}}
]
}
}
} how can I extract the key-value results in all the data_value, to be a table as: node value
aaa 12345
bbb 23456
ccc 34567 I current have a splunk query that could do part of it: ```some search...```
| spath output=pairs path=key1.key2.key3{}.data_value
| rex field=hwids "\"(?<node>[^\"]+)\":\"(?<value>[^\"]+)\""
| table node value pairs but this only gives me the result of all the first data, result would look like below, that ignore the data of "bbb":"23456". Please give me some advice on how to grab all the results, thanks! node value pairs
aaa 12345 {"aaa": "12345", "bbb": "23456"}
ccc 34567 {"ccc": "34567"}
Hi All, Trying to configure an alert that runs on the first Sunday only of every month, specifically at 9:30am. I put this as the cron expression: 30 9 1-7 * 0 If I'm reading the documentation c...
See more...
Hi All, Trying to configure an alert that runs on the first Sunday only of every month, specifically at 9:30am. I put this as the cron expression: 30 9 1-7 * 0 If I'm reading the documentation correctly, that should be it. However, the alert appears to be running every Sunday of every month instead of just the first Sunday of every month. Am I doing something wrong? Can't figure it out.... Thanks!
There is a Proofpoint add-on and we have it installed, but we need kind of bulk processing capabilities. For example, list all messages from a given sender, IP etc.
@gcusello @kiran_panchavat thanks for your help but unfortunately, I can't share any files, sorry. I am in a air-gapped environment. I have already run splunk btool inputs list --debug | grep index ...
See more...
@gcusello @kiran_panchavat thanks for your help but unfortunately, I can't share any files, sorry. I am in a air-gapped environment. I have already run splunk btool inputs list --debug | grep index . but I will try without using "grep index" and see if I can find anything weird. I haven't checked props.conf but I will check it now. As far as I know, I haven't made any change in the props.conf.
Hello, Below is a sample for a single message from Proofpoint log. It looks simple, but I am struggling to write a query to pull sender (env_from value), recipient(s) (env_rcpt values) and IP addres...
See more...
Hello, Below is a sample for a single message from Proofpoint log. It looks simple, but I am struggling to write a query to pull sender (env_from value), recipient(s) (env_rcpt values) and IP address. As far as I understand X and S have the same values for given single message in the logs and will change from message to message. Any help will be greatly appreciated. Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.436109+00:00 host filter_instance1[1394]: rprt s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=mail cmd=env_from value=sender@company.com size= smtputf8= qid=44pnhtdtkf-1 tls= routes= notroutes=tls_fallback host=host123.company.com ip=10.10.10.10 Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.438453+00:00 host filter_instance1[1394]: rprt s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=mail cmd=env_rcpt r=1 value=recipient.two@DifferentCompany.net orcpt=recipient.two@DifferentCompany.NET verified= routes= notroutes=RightFax,default_inbound,journal Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.440714+00:00 host filter_instance1[1394]: rprt s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=mail cmd=env_rcpt r=2 value=recipient.one@company.com orcpt=recipient.one@company.com verified= routes=default_inbound notroutes=RightFax,journal Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.446326+00:00 host filter_instance1[1394]: rprt s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=session cmd=data from=sender@company.com suborg= Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.446383+00:00 host filter_instance1[1394]: rprt s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=session cmd=data rcpt=recipient.two@DifferentCompany.net suborg= Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.446405+00:00 host filter_instance1[1394]: rprt s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=session cmd=data rcpt=recipient.one@company.com suborg= Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.446639+00:00 host filter_instance1[1394]: info s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=session cmd=data rcpt_routes= rcpt_notroutes=RightFax,journal data_routes= data_notroutes= Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.450566+00:00 host filter_instance1[1394]: info s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=session cmd=headers hfrom=sender@company.com routes= notroutes= Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.455141+00:00 host filter_instance1[1394]: info s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=mimelint cmd=getlint lint= Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.455182+00:00 host filter_instance1[1394]: info s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=mimelint cmd=getlint mime=1 score=0 threshold=100 duration=0.000 Feb 11 10:04:12 host.company.com 2025-02-11T15:04:12.455201+00:00 host filter_instance1[1394]: info s=44pnhtdtkf m=1 x=44pnhtdtkf-1 mod=mimelint cmd=getlint warn=0
i finally fixed this, had to check to make sure the sample folder was in the correct parent folder "destinations". i then went and copied all the sample files over to the splunk/etc/apps/destination...
See more...
i finally fixed this, had to check to make sure the sample folder was in the correct parent folder "destinations". i then went and copied all the sample files over to the splunk/etc/apps/destinations/samples folder (pulled from splunkbase website) then copied the eventgen.conf file over to the correct "local" folder in "Destinations folder" and restarted splunk, pulled up index="main" and it populated data. Hope this helps. took me a while to get this right.