All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

You have to use latest and earliest to get the oldest and most recent event.
HI @susheelpatil1  Did you have resolve the issue?i'm new on splunk and have same error    
Please share the spl query and some sample data. The data format hasn't been changed in the meanwhile?
Okay, that's weird. Just for verification please execute the btool command that was provided by nmohammed and share the output with us
To add - the only way back was rebooting the machine - then it all works fine
Hi, I am currently working on a ticket reporting.  Each ticket has a lastUpdateDate field which gets updates multiple times leading to duplicates. I only need the first lastUpdateDate and latest las... See more...
Hi, I am currently working on a ticket reporting.  Each ticket has a lastUpdateDate field which gets updates multiple times leading to duplicates. I only need the first lastUpdateDate and latest lastUpdateDate to determine when the ticket has entered the pipe and the latest to see if changes were made in the specific period range of the reporting. I tried using | stats first(_raw) as first_entry last(_raw) as last_entry by ticket_id but it shows me the same lastUpdateDate for both. I have read to use min and max but do not gain results from that either.  Thanks in advance for any hints and tips!
Are you asking if you can do this on egress in Azure or are you trying to do equivalent thing on ingress in Splunk? You can do filtering on input, if you use ingest-evals even using lookups (but not... See more...
Are you asking if you can do this on egress in Azure or are you trying to do equivalent thing on ingress in Splunk? You can do filtering on input, if you use ingest-evals even using lookups (but not in the Cloud).
If you see the option, but its greyed out, you may want to check this link  https://community.splunk.com/t5/Dashboards-Visualizations/Is-there-an-option-to-export-CSV-or-PDF-in-Splunk-Dashboard/m-... See more...
If you see the option, but its greyed out, you may want to check this link  https://community.splunk.com/t5/Dashboards-Visualizations/Is-there-an-option-to-export-CSV-or-PDF-in-Splunk-Dashboard/m-p/558071/highlight/true
One caveat though - the host field might be being parsed out from the raw message during ingestion. In such case you can't use it for specifying props stanza.
Hi vr2312, thks for your answer.   yes Im splunk Admin.   Regards
Hi all,  I am integrating a Splunk form/dashboard with SOAR, where I use "sendtophantom" to create a container on which a playbook needs to run.  However, what I am noticing is that when the contai... See more...
Hi all,  I am integrating a Splunk form/dashboard with SOAR, where I use "sendtophantom" to create a container on which a playbook needs to run.  However, what I am noticing is that when the container has multiple artifacts, the playbook takes all the artifacts' CEF fields and combines them into one, which then causes havoc with my playbooks. I have considered changing the ingest settings to send MV fields as a list instead of creating new artifacts, but this will break too many other playbooks, so it isn't an option right now.  My flow is basically as follows:  Container gets created with information coming from splunk artifact(s) contain subject and sender email information Playbook needs to run through each artifact to get the subject and sender info  Playbook processes these values Is there a way to specify that a playbook must run against each artifact in a container individually, or another way to alter the datapaths in the VPE to run through each artifact? 
Hi @hv64 , you may need to check if you have the role/privilege to view that option. Are you by any chance a Splunk admin or just an end user of Splunk ?
Currently on Splunk ES 7.3.2 Splunk Enterprise Security  where i can see users, who used to be part of the organisation, but are now deleted/disabled (in Splunk) are still populating when i try to as... See more...
Currently on Splunk ES 7.3.2 Splunk Enterprise Security  where i can see users, who used to be part of the organisation, but are now deleted/disabled (in Splunk) are still populating when i try to assign new investigations to other current members of the organisation For instance, Incident Review -> Notable -> Create Investigation In the investigation panel, when i try to assign the investigation to other members of the team, i can also see disabled/deleted accounts/users/members as an option to assign the investigation to. Any way we can remove these members from populating so that the list of investigators replicate the current numbers we have in the team.
Hello, why I cant see on my dashboard Studio, option export csv ? Our version splunk version is 9.2.1 Thks.
Hi @Iris_Pi , yes, as per the documentation: For settings that are specified in multiple categories of matching [<spec>] stanzas, [host::<host>] settings override [<sourcetype>] settings. Additional... See more...
Hi @Iris_Pi , yes, as per the documentation: For settings that are specified in multiple categories of matching [<spec>] stanzas, [host::<host>] settings override [<sourcetype>] settings. Additionally, [source::<source>] settings override both [host::<host>] and [<sourcetype>] settings.  
Hi @salavi , at index time, I'm not sure that there isn't any other solution, unless the Splunk Edge Processor will be available. Ciao. Giuseppe
Hi @NK , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points ar... See more...
Hi @NK , good for you, see next time! let me know if I can help you more, or, please, accept one answer for the other people of Community. Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
I don't see a Splunk question here.  Rather, this is an HTML question.  If you can illustrate your search result of 140, and the raw data iFrame receives (not what iFrame renders), show that the two ... See more...
I don't see a Splunk question here.  Rather, this is an HTML question.  If you can illustrate your search result of 140, and the raw data iFrame receives (not what iFrame renders), show that the two are different, this can become a Splunk question.  Even as an HTML question, you need to illustrate data so others can tell you possible causes. (But this is not the best place to ask HTML questions.)
I agree with @gcusello that dedup should suffice because dedup really performs on the same principle.  But before going into code, you need to define what you are looking for using data illustrations... See more...
I agree with @gcusello that dedup should suffice because dedup really performs on the same principle.  But before going into code, you need to define what you are looking for using data illustrations.  Without such definition, we could be talking across each other. So, assuming that you have these raw events   _raw 1 {"timestamp":"2024-08-20 15:33:00.837000","data_type":"finding_export","domain_id":"my_domain_id","domain_name":"my_domain_name","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} 2 {"timestamp":"2024-08-20 15:32:00.837000","data_type":"finding_export","domain_id":"your_domain_id","domain_name":"your_domain_name","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} 3 {"timestamp":"2024-08-20 15:31:10.837000","data_type":"finding_export","domain_id":"my_domain_id","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"your_user"} 4 {"timestamp":"2024-08-20 15:31:05.837000","data_type":"finding_export","domain_id":"my_domain_id","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} 5 {"timestamp":"2024-08-20 15:31:00.837000","data_type":"finding_export","domain_id":"my_domain_id","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} 6 {"timestamp":"2024-08-20 15:30:00.837000","data_type":"finding_export","domain_id":"my_domain_id","domain_name":"my_domain_name","user":"my_user"} 7 {"timestamp":"2024-08-20 15:28:00.837000","data_type":"finding_export","domain_id":"my_domain_id","domain_name":"my_domain_name","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} Of the seven (7) events, 1 and 7 differ only in timestamp; 4 and 5 differ only in timestamp; 2 through 6 are missing some fields or another.  Is it your attention to deduce them to five (5) events like the following?   _raw 1 {"timestamp":"2024-08-20 15:33:00.837000","data_type":"finding_export","domain_id":"my_domain_id","domain_name":"my_domain_name","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} 2 {"timestamp":"2024-08-20 15:32:00.837000","data_type":"finding_export","domain_id":"your_domain_id","domain_name":"your_domain_name","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} 3 {"timestamp":"2024-08-20 15:31:10.837000","data_type":"finding_export","domain_id":"my_domain_id","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"your_user"} 4 {"timestamp":"2024-08-20 15:31:05.837000","data_type":"finding_export","domain_id":"my_domain_id","path_id":"T0MarkSensitive","path_title":"My Path Title","user":"my_user"} 5 {"timestamp":"2024-08-20 15:30:00.837000","data_type":"finding_export","domain_id":"my_domain_id","domain_name":"my_domain_name","user":"my_user"} If this is what are you look for, there is no need to perform complicated manipulations and no need for lookup.  Just do   index=my_index data_type=my_sourcetype earliest=-15m latest=now | fillnull value=UNSPEC | dedup keepempty=true data_type domain_id domain_name path_id path_title user ``` below simply restores null values, not required for dedup ``` | foreach * [eval <<FIELD>> = if(<<FIELD>> == "UNSPEC", null(), <<FIELD>>)]   Here is an emulation to produce the sample data illustrated above.  You can play with it and compare with real data:   | makeresults format=json data=" [{\"timestamp\": \"2024-08-20 15:33:00.837000\", \"data_type\": \"finding_export\", \"domain_id\": \"my_domain_id\", \"domain_name\": \"my_domain_name\", \"path_id\": \"T0MarkSensitive\", \"path_title\": \"My Path Title\", \"user\": \"my_user\"}, {\"timestamp\": \"2024-08-20 15:32:00.837000\", \"data_type\": \"finding_export\", \"domain_id\": \"your_domain_id\", \"domain_name\": \"your_domain_name\", \"path_id\": \"T0MarkSensitive\", \"path_title\": \"My Path Title\", \"user\": \"my_user\"}, {\"timestamp\": \"2024-08-20 15:31:10.837000\", \"data_type\": \"finding_export\", \"domain_id\": \"my_domain_id\", \"path_id\": \"T0MarkSensitive\", \"path_title\": \"My Path Title\", \"user\": \"your_user\"}, {\"timestamp\": \"2024-08-20 15:31:05.837000\", \"data_type\": \"finding_export\", \"domain_id\": \"my_domain_id\", \"path_id\": \"T0MarkSensitive\", \"path_title\": \"My Path Title\", \"user\": \"my_user\"}, {\"timestamp\": \"2024-08-20 15:31:00.837000\", \"data_type\": \"finding_export\", \"domain_id\": \"my_domain_id\", \"path_id\": \"T0MarkSensitive\", \"path_title\": \"My Path Title\", \"user\": \"my_user\"}, {\"timestamp\": \"2024-08-20 15:30:00.837000\", \"data_type\": \"finding_export\", \"domain_id\": \"my_domain_id\", \"domain_name\": \"my_domain_name\", \"user\": \"my_user\"}, {\"timestamp\": \"2024-08-20 15:28:00.837000\", \"data_type\": \"finding_export\", \"domain_id\": \"my_domain_id\", \"domain_name\": \"my_domain_name\", \"path_id\": \"T0MarkSensitive\", \"path_title\": \"My Path Title\", \"user\": \"my_user\"} ]" | eval _time = strptime(timestamp, "%F %T.%6N") ``` the above emulates index=my_index data_type=my_sourcetype earliest=-15m latest=now ```  
Problem from dashboard, this dashboard comes with default package of ITSI which i am trying to do reverse engineering fixing the dashboard How do i fix this issue    What is ldapfilter command... See more...
Problem from dashboard, this dashboard comes with default package of ITSI which i am trying to do reverse engineering fixing the dashboard How do i fix this issue    What is ldapfilter command ? and how do i fix the token issue