All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Hi, is there any possibilities to convert these info event details into email actions?? Since, as of now as a workaround please suggest us how can we send this below info custom events data to mail ... See more...
Hi, is there any possibilities to convert these info event details into email actions?? Since, as of now as a workaround please suggest us how can we send this below info custom events data to mail action.
Hi if there is no real reason to add it in ingest phase you should use @ITWhisperer 's example. But if you really need it on ingest time then you can look e.g.  https://community.splunk.com/t5/Ge... See more...
Hi if there is no real reason to add it in ingest phase you should use @ITWhisperer 's example. But if you really need it on ingest time then you can look e.g.  https://community.splunk.com/t5/Getting-Data-In/How-to-apply-source-file-date-using-INGEST-as-Time/m-p/596865 https://community.splunk.com/t5/Getting-Data-In/How-to-get-props-and-transforms-to-extract-time-from-source/td-p/641795 how to use e.g. INGEST_EVAL to manipulate events in ingest phase.  r. Ismo
Hi @jrodriguezap , could you share your solution pls? 
For anomaly detection, you should  consider using the MLTK. Otherwise, you need to fashion a report which detects anomalies and use that for triggering you alert. In order to do this, you need to be... See more...
For anomaly detection, you should  consider using the MLTK. Otherwise, you need to fashion a report which detects anomalies and use that for triggering you alert. In order to do this, you need to be able to define what an anomaly looks like, so that you can instruct Splunk to find them for you.
Firstly, since log seems to contain JSON, why not use spath with input=log to extract the fields. Secondly, there is no need to search for your fields equal to "*" (which presumably you are doing to... See more...
Firstly, since log seems to contain JSON, why not use spath with input=log to extract the fields. Secondly, there is no need to search for your fields equal to "*" (which presumably you are doing to remove events with null values for these fields?), as the dedup will do this for you. Thirdly, perhaps you should consider just converting the start to an epoch time with strptime() as you have already done, then use timechart span=1d Finally, this might have been easier to answer if you had provided some anonymised sample events so we can see what you are working with.
I try to do box plot using viz. But I can see the "trace 0" data graph in box plot. ( I don't have any data called "trace 0")   This is my code, <row> <panel> <viz type="splunk_plotly_collec... See more...
I try to do box plot using viz. But I can see the "trace 0" data graph in box plot. ( I don't have any data called "trace 0")   This is my code, <row> <panel> <viz type="splunk_plotly_collection_viz.boxplot"> <search> <query> ..... | eval total_time=case(time<= 8, "8", time<= 9, "8~9", time<= 10, "9~10", time<= 11, "10~11", time<= 15, "11~15", time<= 20, "15~20") | table total_time init_dt </query> </search> <option name="drilldown">all</option> <option name="refresh.display">progressbar</option> <option name="trellis.enabled">0</option> </viz> </panel> </row>  and this is the current state of my graph. How could I delete "trace 0" in the graph?  
Check your events in splunk - there is a Splunk provide field called source which holds the file name from where the event came from. Can you use this to extract the data you want? | eval some_run_i... See more...
Check your events in splunk - there is a Splunk provide field called source which holds the file name from where the event came from. Can you use this to extract the data you want? | eval some_run_id=mvindex(split(source,"/"),5)
Simply look at the source of all your dashboards, reports, alerts, macros, etc. to see if the index is used.
It is not clear how you arrived at your current state, and it might be easier to solve with some sight of your current search and events (as @gcusello  has indicated), however, assuming you still wan... See more...
It is not clear how you arrived at your current state, and it might be easier to solve with some sight of your current search and events (as @gcusello  has indicated), however, assuming you still want to go forward from where you seem to be, you could try something like this: | eval row=mvrange(0,mvcount(child_Name)) | mvexpand row | foreach child_Name direction dv_u_parent_class parent [| eval <<FIELD>>=mvindex(<<FIELD>>,row)] | fields - _row
Hi @RSS_STT, probably there are more combinations of your values, not only 4. You have two solutions: use less fields as keys in the stats command, but you'll have some filed with multivalue. Othe... See more...
Hi @RSS_STT, probably there are more combinations of your values, not only 4. You have two solutions: use less fields as keys in the stats command, but you'll have some filed with multivalue. Otherwise you should identify some rules to filter your results. Anyway, the only way to have only one value in a stats command is to put it in the BY clause. There's also another solution, but in this way you loose some results: instead values, you could use the first option, taking only one value for each: I don't know (but I don't think) that this is acceptable for you! Ciao. Giuseppe
Hi @RSS_STT , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Yes, I was trying the first query shared by you previously but second query shared by you also creating 96 record where i'm expecting only 4 record.
I have a field which have values only with numbers and also with combination of number and special characters as values. I would like to filter the field values where both number and special characte... See more...
I have a field which have values only with numbers and also with combination of number and special characters as values. I would like to filter the field values where both number and special characters are in it. Example: Log 1 -> field1="238_345$345" Log 2 -> field1="+739-8883 Log 3 -> field1="542.789#298" Already I have tried in writing regex query but there is no expression to filter out the combination of digits & special characters. (No expression to filter all the special character). How can I filter and display the field value which have the combination of number and special characters? Could anyone help me on this?
Hi team, Thank you for your support. The problem was solved when I changed the command by typing hostname instead of IP.
Hi @RSS_STT, I suppose that your search is something like this: <your-search> | stats values(child_Name) AS child_Name values(dv_u_parent_class) AS dv_u_parent_class values(fqdn_name) AS ... See more...
Hi @RSS_STT, I suppose that your search is something like this: <your-search> | stats values(child_Name) AS child_Name values(dv_u_parent_class) AS dv_u_parent_class values(fqdn_name) AS fqdn_name values(direction) AS direction values(name) AS name values(parent) AS parent BY child You shoud try something like this: <your-search> | stats values(fqdn_name) AS fqdn_name values(name) AS name BY child child_Name dv_u_parent_class direction parent I could be more detailed if you can share your search. Ciao. Giuseppe
Trying to expand the multivalue field with one to one mapping as shown in image. mvexpand create multiple row with all column matching value. Actual data with multivalue. child child_Name dv_... See more...
Trying to expand the multivalue field with one to one mapping as shown in image. mvexpand create multiple row with all column matching value. Actual data with multivalue. child child_Name dv_class n_name direction name parent 55555            
Hi @vishenps, your question is just a little vague. Let me understand: you have many indexes in your company and you want to create an app to permit to access to some of them (2 or 3) only to a res... See more...
Hi @vishenps, your question is just a little vague. Let me understand: you have many indexes in your company and you want to create an app to permit to access to some of them (2 or 3) only to a restricted set of users, is this correct? If this is your requirement, you could create a role for those users that can access only the selected indexes. Then you can create some dashboards for those users or use the other dashboards, anyway, they will see only the data contained in the selected granted indexes. Ciao. Giuseppe
Hi @kyokei, use the Add Data function, that you can find at [Settings > Add Data], that guides you in the correct sourcetype configuration. For the timestamp, if you want to use as timestamp the Tr... See more...
Hi @kyokei, use the Add Data function, that you can find at [Settings > Add Data], that guides you in the correct sourcetype configuration. For the timestamp, if you want to use as timestamp the Trigger Time, you could use: [your_sourcetype] TIME_PREFIX = \"\Trigger Time\",\"\' TIME_FORMAT = %y-%m-%d %H:%M:%S.%3N Ciao. Giuseppe
Hi @toporagno, Icould be more detailed if you share your inputs.conf in the forwarder. anyway, you should have somethig like this: [monitor://C:\my_directory\my_file.log] index=my_index sourcetype... See more...
Hi @toporagno, Icould be more detailed if you share your inputs.conf in the forwarder. anyway, you should have somethig like this: [monitor://C:\my_directory\my_file.log] index=my_index sourcetype=my_sourcetype disabled=0 As I said, if you have few Universal Forwarders, you can manually modify all the inputs.conf, if they are many, you should think to use a Deployment Server, but this is another project. For more infos about DS, you can see at https://docs.splunk.com/Documentation/Splunk/9.1.2/Updating/Aboutdeploymentserver Ciao. Giuseppe
I'm trying to look for refernce or documintation that shows me which fields in sysmon logs should be mapped to which fields in endpoint datamodel.   for example Image & ParentImage it should show i... See more...
I'm trying to look for refernce or documintation that shows me which fields in sysmon logs should be mapped to which fields in endpoint datamodel.   for example Image & ParentImage it should show in which fields from endpoint datamodel since we have multiple fields for processes and parent processes it is confusing.