All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

I try to do box plot using viz. But I can see the "trace 0" data graph in box plot. ( I don't have any data called "trace 0")   This is my code, <row> <panel> <viz type="splunk_plotly_collec... See more...
I try to do box plot using viz. But I can see the "trace 0" data graph in box plot. ( I don't have any data called "trace 0")   This is my code, <row> <panel> <viz type="splunk_plotly_collection_viz.boxplot"> <search> <query> ..... | eval total_time=case(time<= 8, "8", time<= 9, "8~9", time<= 10, "9~10", time<= 11, "10~11", time<= 15, "11~15", time<= 20, "15~20") | table total_time init_dt </query> </search> <option name="drilldown">all</option> <option name="refresh.display">progressbar</option> <option name="trellis.enabled">0</option> </viz> </panel> </row>  and this is the current state of my graph. How could I delete "trace 0" in the graph?  
Check your events in splunk - there is a Splunk provide field called source which holds the file name from where the event came from. Can you use this to extract the data you want? | eval some_run_i... See more...
Check your events in splunk - there is a Splunk provide field called source which holds the file name from where the event came from. Can you use this to extract the data you want? | eval some_run_id=mvindex(split(source,"/"),5)
Simply look at the source of all your dashboards, reports, alerts, macros, etc. to see if the index is used.
It is not clear how you arrived at your current state, and it might be easier to solve with some sight of your current search and events (as @gcusello  has indicated), however, assuming you still wan... See more...
It is not clear how you arrived at your current state, and it might be easier to solve with some sight of your current search and events (as @gcusello  has indicated), however, assuming you still want to go forward from where you seem to be, you could try something like this: | eval row=mvrange(0,mvcount(child_Name)) | mvexpand row | foreach child_Name direction dv_u_parent_class parent [| eval <<FIELD>>=mvindex(<<FIELD>>,row)] | fields - _row
Hi @RSS_STT, probably there are more combinations of your values, not only 4. You have two solutions: use less fields as keys in the stats command, but you'll have some filed with multivalue. Othe... See more...
Hi @RSS_STT, probably there are more combinations of your values, not only 4. You have two solutions: use less fields as keys in the stats command, but you'll have some filed with multivalue. Otherwise you should identify some rules to filter your results. Anyway, the only way to have only one value in a stats command is to put it in the BY clause. There's also another solution, but in this way you loose some results: instead values, you could use the first option, taking only one value for each: I don't know (but I don't think) that this is acceptable for you! Ciao. Giuseppe
Hi @RSS_STT , good for you, see next time! Ciao and happy splunking Giuseppe P.S.: Karma Points are appreciated
Yes, I was trying the first query shared by you previously but second query shared by you also creating 96 record where i'm expecting only 4 record.
I have a field which have values only with numbers and also with combination of number and special characters as values. I would like to filter the field values where both number and special characte... See more...
I have a field which have values only with numbers and also with combination of number and special characters as values. I would like to filter the field values where both number and special characters are in it. Example: Log 1 -> field1="238_345$345" Log 2 -> field1="+739-8883 Log 3 -> field1="542.789#298" Already I have tried in writing regex query but there is no expression to filter out the combination of digits & special characters. (No expression to filter all the special character). How can I filter and display the field value which have the combination of number and special characters? Could anyone help me on this?
Hi team, Thank you for your support. The problem was solved when I changed the command by typing hostname instead of IP.
Hi @RSS_STT, I suppose that your search is something like this: <your-search> | stats values(child_Name) AS child_Name values(dv_u_parent_class) AS dv_u_parent_class values(fqdn_name) AS ... See more...
Hi @RSS_STT, I suppose that your search is something like this: <your-search> | stats values(child_Name) AS child_Name values(dv_u_parent_class) AS dv_u_parent_class values(fqdn_name) AS fqdn_name values(direction) AS direction values(name) AS name values(parent) AS parent BY child You shoud try something like this: <your-search> | stats values(fqdn_name) AS fqdn_name values(name) AS name BY child child_Name dv_u_parent_class direction parent I could be more detailed if you can share your search. Ciao. Giuseppe
Trying to expand the multivalue field with one to one mapping as shown in image. mvexpand create multiple row with all column matching value. Actual data with multivalue. child child_Name dv_... See more...
Trying to expand the multivalue field with one to one mapping as shown in image. mvexpand create multiple row with all column matching value. Actual data with multivalue. child child_Name dv_class n_name direction name parent 55555            
Hi @vishenps, your question is just a little vague. Let me understand: you have many indexes in your company and you want to create an app to permit to access to some of them (2 or 3) only to a res... See more...
Hi @vishenps, your question is just a little vague. Let me understand: you have many indexes in your company and you want to create an app to permit to access to some of them (2 or 3) only to a restricted set of users, is this correct? If this is your requirement, you could create a role for those users that can access only the selected indexes. Then you can create some dashboards for those users or use the other dashboards, anyway, they will see only the data contained in the selected granted indexes. Ciao. Giuseppe
Hi @kyokei, use the Add Data function, that you can find at [Settings > Add Data], that guides you in the correct sourcetype configuration. For the timestamp, if you want to use as timestamp the Tr... See more...
Hi @kyokei, use the Add Data function, that you can find at [Settings > Add Data], that guides you in the correct sourcetype configuration. For the timestamp, if you want to use as timestamp the Trigger Time, you could use: [your_sourcetype] TIME_PREFIX = \"\Trigger Time\",\"\' TIME_FORMAT = %y-%m-%d %H:%M:%S.%3N Ciao. Giuseppe
Hi @toporagno, Icould be more detailed if you share your inputs.conf in the forwarder. anyway, you should have somethig like this: [monitor://C:\my_directory\my_file.log] index=my_index sourcetype... See more...
Hi @toporagno, Icould be more detailed if you share your inputs.conf in the forwarder. anyway, you should have somethig like this: [monitor://C:\my_directory\my_file.log] index=my_index sourcetype=my_sourcetype disabled=0 As I said, if you have few Universal Forwarders, you can manually modify all the inputs.conf, if they are many, you should think to use a Deployment Server, but this is another project. For more infos about DS, you can see at https://docs.splunk.com/Documentation/Splunk/9.1.2/Updating/Aboutdeploymentserver Ciao. Giuseppe
I'm trying to look for refernce or documintation that shows me which fields in sysmon logs should be mapped to which fields in endpoint datamodel.   for example Image & ParentImage it should show i... See more...
I'm trying to look for refernce or documintation that shows me which fields in sysmon logs should be mapped to which fields in endpoint datamodel.   for example Image & ParentImage it should show in which fields from endpoint datamodel since we have multiple fields for processes and parent processes it is confusing.
Hi, We have client looking to ingest logs using webmethod from one of application(caremonitor) logs from S3 bucket. Since we have not been used anytime before fetching logs via webmethod. Could you ... See more...
Hi, We have client looking to ingest logs using webmethod from one of application(caremonitor) logs from S3 bucket. Since we have not been used anytime before fetching logs via webmethod. Could you please let me know  the process to use this method and best practices. we are current using Splunk cloud platform and the care monitor looks to be cloud base application.
How to start the elastic search process. It's not mentioned in the Appd Documentation; Events service installation part. 
I have a simple question how can I check that in which of the apps a particular index has been used.
How to this the following file based on trigger time and elapsed time? "File name","AUTO_231126_012051_0329.CSV","V2.10" "Title comment","T1" "Trigger Time","'23-11-26 01:20:51.500" "CH","U1-2","... See more...
How to this the following file based on trigger time and elapsed time? "File name","AUTO_231126_012051_0329.CSV","V2.10" "Title comment","T1" "Trigger Time","'23-11-26 01:20:51.500" "CH","U1-2","Event" "Mode","Voltage" "Range","200mV" "UnitID","" "Comment","" "Scaling","ON" "Ratio","+1.00000E+02" "Offset","+0.00000E+00" "Time","U1-2[]","Event" +0.000000000E+00,+2.90500E+00,0 +1.000000000E-01,+1.45180E+01,0 +2.000000000E-01,+7.93600E+00,0 +3.000000000E-01,+3.60100E+00,0 +4.000000000E-01,+3.19100E+00,0 +5.000000000E-01,+3.17300E+00,0 +6.000000000E-01,+3.17300E+00,0 +7.000000000E-01,+3.18400E+00,0 +8.000000000E-01,+3.19400E+00,0 +9.000000000E-01,+3.16500E+00,0 +1.000000000E+00,+3.16000E+00,0