All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

FYI It looks like the dashboards are now working changing the datamodel at every step and adding the index reference fixed the issue thanks for the help
I modify @uagraw01 your answer to moving your SPL inside </> block in editor. In that way it's much more readable. Can you check that it is still correct and there haven't missed anything! Could yo... See more...
I modify @uagraw01 your answer to moving your SPL inside </> block in editor. In that way it's much more readable. Can you check that it is still correct and there haven't missed anything! Could you In future use that </> block always when you are adding some SPL, dashboards etc. With it we can ensure that what we see is what you have written, not something what editor has changed!
I am facing the same issue, has anyone solved it?
hi @ITWhisperer Its hard for me to share complete events. FYI to you  Area, zone, equipment fields are coming from the index search while other fields are coming through lookups.
@tej57 Hi Tejas thanks for the answer  but I don't want chatgpt or any other AI promt answers.
Hi @gcusello  | datamodel Mmm_availability adapto_shuttle_alarm flat | lookup Alarm_list_adapto_details.csv ERROR_ID as ALARMID OUTPUTNEW DESCRIPTION Max_duration Min_duration OPERATIONAL TECHN... See more...
Hi @gcusello  | datamodel Mmm_availability adapto_shuttle_alarm flat | lookup Alarm_list_adapto_details.csv ERROR_ID as ALARMID OUTPUTNEW DESCRIPTION Max_duration Min_duration OPERATIONAL TECHNICAL | rename Max_duration as MAX_DURATION Min_duration as MIN_DURATION DESCRIPTION as description ID as id | eval matchField = BLOCK."".SHUTTLEID | lookup mordc_topo modified_field as matchField OUTPUTNEW ID "Parent Description" as Name Asas as asas Weight as Weight modified_field as isc_id | table _time ID ALARMID description OPERATIONAL TECHNICAL Name Weight asas isc_id event_time MAX_DURATION MIN_DURATION state | append [ search index=mess sourcetype="EquipmentEventReport" "EquipmentEventReport.EquipmentEvent.Detail.State" IN("CAME_IN","WENT_OUT") | spath input=_raw path=EquipmentEventReport.EquipmentEvent.ID.Location.PhysicalLocation.AreaID output=area | spath input=_raw path=EquipmentEventReport.EquipmentEvent.ID.Location.PhysicalLocation.ZoneID output=zone | spath input=_raw path=EquipmentEventReport.EquipmentEvent.ID.Location.PhysicalLocation.EquipmentID output=equipment | search area=* | dedup _raw | lookup mordc_site_specific_scada_alarms.csv MsgNr as MsgNr OUTPUTNEW Alarmtext Functiongroup | eval zone=if(len(zone)==1,"0".zone,zone), equipment=if(len(equipment)==1,"0".equipment,equipment) | eval isc_id=area.".".zone.".".equipment | fields _time, isc_id, area, zone, equipment start_time element error error_status description event_time state MsgNr Alarmtext Functiongroup alarm_severity | fields - _raw, Alarmtext Functiongroup, MsgNr | lookup isc id AS isc_id OUTPUTNEW statistical_subject mark_code | eval statistical_subject = trim(statistical_subject) | lookup Alarm_list_details_scada_mordc.csv component_type_id AS statistical_subject OUTPUTNEW operational_rate technical_rate maximum_duration minimum_duration alarm_severity | search alarm_severity IN ("High", "Medium") | lookup mordc_topology.csv modified_field as isc_id OUTPUTNEW ID "Description" as Name Asas as asas Weight as Weight | rename operational_rate as OPERATIONAL technical_rate as TECHNICAL maximum_duration as MAX_DURATION minimum_duration as MIN_DURATION | table _time ID description OPERATIONAL TECHNICAL Name Weight asas isc_id event_time MAX_DURATION MIN_DURATION state Alarmtext Functiongroup] Giving below multi value field results.
Hello, I had set up a Distributed Search setup in VirtualBox with a Search Head, indexer and Deployment Server.  Initially the forwarders were showing up in the deployment server as phoned home. Bu... See more...
Hello, I had set up a Distributed Search setup in VirtualBox with a Search Head, indexer and Deployment Server.  Initially the forwarders were showing up in the deployment server as phoned home. But after restarting I see that no clients are coming up in DS, instead they are showing up in the Search Head's Forwarder Management. I checked the deploymentclient.conf and the IP points towards the Deployment Server.  I tried removing the deployment-apps in Search Head and restarting but I think as it's in a Distributed Search mode the folder is automatically getting created.
Thanks @livehybrid  then Splunk should parse correctly fields for addons? Do you mean _raw will be the original event from source host and sending to targered index/sourcetype?
Hi @splunkreal  Are you using Splunk Connect for Kafka? If so you should be able to set it to use raw HEC endpoint: "splunk.hec.raw" : "true", For more info check out https://help.splunk.com/en/sp... See more...
Hi @splunkreal  Are you using Splunk Connect for Kafka? If so you should be able to set it to use raw HEC endpoint: "splunk.hec.raw" : "true", For more info check out https://help.splunk.com/en/splunk-cloud-platform/get-data-in/splunk-connect-for-kafka/2.2/configure/configuration-examples-for-splunk-connect-for-kafka  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @dinesh001kumar  To remove any uncertainty, it *is* possible to have custom css/js within apps in Splunk Cloud, however I dont think you can upload the files via the Splunk Cloud UI, instead you ... See more...
Hi @dinesh001kumar  To remove any uncertainty, it *is* possible to have custom css/js within apps in Splunk Cloud, however I dont think you can upload the files via the Splunk Cloud UI, instead you will need to package the files within your custom app in <app_name>/appserver/static directory.  Once packaged and uploaded to Splunk Cloud this should work the same as you may have previously used on-premise with Splunk Enterprise.  For more information check out https://docs.splunk.com/Documentation/SplunkCloud/latest/AdvancedDev/UseCSS As @bowesmana stated, if you are on Victoria experience Splunk Cloud stack then your app should sail through appsinpect without having to be manually inspected.   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hello, is it possible in Splunk HEC from Kafka to receive raw events on HF in order to parse fields with addons? It seems we can only receive json data with "event" field and may not be able to extr... See more...
Hello, is it possible in Splunk HEC from Kafka to receive raw events on HF in order to parse fields with addons? It seems we can only receive json data with "event" field and may not be able to extract fields within standard addons? The HEC event may also contain target index and sourcetype. Thanks.  
Hi @SN1  I would start by adding a console.log(rowKey); and also one after searchQuery - console.log(searchQuery); and then validate that these are outputting what you expect. Can you check to see... See more...
Hi @SN1  I would start by adding a console.log(rowKey); and also one after searchQuery - console.log(searchQuery); and then validate that these are outputting what you expect. Can you check to see if this prints out what you are expecting and let us know how you get on as this might help drill down further.  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Emre  Try the following: | eval message=json_extract(_raw,"message") | spath input=message    Did this answer help you? If so, please consider: Adding karma to show it was useful Ma... See more...
Hi @Emre  Try the following: | eval message=json_extract(_raw,"message") | spath input=message    Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
As you have used depends attribute on your input, I'm quite sure that this is not a capability issue, instead of it is undefined token. The easiest way to debug tokens is use simple_xml_examples app... See more...
As you have used depends attribute on your input, I'm quite sure that this is not a capability issue, instead of it is undefined token. The easiest way to debug tokens is use simple_xml_examples app and add this to your dashboard <form version="1.1" theme="light" script="simple_xml_examples:showtokens.js"> After this addition, you see all used tokens and their values on bottom of page. https://splunkbase.splunk.com/app/1603  Helsinki UG presentation how to create and debug SXML dashboards https://data-findings.com/wp-content/uploads/2024/09/HSUG-20240903-Tiia-Ojares.pdf
Hi @uagraw01  Im not sure how you ended up with this list, there may be a better way, but if you really do need to split these back out then the following snippet should work - This combined the fie... See more...
Hi @uagraw01  Im not sure how you ended up with this list, there may be a better way, but if you really do need to split these back out then the following snippet should work - This combined the fields into a single JSON string so you can use mvexpand once. | foreach FunctionGroup mode=multivalue [ |eval json=mvappend(json, "{". "\"FunctionGroup\":\"".mvindex(FunctionGroup,<<ITER>>)."\",". "\"MsgNr\":\"".mvindex(MsgNr,<<ITER>>)."\",". "\"alarm_severity\":\"".mvindex(alarm_severity,<<ITER>>)."\",". "\"area\":\"".mvindex(area,<<ITER>>)."\",". "\"equipment\":\"".mvindex(equipment,<<ITER>>)."\"". "}" ) ] | mvexpand json | eval _raw=json | fields _raw | spath Ive tested it as best as I can:    Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
You have a depends attribute on this input, perhaps these tokens are not defined for the user?
Perhaps go back a step - how did you get these multivalue fields in the first place? Can you separate into events prior to this? What does your raw data look like?
Hello, this query seems to be working but the clients field is a multivalue field for some sourcetype ,so it results are spread out ,can you modify it ?
Hey @uagraw01 , mvexpand command works with only one field. And if you use it multiple times, it'll cause n number of duplicate values. Here's what I used in one scenario -  Combine all the multiva... See more...
Hey @uagraw01 , mvexpand command works with only one field. And if you use it multiple times, it'll cause n number of duplicate values. Here's what I used in one scenario -  Combine all the multivalue fields using mvzip and a delimeter. In your case, it would be eqvivalent to -  | eval combined_multivalue_field = mvzip(mvzip(FunctionGroup, MsgNr, "|"),mvzip(alarm_severity,area,"|"),"|") This will give you one single column with all the field values separated by pipe (|) delimeter. You can then use mvexpand on combined_multivalue_field Then if you want to utilize each individual values, you can use string functions to separate out each values that you need and progress with the SPL query.   Thanks, Tejas.   --- If the above solution helps, an upvote is appreciated.
Try like this <change> <condition value="A"> <set token="T1">myRegex1(X)</set> <set token="T2">myRegex2(Y)</set> </condition> </change>