All Posts

Find Answers
Ask questions. Get answers. Find technical product solutions from passionate members of the Splunk community.

All Posts

Append uses a subsearch. Subsearches have their limits. That's one thing. Most probably, especially since you're doing a lot of funky stuff like sorting, your subsearch simply takes too much time and... See more...
Append uses a subsearch. Subsearches have their limits. That's one thing. Most probably, especially since you're doing a lot of funky stuff like sorting, your subsearch simply takes too much time and is silently finalized. Another thing - those searches are probably suboptimal (don't know your data but they don't seem right in some places). I'm always cautious if I see dedup and too many sortings. Also - you're listing a bunch of fields | fields statistic_id value group_name location Then use a field not listed (and not being a default field like _raw and _time) | eval _virtual_=if(isnull(virtual), "N", "Y"), _cd_=replace(_cd, ".*:", "") And you must not use field names beginning with underscore for your own fields - they are reserved for Splunk's internal fields.
punct is (if it's generated, because its creation can be disabled) an indexed field like any other so you can use it. But the question is what do you mean by "conditional" extraction.
Your AIX version seems to be supported so generally it should ran. The upgrade procedure is no rocket science - https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/universal-forwar... See more...
Your AIX version seems to be supported so generally it should ran. The upgrade procedure is no rocket science - https://help.splunk.com/en/splunk-enterprise/forward-and-process-data/universal-forwarder-manual/9.3/upgrade-or-uninstall-the-universal-forwarder/upgrade-the-universal-forwarder#upgrade-a-single-forwarder-0 One caveat though - you must use GNU tar for extracting the installation archive, not AIX tar.
Hi All, I need to upgrade the Splunk Universal Forwarder (UF) on AIX 7.2 from version 8.2.9 to 9.4.3. However, after attempting the upgrade, the Splunk UF crashes immediately upon startup. Could yo... See more...
Hi All, I need to upgrade the Splunk Universal Forwarder (UF) on AIX 7.2 from version 8.2.9 to 9.4.3. However, after attempting the upgrade, the Splunk UF crashes immediately upon startup. Could you please provide the proper upgrade steps and let me know if there are any known limitations or compatibility issues with this upgrade? Thanks in advance for your help.
Thanks, and sorry I was not clear enough. I want to do this with props and transforms so that the fields are reusable.
Hi @uagraw01  I just skim thorugh it can you try appendcols command insteaed of append let us know if that works 
Awesome , but shows 0 results. 
Hello Splunkers!! I want to combined both the queries by using append but it doesnot work. its always giving me only one section of the results. Please help me to fix it. (index=si_error sour... See more...
Hello Splunkers!! I want to combined both the queries by using append but it doesnot work. its always giving me only one section of the results. Please help me to fix it. (index=si_error source=scada (error_status=CAME_IN OR error_status=WENT_OUT) (_time=Null OR NOT virtual)) | fields - _raw | fields + area, zone, equipment, element, isc_id, error, error_status, start_time | search (area="*"), (zone="*"), (equipment="*"), (isc_id="*") | eval _time=exact(if(isnull(start_time),'_time',max(start_time,earliest_epoch))), _virtual_=if(isnull(virtual),"N","Y"), _cd_=replace('_cd',".*:","") | sort 0 -_time _virtual_ -"_indextime" -_cd_ | dedup isc_id error _time | fields - _virtual_, _cd_ | fillnull value="" element | sort 0 -_time -"_indextime" | streamstats window=2 global=false current=true earliest(_time) AS start latest(_time) AS stop, count AS count by area zone equipment element error | search error_status=CAME_IN | lookup isc id AS isc_id OUTPUTNEW statistical_subject mark_code | lookup new_ctcl_21_07.csv JoinedAttempt1 AS statistical_subject, mis_address AS error OUTPUTNEW description, operational_rate, technical_rate, alarm_severity | fillnull value=0 technical_rate operational_rate | fillnull value="-" alarm_severity mark_code | eval description=coalesce(description,("Unknown text for error number " . error)), error_description=((error . "-") . description), location=((mark_code . "-") . isc_id), stop=if((count == 1),null,stop), start=exact(coalesce(start_time,'_time')), start_window=max(start,earliest_epoch), stop_window=min(stop,if((latest_epoch > now()),now(),latest_epoch)), duration=round(exact((stop_window - start_window)),3) | fields + start, error_description, isc_id, duration, stop, mark_code, technical_rate, operational_rate, alarm_severity , area, zone, equipment | dedup isc_id error_description start | sort 0 start isc_id error_description asc | eval operational_rate=(operational_rate * 100), technical_rate=(technical_rate * 100) ,"Start time"= strftime(start,"%d-%m-%Y %H:%M:%S"), "Stop time (within window)"= strftime(stop,"%d-%m-%Y %H:%M:%S"), "Duration (within window)"=tostring(duration,"duration") | dedup "Start time","Stop time (within window)", isc_id, error_description, mark_code | search NOT error_description="*Unknown text for error*" | search technical_rate>* AND operational_rate>* (alarm_severity="*") (mark_code="*") | rename error_description as "Error ID", isc_id as Location, mark_code as "Mark code", technical_rate as "Technical %", operational_rate as "Operational %", alarm_severity as Severity | lookup mordc_Av_full_assets.csv Area as area, Zone as zone, Section as equipment output TopoID | lookup mordc_topo ID as TopoID output Description as Area | search Area="Depalletizing, Decanting" | stats count as Scada_count by Area | table Scada_count Search 2: index=internal_statistics_1h [| inputlookup internal_statistics | where (step="Defoil and decanting" OR step="Defoil and depalletising") AND report="Throughput" AND level="step" AND measurement IN("Case") | fields id | rename id AS statistic_id] | eval value=coalesce(value, sum_value) | fields statistic_id value group_name location | eval _virtual_=if(isnull(virtual), "N", "Y"), _cd_=replace(_cd, ".*:", "") | sort 0 -_time _virtual_ -"_indextime" -_cd_ | dedup statistic_id _time group_name | fields - _virtual_ _cd_ | lookup internal_statistics id AS statistic_id OUTPUTNEW report level step measurement | stats sum(value) AS dda_count  
Thanks for the reply , yes I do get counts by time, but how can I can just VPN data that has a signature="WebVPN" and action="failure" ? 
Hi @stavush  Splunk has not publicly committed to an OIDC GA timeline. You could try and contact Splunk Support/your Splunk account team for roadmap details under NDA. In the meantime, there is an ... See more...
Hi @stavush  Splunk has not publicly committed to an OIDC GA timeline. You could try and contact Splunk Support/your Splunk account team for roadmap details under NDA. In the meantime, there is an idea already raised for this which is getting traction so its worth upvoting this! https://ideas.splunk.com/ideas/EID-I-300  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Splunk rarely announces future features.  We won't know it's coming until it's here. Consider going to https://ideas.splunk.com to request it.
Hi @tlopes  This is a hard-limit: There is no resolution or workaround to get past this limitation ( Chars limit is hardcoded ) other than splitting the value into multiple fields and then concat t... See more...
Hi @tlopes  This is a hard-limit: There is no resolution or workaround to get past this limitation ( Chars limit is hardcoded ) other than splitting the value into multiple fields and then concat the fields together during the search. Check https://splunk.my.site.com/customer/s/article/Understanding-the-Maximum-Allowable-Length-for-Indexed-Fields-in-Metric-Index for more info.    Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
I deleted the duplicate post for you.
I'm trying to ingest some metrics with very long attribute values and the length of "<dim_name>::<dim_value>" seems to be limited to 2048 characters - anything beyond that gets truncated. Is there a ... See more...
I'm trying to ingest some metrics with very long attribute values and the length of "<dim_name>::<dim_value>" seems to be limited to 2048 characters - anything beyond that gets truncated. Is there a way to increase this limit?
Hello, I would like to know if there are any plans for Splunk to support OIDC (in addition to SAML) If so, is there a roadmap or estimated timeline for this support? Thank you
@livehybrid Thanks for pointing it out. Yes that apply to classic only.  @Sudhagar As @livehybrid  mentioned you can use row.event_id.value(For specific field value) or value(for any clicked value) ... See more...
@livehybrid Thanks for pointing it out. Yes that apply to classic only.  @Sudhagar As @livehybrid  mentioned you can use row.event_id.value(For specific field value) or value(for any clicked value) or name(field name of the clicked value) in dsahboard studio.   "options": { "tokens": [ { "key": "value", "token": "eventid" } ] }  
Hi @hl  If you're getting 0 results from that query, but you are getting results in the pivot then it sounds like one of the fields you are using for the filter is not quite right.  I think the cor... See more...
Hi @hl  If you're getting 0 results from that query, but you are getting results in the pivot then it sounds like one of the fields you are using for the filter is not quite right.  I think the correct query should be: | tstats count from datamodel=Network_Sessions.All_Sessions where nodename=All_Sessions.VPN All_Sessions.action=failure All_Sessions.signature="WebVPN" by _time span=1h The action and signature belong to "All_Sessions"   Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @fongpen  This is interesting as the 201 response and the fact the ticket is actually created suggests that what is being sent to the API is correct, its just that the API returns something inval... See more...
Hi @fongpen  This is interesting as the 201 response and the fact the ticket is actually created suggests that what is being sent to the API is correct, its just that the API returns something invalid which the TA cannot process, either a non-JSON string or an invalid JSON string.  I think at this point it will need to go to the app developers to investigate, unless you want to start tweaking the script yourself to output the exact response from the API?! Since the app is supported by Splunk I think the best action would be to log a support case and explain the actions you've been through already, providing them the logs that you've shared here, hopefully they can send through for their dev team to investigate and remediate. You can log a support case via https://www.splunk.com/support  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Here is a full working example for your to try:   { "title": "TokenSet", "description": "", "inputs": { "input_global_trp": { "options": { "defau... See more...
Here is a full working example for your to try:   { "title": "TokenSet", "description": "", "inputs": { "input_global_trp": { "options": { "defaultValue": "-24h@h,now", "token": "global_time" }, "title": "Global Time Range", "type": "input.timerange" } }, "defaults": { "dataSources": { "ds.search": { "options": { "queryParameters": { "earliest": "$global_time.earliest$", "latest": "$global_time.latest$" } } } }, "visualizations": { "global": { "showProgressBar": true } } }, "visualizations": { "viz_6cm12FqM": { "options": { "markdown": "eventid: $eventid$" }, "type": "splunk.markdown" }, "viz_Fba9zdzF": { "dataSources": { "primary": "ds_60Uo5lG8" }, "eventHandlers": [ { "options": { "tokens": [ { "key": "row.event_id.value", "token": "eventid" } ] }, "type": "drilldown.setToken" } ], "options": {}, "type": "splunk.table" } }, "dataSources": { "ds_60Uo5lG8": { "name": "Search_1", "options": { "query": "| makeresults count=5 \n| streamstats count\n| eval msg=\"Test message\".tostring(count)\n| eval event_id=md5(msg)" }, "type": "ds.search" } }, "layout": { "globalInputs": [ "input_global_trp" ], "layoutDefinitions": { "layout_1": { "options": { "display": "auto", "height": 960, "width": 1440 }, "structure": [ { "item": "viz_6cm12FqM", "position": { "h": 40, "w": 360, "x": 20, "y": 20 }, "type": "block" }, { "item": "viz_Fba9zdzF", "position": { "h": 120, "w": 1360, "x": 20, "y": 60 }, "type": "block" } ], "type": "absolute" } }, "options": {}, "tabs": { "items": [ { "label": "New tab", "layoutId": "layout_1" } ] } } }  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing
Hi @Sudhagar  The issue here is your "token" field value - it should be either key, value or row.<fieldName>.value, such as: "eventHandlers": [ { "options": { ... See more...
Hi @Sudhagar  The issue here is your "token" field value - it should be either key, value or row.<fieldName>.value, such as: "eventHandlers": [ { "options": { "tokens": [ { "key": "row.event_id.value", "token": "eventid" } ] }, "type": "drilldown.setToken" } ], NOT just the field name as you have in your example. @PrewinThomas Regarding $click.value$ - Doesnt this only apply to classic XML dashboards?  Did this answer help you? If so, please consider: Adding karma to show it was useful Marking it as the solution if it resolved your issue Commenting if you need any clarification Your feedback encourages the volunteers in this community to continue contributing