All Topics

Top

All Topics

We have many assets with non-compliant names, which I have to fix. I need help with that, because I dont have so much experience and I am not sure how to find correct names.
Hi All, below are the sample logs: can i get props for this sample logs.   ------------------------------------------------------------- Time: 02/12/2021 01:45:05.777 Message: there is a exce... See more...
Hi All, below are the sample logs: can i get props for this sample logs.   ------------------------------------------------------------- Time: 02/12/2021 01:45:05.777 Message: there is a exception error code gg456hhhrgh34567 type: application code data: system ------------------------------------------------------------- ------------------------------------------------------------- Time: 24/12/2021 01:45:05.777 Message: there is a exception error code 897fghj56879hgj type: application code jobs data: system jobs -------------------------------------------------------------       ------------------------------------------------------------- Time: 28/12/2021 02:54:15.767 Message: there is a exception error code 89hjyt5643edhjjy656 type: application code error data: system error ------------------------------------------------------------- -------------------------------------- Timeline: 12/02/2021 12:44:32.667 Message Details - Application code contains error at 12/02/2021 11:30:00.212 -------------------------------------- -------------------------------------- Timeline: 23/02/2021 10:23:22.124 Message Details - Application code contains error at 12/02/2021 08:20:10.100 -------------------------------------- -------------------------------------- Timeline: 24/02/2021 10:20:12.667 Message Details - Application code contains error at 24/02/2021 07:10:23.112 --------------------------------------    
  March 2023 | Check out the latest and greatest Splunk APM's New Tag Filter Experience Splunk APM has updated its Tag filter experience! We have introduced a more intuitive filtering... See more...
  March 2023 | Check out the latest and greatest Splunk APM's New Tag Filter Experience Splunk APM has updated its Tag filter experience! We have introduced a more intuitive filtering experience; with enhanced keyboard navigation and a history with your most recently viewed environments, services, and workflows. Navigate to APM in Splunk Observability to get started. Subscription usage report for hosts and containers For Admins of Splunk Observability, the Subscription Usage page for APM now provides detailed reports on Hosts and Container usage for up to 8 days of data. You can click on a peak, and download a list of hosts or container names that sent data to Splunk APM in a given minute. Click here to learn more. Memory Profiling support for Node.js and .NET Splunk APM AlwaysOn Profiling now supports memory profiling and runtime metrics for Node.js and .NET applications. The new capabilities help you troubleshoot memory hungry code by visualizing the memory allocation behavior of each component using a flame graph. In addition, you can monitor and alert on key runtime metrics for Node.js and .NET applications, including Heap usage, Garbage collection, Node.js Event loop lag, .NET Threadpools, etc. You can visualize and get notified when these metrics go beyond a threshold or when sudden changes occur, even on a specific host or container. Click here to get profiling data into Splunk APM.    ICYMI Getting Started with Observability: What is OpenTelemetry? Observability Documentation is now accepting community contributions We’ve just made it easier to ensure we’re providing the highest quality, most up-to-date Splunk Observability Cloud product documentation. Now anyone can refine and enhance Splunk Observability documents by adding new examples, documenting new settings, or adding new pages of prescriptive guidance. To get started, click Edit this page on any topic in https://docs.splunk.com/Observability. To learn more, see https://github.com/splunk/public-o11y-docs or this blog post. Splunk Lantern We are excited to announce new content in the Observability Use Case Explorer as well as a helpful Getting Started article on Log Observer Connect! * Identifying DNS reliability and latency issues * Monitoring availability and performance in non-public applications * Getting started with Log Observer Connect Mission Possible: Splunk Adoption Challenge Splunk Customer Success is recruiting new members for the Mission Possible: Splunk Adoption Challenge! The challenge contains 3 games with high-value content as ‘tasks’ that players need to ‘crack’. Players will start as Splunk rookie recruits and earn points and badges to become Double-O Splunk agents. The challenge is dedicated to helping the players learn about Splunk products, consume essential Splunk knowledge, and build digital resilience. Click HERE to enter the games! New Splunk Learning Platform launching on May 22! We’re so appreciative of all the curious learners out there who turn to Splunk Education to boost their careers and help their organizations stay resilient. We want to keep you coming back for more, which is why we are launching a new Splunk Learning Platform. This new, feature-rich portal houses all your in-progress eLearning, your in-person enrollments, your completed training, and your course completion certificates.  Pro Tip: If you have in-progress coursework, please complete it before May 17 – or you’ll have to start over. And, note that the system will be down between May 17-21. All the change will be worth it come May 22!  Learn and Earn with Splunk Education With the Splunk Learning Rewards program, customer-learners can earn points for completing at least three paid Splunk Education Courses. These points are available to redeem for Splunk swag by going to our Learning Rewards site, which can be accessed with your Splunk.com credentials. Oh, and if you’re the visual type, check out our fun video that shares all the learning rewards excitement. NEW Community Office Hours: Limited Spots Available - Register Today! Interested in getting hands-on, live help from a Splunk expert? Community Office Hours is a new program that you don’t want to miss. Here, you can ask questions and get live help from a technical Splunk expert on a different topic every month. You must register to attend a session. But spots are limited - so don’t miss out!  Talk to Splunk Product Design Our product design team is currently looking for Splunk users to talk to about their experiences with Splunk products. Sign up here to participate in upcoming studies and shape the future of our products and roadmaps! Join our Customer Advisory Boards to get early access to product releases Sign up and join our April 2023 Customer Advisory Boards! You’ll get access to previews of new products and capabilities, interact with industry experts and provide feedback to influence the future of Splunk products. Use this link to sign up! Contact us at advisoryprograms@splunk.com with any questions.   Until Next Month, Happy Splunking!
  March 2023 | Check out the latest and greatest Unify Your Security Operations with Splunk Mission Control  The recent release of Splunk Mission Control allows your SOC to detect, in... See more...
  March 2023 | Check out the latest and greatest Unify Your Security Operations with Splunk Mission Control  The recent release of Splunk Mission Control allows your SOC to detect, investigate, and respond to threats from one modern, unified work surface. Learn more about how to utilize Mission Control to streamline your security workflows with response templates and modernize your security operations with automation in our product news and announcements post.  Splunk SOAR 6.0  On February 22nd, Splunk released SOAR version 6.0.0, and this release marks the start of our integration efforts with Splunk Mission Control to provide a unified security operations solution. This includes a new integrated browsing experience, the ability to enable SOAR playbooks and actions to run against Mission Control data and lastly a deep integration into the Visual Playbook Editor (VPE) to write playbooks for Mission Control, without the need for custom code or configuration. SOAR 6.0.0 also includes exciting new features and comes with a new version of the Automation Broker (AB). Read the SOAR 6.0 Release Notes here, and release notes for Automation Broker here to learn more. Rapid Detection and Incident Scoping with Splunk ES 7.1  Watch our on-demand webinar to learn how the recent release of Splunk Enterprise Security can help you:  Detect suspicious behavior in real time using cloud-based streaming analytics Quickly discover the scope of an incident and respond with accuracy using “threat topology visualization” Improve security workflow efficiencies with the new embedded “MITRE ATT&CK visualization” feature New Detections from the Splunk Threat Research Team  The Splunk Threat Research Team (STRT) has had three releases of security content, which provide you with 45 new detections and 7 new analytic stories. The new security content is available via the ESCU application update process or via Splunk Security Essentials (SSE). The Splunk Threat Research Team has also published the following blog to help you stay ahead of threats: Fantastic IIS Modules and How to Find Them  Splunk at RSA Conference 2023  If you’ll be attending RSA in April, be sure to stop by booth 5770 to say hello, grab a Splunk T-shirt and pick up a copy of the brand new book we’ll be releasing at the event. Stay and learn about the latest and greatest happening with Splunk Security by getting a demo or watching a booth presentation, and check out one of the three talks being given by Splunk SURGe team members: Threat-informed Planning with Macro-level ATT&CK Trending Rethinking Recruiting: Effective Hiring Practices to Close the Skills Gap Trust Unearned? Evaluating CA Trustworthiness Across 2 Billion Certificates Suggested Security Reading  Did you know that every month Splunk security experts curate a list of suggested content that they think provides value to security professionals? Take a look at all of our security staff picks here, and happy reading! Tune in for a must see Tech Talk! Security Edition Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat Topology and MITRE ATT&CK Visualizations   Tuesday, March 28, 2023  |  11am–12pm PT Register Now Mission Possible: Splunk Adoption Challenge Splunk Customer Success is recruiting new members for the Mission Possible: Splunk Adoption Challenge! The challenge contains 3 games with high-value content as ‘tasks’ that players need to ‘crack’. Players will start as Splunk rookie recruits and earn points and badges to become Double-O Splunk agents. The challenge is dedicated to helping the players learn about Splunk products, consume essential Splunk knowledge, and build digital resilience.  Click HERE to enter the games! New Splunk Learning Platform launching on May 22! We’re so appreciative of all the curious learners out there who turn to Splunk Education to boost their careers and help their organizations stay resilient. We want to keep you coming back for more, which is why we are launching a new Splunk Learning Platform. This new, feature-rich portal houses all your in-progress eLearning, your in-person enrollments, your completed training, and your course completion certificates.  Pro Tip: If you have in-progress coursework, please complete it before May 17 – or you’ll have to start over. And, note that the system will be down between May 17-21. All the change will be worth it come May 22!  Learn and Earn with Splunk Education With the Splunk Learning Rewards program, customer-learners can earn points for completing at least three paid Splunk Education Courses. These points are available to redeem for Splunk swag by going to our Learning Rewards site, which can be accessed with your Splunk.com credentials. Oh, and if you’re the visual type, check out our fun video that shares all the learning rewards excitement. NEW Community Office Hours: Limited Spots Available - Register Today! Interested in getting hands-on, live help from a Splunk expert? Community Office Hours is a new program that you don’t want to miss. Here, you can ask questions and get live help from a technical Splunk expert on a different topic every month. You must register to attend a session. But spots are limited - so don’t miss out!  Talk to Splunk Product Design Our product design team is currently looking for Splunk users to talk to about their experiences with Splunk products. Sign up here to participate in upcoming studies and shape the future of our products and roadmaps! Join our Customer Advisory Boards to get early access to product releases Sign up and join our April 2023 Customer Advisory Boards! You’ll get access to previews of new products and capabilities, interact with industry experts and provide feedback to influence the future of Splunk products. Use this link to sign up! Contact us at advisoryprograms@splunk.com with any questions.   Until Next Month, Happy Splunking!  
Hi I have a table like this : What I want to do is when I click on the "Test Case" value of a particular row, it should expand that row ( if possible only that particular cell) and display a ... See more...
Hi I have a table like this : What I want to do is when I click on the "Test Case" value of a particular row, it should expand that row ( if possible only that particular cell) and display a table like this: Also I am using token (when clicking on the Test Case) to pass value to the second table. Any help would be appreciated 
HI  So I have this dashboard showing the below.  HBSS      ACAS        CMRSACAS    CMRSHBSS 89              92               84                          77 MY question is how do I get the dash... See more...
HI  So I have this dashboard showing the below.  HBSS      ACAS        CMRSACAS    CMRSHBSS 89              92               84                          77 MY question is how do I get the dashboard to show  ONLY the highest count for the day. Since the dashboard are updated daily? Any help will be fantastic. Thanks
Hello, I am trying to measure the downtime or slo breaches of certain customer endpoints over period of time, for example the metrics are success rate, latency are something we measure for the endpoi... See more...
Hello, I am trying to measure the downtime or slo breaches of certain customer endpoints over period of time, for example the metrics are success rate, latency are something we measure for the endpoints, currently we capture and query splunk every 5 mins and get these values, the values when below < 97% for success rate are breaches , one issue we have with in that 5 mins the slo breaches could have lasted few secs and mins and not the entire 5 minutes, if we capture the data from splunk for every min data for success rate that will be too many queries hits to the splunk and storing 1440 values/day instead of 288 values/day when queried every 1 min + storage cost for storing data and parsing to compute slo breaches 1440 mins/5 mins = 288 values 1440 mins/ 1 mins = 1440 values Any ideas how we can query splunk and get the threshold breaches accurately to secs so we can report downtime for prod incidents accurately to what is the amount of time the customer impact lasted with less hits to splunk and also more real time data provided to business on impact ?  
 March 2023 | Check out the latest and greatest Introducing Splunk Edge Processor, simplified data transformation Now generally available, Splunk Edge Processor provides Splunk Cloud Pl... See more...
 March 2023 | Check out the latest and greatest Introducing Splunk Edge Processor, simplified data transformation Now generally available, Splunk Edge Processor provides Splunk Cloud Platform customers new abilities to easily filter, mask, and route data. These data transformations are powered by Splunk’s next-generation Search Processing Language (SPL2), which simplifies data preparation and search. With Edge Processor, customers will have improved visibility into and control over streaming data at the edge - inside their networks and close to the data source. Dashboard Studio Challenge - learn new tricks, showcase your skills, and win prizes! This challenge is an opportunity to level up your dashboard skills, showcase your visualizations, and win a $100 gift card to the Splunk Store. We supply the themes and the datasets. You create an impactful, story-telling dashboard in Dashboard Studio. It’s that easy! Register here to enter the challenge. Explore additional use cases with our new Lantern tool! We’re excited to announce the launch of the new Use Case Explorer for the Splunk Platform!  This tool contains Splunk Enterprise and Splunk Cloud Platform use cases developed for five key industries - Financial Services, Healthcare, Retail, Technology Communications and Media, and Public Sector - as well as use cases designed to help you achieve Security and IT Modernization goals. Check out the Use Case Explorer for the Splunk Platform, plus find all the latest Lantern news in our latest blog update, and please let us know what you think! Bring More ML to Splunk: Inference Externally Trained Models in Machine Learning Toolkit 5.4 The latest release of the Splunk Machine Learning Toolkit (MLTK) enables users to upload their pre-trained ONNX models in MLTK with a simple UI. Once the model is in Splunk, users can use the model with their Splunk data with no modification to their existing workflows. Read more how this capability extends the usability of MLTK and ML-SPL beyond models trained using MLTK, unlocking more use cases with externally trained data models. The start of Splunk’s unified language: SPL2 Profile for Edge Processor The new Edge Processor uses SPL2, Splunk’s powerful and flexible data language for data at rest and in motion, designed to serve as the single entry point for a wide range of data handling scenarios and in the future will be available across multiple products. The SPL2 Profile for Edge Processor contains the specific subset of powerful SPL2 commands & functions that can be used to control & transform data behavior within Edge Processor. Read more about SPL2. Tune in for a must see Tech Talk! Platform Edition Tips & Tricks When Using Ingest Actions Tuesday, March 28, 2023  |  12pm–1pm PT Mission Possible: Splunk Adoption Challenge Splunk Customer Success is recruiting new members for the Mission Possible: Splunk Adoption Challenge! The challenge contains 3 games with high-value content as ‘tasks’ that players need to ‘crack’. Players will start as Splunk rookie recruits and earn points and badges to become Double-O Splunk agents. The challenge is dedicated to helping the players learn about Splunk products, consume essential Splunk knowledge, and build digital resilience.  Click HERE to enter the games! New Splunk Learning Platform launching on May 22! We’re so appreciative of all the curious learners out there who turn to Splunk Education to boost their careers and help their organizations stay resilient. We want to keep you coming back for more, which is why we are launching a new Splunk Learning Platform. This new, feature-rich portal houses all your in-progress eLearning, your in-person enrollments, your completed training, and your course completion certificates.    Pro Tip: If you have in-progress coursework, please complete it before May 17 – or you’ll have to start over. And, note that the system will be down between May 17-21. All the change will be worth it come May 22!  Learn and Earn with Splunk Education With the Splunk Learning Rewards program, customer-learners can earn points for completing at least three paid Splunk Education Courses. These points are available to redeem for Splunk swag by going to our Learning Rewards site, which can be accessed with your Splunk.com credentials. Oh, and if you’re the visual type, check out our fun video that shares all the learning rewards excitement. NEW Community Office Hours: Limited Spots Available - Register Today! Interested in getting hands-on, live help from a Splunk expert? Community Office Hours is a new program that you don’t want to miss. Here, you can ask questions and get live help from a technical Splunk expert on a different topic every month. You must register to attend a session. But spots are limited - so don’t miss out!  Talk to Splunk Product Design Our product design team is currently looking for Splunk users to talk to about their experiences with Splunk products. Sign up here to participate in upcoming studies and shape the future of our products and roadmaps! Join our Customer Advisory Boards to get early access to product releases Sign up and join our April 2023 Customer Advisory Boards! You’ll get access to previews of new products and capabilities, interact with industry experts and provide feedback to influence the future of Splunk products. Use this link to sign up! Contact us at advisoryprograms@splunk.com with any questions.   Until Next Month, Happy Splunking!
looking for the best way to audit all users accessing REST endpoints found a way to list users, but any way to limit this based on REST calls? | rest /services/authentication/users splunk_server=*
Hello, collectd is the mechanism to obtain information about network traffic (octets per second). The search to create a visualization of the data in a dashboard is below.    | mstats rate_av... See more...
Hello, collectd is the mechanism to obtain information about network traffic (octets per second). The search to create a visualization of the data in a dashboard is below.    | mstats rate_avg("octets.*") WHERE index="network" chart=true host="device-*" span=5m by host | fields - _span* | rename "rate_avg(octets.rx): *" AS "in * bit/s" | rename "rate_avg(octets.tx): *" AS "out * bit/s" | foreach * [eval <<FIELD>>='<<FIELD>>' * 8 ]   The issue I am facing is when trying to graph time frames wider than a few months. There are to many data points and the results are truncated. I have played with charting.chart.resultTruncationLimit but that only gets so far. Note: the span of 5m cannot be changed or the data is skewed. Is there a way to create graphs maintaining the time span but per day or per month? For example, Display a graph of the last 30 days but summarize per day or per week Display a graph of the last year but summarize per month or per week. Thanks in advance. 
Is it possible to set the token value from the other dashboard? For example, can I link from one dashboard to other dashboard like   http://{base-URL?test-toke="value"
I have a stand-alone SH with 3 peer(non-clustered) indexers. I tried adding a 4th non-cluster indexer as a peer. 2 days later /opt/splunk was 100% full. Anyone have this happen? Is the data new data ... See more...
I have a stand-alone SH with 3 peer(non-clustered) indexers. I tried adding a 4th non-cluster indexer as a peer. 2 days later /opt/splunk was 100% full. Anyone have this happen? Is the data new data or old data that was copied to that indexer? I had to remove that indexer from the peer but now I don’t know what the data is on that 4th indexer. Help. New to Splunk obviously. 
Hi Everyone, I was reading through this article that led me to believe it`s possible to display external web content in Splunk, however it doesn`t appear to be working for me. Interestingly, it wor... See more...
Hi Everyone, I was reading through this article that led me to believe it`s possible to display external web content in Splunk, however it doesn`t appear to be working for me. Interestingly, it works fine outside of Splunk (ie: if I save the source as an HTML file locally on my computer), but it doesn`t display the iframe if I put it in a Splunk dashboard. Any assistance would be greatly appreciated. Source code below. <?xml version='1.0' encoding='utf-8'?> <dashboard version="1.1"> <label>My iFrame Dashboard</label> <row> <html> <h2>Embedded Web Page!</h2> <iframe src="https://myValidDomainName" width="100%" height="300">&gt;</iframe> </html> </row> </dashboard>  
I am looking for a Splunk query that will pull the enabled and disabled ciphers from windows servers in my environment.  Ranging from OS 2012R2 - 2019.  As a bonus if someone has one for Oracle Linux... See more...
I am looking for a Splunk query that will pull the enabled and disabled ciphers from windows servers in my environment.  Ranging from OS 2012R2 - 2019.  As a bonus if someone has one for Oracle Linux I would like that too.
Hi, I have a query that is making two different searches and displaying the stats of each. Example: index="example" TERM(STOP)     | rename message.payload as message1     | stats count by me... See more...
Hi, I have a query that is making two different searches and displaying the stats of each. Example: index="example" TERM(STOP)     | rename message.payload as message1     | stats count by message1     | appendcols [search index="example2"         | rename message.payload as message2         | stats count by message2]   I want the results of message1 and message2 whose event timestamps are identical to be displayed next to each other in statistics tab. I would like to have the stats displayed as such: Message1         Message2        Count <data>                 <data>               23 <data>                 <data>               17   Is this possible?  I hope this makes sense, I am still somewhat new to writing Splunk queries and this is so far the most complex one I have needed to write.
Hello, I have some log messages like this, where various info is delimited by double-colons: {"@message":"[\"ERROR :: xService :: xService :: function :: user :: 6c548f2b-4c3c-4aab-8fde-c1a8d727af3... See more...
Hello, I have some log messages like this, where various info is delimited by double-colons: {"@message":"[\"ERROR :: xService :: xService :: function :: user :: 6c548f2b-4c3c-4aab-8fde-c1a8d727af35 :: device1,device2 :: shared :: groupname :: tcp\"]","@timestamp":"2023-03-20T23:34:05.886Z","@fields":{"@origin":"/dir/stuff/things/and/more/goes/here/file.js:2109","@level":"info"}} I am trying to send a count per day of the 'function' shown above, and the issue is that it might appear at various block count when the message is split from ' :: ' - So, I am trying to match regex on the UUID and count 2 blocks backwards from there to get the 'function' as a reliable way to extract it.  I have observed the UUID appearing in blocks 5, 6, and 7, so this is an attempt at case for each and assigning a value to get the function. Am quite new to Splunk queries, but here is my stab at it.  Of course, it doesn't quite work: index=iap source="/dir/stuff/things/xService.log" "ERROR :: xService ::" | rex field=@message mode=sed "s/(\[\"|\"\])//g" | eval tmp = split('@message'," :: ") , check7 = mvindex(tmp,7), check6 = mvindex(tmp,6), check5 = mvindex(tmp,5) | eval target=case(match(check7,"\w+\-\w+\-\w+\-\w+\-\w+"),7,match(check6,"\w+\-\w+\-\w+\-\w+\-\w+"),6,match(check5,"\w+\-\w+\-\w+\-\w+\-\w+"),5) | eval function=case(match(target == 7, 5, target == 6, 6, target == 5, 5) | timechart span=1d count by function limit=0
We are populating Splunk using an HEC connection with a source type of _json, set to the default character set of UTF-8.  However, a field shown in the raw data as: "Character test: 0242 (\\u00f2): ... See more...
We are populating Splunk using an HEC connection with a source type of _json, set to the default character set of UTF-8.  However, a field shown in the raw data as: "Character test: 0242 (\\u00f2): >\uC3B2<" is displayed as: Character test: 0242 (\u00f2): >쎲< I would have expected the display to show the character, ò, which is the UTF-8 equivalent of hexadecimal C3B2, rather than the displayed UNICODE character
We have a standard configuration for our workstations. Several of the fields are static but some are dynamic (but these have a fixed length). I want to use a lookup table of all the values and apply... See more...
We have a standard configuration for our workstations. Several of the fields are static but some are dynamic (but these have a fixed length). I want to use a lookup table of all the values and apply automatically to a sourcetype. But I'm not sure how I would go about matching the fields/values with a Lookup Definition. The standard is  1=Device Type - Static 1 char 2=Building Code - Static 3 chars 3=Department Code - Static 3 chars 4=Function - Static 1 char 5=Asset Tag - Dynamic 7 chars   So a machine may be named LBL1HRSSABC1234 indicating it's a laptop in Building 1 in HR Services that is Shared with an asset tag of ABC1234. How could I use a lookup with these 4 static and 1 dynamic values to populate said values when a search is done on a particular host name. I should mention that I'm confortable creating the lookup and applying it, just not how to get it to match on the criteria above. Thanks in advance!  
I have a log set from FW's. These logs have a field called "src."  From what I can tell, this field is populated with values such as: FQDN (myhost.mydomain.com) Console or telnet 10.0.0.1 I'm l... See more...
I have a log set from FW's. These logs have a field called "src."  From what I can tell, this field is populated with values such as: FQDN (myhost.mydomain.com) Console or telnet 10.0.0.1 I'm looking to have two fields created from the "src" field, one name IP if the value in "src" is an IP and "src_nt_host" if the value is not an ip_address.  A small sample from the logged event: From: Console or telnet. From: myhost.mydomain.com. From: 10.0.0.1. Any help / guidance is greatly appreciated.
Understand your native mobile application, such as iOS, Android, React Native or Flutter as your end users actually use it! Check out the capabilities AppDynamics can provide those looking for deep... See more...
Understand your native mobile application, such as iOS, Android, React Native or Flutter as your end users actually use it! Check out the capabilities AppDynamics can provide those looking for deep mobile analytics that truly helps improve the customer experience while using mobile applications.  More info below! AppDynamics MRUM documentation: https://docs.appdynamics.com/appd/22.x/22.2/en/end-user-monitoring/mobile-real-user-monitoring  Flutter blog: https://www.appdynamics.com/blog/product/general-availability-of-flutter-agent-for-mobile-real-user-monitoring