In our blog post, How the Edge Enables Groundbreaking AI Applications, we provided a definition for edge Artificial Intelligence (AI). To refresh, ABI Research defines edge AI as “the implementation of AI platforms and solutions on the edge of a network, close to the end user's environment.” In other words, edge AI use cases and applications are processed locally instead of at a remote data center. Edge AI platforms include Internet of Things (IoT) hardware, edge computers, and small, localized data centers.
For enterprises, some of the major benefits of placing AI at the edge include the following:
- Reduced network latency
- Lower bandwidth consumption
- Real-time analytics
- Improved data security
- Lower power usage
- Increased efficiency
- Better business outcomes
ABI Research has seen the edge AI market growing rapidly, with the Asia-Pacific region being fertile ground for innovation. We forecast revenue for edge AI in Asia-Pacific to grow at a Compound Annual Growth Rate (CAGR) of 24% between 2023 and 2028—reaching a market value of US$12.9 billion by the end of the forecast window. The Asia-Pacific market is being fueled primarily by the adoption of on-device AI, Tiny Machine Learning (TinyML), edge AI gateways, and on-premises edge AI servers. These trends create promising opportunities for the entire edge AI value chain—hardware infrastructure providers, System Integrators (SIs), data center/cloud providers, and software vendors.
Data Analytics at the Edge
Edge AI is key to collecting various data from devices and contextualizing it in real time. Whether it’s a smart sensor or a tracking device, data management tools are a must for businesses operating in Asia-Pacific. Edge AI makes it easier to use data collected from edge devices to make informed decisions and identify patterns promptly.
Edge AI Case Study #1 in Asia-Pacific: Taipei Veterans General Hospital
Taipei Veterans General Hospital (TVGH) was looking for an AI model that could predict the risk of heart failure for its kidney dialysis patients. Leveraging NVIDIA’s Jetson edge AI platform, TVGH was provided with a dashboard available to doctors and nursing staff, which alerts them of worrisome patient health updates.
The hospital’s edge AI model lets clinicians better evaluate patient data. Risk factors of cardiovascular disease, such as blood flow rate and artery/vein pressure, are assessed in tandem with dialysis machine data, patient medical records, test results, and medication information.
Ultimately, this edge AI use case allowed TVGH clinicians to achieve an 80% lower deviation rate in their assessments of a patient’s dry weight, which makes future complications less likely. The neural network model is also reported to have as high as a 95% level of accuracy.
Related Content
Keeping Devices Connected with Edge AI
For the many enterprises that use edge/Internet of Things (IoT) devices in daily operations, all eyes are on edge AI application providers. These solution providers are integral to designing and implementing AI algorithms that operate smoothly on edge networks and devices. Health wearables (remote patient monitoring), embedded fleet appliances (vehicle tracking), and drones (image scanning) are all common examples of how connected devices are cornerstone use cases of edge AI.
Edge AI Case Study #2 in Asia-Pacific: Auckland Transport
Auckland Transport (AT) operators are tasked with keeping city traffic running smoothly, encouraging people to take public transportation, and minimizing safety and health risks on roads and public spaces. AT partnered with Hewlett Packard Enterprises (HPE) and leveraged the company’s GreenLake edge-to-cloud platform for deep insights that can be turned into action. For example, when conditions are right, AT operators can notify people how much quicker it would be to commute by public bus, instead of by car.
As another demonstration, GreenLake enables AT parking officers to use thousands of cameras throughout the city, which makes real-time monitoring far easier. Finally, switching its Video Management System (VMS) to the GreenLake edge-to-cloud platforms has resulted in AT reducing its energy footprint by 37% without negatively impacting the system’s performance.
Edge AI Is Improving Retail Experiences
The benefits of edge AI extend to shopping as well, providing customers with personalized experiences and improved accessibility, while ensuring that manufactured products are flawless before being packaged.
- Personalized Experiences: Whether it's historical purchases or an in-store heatmap, edge AI assesses various customer data to enable retailers to deliver a shopping experience tailored to the individual. Examples include dynamic pricing on the retail floor and personalized promotions at the checkout.
- Improved Accessibility: Edge AI’s voice recognition and translation capabilities make it a prime solution for assisting customers with disabilities (hearing loss, blindness, etc.).
- Better Quality Control: Machine Vision (MV) at the edge can identify product defects, ensuring the manufacturer doesn’t ship any goods that will make the customer unhappy.
Edge AI Case Study #3 in Asia-Pacific: Carrefour Taiwan
As part of its digital transformation ambitions, Carrefour Taiwan aimed to construct a retail ecosystem encompassing physical and virtual channels. The company used Google Cloud to build edge ML models that turn customer data into actionable insights for marketing campaigns. For example, the model enabled Carrefour Taiwan to pinpoint their high-value customers more easily.
This edge-to-cloud platform resulted in Carrefour Taiwan being able to target only customers that the edge ML model predicted to spend the most. In the end, online advertising costs per action were lowered by 40%, achieving a 2.64X higher Return on Ad Spend (ROAS).
Overcoming Edge AI Challenges
Edge AI is still a relatively new technology, so with it, several challenges persist for Asia-Pacific businesses adopting it. Among the challenges that stand out the most, integration with existing infrastructure and the high cost of implementation come to the fore. Edge AI solutions often involve a diverse set of hardware components (Central Processing Units (CPUs), Graphics Processing Units (GPUs), etc.) and microchips, each requiring different architecture, systems, and/or interfaces. This makes interoperability crucial in expanding edge AI use cases and keeping implementation costs to a minimum. Edge AI solutions should be able to work with legacy/new systems and edge devices, which stipulates the need for standardizing data formats, communication protocols, and application interfaces.
Moreover, edge AI platforms usually contain distributed data sources, such as sensors, cameras, and other IoT devices. If a business plans to scale, particularly with existing infrastructure, disparate hardware and data sources make this very challenging. To overcome this challenge, edge AI platforms must be capable of handling data collection, aggregation, and processing. The solution must also be able to automatically scale up or down, contingent on workload type and data volume.
ABI Research expects AI innovation labs to be key in facilitating new edge AI use cases and applications in Asia-Pacific. Meanwhile, infrastructure vendors like Lenovo and Fujitsu are already extending their data center operations to fit into the edge network ecosystem, which gives enterprises expertise on infrastructure platforms. Besides improving edge-to-cloud orchestration and innovating edge AI, a huge facilitator of edge AI market growth will be strategic partnerships, such as StarHub and Nokia’s collaboration with Google Distributed Cloud Edge. Partnerships like this drive better coordination between hardware and software providers, which ultimately benefits the end customer in implementing edge AI.
Learn more by downloading ABI Research’s Distributed Intelligence on the Edge: Opportunities and Challenges Shaping Edge AI in APAC presentation. This content is part of the company’s Distributed & Edge Computing Research Service.