Amazon integrates AI with robotics and smart glasses to streamline delivery processes

2025-10-23

Amazon is revolutionizing its retail and logistics operations by implementing advanced robotics and artificial intelligence tools. These innovations are applied within warehouse operations and also extend to delivery drivers through the development of “smart delivery glasses” designed to assist in last-mile deliveries. At today’s annual “Delivering the Future” event held in the Bay Area, Amazon unveiled Blue Jay, a new multi-arm robotic system that combines three existing systems into one. Additionally, the company introduced Project Eluna, an AI-driven platform to assist warehouse managers in making smarter, real-time decisions. The smart delivery glasses, still in development, aim to enhance delivery drivers’ navigation efficiency and enable them to record proof of delivery without frequently relying on smartphones. Tye Brady, Amazon’s Chief Technologist for Robotics, emphasized during the event that while AI is advancing robotics, human workers remain central to the delivery process. “Humans are always at the core,” he stated, possibly addressing recent reports suggesting Amazon might replace up to half a million jobs with automation. Brady explained that the company’s vision is to create a world where robots and AI seamlessly integrate into human workflows. Blue Jay represents years of robotic advancements built upon earlier systems such as Vulcan and DeepFleet. It exemplifies Amazon’s concept of “physical AI”—intelligent systems capable of interacting with the physical world to support human tasks. This high-performance robotic sorting system is designed for installation in Amazon’s logistics centers. With its multi-arm configuration, Blue Jay functions more like an automated production line, capable of picking, sorting, and storing hundreds of different items, significantly improving package organization. It consolidates three previously separate robotic systems into a single, more powerful station that can process thousands of items daily at a faster pace. Amazon reported that AI and “digital twin” technology accelerated the development of Blue Jay, reducing the design-to-deployment timeline to just over a year. This is notably faster than earlier systems like Cardinal, Robin, and Sparrow, which each took approximately three years to develop. The company noted that AI enabled engineers to compress years of trial-and-error into just months. Using digital twin simulations, teams iterated through dozens of Blue Jay prototypes before finalizing the design. Currently undergoing testing in one of Amazon’s largest fulfillment centers in South Carolina, Blue Jay is already capable of organizing approximately 75% of the diverse items stored at the site. Over time, the system will be rolled out to additional facilities, including thousands of “next-day delivery” hubs, to expedite package delivery to customers. Amazon continues to lead in physical AI innovation. Earlier this year, it introduced Vulcan, the world’s first robot equipped with tactile sensing. This breakthrough plays a crucial role in both operational efficiency and safety. For instance, if Vulcan picks up a box and detects that the edges are beginning to crumple, it automatically reduces the pressure applied. It also uses tactile feedback to detect accidental contact with human workers and stops immediately when such contact occurs.

Intelligent Operations Assistant

Blue Jay will be supported by Project Eluna, an AI-powered platform designed to function like a personal assistant for operations managers. Amazon explained that its human workers face immense workloads, often needing to monitor dozens of dashboards tracking logistics operations and respond to issues like technical failures or resource bottlenecks.

Project Eluna aims to reduce cognitive load by overseeing operations and suggesting real-time actions to operators. It can anticipate bottlenecks before they occur and recommend solutions to ensure smooth logistics flow. Operators can also interact with Eluna conversationally, asking questions such as, “Where should we move staff to avoid bottlenecks?” and receive immediate recommendations.

“Our latest innovations exemplify how we’re leveraging AI and robotics to enhance experiences for both employees and customers,” said Tye Brady. “The goal is to make technology the most practical and powerful tool available, making work safer, smarter, and more meaningful.”

AI Glasses Designed for Delivery Drivers

While the smart delivery glasses have yet to receive an official name, they are already being tested by hundreds of delivery drivers to refine their design and functionality.

Earlier this year, Amazon revealed it was developing advanced geospatial mapping technology offering detailed insights into building structures and sidewalk obstructions—now it’s clear why. This mapping system is being integrated into the smart glasses, which feature an embedded display enabling hands-free operation so drivers can avoid constant smartphone use. The glasses will provide turn-by-turn navigation while driving and continue guiding drivers on foot once they park and walk to a customer’s door. “Think of these glasses as your reliable companion from the van to the doorstep,” said Beryl Tomay, Amazon’s Vice President of Transportation, during today’s event. Additionally, the glasses are particularly useful in large apartment complexes. They can scan packages and capture images of delivered items at the customer’s doorstep as proof of delivery. Amazon noted that driver feedback during early testing phases was instrumental in refining the glasses’ design and usability. Based on this input, the company added features such as swappable batteries to accommodate long shifts and support for prescription lenses so those who wear regular glasses can use them comfortably. Kaleb M., a delivery contractor with Maddox Logistics in Omaha, reported feeling safer while using the smart glasses, as all necessary information appears directly in his field of view. “You don’t have to look down at your phone—you can keep your eyes forward and glance over the display,” he explained. “You’re always focused on what’s ahead.”