@gradientflow.com
//
References:
eWEEK
, Gradient Flow
,
Apple is ramping up its efforts in the artificial intelligence space, focusing on efficiency, privacy, and seamless integration across its hardware and software. The tech giant is reportedly accelerating the development of its first AI-powered smart glasses, with a target release date of late 2026. These glasses, described as similar to Meta's Ray-Ban smart glasses but "better made," will feature built-in cameras, microphones, and speakers, enabling them to analyze the external world and respond to requests via Siri. This move positions Apple to compete directly with Meta, Google, and the emerging OpenAI/Jony Ive partnership in the burgeoning AI device market.
Apple also plans to open its on-device AI models to developers at WWDC 2025. This initiative aims to empower developers to create innovative AI-driven applications that leverage Apple's hardware capabilities while prioritizing user privacy. By providing developers with access to its AI models, Apple hopes to foster a vibrant ecosystem of AI-enhanced experiences across its product line. The company's strategy reflects a desire to integrate sophisticated intelligence deeply into its products without compromising its core values of user privacy and trust, distinguishing it from competitors who may have rapidly deployed high-profile AI models. While Apple is pushing forward with its smart glasses, it has reportedly shelved plans for an Apple Watch with a built-in camera. This decision suggests a strategic shift in focus, with the company prioritizing the development of AI-powered wearables that align with its vision of seamless integration and user privacy. The abandonment of the camera-equipped watch may also reflect concerns about privacy implications or technical challenges associated with incorporating such features into a smaller wearable device. Ultimately, Apple's success in the AI arena will depend on its ability to deliver genuinely useful and seamlessly embedded AI experiences that enhance user experience. Recommended read:
References :
Aminu Abdullahi@eWEEK
//
Apple is accelerating its entry into the AI-powered wearable market with plans to launch its first smart glasses by late 2026. These glasses, codenamed "N401," will feature built-in cameras, microphones, and speakers, enabling users to interact with the Siri voice assistant for tasks such as making phone calls, playing music, conducting live translations, and receiving GPS directions. The company aims to compete with Meta's Ray-Ban smart glasses, which have seen significant success, but is initially focusing on simplicity by foregoing full augmented reality (AR) capabilities in the first iteration. Apple hopes that by defining their product vision and investing strategically, they can overcome their late start in the AI race and deliver a superior experience.
The move comes as Apple recognizes the growing importance of AI in wearable technology and seeks to catch up with competitors like Meta, Google, and OpenAI. While Meta is working on higher-end smart glasses with built-in displays, Apple is taking a different approach by prioritizing essential functionalities and a sleek design, similar to Meta's Ray-Bans. Google has also partnered with brands like Samsung, Warby Parker, and Gentle Monster to launch smart glasses using its Android XR system. Apple is looking to capture the AI devices market, which is set to become more crowded next year. OpenAI, the company behind popular AI chatbot ChatGPT, announced earlier this week that it was collaborating with former Apple designer Jony Ive on AI gadgets to be released beginning next year. Amidst these developments, Apple has reportedly scrapped plans for a camera-equipped Apple Watch, signaling a shift in its wearables strategy. Sources indicate that the company had been actively working on releasing a camera-equipped Apple Watch and Apple Watch Ultra by 2027, but that project was recently shut down. Instead, Apple is concentrating its resources on smart glasses, with prototypes expected to be ordered in bulk for testing. Apple faces competitors like OpenAI, Meta, Google, and Amazon. Bloomberg reported that Apple was running multiple focus groups to find out what its employees liked about smart glasses from competitors. Recommended read:
References :
@www.eweek.com
//
References:
www.eweek.com
, www.techradar.com
,
Apple is reportedly speeding up development of its first pair of AI-powered smart glasses, with a targeted release in late 2026. These glasses, internally codenamed "N401" and previously "N50," are designed to compete with Meta’s popular Ray-Ban smart glasses. Insiders describe Apple's glasses as "similar to Meta’s product but better made," and they will feature built-in cameras, microphones, and speakers.
The glasses are expected to "analyze the external world and take requests via the Siri voice assistant," enabling tasks such as making phone calls, playing music, live translations, and GPS navigation. While Apple hasn’t officially confirmed the product, sources indicate that the company plans to produce large quantities of prototypes by the end of this year, collaborating with overseas suppliers. Apple is focusing on simplicity in its initial smart glasses design, foregoing full augmented reality (AR) capabilities for now, with the ultimate goal of releasing AR-capable spectacles in the future. Google is also actively developing smart glasses using its Android XR system and has partnered with brands like Warby Parker and Gentle Monster to enhance the design and appeal of its devices. The inclusion of AI, particularly assistants like Gemini, is seen as a crucial feature for smart glasses, providing users with real-time information and assistance. Google's focus on fashion and user-friendly design aims to avoid the mistakes of the past, learning from the negative perception associated with the earlier Google Glass. Recommended read:
References :
Hamish Hector@techradar.com
//
Google is making a renewed push into the smart glasses market with the upcoming Android XR glasses, leveraging its Gemini AI to enhance user experience. During the I/O 2025 developer conference, Google showcased the capabilities of these glasses, highlighting features such as live language translation. The Android XR glasses can connect to a smartphone to access apps and come equipped with speakers. An optional in-lens display will allow users to view information privately.
Google is partnering with eyewear brands such as Warby Parker and Gentle Monster, focusing on creating stylish and wearable designs. These collaborations aim to move away from the bulky, tech-heavy aesthetic often associated with earlier smart glasses models. This partnership hints that Google is taking style a lot more seriously this time around. Warby Parker is well known as a direct-to-consumer eyewear brand that makes it easy to get trendy glasses at a relatively accessible price. Meanwhile, Gentle Monster is currently one of the buzziest eyewear brands that isn’t owned by EssilorLuxottica. The Korean brand is popular among Gen Z, thanks in part to its edgy silhouettes and the fact that Gentle Monster is favored by fashion-forward celebrities like Kendrick Lamar, Beyoncé, Rihanna, Gigi Hadid, and Billie Eilish. Partnering with both brands seems to hint that Android XR is aimed at both versatile, everyday glasses as well as bolder, trendsetting options. The glasses integrate Google’s Gemini AI assistant, enabling users to interact with their surroundings and access information hands-free. Google is working with Xreal and Qualcomm on Project Aura, an optical see-through XR device, to integrate Android XR software. Google also plans to integrate Project Astra to allow users to talk back and forth with Search about what they see in real time with their camera. The tech giant also rolled out AI Mode to Google Search for every U.S. user in order to answer your question before showing the traditional list of links. Recommended read:
References :
@www.artificialintelligence-news.com
//
Apple is doubling down on its custom silicon efforts, developing a new generation of chips destined for future smart glasses, AI-capable servers, and the next iterations of its Mac computers. The company's hardware strategy continues to focus on in-house production, aiming to optimize performance and efficiency across its product lines. This initiative includes a custom chip for smart glasses, designed for voice commands, photo capture, and audio playback, drawing inspiration from the low-power components of the Apple Watch but with modifications to reduce energy consumption and support multiple cameras. Production for the smart glasses chip is anticipated to begin in late 2026 or early 2027, potentially bringing the device to market within two years, with Taiwan Semiconductor Manufacturing Co. expected to handle production, as they do for most Apple chips.
Apple is also exploring integrating cameras into devices like AirPods and Apple Watches, utilizing chips currently under development codenamed "Nevis" for Apple Watch and "Glennie" for AirPods, both slated for a potential release around 2027. In addition to hardware advancements, Apple is considering incorporating AI-powered search results in its Safari browser, potentially shifting away from reliance on Google Search. Eddy Cue, Apple's SVP of services, confirmed the company has engaged in discussions with AI companies like Anthropic, OpenAI, and Perplexity to ensure it has alternative options available, demonstrating a commitment to staying nimble in the face of technological shifts. Apple is planning the launch of AR and non-AR glasses under the codename N401. The company's CEO, Tim Cook, hopes for Apple to take a lead in this market segment. Eddy Cue said that in 10 years you may not need an iPhone. Cue acknowledges that AI is a new technology shift, and it’s creating new opportunities for new entrants and that Apple needs to stay open to future possibilities. Recommended read:
References :
|
BenchmarksBlogsResearch Tools |