@www.eweek.com
//
Apple is exploring groundbreaking technology to enable users to control iPhones, iPads, and Vision Pro headsets with their thoughts, marking a significant leap towards hands-free device interaction. The company is partnering with Synchron, a brain-computer interface (BCI) startup, to develop a universal standard for translating neural activity into digital commands. This collaboration aims to empower individuals with disabilities, such as ALS and severe spinal cord injuries, allowing them to navigate and operate their devices without physical gestures.
Apple's initiative involves Synchron's Stentrode, a stent-like implant placed in a vein near the brain's motor cortex. This device picks up neural activity and translates it into commands, enabling users to select icons on a screen or navigate virtual environments. The brain signals work in conjunction with Apple's Switch Control feature, a part of its operating system designed to support alternative input devices. While early users have noted the interface is slower compared to traditional methods, Apple plans to introduce a dedicated software standard later this year to simplify the development of BCI tools and improve performance. In addition to BCI technology, Apple is also focusing on enhancing battery life in future iPhones through artificial intelligence. The upcoming iOS 19 is expected to feature an AI-powered battery optimization mode that learns user habits and manages app energy usage accordingly. This feature is particularly relevant for the iPhone 17 Air, where it will help offset the impact of a smaller battery. Furthermore, Apple is reportedly exploring the use of advanced memory technology and innovative screen designs for its 20th-anniversary iPhone in 2027, aiming for faster AI processing and extended battery life. Recommended read:
References :
John-Anthony Disotto@techradar.com
//
Apple is reportedly focusing on AI and design upgrades for its upcoming iPhone lineup. A new Apple Intelligence feature is being developed for iOS 19, set to launch this fall. This feature is an AI-powered battery optimization mode designed to extend battery life, especially for the iPhone 17 Air. This model is expected to utilize the feature to compensate for a smaller battery. The company appears to be heavily invested in AI, viewing it as a crucial element for future device enhancements.
Meanwhile, reports indicate Apple is contemplating a price increase for the next iPhone, possibly the iPhone 17 series. Sources suggest the company aims to avoid attributing the increase to tariffs. Apple has historically relied on Chinese manufacturers, and while past tariffs have been a concern, current trade conditions show a pause on tariffs between the US and China until early August. Apple is considering attributing the price hike to the inclusion of next-generation features and design changes. In other news, Apple is gearing up for Global Accessibility Awareness Day with a preview of new accessibility features coming to its platforms later this year. This marks the 40th anniversary of Apple's dedicated accessibility office, with continuous development in this area. Upcoming features include App Store Accessibility Nutrition Labels, which will inform users about supported accessibility features like VoiceOver and Reduce Motion. Additionally, the Magnifier app, currently on the iPhone, will be introduced to the Mac, allowing users to manipulate images for better visibility using Continuity Camera or a USB-compatible camera. Recommended read:
References :
@thetechbasic.com
//
References:
thetechbasic.com
, www.bloomberg.com
Apple is making significant investments in AI and expanding connectivity features across its devices. According to recent reports, the tech giant is developing specialized chips designed for a range of upcoming products, including non-AR glasses aimed at competing with the Meta Ray-Bans, new chips for AirPods and Apple Watches potentially equipped with cameras, a high-end AI server chip, and updated M-series chips for its Mac line. This move highlights Apple's commitment to pushing the boundaries of its hardware capabilities and integrating AI deeper into its ecosystem.
Apple is also working on streamlining the Wi-Fi connectivity experience for users in hotels, trains, and airplanes. The company aims to implement a system where users can log in to Wi-Fi networks once on their iPhone, and all their other Apple devices, such as iPads and Macs, will automatically connect without requiring repeated logins. This feature would significantly reduce the hassle of re-entering credentials on multiple devices, especially beneficial for families and business travelers who rely on seamless connectivity on the go. While the expanded Wi-Fi sharing feature promises convenience, its effectiveness may be limited by the policies of individual establishments. Some hotels, for example, restrict the number of devices that can be connected simultaneously. Apple's new system might simplify device switching within those limits, but it won't bypass the overall restriction. The tech giant's focus on AI integration and enhanced connectivity underscores its dedication to improving user experience across its hardware and software offerings. Recommended read:
References :
@www.artificialintelligence-news.com
//
Apple is doubling down on its custom silicon efforts, developing a new generation of chips destined for future smart glasses, AI-capable servers, and the next iterations of its Mac computers. The company's hardware strategy continues to focus on in-house production, aiming to optimize performance and efficiency across its product lines. This initiative includes a custom chip for smart glasses, designed for voice commands, photo capture, and audio playback, drawing inspiration from the low-power components of the Apple Watch but with modifications to reduce energy consumption and support multiple cameras. Production for the smart glasses chip is anticipated to begin in late 2026 or early 2027, potentially bringing the device to market within two years, with Taiwan Semiconductor Manufacturing Co. expected to handle production, as they do for most Apple chips.
Apple is also exploring integrating cameras into devices like AirPods and Apple Watches, utilizing chips currently under development codenamed "Nevis" for Apple Watch and "Glennie" for AirPods, both slated for a potential release around 2027. In addition to hardware advancements, Apple is considering incorporating AI-powered search results in its Safari browser, potentially shifting away from reliance on Google Search. Eddy Cue, Apple's SVP of services, confirmed the company has engaged in discussions with AI companies like Anthropic, OpenAI, and Perplexity to ensure it has alternative options available, demonstrating a commitment to staying nimble in the face of technological shifts. Apple is planning the launch of AR and non-AR glasses under the codename N401. The company's CEO, Tim Cook, hopes for Apple to take a lead in this market segment. Eddy Cue said that in 10 years you may not need an iPhone. Cue acknowledges that AI is a new technology shift, and it’s creating new opportunities for new entrants and that Apple needs to stay open to future possibilities. Recommended read:
References :
Mark Gurman@Bloomberg Technology
//
Apple is collaborating with Anthropic to integrate the Claude AI tool into its Xcode development environment. This partnership marks a significant shift for Apple, which typically prefers to develop its own internal tools. The goal is to enhance coding efficiency and help Apple engineers fix errors more quickly. By adding Claude to Xcode, the software used to create iPhone and Mac apps, Apple aims to streamline the development process and improve overall productivity.
The integration of Claude will allow engineers to receive code suggestions, automatically find and fix bugs, and test applications more efficiently. For example, an engineer can simply request "add a dark mode button," and Claude will recommend the appropriate code. This saves time, allowing engineers to focus on developing core concepts. Apple's previous attempt at building its own AI helper, Swift Assist, reportedly suffered from issues such as generating incorrect code and being too slow. This collaboration with Anthropic allows Apple to catch up with other tech giants like Microsoft and Google, which already have robust AI tools. The plan is to combine Claude with Swift Assist and other tools like GitHub Copilot to improve Xcode. If the integration proves successful internally, Apple may release it to third-party developers in the future, potentially at the company's WWDC event in June. This new way of working signals a major change in Apple's approach to AI and software development. Recommended read:
References :
Ashutosh Singh@The Tech Portal
//
Apple is enhancing its AI capabilities, known as Apple Intelligence, by employing synthetic data and differential privacy to prioritize user privacy. The company aims to improve features like Personal Context and Onscreen Awareness, set to debut in the fall, without collecting or copying personal content from iPhones or Macs. By generating synthetic text and images that mimic user behavior, Apple can gather usage data and refine its AI models while adhering to its strict privacy policies.
Apple's approach involves creating artificial data that closely matches real user input to enhance Apple Intelligence features. This method addresses the limitations of training AI models solely on synthetic data, which may not always accurately reflect actual user interactions. When users opt into Apple's Device Analytics program, the AI models will compare these synthetic messages against a small sample of a user’s content stored locally on the device. The device then identifies which of the synthetic messages most closely matches its user sample, and sends information about the selected match back to Apple, with no actual user data leaving the device. To further protect user privacy, Apple utilizes differential privacy techniques. This involves adding randomized data to broader datasets to prevent individual identification. For example, when analyzing Genmoji prompts, Apple polls participating devices to determine the popularity of specific prompt fragments. Each device responds with a noisy signal, ensuring that only widely-used terms become visible to Apple, and no individual response can be traced back to a user or device. Apple plans to extend these methods to other Apple Intelligence features, including Image Playground, Image Wand, Memories Creation, and Writing Tools. This technique allows Apple to improve its models for longer-form text generation tasks without collecting real user content. Recommended read:
References :
@www.theapplepost.com
//
References:
www.applemust.com
, The Apple Post
Apple is significantly ramping up its efforts in the field of artificial intelligence, with a dedicated focus on enhancing Siri and the overall Apple Intelligence platform. Teams within Apple have been instructed to prioritize the development of superior AI features for Apple Intelligence, demonstrating the company's commitment to leading in this domain. This push involves improving Siri's capabilities through features like Personal Context, Onscreen Awareness, and deeper app integration, aiming to create a more intuitive and capable virtual assistant.
Apple has also made strides in machine learning research, particularly in the area of multimodal large language models (LLMs). Their research, named MM-Ego focuses on enabling models to better understand egocentric video. These capabilities could provide users with real-time activity suggestions, automated task management, personalized training programs, and automated summarization of recorded experiences. Moreover, Apple is committed to making on-device model updates available, ensuring that users benefit from the latest AI advancements directly on their devices. According to reports, Apple is planning to release its delayed Apple Intelligence features this fall. The release will include Personal Context, Onscreen Awareness, and deeper app integration. These enhancements are designed to enable Siri to understand and reference a user's personal information, such as emails, messages, files, and photos, to assist with various tasks. Onscreen Awareness will allow Siri to respond to content displayed on the screen, while Deeper App Integration will empower Siri to perform complex actions across multiple apps without manual input. Recommended read:
References :
@www.theapplepost.com
//
References:
Apple Must
, The Apple Post
,
Apple is doubling down on its efforts to deliver top-tier AI capabilities, rallying its teams to "do whatever it takes" to make Apple Intelligence the best it can be. New leadership, including Craig Federighi and Mike Rockwell, have been brought in to revamp Siri and other AI features. The company is reportedly encouraging the use of open-source models, if necessary, signaling a shift in strategy to prioritize performance and innovation over strict adherence to in-house development. This renewed commitment comes after reports of internal conflict and confused decision-making within Apple's AI teams, suggesting a major course correction to meet its ambitious AI goals.
Apple is planning to release its delayed Apple Intelligence features this fall, including Personal Context, Onscreen Awareness, and deeper app integration, according to sources cited by The New York Times. The features were initially announced in March but were later postponed. Personal Context will allow Siri to understand and reference user emails, messages, files, and photos. Onscreen Awareness will enable Siri to respond to what’s currently on the screen, while Deeper App Integration will give Siri the power to perform complex, multi-step actions across apps without manual input. The push for enhanced AI follows reports of internal strife and shifting priorities within Apple's AI development teams. According to The Information, some potentially exciting projects were shelved in favor of smaller projects. Additionally, the impressive feature demo of contextual intelligence Apple showcased at WWDC "came as a surprise" to some Siri team members. Despite past challenges, Apple is determined to deliver on its AI vision, aiming to integrate advanced intelligence seamlessly into its products and services, potentially with the launch of iOS 19. Recommended read:
References :
@the-decoder.com
//
References:
THE DECODER
, www.techradar.com
,
Apple is facing challenges in its efforts to integrate advanced AI capabilities into Siri and the broader Apple Intelligence suite. Despite aiming to catch up with AI models like ChatGPT, the company has encountered technical setbacks, internal power struggles, and a divided management team. Originally slated for a summer 2024 release, key features like notification summarization were quickly disabled due to accuracy issues, and a planned Spring 2025 Siri upgrade was delayed after showing high error rates in internal testing.
In response to these setbacks, Apple has reorganized its AI leadership, with software chief Craig Federighi taking control, supported by Mike Rockwell. The company has also acknowledged delays, stating that it will take longer than initially anticipated to deliver on the promised features. Despite earlier uncertainties, the expectation is that the enhanced Siri capabilities, as part of the broader Apple Intelligence suite, will debut with the release of iOS 19 this fall. The revamped Siri aims to provide deeper app integrations, context understanding, and the ability to take action on behalf of users via voice commands. Apple plans to release a virtual assistant this fall capable of performing actions like editing and sending photos on request. While the ultimate goal is to bring Siri on par with AI chatbots like ChatGPT and Gemini, the initial rollout may focus on select features, with further upgrades expected in the future. Recommended read:
References :
@computerworld.com
//
Apple is facing significant internal challenges in its efforts to revamp Siri and integrate Apple Intelligence features. A new report has revealed epic dysfunction within the company, highlighting conflicts between managerial styles, shifting priorities, and a sense of being "second-class citizens" among Siri engineers. The issues stem, in part, from leadership differences, with some leaders favoring slow, incremental updates while others prefer a more brash and efficient approach. These conflicts have reportedly led to stalled projects and a lack of clear direction within the teams.
Despite these internal struggles, Apple intends to rollout the contextual Siri features it promised at WWDC 2024 this fall, potentially as part of iOS 19. The company has shifted senior leadership to ensure this happens. A key point of contention has been the integration of AI development efforts, with the software team led by Craig Federighi reportedly taking on more AI responsibilities and building within existing systems, which left the original Siri team feeling sidelined and slow to make progress. It remains unclear if the company can resolve these internal conflicts in time to deliver a seamless and improved Siri experience. Apple's AI teams have been instructed to "do whatever it takes" to build the best artificial intelligence features, even if that means using open-source models instead of Apple's own creations. This decision follows years of focus on the wrong things, internal conflict, and confused decision-making within the teams, according to the report. A spoken user interface for VisionOS that never got completed, despite being an exciting-sounding prospect, is just one example of shelved ideas in favor of projects with little impact. Despite the chaos the "tech bros got to work it out", says Jonny Evans in his column about Apple. Recommended read:
References :
@Latest from Laptop Mag
//
Apple is preparing to launch a significant expansion into digital health with its AI-driven Health+ service. Described by insiders as a "doctor in your iPhone," this service aims to optimize workouts, nutrition, and overall health management. The development represents a considerable investment and years of anticipation in the field of AI-enhanced healthcare.
The Health+ service, reportedly called Project Mulberry, will feature an AI health coach trained on data gathered by Apple’s own physicians. Apple intends to bring in outside specialists in areas like sleep, nutrition, and mental health to create video content. The goal is to empower users to take charge of their health journey and make a fundamental difference to healthcare. Recommended read:
References :
@Latest from Laptop Mag
//
Apple is reportedly developing a new AI-powered health app, codenamed "Project Mulberry," aimed at revolutionizing health tracking on iPhones. Expected to launch alongside iOS 19, the app will use AI algorithms to analyze health metrics from Apple devices and third-party sources. This comprehensive approach will enable the app to provide personalized insights and actionable recommendations to improve overall wellness, potentially transforming how users manage their health.
The app's AI coach will look at data from devices such as Apple Watch and potentially AirPods, which may gain future health-tracking features like heart-rate monitoring and temperature sensing. Apple is collaborating with health experts, including doctors, nutritionists, and therapists, to train the AI agents and create educational videos for the service, which may be called "Health+". The goal is to create an AI coach that can give personalized advice on sleep, exercise, mental well-being and other health issues. Recommended read:
References :
@laptopmag.com
//
Apple is reportedly developing future Apple Watch models with integrated cameras to enhance its artificial intelligence capabilities. According to reports, Apple aims to add small cameras to both the standard Apple Watch and the Ultra version, potentially positioning them on the front screen or where the side button is located. The goal is to provide the watch with visual perception of its environment, making it smarter and more useful for users.
Adding cameras to the Apple Watch would enable new features, similar to object and text scanning currently available on iPhones. For example, the watch could identify objects, translate foreign text, or provide information about food packaging for allergy-conscious individuals. The camera system could also have applications in health monitoring, such as examining skin tone to determine sleep quality or measuring heart rate through wrist readings. Apple has been exploring various camera designs for the Apple Watch for years, with patents showcasing different approaches. These include hiding a camera in the watch band for health tracking, a pop-up camera for occasional use, and even integrating a camera into the Digital Crown or using a flip-up screen. Despite these advancements, Apple is shuffling its executive team, replacing the head of AI with the VP of Vision Pro, suggesting a strategic shift in its AI efforts, with internal sources labeling delays with new AI rollouts as "ugly." Recommended read:
References :
Jonny Evans@Apple Must
//
Apple is making significant changes to its AI strategy by replacing John Giannandrea with Mike Rockwell as the head of Siri development. CEO Tim Cook reportedly lost confidence in Giannandrea’s execution regarding product development. This decision highlights Apple’s urgent need to enhance Siri’s capabilities and improve its integration across Apple devices and services. The move was described by one insider as a measure to "sort Siri out," reflecting internal pressure to boost the AI assistant's performance.
Rockwell, celebrated for his role in developing the Vision Pro, is expected to bring a fresh perspective to Siri’s development. Two of Rockwell's top lieutenants, Kim Vorrath and Aimee Nugent, have already been moved into the Siri team. While Rockwell will oversee Siri and report to Craig Federighi, Giannandrea will remain at Apple, focusing on broader AI and robotics research and technologies. This reorganization, announced internally, signifies a strategic shift following discussions at Apple's annual leadership summit about the company’s AI future. Recommended read:
References :
Asma Hussain@iThinkDifferent
//
Apple is facing significant difficulties with its AI assistant, Siri, leading to feature delays and internal frustration. According to reports, Apple's top Siri executive admitted in a meeting that the AI upgrade has been problematic, calling the delays "ugly and embarrassing." The executive also acknowledged that promoting features before they were ready worsened the situation. Despite these challenges, Apple aims to improve Siri and make it the "world's greatest" assistant.
The issues with Siri's AI upgrade have resulted in key enhancements being postponed, with some features possibly not appearing until the iOS 19 cycle or later. The premature showcasing of these capabilities at events like WWDC 2024 and in marketing campaigns for the iPhone 16 has added to the pressure. Although Apple is committed to enhancing Siri, quality issues remain a critical hurdle, raising concerns about the company's ability to compete in the rapidly evolving AI landscape. Recommended read:
References :
@Simon Willison's Weblog
//
Apple is facing setbacks in its AI development, leading to delays in key features for Siri and Apple Intelligence. The enhanced version of Siri, promising a more personalized AI assistant experience, won't arrive until 2026, a pushback from its originally slated release with iOS 18. These delayed features are focused on enhancing Siri's context-awareness and capabilities, including scanning emails, messages, files, and photos to perform tasks across multiple apps, and understanding on-screen actions.
Apple's plan to replace Qualcomm modems with its in-house C1 chips is also facing hurdles, postponing the release of the mmWave-compatible C1 modem until 2026, potentially impacting the iPhone 17 lineup. This delay means the iPhone 17 series will continue using Qualcomm modems, as Apple grapples with technical challenges in perfecting mmWave integration. While the first-generation C1 modem offered improved power management, it faced inadequacy in supporting mmWave 5G. This setback impacts Apple's 5G independence and delays modem upgrades for iPhone users. Recommended read:
References :
Michal Langmajer@Fello AI
//
Apple's plans to revamp Siri with advanced AI capabilities have been delayed, with a truly modernized version not expected until iOS 20 in 2027. This upgrade, known as "LLM Siri," was initially planned for iOS 19 but has encountered roadblocks, raising concerns about Apple's competitiveness in the AI field. According to Bloomberg’s Mark Gurman, Apple is struggling to rebuild Siri for the age of generative AI, and the company might not release "a true modernized, conversational version of Siri” until 2027.
This delay means that Apple's AI team believes a fully conversational Siri isn’t in the cards now until 2027. While Apple Intelligence aims to integrate AI across its devices with improved language models and contextual awareness, Siri, the centerpiece of Apple's assistant technology, lags behind competitors. Apple has cautioned investors that upcoming products may not achieve the same level of success as the iPhone, which remains a primary source of income for the company. Recommended read:
References :
nftjedi@chatgptiseatingtheworld.com
//
Apple has announced a massive $500 billion investment in the United States over the next four years, signaling a significant boost to American manufacturing and innovation. This commitment includes the establishment of a new AI server manufacturing facility in Texas, expected to open in 2026. The 250,000-square-foot facility near Houston will produce servers powering Apple Intelligence and Private Cloud Compute, previously manufactured overseas, creating thousands of jobs.
Apple's investment also encompasses the creation of 20,000 new jobs nationwide, with a focus on research and development, silicon engineering, software development, and AI and machine learning. Furthermore, Apple is doubling its U.S. Advanced Manufacturing Fund to $10 billion, supporting innovation and high-skilled manufacturing. The fund includes a multibillion-dollar commitment to advanced silicon chip production at TSMC's Arizona facility and the opening of an "Apple Manufacturing Academy" in Detroit, Michigan. Recommended read:
References :
|
BenchmarksBlogsResearch Tools |