Jibin Joseph@PCMag Middle East ai
//
Google is experimenting with replacing its iconic "I'm Feeling Lucky" button with a new "AI Mode" tool. This represents a significant shift in how the search engine operates, moving away from providing a direct link to the first search result and towards offering AI-powered answers directly within the Google search interface. The "I'm Feeling Lucky" button, which has been a staple of Google's homepage since 1998, bypasses the search results page entirely, taking users directly to what Google deems the most relevant website. However, Google believes that most users now prefer browsing a range of links rather than immediately jumping to a single site.
The new AI Mode aims to provide a more interactive and informative search experience. When users ask questions, the AI Mode tool leverages Google's AI chatbot to generate detailed responses instead of simply presenting a list of links. For example, if a user asks "Where can I purchase a camping chair under $100?", AI Mode may display images, prices, and store links directly within the search results. Users can then engage in follow-up questions with the AI, such as "Is it waterproof?", receiving further details and recommendations. The AI also uses real-time information to display store hours, product availability, and other relevant data. Testing of the AI Mode button is currently limited to a small percentage of users in the U.S. who are part of Google's Search Labs program. Google is exploring different placements for the button, including inside the search bar next to the camera icon, or replacing the "I'm Feeling Lucky" button entirely. Some users have also reported seeing a rainbow-colored glow around the button when they hover over it. While this move aims to align with modern search habits, some users have expressed concern over the potential loss of the nostalgic "I'm Feeling Lucky" feature, and are also concerned about the accuracy of Google's AI Mode. References :
Classification:
Matt G.@Search Engine Journal
//
OpenAI is rolling out a series of updates to ChatGPT, aiming to enhance its search capabilities and introduce a new shopping experience. These features are now available to all users, including those with free accounts, across all regions where ChatGPT is offered. The updates build upon real-time search features that were introduced in October and aim to challenge established search engines such as Google. ChatGPT's search function has seen a rapid increase in usage, processing over one billion web searches in the past week.
The most significant addition is the introduction of shopping functionality, allowing users to search for products, compare options, and view visual details like pricing and reviews directly within the chatbot. OpenAI emphasizes that product results are chosen independently and are not advertisements, with recommendations personalized based on current conversations, past chats, and user preferences. The initial focus will be on categories like fashion, beauty, home goods, and electronics, and soon it will integrate its memory feature with shopping for Pro and Plus users, meaning ChatGPT will reference a user’s previous chats to make highly personalized product recommendations. In addition to the new shopping features, OpenAI has added other improvements to ChatGPT's search capabilities. Users can now access ChatGPT search via WhatsApp. Other improvements include trending searches and autocomplete, which offer suggestions as you type to speed up your searches. Furthermore, ChatGPT will provide multiple sources for information and highlight specific portions of text that correspond to each source, making it easier for users to verify facts across multiple websites. While these new features aim to enhance user experience, OpenAI is also addressing concerns about ChatGPT's 'yes-man' personality through system prompt updates. References :
Classification:
@developers.googleblog.com
//
Google is aggressively advancing AI agent interoperability with its new Agent2Agent (A2A) protocol and development kit. Unveiled at Google Cloud Next '25, the A2A protocol aims to standardize how AI agents communicate, collaborate, and discover each other across different platforms and tasks. This initiative is designed to streamline the exchange of tasks, streaming updates, and sharing of artifacts, fostering a more connected and efficient AI ecosystem. The A2A protocol complements existing efforts by providing a common language for agents, enabling them to seamlessly integrate and normalize various frameworks like LangChain, AutoGen, and Pydantic.
The Agent2Agent protocol introduces the concept of an "Agent Card" (agent.json), which describes an agent's capabilities and how to reach it. Agents communicate through structured messages, indicating task states such as working, input-required, or completed. By establishing this open standard, Google, along with partners like SAP, seeks to enable agents from different vendors to interact, share context, and collaborate effectively. This move represents a significant step beyond simple API integrations, laying the groundwork for interoperability and automation across traditionally disconnected systems. The development of A2A aligns with Google's broader strategy to solidify its position in the competitive AI landscape, challenging rivals like Microsoft and Amazon. Google is not only introducing new AI chips, such as the Ironwood TPUs, but also updating its Vertex AI platform with Gemini 2.5 models and releasing an agent development kit. This comprehensive approach aims to empower businesses to turn AI potential into real-world impact by facilitating open agent collaboration, model choice, and multimodal intelligence. The collaboration with SAP to enable AI agents to securely interact and collaborate across platforms through A2A exemplifies this commitment to enterprise-ready AI that is open, flexible, and deeply grounded in business context. References :
Classification:
|
BenchmarksBlogsResearch Tools |