Apple plans AI reboot with Siri app, new look and ‘Ask Siri’ button in iOS 27

SAN FRANCISCO – Apple is testing a standalone app for its Siri voice assistant alongside a new “Ask Siri” feature that will work across the company’s software, part of a broader artificial intelligence overhaul.

Apple is also modernising Siri by giving it a fresh look and chatbot-like experience, according to people with knowledge of the matter.

The new Siri is slated to be unveiled on June 8 at the iPhone maker’s Worldwide Developers Conference (WWDC) as part of the iOS 27 and macOS 27 operating systems, said the people, who asked not to be identified because the plans are private. 

After struggling to gain traction with Apple Intelligence, the company has been working to rebuild the AI platform around the new version of Siri.

The goal is to transform the technology from a traditional voice assistant into a systemwide AI agent with deep integration across applications.

Apple announced the schedule for WWDC on March 23 and vowed to highlight “AI advancements”, but the company hasn’t publicly discussed what it plans to present. A spokesperson for the Cupertino, California-based tech giant declined to comment.

The updated Siri, code-named Campo, is designed to better control features within iPhones and Macs and tap into personal data – like messages, notes and e-mails – to fulfil requests. It will also be able to complete tasks within apps, access news content, and search the open web using Apple-built interfaces and models.

The most significant change is the ability to interact with users in a conversational, chat-like format – through text and voice. The move represents a clear break from the current Siri experience, which lacks conversational capabilities, and marks a strategic shift for the company.

Software engineering head Craig Federighi, who now oversees AI efforts, told Tom’s Guide in 2025 that Apple didn’t want to send users “off into some chat experience in order to get things done”.

But the rapid adoption of services like ChatGPT has made that stance increasingly difficult to maintain. Even so, Apple is unlikely to characterise its new technology as a chatbot. 

As part of the shift toward this approach, Apple is testing a dedicated Siri app for the iPhone, iPad and Mac later in 2026. It rivals outside AI tools while also giving users a central place to access their past interactions.

The app’s main interface will display prior conversations in either a list or a grid of rounded rectangles with text previews. Users can pin favorite chats, save older conversations, search across interactions and start new chats via a prominent plus button.

The conversation view resembles a thread in Apple’s Messages app, with chat bubbles and a text entry field. It also includes a toggle for switching in and out of voice mode and an option to upload attachments – such as documents and photos – for analysis.

These features have already become standard in modern chatbot interfaces.

When starting a new conversation, Siri will offer suggested prompts based on prior usage. The interface adapts to light and dark modes, with a white background and dark text or the inverse.

Users will still be able to trigger Siri via the power button or voice command, but Apple is testing a redesigned interface that replaces the glowing edges effect introduced in iOS 18.

One new design in testing places Siri at the top of the screen within the Dynamic Island, the mini-interface that Apple introduced in 2022. After it’s activated, Siri will prompt the user to “Search or Ask”. 

When processing a request, a pill-shaped indicator labelled “Searching” appears, alongside a glowing Siri icon.

Once results are ready, the interface expands into a larger translucent panel with Apple’s Liquid Glass design. Users can pull the menu down further to begin conversing back and forth.

The final design could change, and Apple’s human interface team typically tests a number of different options. 

Apple is also working to replace its existing on-device search system, Spotlight, with Siri. The new unified interface helps users find local content or submit broader queries in one place. 

The search interface also will keep showing “Siri Suggestions” – the apps, upcoming appointments or setting changes suggested by AI. But it will be able to root through more types of user data than the current version of iOS Spotlight Search, leveraging a feature called Personal Context that was delayed from 2025.

The updated Siri will also provide more detailed responses sourced from the web, including summaries, bullet points and images – an attempt to compete more directly with AI-driven search tools like Google Gemini and Perplexity. The software can also generate deeper summaries of daily news using content from Apple News.

Apple is also testing the idea of integrating the assistant more deeply across its operating systems with new entry points.

A systemwide “Ask Siri” toggle will appear in menus across built-in apps, allowing users to send selected content into a new Siri conversation. For example, they could request more information about highlighted text or pull up related e-mails. The toggle is similar to what exists in the ChatGPT iPhone app today. 

A “Write with Siri” option at the top of the keyboard is also in testing. It will surface the Writing Tools menu for generating and editing text. That existing feature, core to the marketing of Apple Intelligence the past two years, can be difficult to find in the current version of iOS. 

The new Siri builds on an overhaul first unveiled at WWDC in 2024 but never released. Apple initially announced that it would debut in spring 2025. The features were then delayed until this month and again till later in 2026.

Many involved in the effort believe the majority of the already-announced changes – including access to personal data and on-screen awareness for answering questions – won’t be ready until Fall 2026.

The latest internal versions of iOS 27 being tested by employees include the features.

Another already-announced upgrade, an expansion of the App Intents software, remains in the works. It allows Siri to more precisely control functions within both Apple applications and third-party apps.

Over time, the company plans to extend this system further, partly by enabling users to navigate app interfaces – such as scrolling through menus – via voice.

Many of the new features are powered by updated versions of the company’s in-house models, known as Apple Foundation Models, developed alongside technology from Google Gemini.

The two partners struck a roughly US$1 billion (S$1.28 billion) arrangement in 2025, Bloomberg News reported. They confirmed the tie-up in January. BLOOMBERG

Comments (0)

AI Article