Perplexity adds Voice Mode to Perplexity Computer for spoken agent steering
Original: Perplexity adds Voice Mode to Perplexity Computer for spoken agent steering View original →
What Perplexity announced
On March 4, 2026, Perplexity said Voice Mode was coming to Perplexity Computer, summarizing the update with a simple message: you can now talk and do things. That sounds small, but it changes how users steer an agent system that is meant to handle longer, multi-step tasks. Instead of stopping to type corrections, a user can keep the workflow moving while speaking adjustments in real time.
What the changelog adds
Perplexity’s March 6 changelog says Voice Mode brings natural conversation directly into Computer on the web and is powered by the same voice stack used in Comet. The company frames the feature around spoken project control: describe a task, interrupt with feedback mid-run, or redirect the system on the fly without switching back to a keyboard. The examples are telling. Perplexity suggests asking Computer to build a landing page, run a financial analysis, or revise a chart while the project is already in progress.
That matters because Perplexity Computer is not pitched as a voice assistant in the narrow sense. It is positioned as a work system that researches, codes, analyzes, and produces deliverables across longer sessions. Voice Mode therefore acts less like a novelty interface and more like an additional control surface for an agent that is already operating over web tools and connected services.
Why this matters
The broader implication is that spoken interaction is moving from simple question answering into live workflow management. If users can redirect a running agent by voice, the handoff between planning and execution becomes more conversational and less interruptive. For products competing on agent usability, that could matter as much as the underlying model choice, because the real bottleneck is often not capability but how quickly a human can intervene, clarify, and iterate.
Sources: Perplexity X post, Perplexity Changelog
Related Articles
Perplexity said on March 31, 2026 that it is launching the Secure Intelligence Institute to study the security, trustworthiness, and practical defense of frontier AI systems. The institute page says the work draws on Perplexity’s experience serving millions of users and thousands of enterprises, is led by Purdue professor Ninghui Li, and already highlights research such as BrowseSafe and a NIST-focused paper on securing AI agents.
Google is rolling out Skills in Gemini in Chrome so users can save prompts and rerun them on the current page or selected tabs. The feature starts on Mac, Windows, and ChromeOS for English-US desktop users, with confirmations before actions like adding calendar events or sending email.
Why it matters: Google is turning Vertex AI from a collection of services into a governed agent platform. The linked Google Cloud post says Model Garden gives access to more than 200 models, including Gemini 3.1 Pro, Lyria 3, Gemma 4, and Claude families.
Comments (0)
No comments yet. Be the first to comment!