Perplexity adds Voice Mode to Perplexity Computer for spoken agent steering
Original: Perplexity adds Voice Mode to Perplexity Computer for spoken agent steering View original →
What Perplexity announced
On March 4, 2026, Perplexity said Voice Mode was coming to Perplexity Computer, summarizing the update with a simple message: you can now talk and do things. That sounds small, but it changes how users steer an agent system that is meant to handle longer, multi-step tasks. Instead of stopping to type corrections, a user can keep the workflow moving while speaking adjustments in real time.
What the changelog adds
Perplexity’s March 6 changelog says Voice Mode brings natural conversation directly into Computer on the web and is powered by the same voice stack used in Comet. The company frames the feature around spoken project control: describe a task, interrupt with feedback mid-run, or redirect the system on the fly without switching back to a keyboard. The examples are telling. Perplexity suggests asking Computer to build a landing page, run a financial analysis, or revise a chart while the project is already in progress.
That matters because Perplexity Computer is not pitched as a voice assistant in the narrow sense. It is positioned as a work system that researches, codes, analyzes, and produces deliverables across longer sessions. Voice Mode therefore acts less like a novelty interface and more like an additional control surface for an agent that is already operating over web tools and connected services.
Why this matters
The broader implication is that spoken interaction is moving from simple question answering into live workflow management. If users can redirect a running agent by voice, the handoff between planning and execution becomes more conversational and less interruptive. For products competing on agent usability, that could matter as much as the underlying model choice, because the real bottleneck is often not capability but how quickly a human can intervene, clarify, and iterate.
Sources: Perplexity X post, Perplexity Changelog
Related Articles
OpenAI announced on X that Codex Security has entered research preview. The company positions it as an application security agent that can detect, validate, and patch complex vulnerabilities with more context and less noise.
OpenAI said on X on March 9 that it plans to acquire Promptfoo, an AI security platform, and keep the project open source. The deal strengthens OpenAI Frontier’s agentic testing and evaluation stack.
Amazon and OpenAI have announced a multi-year strategic partnership that combines cloud infrastructure, managed GPT access in Bedrock, and a new stateful runtime for agent systems. The deal also significantly expands the companies’ long-term infrastructure commitment.
Comments (0)
No comments yet. Be the first to comment!