Meta ships SAM 3.1 with object multiplexing for 32 FPS video tracking on a single H100

Original: SAM 3.1: Faster and More Accessible Real-Time Video Detection and Tracking With Multiplexing and Global Reasoning View original →

Read in other languages: 한국어日本語
AI Mar 28, 2026 By Insights AI 2 min read 1 views Source

Meta introduced SAM 3.1 on March 27, 2026 as an update to its Segment Anything family focused on real-time video detection and tracking. The company says the new model is a drop-in replacement for SAM 3 and improves throughput primarily through object multiplexing, which lets the system track up to 16 objects in a single forward pass. For videos with a medium number of objects, Meta says that shifts performance from 16 FPS to 32 FPS on a single H100 GPU.

That runtime change matters because multi-object video tracking usually scales poorly as scenes become more crowded. Meta says earlier behavior required a separate pass per object, creating redundant compute and memory overhead. By processing tracked objects together and using global reasoning, SAM 3.1 is meant to cut those bottlenecks while also improving robustness in cluttered scenes. The practical claim is not just a faster model, but more accessible real-time performance on smaller hardware budgets.

What Meta is shipping

Meta is encouraging developers to treat SAM 3.1 as a straightforward upgrade path. The company says the release includes an updated checkpoint, changes to the SAM 3 codebase and research paper, and compatibility with the Segment Anything Playground. That makes the announcement relevant to developers building media tools, robotics and perception stacks, or enterprise video workflows where latency and GPU cost determine whether a feature can be deployed at all.

  • Meta says SAM 3.1 can track up to 16 objects in one forward pass.
  • Throughput for medium-object-count videos rises from 16 FPS to 32 FPS on a single H100, according to the company.
  • The update is designed as a drop-in replacement for SAM 3 rather than a separate product line.

The release also fits Meta’s broader pattern of turning internal research into creator and commerce tools. The company says SAM 3 is being used in products such as Facebook Marketplace’s View in Room and that related creation features are coming to Edits, Vibes, and Meta AI surfaces. SAM 3.1 therefore matters as both a research and runtime improvement and as a sign that open computer-vision models are being optimized for real production loops, not just academic demos.

Share: Long

Related Articles

AI 5d ago 2 min read

Meta said on January 9, 2026 that new agreements with Vistra, TerraPower, and Oklo could support up to 6.6 GW of new and existing clean power by 2035. The company tied the effort directly to the energy demands of its growing AI infrastructure, including the Prometheus supercluster in Ohio.

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.