PyTorch Foundation adds Safetensors and Helion to its hosted project stack
Original: Safetensors and Helion have joined PyTorch Foundation as foundation-hosted projects to secure model distribution for trusted agentic solutions and simplify kernel development across the open source AI ecosystem. PyTorch Foundation CTO Matt White to Noah Bovenizer at The Stack: “Having portable formats that work across different frameworks is extremely important to be able to ship and move models around. And then Helion makes things more accessible for folks that want to do custom kernel development.” Safetensors and Helion join PyTorch, @vllm_project, @DeepSpeedAI, and @raydistributed as foundation hosted projects. Read Noah Bovenizer’s coverage at The Stack here: https://www.thestack.technology/hugging-faces-safetensors-metas-helion-join-pytorch-foundation/ #PyTorch #OpenSource #AI #Safetensors #Helion View original →
On April 9, 2026, PyTorch said in an X post that Safetensors and Helion have joined the PyTorch Foundation as foundation-hosted projects. PyTorch framed the move as a way to secure model distribution for trusted agentic solutions and to simplify kernel development across the open-source AI ecosystem. The company added that the two projects now sit alongside PyTorch, vLLM, DeepSpeed, and Ray under the foundation’s hosted-project structure.
The two additions cover different layers of the stack. Safetensors is Hugging Face’s format for serializing tensors with a safer loading model than pickle-based approaches, which has made it relevant wherever teams want to move model weights without expanding arbitrary code-execution risk. Helion is Meta’s domain-specific language for machine-learning kernels, aimed at making custom kernel work more accessible. In coverage by The Stack quoted by PyTorch, Foundation CTO Matt White said portable formats across frameworks are critical for shipping and moving models around, and that Helion lowers the barrier for custom kernel development.
Governance is the real headline
The broader significance is governance, not just project count. By moving core pieces of model transport and kernel tooling into a neutral foundation structure, the PyTorch ecosystem is trying to make important infrastructure look less like vendor-specific add-ons and more like shared public rails. For teams building agent systems, that could matter in two ways: safer defaults for model exchange and a wider contributor base around performance-critical kernels. The announcement is a reminder that open-source AI competition increasingly plays out in formats, runtimes, and governance models as much as in the models themselves.
Related Articles
Netflix’s VOID reached Reddit as an open research release aimed at removing objects from video and repairing the interactions those objects caused in the scene. The notable details are the CogVideoX base, a two-pass pipeline, Gemini+SAM2 mask generation, and a 40GB+ VRAM requirement.
Astral’s April 8, 2026 post became an HN talking point because it turned supply-chain security into concrete CI/CD practice. The key pieces were banning risky GitHub Actions triggers, hash-pinning actions, shrinking permissions, isolating secrets, and using GitHub Apps or Trusted Publishing where Actions defaults fall short.
Anthropic said on March 17, 2026 that open source security is becoming more important as AI grows more capable. In its X post, the company said it is donating to the Linux Foundation to help secure the software foundations AI depends on.
Comments (0)
No comments yet. Be the first to comment!