Poolside Releases Laguna XS.2: First Open-Weight Coding Model That Runs on a Single GPU
Poolside's First Open-Weight Release
On April 28, 2026, Poolside AI released Laguna XS.2 and Laguna M.1 simultaneously. XS.2 is the company's first model with open weights, released under Apache 2.0.
Specifications
- Architecture: 33B total / 3B active MoE
- Training: 30T tokens, fully in-house infrastructure
- SWE-bench Verified: 68.2%
- SWE-bench Pro: 44.5%
- SWE-bench Multilingual: 62.4%
- Terminal-Bench 2.0: 30.1%
- Hardware: Single GPU with 36GB RAM (runs on Apple M-series via Ollama)
- License: Apache 2.0
Architecture
XS.2 uses sigmoid gating with per-layer rotary scales and a mixed Sliding Window Attention / global attention layout in a 3:1 ratio across 40 layers. The model was trained entirely on Poolside's own in-house stack.
How to Access
Weights are on Hugging Face. Hosted inference is available via Poolside's API and OpenRouter. Local deployment works via Ollama on Mac hardware.
Source: Poolside AI, VentureBeat
Related Articles
Released April 29, 2026 under Modified MIT license, Mistral Medium 3.5 consolidates the company's chat, reasoning, and coding models into one 128B dense open-weight model with 256K context, scoring 77.6% on SWE-bench Verified.
Anthropic released a suite of Claude connectors enabling direct integration with Adobe Creative Cloud, Blender, Autodesk, Ableton, and five other creative tools. The Blender connector is open-source and MCP-based, meaning any LLM can plug in.
Alibaba launched Qwen 3.5 on February 16 under Apache 2.0, featuring 397B parameters with a sparse MoE architecture (17B active), 256K context, and native multimodal capabilities matching leading US proprietary models on key benchmarks.
Comments (0)
No comments yet. Be the first to comment!