LLM Reddit 2h ago 2 min read

LocalLLaMA lit up because Xiaomi MiMo dropped an MIT-licensed MoE with 1.02T total parameters, 42B active parameters, and a 1M-token context window. The excitement was real, but so was the hardware reality check: people loved the openness and agentic claims while joking about how many serious GPUs you still need.