LLM X/Twitter Mar 22, 2026 2 min read

Vercel said on March 19, 2026 that it built Chat SDK to remove the platform-specific plumbing that slowed internal agent rollouts. Vercel’s blog describes an open-source public-beta TypeScript library that lets one bot implementation target Slack, Teams, Google Chat, Discord, Telegram, GitHub, Linear, and now WhatsApp through adapters.

LLM X/Twitter Mar 22, 2026 2 min read

Together AI said on March 19, 2026 that its fine-tuning service now supports tool calling, reasoning, and vision-language model training, with up to 6x higher throughput on MoE architectures. The company says the update also targets very large models, supports datasets up to 100GB, and adds pre-run cost estimates plus live ETAs during training.

LLM X/Twitter Mar 22, 2026 2 min read

Cloudflare said on March 20, 2026 that Kimi K2.5 was available on Workers AI so developers could build end-to-end agents on Cloudflare’s platform. Its launch post says the model brings a 256k context window, multi-turn tool calling, vision inputs, and structured outputs, while an internal security-review agent processing 7B tokens per day cut costs by 77% after the switch.

LLM X/Twitter Mar 22, 2026 2 min read

OpenAI Developers said on March 21, 2026 that container startup for skills, hosted shell, and code interpreter was about 10x faster via a new container pool in the Responses API. Updated OpenAI shell docs show hosted shell can create containers automatically, reuse active containers by reference, and keep them alive for 20 minutes of inactivity.

LLM Mar 22, 2026 3 min read

On February 27, 2026, OpenAI and Amazon announced a multi-year deal covering a Stateful Runtime Environment on Amazon Bedrock, AWS-exclusive third-party distribution for OpenAI Frontier, 2 gigawatts of Trainium capacity, and a $50 billion Amazon investment. The announcement matters because it combines enterprise agent infrastructure, cloud distribution, and custom silicon in one agreement.