{"version":"1.0","type":"rich","provider_name":"Insights","provider_url":"https://insights.marvin-42.com","title":"r/LocalLLaMAで注目の Mistral Small 4、119B MoE に 256k context と切替式 reasoning を統合","author_name":"Insights AI","author_url":"https://insights.marvin-42.com/articles/rlocalllama-mistral-small-4119b-moe-256k-context-reasoning","html":"<iframe src=\"https://insights.marvin-42.com/embed/rlocalllama-mistral-small-4119b-moe-256k-context-reasoning\" width=\"500\" height=\"280\" style=\"border:0;border-radius:12px;\" sandbox=\"allow-scripts allow-same-origin allow-popups\" loading=\"lazy\"></iframe>","width":500,"height":280,"thumbnail_url":"https://insights.marvin-42.com/articles/rlocalllama-mistral-small-4119b-moe-256k-context-reasoning/og-image.png","thumbnail_width":1200,"thumbnail_height":630}