Meta and WRI open-source CHMv2 for high-resolution global canopy mapping
Original: We're announcing Canopy Height Maps v2 (CHMv2), an open source model for high-resolution global forest canopy mapping, developed in partnership with the @WorldResources. CHMv2 leverages our DINOv3 Sat-L vision model, specifically optimized for satellite imagery, to deliver substantial improvements in accuracy, detail, and global consistency. Learn more: https://go.meta.me/70d2e9 View original →
On March 12, 2026, AI at Meta announced on X that it is open-sourcing Canopy Height Maps v2 (CHMv2), a model for high-resolution global forest canopy mapping developed with the World Resources Institute (WRI). Meta said the release uses its DINOv3 Sat-L vision model, which it described as optimized for satellite imagery, and that the new version materially improves accuracy, detail, and global consistency.
The linked materials position CHMv2 as both a model release and a world-scale mapping output. Meta's accompanying thread said CHMv2 is already supporting public-sector efforts in the United States, Europe, and other regions. The card linked from the post framed canopy height as important for measuring forest carbon, monitoring restoration and degradation, and understanding habitat structure. That gives the project a clearer operational purpose than a typical benchmark announcement.
- Partners named by Meta: AI at Meta and the World Resources Institute.
- Core technical ingredient: DINOv3 Sat-L for satellite imagery.
- Use cases cited: carbon quantification, restoration tracking, reforestation, and land management.
The release is notable beyond Meta's own research portfolio. By open-sourcing both the model and world-scale maps, Meta is lowering the barrier for researchers, public agencies, and conservation groups that need geospatial AI tools but do not have the resources to build them from scratch. That can accelerate downstream work in climate monitoring, ecosystem analysis, and public decision-making, especially where field measurements are sparse or expensive.
Primary sources are AI at Meta's March 12, 2026 X posts and the linked Meta AI materials, including Mapping the World's Forests with Greater Precision: Introducing Canopy Height Maps v2. Meta's wording is careful: the company says CHMv2 can inform carbon offsetting, reforestation, and land management decisions, which is best read as decision support rather than as a substitute for on-the-ground measurement programs.
Related Articles
Meta said on March 26, 2026 that TRIBE v2 is a foundation model for predicting human brain responses to sight, sound, and language. The supporting paper and demo highlight zero-shot generalization, prediction across 70,000 voxels, and public releases of the paper, code, and model weights.
Meta said on March 26, 2026 that TRIBE v2 can predict high-resolution fMRI brain activity with zero-shot generalization across new subjects, languages, and tasks. The company is also releasing the model, code, paper, and demo for researchers.
r/MachineLearning found the 1,200-paper list useful, but the thread immediately separated “has a link” from “can reproduce the result.” Comments pointed to missing papers, 404s, and the gap between public code and runnable research.
Comments (0)
No comments yet. Be the first to comment!