GoodSeed pitches a lighter ML experiment tracker with local SQLite and Neptune migration
Original: [P] We made GoodSeed, a pleasant ML experiment tracker View original →
Reddit thread: MachineLearning discussion
Website: goodseed.ai
GitHub: Kripner/goodseed
Migration guide: Neptune to GoodSeed
GoodSeed is being pitched to the MachineLearning community as an experiment tracker that stays narrow on purpose. Instead of trying to become a full platform for orchestration, labeling, feature stores and deployment, the project focuses on the day-to-day job many researchers actually care about: record runs, inspect metrics, compare configs, and recover enough system context to understand why one training job behaved differently from another. That positioning is explicit both in the Reddit announcement and in the product site’s tagline, which describes GoodSeed as an experiment tracker rather than an all-purpose ML Ops suite.
The core technical choice is simple local storage. The README says GoodSeed writes metrics and configs into local SQLite files, then serves them through a built-in HTTP server for browser inspection. The Python API looks lightweight: users instantiate goodseed.Run, log scalar configs or metric series, and close the run when training ends. The project supports three storage modes. cloud keeps local SQLite data and syncs it in the background to a remote API, local stays fully offline, and disabled turns writes into no-ops. That makes the tool usable both for laptop experiments and for teams that still want hosted access.
Monitoring without a giant stack
GoodSeed’s monitoring scope is broader than a plain loss chart. According to the README and the Reddit post, it can capture stdout, stderr, unhandled tracebacks, CPU and memory usage, NVIDIA and AMD GPU metrics, and Git metadata from the active repository. The Git tracking is especially practical: it records dirty state, diffs, commit information, branch data and remotes, which helps tie a training result back to a concrete source tree rather than a vague memory of “whatever was on the machine that day.”
The migration angle also matters. The Reddit announcement says the authors are already using GoodSeed as a replacement for Neptune, and the project links directly to Neptune’s own transition material plus a migration flow through neptune-exporter. GoodSeed also exposes a Neptune proxy view, which reduces the friction of testing the UI against existing runs before a full switch.
The project is still early. The README labels it beta, and the Reddit post notes that the remote server currently supports only a subset of data types and has limited capacity. Even so, the attention on MachineLearning suggests there is real demand for tools that do less, store data locally first, and treat experiment tracking as a focused developer workflow instead of a sprawling platform commitment.
Related Articles
Lalit Maganti argues that AI coding agents made a long-delayed SQLite tooling project feasible, but only after he threw away the early “vibe-coded” version and rebuilt the project around Rust, tests, and tighter human control. The result is a grounded case study in how AI accelerates engineering and where it still fails.
HN latched onto the RAM shortage because the uncomfortable link is physical: HBM demand for AI data centers is now shaping prices for phones, laptops, and handhelds.
HN pushed this past 400 comments because the story was not just nostalgia. It asked what evidence of student thinking should look like when AI can produce the polished draft.
Comments (0)
No comments yet. Be the first to comment!