“Google's 200M parameter model forecasts your time series zero-shot — no training required.”
TimesFM is a 200M parameter decoder-only foundation model from Google Research that forecasts time series data without any training on your specific dataset. Trained on 100 billion real-world time-points from Google Trends, Wikipedia pageviews, and synthetic data, it achieves state-of-the-art zero-shot performance on unseen datasets — matching supervised models that require custom training. Released as TimesFM 2.5 in September 2025, it now supports up to 16k context length and continuous quantile forecasts up to 1k horizon. Available as open source on GitHub (13k+ stars, Apache-2.0 license) a...
You need to forecast demand, traffic, or metrics for your SaaS product. Traditional approaches require either (a) statistical methods like ARIMA that need manual tuning per time series, or (b) deep learning models like DeepAR that require training on your specific data for hours or days. When you have hundreds or thousands of time series — each product SKU, each geographic region, each customer segment — the overhead becomes prohibitive. TimesFM eliminates the per-dataset training cost by providing a foundation model that works zero-shot across domains.
TimesFM treats time series forecasting like language modeling. It divides your input series into patches of 32 consecutive time points, processes them through a decoder-only transformer (similar to GPT's architecture), and outputs patches of 128 future points. The key innovation: the model learned general temporal patterns from 100B+ diverse time points during pretraining, so when you feed it your web traffic or sales data, it recognizes patterns from similar series it saw during training. Unlike encoder-decoder models, the decoder-only approach generates forecasts autoregressively — each prediction conditions on previous predictions, enabling flexible horizon lengths. The 2.5 release adds a 30M parameter quantile head for probabilistic forecasts and XReg support for covariates (external variables like holidays or promotions).
Data scientists, ML engineers, and developers who need time series forecasting at scale. If you're forecasting demand for an e-commerce platform with thousands of SKUs, predicting server load across multiple services, or analyzing web traffic across hundreds of pages — and you don't want to train custom models for each — this is for you. Also relevant for PMs and business analysts who use BigQuer...
Yes, if you have time series forecasting needs. The HN thread (317 points, 118 comments) debates whether foundation models make sense for time series, but the benchmark results are real: TimesFM outperforms ARIMA by 15-25% on standard datasets and matches supervised models zero-shot. The open source release with Apache-2.0 license makes it trivial to test on your data. The one thing you'd regret missing: Google put this in BigQuery as an official product — that's a signal about production readiness that academic papers don't provide.
View original sourceThis page gives you the hook. The full Snaplyze digest goes deeper so you can move from curiosity to decision with less noise.
Open the full digest to read the deeper breakdown, compare viewpoints, and get the practical next-step playbooks.
Read the full digest for deep-dive insight, Easy Mode, Pro Mode, and practical playbooks you can actually use.
Install Snaplyze