Google's TimesFM: Zero-Shot Time Series Forecasting Without Training Data
Snaplyze Digest
R&D intermediate 3 min read Apr 3, 2026 Updated Apr 5, 2026

Google's TimesFM: Zero-Shot Time Series Forecasting Without Training Data

“Google's 200M parameter model forecasts your time series zero-shot — no training required.”

In Short

TimesFM is a 200M parameter decoder-only foundation model from Google Research that forecasts time series data without any training on your specific dataset. Trained on 100 billion real-world time-points from Google Trends, Wikipedia pageviews, and synthetic data, it achieves state-of-the-art zero-shot performance on unseen datasets — matching supervised models that require custom training. Released as TimesFM 2.5 in September 2025, it now supports up to 16k context length and continuous quantile forecasts up to 1k horizon. Available as open source on GitHub (13k+ stars, Apache-2.0 license) a...

time-seriesforecastingfoundation-modelgooglezero-shot
Why It Matters
The practical pain point this digest is really about.

You need to forecast demand, traffic, or metrics for your SaaS product. Traditional approaches require either (a) statistical methods like ARIMA that need manual tuning per time series, or (b) deep learning models like DeepAR that require training on your specific data for hours or days. When you have hundreds or thousands of time series — each product SKU, each geographic region, each customer segment — the overhead becomes prohibitive. TimesFM eliminates the per-dataset training cost by providing a foundation model that works zero-shot across domains.

How It Works
The mechanism, architecture, or workflow behind it.

TimesFM treats time series forecasting like language modeling. It divides your input series into patches of 32 consecutive time points, processes them through a decoder-only transformer (similar to GPT's architecture), and outputs patches of 128 future points. The key innovation: the model learned general temporal patterns from 100B+ diverse time points during pretraining, so when you feed it your web traffic or sales data, it recognizes patterns from similar series it saw during training. Unlike encoder-decoder models, the decoder-only approach generates forecasts autoregressively — each prediction conditions on previous predictions, enabling flexible horizon lengths. The 2.5 release adds a 30M parameter quantile head for probabilistic forecasts and XReg support for covariates (external variables like holidays or promotions).

Key Takeaways
7 fast bullets that make the core value obvious.
  • Zero-shot forecasting — why YOU care: No training required. Load the model, pass your time series, get predictions. Works on data the model never saw during pretraining.
  • 200M parameters with 16k context — why YOU care: Lightweight enough to run on a single GPU, yet handles long historical context. TimesFM 2.5 supports up to 16,384 time points as input.
  • Probabilistic outputs via quantile head — why YOU care: Not just point forecasts — get prediction intervals. The optional 30M quantile head outputs the 10th, 50th, and 90th percentiles.
  • Patch-based architecture — why YOU care: Input patches of 32, output patches of 128 mean efficient generation. The model doesn't predict one step at a time — it predicts in chunks.
  • Apache-2.0 license — why YOU care: Free for commercial use. No API costs, no rate limits. Run it on your infrastructure, keep your data private.
  • BigQuery integration — why YOU care: Native SQL interface for Google Cloud users. `ML.FORECAST` with TimesFM models directly in your data warehouse.
  • Covariate support (XReg) — why YOU care: Added in October 2025. Incorporate external variables like holidays, promotions, or weather that affect your forecasts.
Should You Care?
Audience fit, decision signal, and the original source in one place.

Who It Is For

Data scientists, ML engineers, and developers who need time series forecasting at scale. If you're forecasting demand for an e-commerce platform with thousands of SKUs, predicting server load across multiple services, or analyzing web traffic across hundreds of pages — and you don't want to train custom models for each — this is for you. Also relevant for PMs and business analysts who use BigQuer...

Worth Exploring?

Yes, if you have time series forecasting needs. The HN thread (317 points, 118 comments) debates whether foundation models make sense for time series, but the benchmark results are real: TimesFM outperforms ARIMA by 15-25% on standard datasets and matches supervised models zero-shot. The open source release with Apache-2.0 license makes it trivial to test on your data. The one thing you'd regret missing: Google put this in BigQuery as an official product — that's a signal about production readiness that academic papers don't provide.

View original source
What the full digest unlocks

There is more here than the public preview.

This page gives you the hook. The full Snaplyze digest goes deeper so you can move from curiosity to decision with less noise.

Open the full digest to read the deeper breakdown, compare viewpoints, and get the practical next-step playbooks.

Open the full digest

Snaplyze

Go beyond the preview

Read the full digest for deep-dive insight, Easy Mode, Pro Mode, and practical playbooks you can actually use.

Install Snaplyze