Google Announces 200M Parameter AI Forecasting Model TimesFM
Briefly

TimesFM is trained on nearly 100B data points and has zero-shot forecasting performance comparable to or better than supervised-learning models.
TimesFM uses a decoder-only transformer architecture similar to large language models (LLMs) like ChatGPT. In this scheme, short patches of time series data are modeled as tokens, for both the model's input and output.
Read at InfoQ
[
add
]
[
|
|
]