Datadog scales time series foundation models to 2.5B parameters
6 points
7 hours ago
| 1 comment
| huggingface.co
| HN
chrisdevs
7 hours ago
[-]
I'm a research engineer on the team behind this at Datadog AI Research.

We released a family of open-weight models today, 4M to 2.5B params.

It's the first time we've been able to scale time series foundation models this large and train them long enough to see performance keep improving with size.

These were trained on internal Datadog observability data (from monitoring ourselves) and synthetic data.

More details in our blog post (https://www.datadoghq.com/blog/ai/toto-2/), technical report is coming soon.

Happy to answer questions.

reply
vexcrest
53 minutes ago
[-]
great work & thanks for making it open btw
reply
vexcrest
53 minutes ago
[-]
how do i use this to make me money
reply