The Principles of Diffusion Models
148 points
10 hours ago
| 6 comments
| arxiv.org
| HN
smokel
6 hours ago
[-]
If you're more into videos, be sure to check out Stefano Ermon's CS236 Deep Generative Models [1]. All lectures are available on YouTube [2].

[1] https://deepgenerativemodels.github.io/

[2] https://m.youtube.com/playlist?list=PLoROMvodv4rPOWA-omMM6ST...

reply
storus
1 hour ago
[-]
I wish Stanford kept offering CS236 but they haven't run it for two years already :(
reply
JustFinishedBSG
9 hours ago
[-]
CTRL-F: "Fokker-Planck"

> 97 matches

Ok I'll read it :)

reply
joaquincabezas
5 hours ago
[-]
why am I only getting 26 matches? where's the threshold then? :D
reply
tim333
3 hours ago
[-]
It's all about the en dashes and Fokker-Planck vs Fokker–Planck.
reply
joaquincabezas
2 hours ago
[-]
AI is definitely related to dashes!!
reply
dvrp
9 hours ago
[-]
hn question: how is this not a dupe of my days old submission (https://news.ycombinator.com/item?id=45743810) ?
reply
borski
8 hours ago
[-]
It is, but dupes are allowed in some cases:

“Are reposts ok?

If a story has not had significant attention in the last year or so, a small number of reposts is ok. Otherwise we bury reposts as duplicates.”

https://news.ycombinator.com/newsfaq.html

Also, from the guidelines: “Please don't post on HN to ask or tell us something. Send it to hn@ycombinator.com.”

reply
scatedbymath
50 minutes ago
[-]
i m scared by the maths
reply
leptons
4 hours ago
[-]
Reading this reinforces that a lot of what makes up current "AI" is brute forcing and not actually intelligent or thoughtful. Although I suppose our meat-minds could also be brute-forcing everything throughout our entire lives, and consciousness is like a chat prompt sitting on top of the machinery of the mind. But artificial intelligence will always be just as soulless and unfulfilling as artificial flavors.
reply
dhampi
4 hours ago
[-]
Guessing you’re a physicist based on the name. You don’t think automatically doing RG flow in reverse has beauty to it?

There’s a lot of “force” in statistics, but that force relies on pretty deep structures and choices.

reply
theptip
2 hours ago
[-]
Intelligence is the manifold that these brute-force algorithms learn.

Of course we don’t brute-force this in our lifetime. Evolution encoded the coarse structure of the manifold over billions of years. And then encoded a hyper-compressed meta-learning algorithm into primates across millions of years.

reply
tim333
3 hours ago
[-]
Always is a long time. It may get better.
reply
mlmonkey
5 hours ago
[-]
470 pages?!?!?!? FML! :-D
reply