Think of a river - the water carves the banks, the banks channel the water. Which is the real river?
A model is like the river banks, it is not intelligent in itself. Neither is activity by itself. Intelligence comes from their co-constitutive relation.
You can see the structure+flow coupling in many places. The model weights (banks) vs the training set (water). The contextual tokens (banks) vs the mass of probability for the next token (water). Agent environment (banks) vs agent activity (water).
The trick here is that neither is capable of explaining the process, or more fundamental than the other. It is a recursive process rewriting its own rules.
And we know from Godel, Turing and Chaitin that recursive processes exhibit incompleteness, undecidability and incompressibility. There is no way to predict them from outside, they can only be known only through unrolling, you have to be inside, or you have to be it to know it.
Thanks for your great contributions, especially your dedication to metaphor!
Fantastic metaphor.
I also can't help but draw a parallel (or even an overlap) with how the process works for humans. Once you put aside the intrinsic "model" (brain) differences (nature vs. nurture), humans develop their intelligence the same way: relentless exploration, and decision making to guide that exploration. We don't just funnel new info into a child's brain, they're allowed to explore threads around that, and take different paths which then guide how to process the next bit of info.
If we're looking at building anything like human intelligence, the only advanced general intelligence we know and understand to a degree, exploration will be critical.
I suspect that the amount of freely available text, video, and audio will decrease once entities like Disney start winning lawsuits. Alternatively, anyone who doesn’t want their work to be crawled and trained from will keep their work behind login/paywalls moving forward.
I think it’s entirely fair that there should be a robots.txt type of mechanism for sites to opt out of being crawled for training, and we might see places like the EU come up with something of that sort.
Throwing around scary words means very little. You're using a highly unorthodox interpretation of that concept. Even the anti-AI lawsuits are typically focused on specific model outputs or some other technicality.
>entities like Disney
The same company that's now publishing papers on generative AI? See the HN front page right now.
>keep their work behind login/paywalls moving forward
At what cost? Raging against a new technology like this is sure to backfire. Not only are you crippling yourself in terms of regular search engine visibility, there's now a non-negligible amount of traffic coming from AI search. And the average person will have even less of a reason to interact with you. Logins/paywalls are a nuisance. Especially if AI is producing competitive results to whatever you have to offer.
what pattern are human color/light sensors arranged in? maybe we should replicate that pattern? an organic arrangement discovered via simulated annealing?
all this bitmap x/y display stuff is very pre-AI-ish. old tech. victorian era clockwork mechanism. built to make it easy to reason about for humans, before the advent of neuron-soup. maybe we can do better?
The very earliest color CRTs used triangular arrangements of subpixels for each color, for example.
https://commons.wikimedia.org/wiki/File:ConeMosaics.jpg
Maybe we should put the electronics in a layer in front of the display, make them transparent, and use them to filter the light. The vertebrate retina does something like that, putting nerves in front of light sensors. Imitating evolved solutions isn't always a good idea. Nature's design process is like: start with the wrong parts for the job. Lay them out back to front. Wire them together the long way round. Adapt this design slightly and latch on to any convenient side effects. Iterate ten billion times. Now it works pretty good! Or it doesn't, but the species survives anyway, which is also fine since there is no goal.