So plugging a RasPi into a 5090 is "just" swapping the horse for one 10,000x bigger (someone correct my ratio of the RasPi5 GPU to the RTX5090)
This is a general purpose processor which includes 16 way SIMD instructions that can access data in a 64 by 64 byte register file as either rows or columns (and as either 8 or 16 or 32 bit data).
It also has superscalar instructions which access a separate set of 32-bit registers, but is tightly integrated with the SIMD instructions (like in ARM Neon cores or x86 AVX instructions).
This is what boots up originally.
Videocore was designed to be good at the actions needed for video codecs (e.g. motion estimation and DCTs).
I did write a 3d library that could render textured triangles using the SIMD instructions on this processor. This was enough to render simple graphics and I wrote a demo that rendered Tomb Raider levels, but only for a small frame resolution.
The main application was video codecs, so for the original Apple Video iPod I wrote the MPEG4 and h264 decoding software using the Videocore processor, which could run at around QVGA resolution.
However, in later versions of the chip we wanted more video and graphics performance. I designed the hardware to accelerate video, while another team (including Eben) wrote the hardware to accelerate 3d graphics.
So in Raspberry Pis, there is both a Videocore processor (which boots up and handles some tasks), and a separate GPU (which handles 3d graphics, but not booting up).
It is possible to write code that runs on the Videocore processor - on older Pis I accelerated some video decode sofware codecs by using both the GPU and the Videocore to offload bits of transform and deblocking and motion compensation, but on later Pis there is dedicated video decode hardware to do this instead.
Note that the ARMs on the later Pis are much faster and more capable than before, while the Videocore processor has not been developed, so there is not really much use for the Videocore anymore. However, the separate GPU has been developed more and is quite capable.
Thank you, I've used your work quite a number of times now.
It's a quirk of the broadcom chips that the rpi family uses; the GPU is the first bit of silicon to power up and do things. The GPU specifically is a bit unusual, but the general idea of "smaller thing does initial bring up, then powers up $main_cpu" is not unusual once $main_cpu is ~ powerful enough to run linux.
It's both funny and sad to me that we're at the point where someone would (perhaps even reasonably) describe using the GPU only for the "G" in its name as not "much of anything".
I'd also love to see the same done on the Zero 2, where the CPU is far less beefy and the trade-off might go a different way. It's an older generation of GPU though so the same code won't work.
That and the prices never really came back down to earth after the chip shortage hikes.
No. There are a bunch of alternatives with some to full pin compatibility. Some being many times faster [1]. No new projects should use a new Raspberry Pi.
Maybe this is the new narrative, but it wasn't how the Pi was initially developed and marketed.
It's just a touch too expensive for the use cases many hobbiest have.
- high current 5V USB power supply you probably don't have
- HDMI micro port you have like 1 cable for
- PCIe through very fragile ribbon cable + hodgepodge of adapters
- more adapters needed for SSD
- no case, but needs ample airflow
- power input is on the side and sticks out
GPIO is the killer feature, but I'll be honest, 99% of the hardware hacking I do is with microcontrollers much cheaper than a Pi that provide a serial port over USB anyways (and the commonly-confused-for-a-full-pi Pi Pico is pretty great for this)
We had a problem trying to bring up a couple of Pi 5, hoping they'd represent something reproducable we could deploy on multiple sites as an isolation stage for remote firmware programming. Everything looked great, until we brought one somewhere and untethered it from ethernet, and we started getting bizarre hangs. Turned out the wifi was close enough to the PCIe ribbon cable that bursts of wifi broadcasts were enough to disrupt the signal to the SSD, and essentially unmount it (taking root with it). Luckily we were able to find better shielded cables, but it's not something we were expecting to have to deal with.
It's not super powerful but my young kids use it to surf the net, play Minecraft, do art projects, etc. (we are yet to play with the gpio).
I don't get on with the keyboard but otherwise would make a decent development machine for me, considering my development starts with me ssh'ing into some remote VM and running vim.
The whole lot is tiny and extremely portable, we pack it away in a draw when not in use.
All in it felt like good value for money for something that took about 3 minutes to get up and running.
Processor comparison too
https://www.cpu-monkey.com/en/compare_cpu-raspberry_pi_5_b_b...
The only case I can think of is very heavy compute that relies on low latency GPIO related to that compute?
Along with that the gpio is there and ample so it's extremely easy to just start using it.
I do argue an esp 8266 or esp32 are better for a development microcontroller but you have to muck with cabling it up before you can even load a program on it which is a few more extra steps than a Pi
I really hope there's some kind of battery oriented low wattage high efficiency version planned someday, because we're up to requiring a 5A power supply and it's getting absurd.
This blog post shows a $2000 GPU attached to a slow SBC that costs less than 1/10th of the GPU.
It’s interesting. It’s entertaining. It’s a fun read. But it’s not a serious setup that anyone considers optimal.
Doom The Dark Ages is a single player game, so I’m not sure who you’d be cheating against, aside from maybe some real Buzz Killington’s saying you’re “cheating Microsoft by pirating it”.
I know that sounds a little pedantic; but typically DRM involves an identity layer (who is allowed to access what?). Denuvo doesn’t care about that; it’s even theoretically possible to make a Denuvo protected binary anyone could use.
This is a lot better than my memories of forcing a Pentium MMX 200 MHz PC with 32 MB SDRAM and an ATI All-in-Wonder Pro of running games from the early 2000s.
Single-digit FPS can _absolutely_ be playable if you're a desperate enough ten-year-old...
This would have been on some kind of Pentium 4 with integrated graphics. Not my earliest PC, but the first one I played any games on more advanced than the Microsoft Entertainment Packs.
I had to look at the ground and get the camera as close as possible to cross between the AH and the bank in IF. Otherwise I’d get about 0.1 fps and had to close the game, which meant waiting in line to get back. Those were the days.
> So with the right UI layout made from addons I could still be a pretty effective healer.
I got pretty good with the timings and could almost play without looking at the screen. But I was DD and it was vanilla so nobody cared if I sucked as long as I got far away with the bombs.
> I don't even remember what the dungeons looked like, just a giant grid of health bars, buttons and threat-meter graphs.
I was talking a couple of weeks ago with a mate who was MT at the time and told me he knew the feet and legs of all the bosses but never saw the animations or the faces before coming back with an alt a couple of years later. I was happy as a warlock, enjoying the scenery. With a refresh rate that gave me ample time to admire it before the next frame :D
Absolutely, sweet memories playing at less than 10fps using zsnes on a 486 dx2 by 1999...
Wasn't AoE1 released for PPC Mac natively? AoE2 was probably the best Mac game ever.
I have fond memories of playing Diablo II at 16 fps on an old (even at the time) PowerMac. I am not sure I could do it now.
And somehow, more mesmerizing than games feels like playing now. To be a kid again.
Countless kids played Morrowind below par spec on family computers all across America.
The DGX Spark and Mac Studio are currently the two best Arm-based platforms for running that game, it seems to like a lot of CPU to feed a decent GPU.
Was a bit faster than software (but hey I suppose if you weren't doing any transparency that makes it easier lol).
I remember what a huge difference it was having a dedicated 3D card capable of fast 2D and 3D vs the software rasterizer. Yes, NovaLogic games ran better. Yes, you can play Doom at decent FPS. Yes, SpecOps ran at full monitor resolution. They had a LOT to brag about.
As a developer, I'm sure Glide was great.
But as a kid that really wanted a 3dfx Voodoo card for Christmas so I could play all the sweet 3D games that only supported Glide, I was upset when my dad got me a Rendition Verite 2200. But I didn't want to seem ungrateful, so my frustration was pointed to 3dfx for releasing a proprietary API.
I was glad that Direct3D and OpenGL quickly surpassed Glide's popularity.
But yeah, then 3dfx failed to innovate. IIRC, they lagged behind in 32-bit color rendering support as well as letting themselves get caught with their pants down when NVIDIA released the GeForce and introduced hardware transform which allowed the GPU to be more than just a texturing engine. I think that was the nail in 3dfx's coffin.
Thanks for the laugh about your disappointment with your dad. I had a similar thing happen with mine when I asked for Doom and him being a Mac guy, he came back with Bungie’s Marathon. I was upset until I played Marathon… I then realized how wise my father was.
For example, if you pair an N150 mini pc with a cheap AMD egpu (one of the laptop skus), you’ve made yourself tho equivalent of a gaming laptop in clamshell (with better cooling) on the cheap. A price vs fps curve, switching GPUs but keeping the mini pc as a constant, would be super interesting.
Managed to complete the games with decent graphics and framerate at the time. It wasn't an ideal setup, but I didn't care. In fact, I thought it was a cool hack to play games at the time without forking out a lot of money to build a gaming PC.
Maybe there are probably better options now to game than attaching a dedicated GPU with whatever hardware you already have, but I can verify that external GPUs are really cool and useful (though a 5090 is definitely not needed). You also don't have to care about cooling the GPU, since it's "atmosphere" cooled (though headphones and/or ANC are a must).
I never managed to figure out the issue. The BSOD was something about a gpu timeout. It worked perfectly at home but shat the bed at the dorm. I assume there was some nasty interference there.
I tried a lot of things, inclusing full windows reinstall, driver rollback, cleaning from dust etc etc. Crash reason is listed as "other" Nvidia driver error code.
Bazzite using Proton it works flawlessly. God of war,KCD2 and others. I guess, it will be Linux gaming for me from now on.
I am still puzzled why this situation even can be. If you have ideas, be my guest.
Pi4: 20 FPS same when using ffmpeg to stream to twitch. 5W
Pi5: 40 FPS idem as above. 10W
3588: 300+ FPS and rock solid 60 FPS streaming to twitch. 15W
So 5090 is not even interesting for gameplay. More polygons and larger textures do not make games more fun to play.
AAA has peaked and C++ does not even deliver interesting games any more. C#/Java are way better alternatives for modding.
Also, it doesn't seem like it would be all that much more expensive for these high end GPUs to start getting x86/64 SoCs with midrange specs baked in, and these AIO GPUs could be tailor made for standalone AI and gaming applications. If it's the equivalent of a $10 bit of gear in terms of cost, they could charge an additional $100 for the feature, with a SoC optimized for the specs of the GPU - get rid of the need for an eGPU altogether and stream from the onboard host?
I think the sweet spot for the Pi 5 is 4GB (cost vs functionality you can use it for). But if you're like me, you don't care about value quite as much as fun/exploration. And for that, the more RAM, the merrier...
Interesting
Nothing. It’s just fun.
> t would have been more meaningful if the author tried the GPU card with an old machine, rather than a Raspberry Pi
But then it would have been lame. Who cares? If your old machine is a x86 less than 10 years old it’s most likely faster than the Pi. But that’s not the point. The point is to pair a cheap fun computer with a humongous and expensive card and see if it works. Because it’s fun.
to your point about 'meaningful' though, indeed the ole College Try to run Crysis on a Samsung NC-10 would be far more glorious! But I assure you this was very fun for me.