16x16 sounds really shit for me who still has perfect vision indeed but I bet it's life changing to be able to identify presence / absence of stuff around you and such! Yay for technology!
By now, we have smartphones with camera systems that beat human eyes, and SoCs powerful enough to perform whatever image processing you want them to, in real time.
But our best neural interfaces have the throughput close to that of a dial-up modem, and questionable longevity. Other technological blockers advanced in leaps and bounds, but SOTA on BCI today is not that far away from 20 years ago. Because medicine is where innovation goes to die.
It's why I'm excited for the new generation of BCIs like Neuralink. For now, they're mostly replicating the old capabilities, but with better fundamentals. But once the fundamentals - interface longevity, ease of installation, ease of adaptation - are there? We might actually get more capable, more scalable BCIs.
Fixed the typo for you.
Inaction has a price, you know.
BCI == Brain-computer interface
AI is the final failure of "intermitent" wipers,which like my latest car, is irevocably enabled to smeer the road grime and imperceptable "rain" into a goo, blocking by ability to see
That's what we're having with VR: we came to a point where increasing DPI for laptop or phone seemed to make no sense; but that was also the point where VR started to be reachable, and over there a 300DPI screen is crude and we'd really want 3x that pixel density.
I wonder why so many shades of grey? Fancy!
(Yeah, the U.K. spelling of "grey" looks more "gray" to these American eyes.)
Hilarious too that this article is on Petapixel. (Centipixel?)
https://old.reddit.com/r/electronics/comments/1olyu7r/i_made...
The video on Reddit: https://www.reddit.com/r/3Dprinting/comments/1olyzn6/i_made_...
> Do you mean the refresh rate should be higher? There's two things limiting that: > - The sensor isn't optimized for actually reading out images, normally it just does internal processing and spits out motion data (which is at high speed). You can only read images at about 90Hz > - Writing to the screen is slow because it doesn't support super high clock speeds. Drawing a 3x scale image (90x90 pixels) plus reading from the sensor, I can get about 20Hz, and a 1x scale image (30x30 pixels) I can get 50Hz.
I figured there would be limitations around the second, but I was hoping the former wasn't such a big limit.
Sincerely a lot of thanks.
https://www.youtube.com/watch?v=EE9AETSoPHw&t=44
https://www.instructables.com/Single-Pixel-Camera-Using-an-L...
(Okay not the same guy, but I wanted to share this somewhat related "extreme" camera project)
> Optical computer mice work by detecting movement with a photoelectric cell (or sensor) and a light. The light is emitted downward, striking a desk or mousepad, and then reflecting to the sensor. The sensor has a lens to help direct the reflected light, enabling the mouse to convert precise physical movement into an input for the computer’s on-screen cursor. The way the reflected changes in response to movement is translated into cursor movement values.
I can't tell if this grammatical error is a result of nonchalant editing and a lack of proofreading or a person touching-up LLM content.
> It’s a clever solution for a fundamental computer problem: how to control the cursor. For most computer users, that’s fine, and they can happily use their mouse and go about their day. But when Dycus came across a PCB from an old optical mouse, which they had saved because they knew it was possible to read images from an optical mouse sensor, the itch to build a mouse-based camera was too much to ignore.
Ah, it's an LLM. Dogshit grifter article. Honestly, the HN link should be changed to the reddit post.