Zero-Copy GPU Inference from WebAssembly on Apple Silicon
37 points
4 hours ago
| 3 comments
| abacusnoir.com
| HN
saagarjha
1 hour ago
[-]
I'm curious what this offers over just building the host side code to be native?
reply
trueno
2 hours ago
[-]
> on Apple Silicon, a WebAssembly module's linear memory can be shared directly with the GPU: no copies, no serialization, no intermediate buffers

enhance

> no copies, no serialization, no intermediate buffers

would it kill people to write their own stuff why are we doing this. out of all the things people immediately cede to AI they cede their human ability to communicate and convey/share ideas. this timeline is bonkers.

reply
Aurornis
1 hour ago
[-]
I’ve become overly sensitive to it as well because it’s such a reliable indicator that there are other problems in the work.

I’ve wasted so much time looking at interesting repos this year before discovering that one of the main claims was a hallucination, or that when I got to the specific part of the codebase it just had a big note from the LLM that’s it’s a placeholder until it can figure out how to do the requested thing.

The people who have AI write their articles don’t care if it works or if it’s correct. They’re trying to get jobs and want something quick and interesting that will appeal to a lazy hiring manager. We’re just taking the bait too.

reply
notepad0x90
24 minutes ago
[-]
I don't know, to me your sentiment sounds a lot like how back in the day they used to say "you can't just use a calculator all the time, use your brain and show the work on pen and paper".

humans have been using tools to communicate since pre-history. language itself is one tool of communication invented to supersede body-language and grunting and noises. the thought and idea is theirs, it was communicated. Would it be that much different if they used a spellchecker extensively to edit their work?

I get why you're annoyed but is it really such a big deal? random people aren't to blame for whatever other annoyances "AI slop" has created.

reply
rvz
2 hours ago
[-]
This sort of obvious pattern is an instant AI dead give-away that I keep on seeing in hundreds of blogs and code posted on this site:

   "Here is X - it makes Y"

   "That's not X, it's Y."

   "...no this, no that, no X, no Y."
Another way of telling via code is by deducing the experience of the author if they became an expert of a different language since...yesterday.

There will be a time where it will be problematic for those who over-rely on AI and will struggle on on-site interviews with whiteboard tests.

reply
bensyverson
2 hours ago
[-]
I think the days of on-site interviews with whiteboard tests may be drawing to a close faster than you suspect
reply
JSR_FDED
1 hour ago
[-]
Huh, I’m 100% going to interview this way the next time I have to hire an engineer. I can’t think of a better way to get a sense of how a candidate reasons about things, and of their values - do they have a sense of responsibility, conscientiousness, team fit.

All other things that could be LLM-mediated have no more signal.

reply
andsoitis
5 minutes ago
[-]
> I can’t think of a better way to get a sense of how a candidate reasons about things

Some ideas to help you: ask the candidate something underspecified and watch what they do first. Do they ask clarifying questions, make their assumptions explicit? After they answer ask what would change their mind, where does that break down? Pick a topic they know and ask them to explain it to a smart non-engineer. Make them estimate something they can’t look up (forces them to decompose, bound, and calibrate). Once they’ve proposed a solution to a question, change the constraints to see if they can adapt or whether they’re stuck.

What you want to evaluate is dynamic reasoning, adaptability.

reply
z0r
37 minutes ago
[-]
Is this implying that you don't believe people will hire programmers anymore?
reply
m00dy
2 hours ago
[-]
I also think we will never go back to good old days.
reply
wmf
2 hours ago
[-]
This works in wasmtime not browsers.
reply
thrill
2 hours ago
[-]
Why would it not work in a browser?
reply
m00dy
2 hours ago
[-]
it would be hard to share the same memory location with gpu, right ?
reply
junon
2 hours ago
[-]
If the browser supported it it could expose it via a buffer view or something, but that'd be quite the security surface area one would think.
reply