Show HN: App.build, an open-source AI agent that builds full-stack apps
89 points
1 day ago
| 3 comments
| app.build
| HN
davidgomes
1 day ago
[-]
OP here, everything is available on GitHub:

- https://github.com/appdotbuild/agent

- https://github.com/appdotbuild/platform

And we also blogged[1] about how the whole thing works. We're very excited about getting this out but we have a ton of improvements we'd like to make still. Please let us know if you have any questions!

[1]: https://www.app.build/blog/app-build-open-source-ai-agent

reply
ivape
1 day ago
[-]
Probably not the question you want to hear, but, what template or css lib is that for the landing page? I'm really in love with it.
reply
davidgomes
1 day ago
[-]
reply
zihotki
1 day ago
[-]
Important part of the context missing or was cut off - it's for building apps on top of the Neon platform (PostgreSQL open source SAAS)
reply
gavmor
1 day ago
[-]
ie inextricably coupled to their services? Or is it a matter of swapping out a few "provider" modules?
reply
igrekun
1 day ago
[-]
Completely agnostic, if you run it locally, we provide a docker compose, if you have other deployment preferences pointing to your DB is a matter of changing env var https://github.com/appdotbuild/agent/blob/main/agent/trpc_ag...

We have baseline cursor rules included in case you want to hack on this manually https://github.com/appdotbuild/agent/tree/main/agent/trpc_ag...

Where we are tied is the LLM provider - you will need to supply your own keys for Anthropic / Gemini.

We did a couple runs on top of Ollama + Gemma - expect support for local LLMs. Can't swear on the timeline, but one of our core contributors recently built a water cooled rig with a bunch of 3090s so my guess is "pretty soon".

reply
ah27182
1 day ago
[-]
The CLI for this feels extremely buggy, Im attempting to build the application but the screen is flickering like crazy: https://streamable.com/d2jrvt
reply
davidgomes
1 day ago
[-]
Yeah, we have a PR in the works for this (https://github.com/appdotbuild/platform/issues/166), should be fixed tomorrow!
reply
ah27182
1 day ago
[-]
Alright sounds good. Question, what LLM model does this use out of the box? Is it using the models provided by Github (after I give it access)?
reply
igrekun
1 day ago
[-]
If you run locally you can mix and match any anthropic / gemini models. As long as it satisfies this protocol https://github.com/appdotbuild/agent/blob/4e0d4b5ac03cee0548... you can plug in anything.

We have a similar wrapper for local LLMs on the roadmap.

If you use CLI only - we run claude 4 + gemini on the backend, gemini serving most of the vision tasks (frontend validation) and claude doing core codegen.

reply
davidgomes
1 day ago
[-]
We use both Claude 4 and Gemini by default (for different tasks). But the idea is you can self-host this and use other models (and even BYOM - bring your own models).
reply
csomar
1 day ago
[-]
Average experience for AI-made/related products.
reply
ecb_penguin
1 day ago
[-]
Exactly. Non-AI projects have always been easy to build without issues. That's why we have so many build systems. We perfected it the first try and then made lots of new versions based on that perfect Makefile.
reply