davidgomes 2 days ago

OP here, everything is available on GitHub:

- https://github.com/appdotbuild/agent

- https://github.com/appdotbuild/platform

And we also blogged[1] about how the whole thing works. We're very excited about getting this out but we have a ton of improvements we'd like to make still. Please let us know if you have any questions!

[1]: https://www.app.build/blog/app-build-open-source-ai-agent

zihotki 2 days ago

Important part of the context missing or was cut off - it's for building apps on top of the Neon platform (PostgreSQL open source SAAS)

  • gavmor 2 days ago

    ie inextricably coupled to their services? Or is it a matter of swapping out a few "provider" modules?

    • igrekun a day ago

      Completely agnostic, if you run it locally, we provide a docker compose, if you have other deployment preferences pointing to your DB is a matter of changing env var https://github.com/appdotbuild/agent/blob/main/agent/trpc_ag...

      We have baseline cursor rules included in case you want to hack on this manually https://github.com/appdotbuild/agent/tree/main/agent/trpc_ag...

      Where we are tied is the LLM provider - you will need to supply your own keys for Anthropic / Gemini.

      We did a couple runs on top of Ollama + Gemma - expect support for local LLMs. Can't swear on the timeline, but one of our core contributors recently built a water cooled rig with a bunch of 3090s so my guess is "pretty soon".

ah27182 2 days ago

The CLI for this feels extremely buggy, Im attempting to build the application but the screen is flickering like crazy: https://streamable.com/d2jrvt

  • davidgomes 2 days ago

    Yeah, we have a PR in the works for this (https://github.com/appdotbuild/platform/issues/166), should be fixed tomorrow!

    • ah27182 2 days ago

      Alright sounds good. Question, what LLM model does this use out of the box? Is it using the models provided by Github (after I give it access)?

      • igrekun a day ago

        If you run locally you can mix and match any anthropic / gemini models. As long as it satisfies this protocol https://github.com/appdotbuild/agent/blob/4e0d4b5ac03cee0548... you can plug in anything.

        We have a similar wrapper for local LLMs on the roadmap.

        If you use CLI only - we run claude 4 + gemini on the backend, gemini serving most of the vision tasks (frontend validation) and claude doing core codegen.

      • davidgomes 2 days ago

        We use both Claude 4 and Gemini by default (for different tasks). But the idea is you can self-host this and use other models (and even BYOM - bring your own models).

  • csomar a day ago

    Average experience for AI-made/related products.

    • ecb_penguin a day ago

      Exactly. Non-AI projects have always been easy to build without issues. That's why we have so many build systems. We perfected it the first try and then made lots of new versions based on that perfect Makefile.