Well done!
I've been meaning to add "intelligence" to my Telegram monitoring bot: it couldn't come at a better time!
Thank you for building it!
Does it run in FaaS/serverless environments out of the box? Lambdas, Cloudflare Workers, Vercel functions and the likes? Deno? The README says "Isomorphic - works everywhere", but might be nice to make this more explicit.
- Production-ready and used by enterprises. - Fully local and NOT a proxy. You can deploy it anywhere. - Comes with batching, retries, caching, callbacks, and OpenTelemetry support. - Supports custom plugins for caching, logging, HTTP client, and more. You can use it like LEGOs and make it work with your infrastructure. - Supports plug-and-play providers. You can run fully custom providers and still leverage all the benefits of Adaline Gateway.
Features Strongly typed in TypeScript Isomorphic - works everywhere 100% local and private and NOT a proxy Tool calling support across all compatible LLMs Batching for all requests with custom queue support Automatic retries with exponential backoff Caching with custom cache plug-in support Callbacks for full custom instrumentation and hooks OpenTelemetry to plug tracing into your existing infrastructure Plug-and-play custom providers for local and custom models
The LLM itself is still in the cloud right? If using one from anthropic or open ai