Show HN: Tokenaru – commodity market for LLM tokens
1 points
1 hour ago
| 0 comments
| tokenaru.com
| HN
I have been reading HN over the decade, but this is the first time I have something to submit!

Six months ago, I started tracking my OpenAI usage and numbers scared me. Like many of you, I hit the limits on subscriptions and watched costs spiral. I've tried cutting corners, explored cheaper models (quality is not there yet), ran local models through ollama, did a lot of optimizations to use less tokens.

But finally hit the wall with OpenClaw/Molt agent. Usage spiked over the roof and although I see a lot of value in having my personal Jarvis, but ROI is not there yet.

At some point I realized that the real issue is not price itself, it’s access + predictability. Some people/companies have more capacity than they need at times, others are paying retail while trying to scale.

So I'm building commodity market for LLM tokens.

Sellers can:

* Offer OpenAI capacity on their terms * Set a floor price (% of retail) * Control timing and volume

Buyers can:

* Bid below retail * Get OpenRouter-compatible API access * Pay only for what they use

Under the hood:

* Bid/ask realtime orderbook matching (like a stock exchange) * <10ms added latency * API keys encrypted * Every transaction metadata is logged (not content)

Why I’m posting:

I need 20 people to test this MVP:

* 10 sellers (you have OpenAI spend / capacity you can make available) * 10 buyers (you want cheaper, reliable access)

Current limits (for now):

* OpenAI only * Manual onboarding * Very basic web UI

---

If you’re interested:

1. See https://tokenaru.com 2. Use the form to submit your request (or email hello@tokenaru.com) 3. Tell me about your use case + rough monthly spend

What I want to learn:

1. What discount makes selling worthwhile? 2. What safeguards would make you comfortable with sharing your keys? 3. What is your strategy to keep up with AI costs tomorrow?

Thoughts? I'll be around for entire day to answer any questions.

No one has commented on this post.