Local-first software (2019)
698 points
19 hours ago
| 53 comments
| inkandswitch.com
| HN
DataDaoDe
17 hours ago
[-]
Yes a thousand percent! I'm working on this too. I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work. I'm working on a fitness tracking app right now that will use the sublime model - just buy it, get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.

This is the model I want from 90% of the software out there, just give me a reasonable price to buy it, make the product good, and don't marry it to the cloud so much that its unusable w/out it.

There are also a lot of added benefits to this model in general beyond the data privacy (most are mentioned in the article), but not all the problems are solved here. This is a big space that still needs a lot of tooling to make things really easy going but the tech to do it is there.

Finally, the best part (IMHO) about local-first software is it brings back a much healthier incentive structure - you're not monetizing via ads or tracking users or maxing "engagement" - you're just building a product and getting paid for how good it is. To me it feels like its software that actually serves the user.

reply
patmorgan23
12 hours ago
[-]
Obsidian the note taking app is a great model to follow as well. The client is completely free and they sell an optional syncing service. The notes are all on markdown files so the client is completely optional.
reply
crossroadsguy
2 hours ago
[-]
This is the reason I have always refused to use Bear note taking app irrespective of how good and snappy that app is. Because they keep their notes in a SQLite db now and even though that file can be backed up and handled locally my notes are not easily accessible to me. I can't easily edit my notes in other editors (which I often like to do on my mac), I can't version controlled backup and sync those files the way I want outside of iCloud (which is what Bear uses).

What is sad is that they used to be local files first note app and then they moved to sqlite citing some sync and performance issues.

reply
agos
1 hour ago
[-]
I didn’t know they did this change which means it’s time to think about migrating away from bear. Which is a pity because the software itself is rock solid
reply
wim
3 hours ago
[-]
A backend can be part of the functionality though, such as for real-time collaboration and syncing. But you can have ownership and longevity guarantees for both the data and the service as long as you can eject [1] from the cloud and switch to self-host or back at any time, which is what we do for our notes/tasks IDE

[1] https://thymer.com/local-first-ejectable

reply
maxhille
14 hours ago
[-]
How do you plan to do the syncing without some sort of cloud infrastructure?
reply
CGamesPlay
12 hours ago
[-]
There are a lot of valid answers to this! One is to use your platform's provided one, like OneDrive or iCloud. Another is to integrate with some other sync platform. Dropbox is a popular target for this. Peer-to-peer is another, although that obviously also come with limitations. Finally, bring-your-own-sync is a popular choice amongst open-source apps, where you provide a self-hostable sync server.
reply
jmb99
8 hours ago
[-]
The benefit of local-first means you’re not incentivized to sell your cloud offering, so you can just give options. Sync with iCloud, Google drive, OneDrive, Dropbox, Mega, SMB, SFTP, FTP, whatever you feel like adding support for. And since local-first usually means having some kind of sane file format, you can let “advanced” users manage their own files and synchronization like people have been doing for the last 50 years.
reply
Hard_Space
1 hour ago
[-]
For Joplin I use WebDav from the 10gb of free file storage that comes with Fastmail. So I have easy sync with multiple platforms and form factors, and even substantial notes make little dent in the allowance.
reply
WD-42
10 hours ago
[-]
Check out Aardvark (renamed to reflection) it's a collaborative note-taking app from the GNOME folks. I think the idea isn't to completely remove cloud infrastructure, but to at least make it optional and/or provide alternatives. For example, this note app works via P2P. blogs.gnome.org/tbernard/2025/06/30/aardvark-summer-2025-update/
reply
piperswe
14 hours ago
[-]
Something like Syncthing, perhaps?
reply
dsp_person
12 hours ago
[-]
Anyone know of any mobile apps that have done this and bundled their own fork of syncthing under the hood for syncing?
reply
FallCheeta7373
7 hours ago
[-]
Practically not really needed for a person going out of their way to setup syncthing, you can just sync the underlying folder, I do this with logseq, their syncing subscription is paid, I just sync the underlying logseq graph and markdown syntax. It's seamless and rarely disturbs me, works well in background, although android seemingly doesn't respect my background preferences, and clears it out of my ram when I inevitably hit the clear button, but that's soluble by simply rebooting once in a while.
reply
pvh
6 hours ago
[-]
Ideally, you would use existing commodity infrastructure but we have found none of it is really super fit for our purposes. Failing that, we have been developing an approach to low-maintenance reusable infrastructure. For now, I would advise running your own but positioning yourself to take advantage of commodity systems as they emerge.
reply
j45
9 hours ago
[-]
Syncthing
reply
MajesticHobo2
9 hours ago
[-]
You can use FTP and SVN.
reply
cortesoft
6 hours ago
[-]
Both of those require a server
reply
rschiavone
11 hours ago
[-]
There's a git plugin.
reply
DataDaoDe
14 hours ago
[-]
right now its in webrtc
reply
nerdyadventurer
7 hours ago
[-]
> get updates for X years, sync with all your devices and use it forever. If you want updates after X years buy the newest version again. If its good enough as is - and that's the goal - just keep using it forever.

While this sounds good deal, with this approach

- You have to charge total cost of subscription at once (1y or 2y),

- Still have to keep servers running for syncing, also you have think about cases where user syncing 1y of data in a single day.

- Have to keep people on the payroll for future developments.

(You are here thinking only in developer perspective.)

reply
tarpit_idea
14 hours ago
[-]
Totally agree. If you don't mind - what tech stack are you using for your fitness tracking app? I'm particularly curious about how you handle cross-device sync :)
reply
charcircuit
16 hours ago
[-]
>you're not monetizing via ads

Yes, you are. You can find tons of purely local apps that monetize themselves with ads.

reply
DataDaoDe
16 hours ago
[-]
Sure you could. I'm not, I don't think its in the spirit of local first. And I wouldn't pay money for that, but if you or someone else wants to build that kind of software - its a free world :)
reply
criddell
15 hours ago
[-]
It’s easy to say you wouldn’t do that, but if it gets to the point where you have an employee helping you out and in a downturn you have to choose between laying them off or pushing an ad to keep paying them one more quarter, you might reconsider.
reply
nofunsir
14 hours ago
[-]
No, ads aren't the solution for everything, and in my opinion anything.
reply
earthnail
3 hours ago
[-]
You will reconsider this argument when you start publishing your own ads to make people aware of your software.

there are different kinds of ads, but lets be clear that even a Show HN is a form of ad. Some forms of ads are just more appreciated than others.

reply
thaumasiotes
15 hours ago
[-]
> You can find tons of purely local apps tha[t] monetize themselves with a[d]s.

How do they do that without hitting the internet?

reply
kid64
15 hours ago
[-]
It's "local first", not "local only".
reply
thaumasiotes
14 hours ago
[-]
Sorry, a "purely local app" isn't "local only"?
reply
senko
3 hours ago
[-]
Not OP, but no.

IMHO, a fully local app is an app that can run locally with all the functionality, not that it's isolated from everything else.

Browser, email client (running locally on your device such as Mail.app, mutt, Outlook,...), Zed (text editor, runs locally but can check for updates... as can many other modern apps)...

reply
satvikpendem
12 hours ago
[-]
You can hardcode ads into each build that don't need Internet access.
reply
kid64
10 hours ago
[-]
Well if you're gonna get all accurate on me...
reply
free_bip
13 hours ago
[-]
i could be wrong but I think they're referring to the winrar model, where there are occasional "annoyances" that you can either ignore or pay to get rid of.
reply
charcircuit
14 hours ago
[-]
Point 3 from the article is

>3. The network is optional

Ad SDKs usually allow caching ads for a period of time so that ads can still be shown while the device is temporarily offline.

reply
fud101
5 hours ago
[-]
Bro who wants your pointless fitness data? Not even you care that much for that. Just use a notepad ffs.
reply
echelon
14 hours ago
[-]
> I'm sick of everyone trying to come up with a use case to get all my data in everyone's cloud so I have to pay a subscription fee to just make things work.

AI photo and video generation is impractical to run locally.

ComfyUI and Flux exist, but they serve a tiny sliver of the market with very expensive gamer GPUs. And if you wanted to cater to that market, you'd have to support dozens of different SKUs and deal with Python dependency hell. And even then, proficient ComfyUI users are spending hours experimenting and waiting for renders - it's really only a tool for niche artists with extreme patience, such as the ones who build shows for the Las Vegas Sphere. Not your average graphics designers and filmmakers.

I've been wanting local apps and local compute for a long time, but AI at the edge is just so immature and underpowered that we might see the next category of apps only being available via the cloud. And I suspect that these apps will start taking over and dominating much of software, especially if they save time.

Previously I'd only want to edit photos and videos locally, but the cloud offerings are just too powerful. Local cannot seriously compete.

reply
satvikpendem
12 hours ago
[-]
But who said anything about AI? Lots of local-first apps have nor need any AI whatsoever. And by the way, Topaz Labs has good offerings for editing photos and videos with AI that run locally, works great for many use cases (although it's not fully generative like Veo etc, more like upscaling and denoising, which does use generative AI but not like the former).
reply
bigfatkitten
10 hours ago
[-]
Most cloud apps have no need for AI either, but companies are pushing it anyway for bullshit marketing reasons, similar to what they did with blockchain a decade ago.
reply
satvikpendem
5 hours ago
[-]
Sure, that's unrelated to my point however, it's a non sequitur.
reply
echelon
12 hours ago
[-]
I suspect that most content will be generated in the future and that generation will dominate the creative fields, white collar work, and most internet usage.

If that's true, it's a substantial upset to the old paradigms of data and computing.

reply
satvikpendem
12 hours ago
[-]
Yes, that is true, but again for apps like a fitness tracker, it is not "content" based. Sure, it might have some AI in the form of chatbots to ask what your diet plan should be based on your current progress, but that's not what you're talking about. In my experience, most local-first apps are like this fitness tracker, utility tools, rather than a means to view content, like TikTok.
reply
echelon
9 hours ago
[-]
The vast majority of apps, or at least data consumption, will not fit the shape of "fitness tracker". Budgeting, emails [1], workout routines - those will fall into a non-generative bucket of applications.

I still purport that in the future, most applications and screen time will fall into a generative AI bucket: creating media, writing code, watching videos, playing games, searching for information, etc. I wouldn't even be surprised if our personal images and videos get somehow subsumed and "enriched" with AI.

[1] Well, email might fall into a non-generative bucket. There are already tools that purport to both read and write your emails for you. I'm not quite sure what to make of those.

reply
satvikpendem
5 hours ago
[-]
> or at least data consumption

Good thing I'm not talking about data consumption apps then, as I mentioned in my comment above. Local-first apps specifically are not amenable to data consumption purposes so while you are right on the generative AI part, it's unrelated to the topic of this post.

reply
flkenosad
13 hours ago
[-]
> AI photo and video generation is impractical to run locally.

You think it always will be? What can the new iPhone chips do locally?

reply
bananaboy
11 hours ago
[-]
Regardless of what hardware capabilities exist, the previous post makes it sound like every application needs AI which is just not true.
reply
echelon
13 hours ago
[-]
> You think it always will be? What can the new iPhone chips do locally?

I suspect we're a decade off from being able to generate Veo 3, Seedance, or Kling 2.1 videos directly on our phones.

This is going to require both new compute paradigms and massively more capable hardware. And by that time who knows what we'll be doing in the data center.

Perhaps the demands of generating real time fully explorable worlds will push more investment into local compute for consumers. Robotics will demand tremendous low latency edge compute, and NVidia has already highlighted it as a major growth and investment opportunity.

reply
samwillis
16 hours ago
[-]
There is now a great annual Local-first Software conference in Berlin (https://www.localfirstconf.com/) organised by Ink and Switch, and it's spawned a spin out Sync Conf this November in SF (https://syncconf.dev/)

There was a great panel discussion this year from a number of the co-authors of the the paper linked, discussing what is Local-first software in the context of dev tools and what they have learnt since the original paper. It's very much worth watching: https://youtu.be/86NmEerklTs?si=Kodd7kD39337CTbf

The community are very much settling on "Sync" being a component of local first, but applicable so much wider. Along with local first software being a characteristic of end user software, with dev tools - such as sync engines - being an enabling tool but not "local first" in as much themselves.

The full set of talks from the last couple of years are online here: https://youtube.com/@localfirstconf?si=uHHi5Tsy60ewhQTQ

It's an exciting time for the local-first / sync engine community, we've been working on tools that enable realtime collaborative and async collaborative experiences, and now with the onset of AI the market for this is exploring. Every AI app is inherently multi user collaborative with the agents as actors within the system. This requires the tech that the sync engine community has been working on.

reply
trinsic2
3 hours ago
[-]
Thanks for the info. Didn't know there were sync conferences.
reply
pcollins123
10 minutes ago
[-]
100% agree! I built Paisley (because it is the opposite of plaid), to host your personal finances locally and is 100% open source. Paisley pulls data from your financial institutions by scraping balances and importing CSV exports, storing everything locally in a simple SQLite database.

https://github.com/patrickcollins12/paisley

reply
Jtsummers
19 hours ago
[-]
Worth a read, and it's had some very active discussions in the past:

https://news.ycombinator.com/item?id=19804478 - May 2019, 191 comments

https://news.ycombinator.com/item?id=21581444 - Nov 2019, 241 comments

https://news.ycombinator.com/item?id=23985816 - Jul 2020, 9 comments

https://news.ycombinator.com/item?id=24027663 - Aug 2020, 134 comments

https://news.ycombinator.com/item?id=26266881 - Feb 2021, 90 comments

https://news.ycombinator.com/item?id=31594613 - Jun 2022, 30 comments

https://news.ycombinator.com/item?id=37743517 - Oct 2023, 50 comments

reply
the_snooze
18 hours ago
[-]
Anything with online dependencies will necessarily require ongoing upkeep and ongoing costs. If a system is not local-first (or ideally local-only), it’s not designed for long-term dependability.

Connected appliances and cars have got to be the stupidest bit of engineering from a practical standpoint.

reply
api
18 hours ago
[-]
The entire thing is because of subscription revenue.

It’s self reinforcing because those companies that get subscription revenue have both more revenue and higher valuations enabling more fund raising, causing them to beat out companies that do not follow this model. This is why local first software died.

reply
tikhonj
17 hours ago
[-]
I remember seeing somebody summarize this as "SaaS is a pricing model" or "SaaS is financialization" and it totally rings true. Compared to normal software pricing, a subscription gives you predictable recurring revenue and a natural sort of price discrimination (people who use your system more, pay more). It's also a psychological thing: folks got anchored on really low up-front prices for software, so paying $2000 for something up-front sounds crazy even if you use it daily for years, but paying $25/month feels reasonable. (See also how much people complain about paying $60 for video games which they play for thousands of hours!)

It's sad because the dynamics and incentives around clear, up-front prices seem generally better than SaaS (more user control, less lock-in), but almost all commercial software morphs into SaaS thanks to a mix of psychology, culture and market dynamics.

There are other advantages to having your software and data managed by somebody else, but they are far less determinative than structural and pricing factors. In a slightly different world, it's not hard to imagine relatively expensive software up-front that comes with a smaller, optional (perhaps even third-party!) subscription service for data storage and syncing. It's a shame that we do not live in that world.

reply
danjl
15 hours ago
[-]
Correct. SaaS is a business model, not a technical concept. But the real problem is that there is no equivalent business model for selling local first software. Traditional desktop apps were single purchase items. Local first is not because you just navigate to a website in your browser and blammo you get the software. What we need is a way to make money off of local first software.
reply
trinsic2
3 hours ago
[-]
I'm not understanding why we have to have a model that replicates SaaS pricing for local-first software?

Obsidian is doing a pretty good job selling sync functionality to their free client. Because the have a really good markdown editor implementation IMHO with community plug-in support that IMHO beats every PKM cloud tool out there that competes with them.

reply
Timwi
2 hours ago
[-]
> What we need is a way to make money off of local first software.

No, what we need is a way for people to not starve so that they don't have to make money at all and can focus instead on their passion project(s). Cough UBI cough

reply
gffrd
13 hours ago
[-]
> there is no equivalent business model for selling local first software.

Sure there is: “$500 upfront or $21/mo for 24 months *”

* if you don’t complete you 24 payments, we freeze your license.

reply
HappMacDonald
10 hours ago
[-]
So Local-first DRM then?
reply
flomo
14 hours ago
[-]
It's the missing middle. A manager can just expense $25/mo, while $2000 requires an approval process, which requires outside sales, which means it really costs at least $20,000.
reply
3eb7988a1663
13 hours ago
[-]
Ha! If only that were true. I gave up on my effort to buy a one year license for $25 after filling out too many TPS reports. Which is probably part of the design of the system.
reply
api
15 hours ago
[-]
SaaS is a business model. Cloud is DRM. If you run the software in the cloud it can't be pirated and there is perfect lock-in. Double if the data can't be exported.

Related: I've been incubating an idea for a while that open source, as it presently stands, is largely an ecosystem that exists in support of cloud SaaS. This is quite paradoxical because cloud SaaS is by far the least free model for software -- far, far less free than closed source commercial local software.

reply
seec
14 hours ago
[-]
Yes, this is the main reason for doing "cloud" I believe. Otherwise, it would make no sense for someone like Adobe to adopt this model, since the softwares still largely require to run locally for technical reasons.

It's the same thing as the subscriptions for movies like Netflix, except at least in the last case we can fight back with various means (and it's not a necessity).

The SaaS model is basically a perfect racketeering setup, I think it should be outlawed at least philosophically. There is no way business is not going to abuse that power and they have already shown as much...

I agree with your sentiment on Open Source. I think like many of these types of things, it lives in contradictions. In any case, Linux as it is today, couldn't exist without the big commercial players paying quite a bit to get it going.

reply
bboygravity
17 hours ago
[-]
The root cause of the problem is that it's easier to make personalized stuff with server/backend (?cloud?) than without maybe?

Example: I made a firefox extension that automatically fills forms using LLM. It's fully offline (except OPTIONALLY) the LLM part, optionally because it also supports Ollama locally.

Now the issue is that it's way too hard for most people to use: find the LLM to run, acquire it somehow (pay to run it online or download it to run in Ollama) gotta configure your API url, enter API key, save all of your details for form fulling locally in text files which you then have to backup and synchronize to other devices yourself.

The alternative would be: create account, give money, enter details and all is synced and backedup automatically accross devices, online LLM pre-selected and configured. Ready to go. No messing around with Ollama or openrouter, just go.

I don't know how to solve it in a local way that would be as user friendly as the subscription way would be.

Now things like cars and washing machines are a different story :p

reply
tshaddox
16 hours ago
[-]
> The root cause of the problem is that it's easier to make personalized stuff with server/backend (?cloud?) than without maybe?

That, and also there are real benefits to the end user of having everything persisted in the cloud by default.

reply
goopypoop
10 hours ago
[-]
I don't think having to manually sync preferences (or set up an unnecessary LLM) is really "the root cause" of "why local first software died".
reply
okr
17 hours ago
[-]
Can the LLM not help with setting up the local part? (Sorry, was just the first thought i had.)
reply
seec
14 hours ago
[-]
Pretty much greed being a universally destructive force in the world as usual.

When Apple joined the madness, all hopes where lost (that was a long time ago now, sight)

reply
ashdev
16 hours ago
[-]
This was refreshing to read! More apps should be local-first. If the user does not want to sync their data to cloud, they should have that option.

I’ve been building the offline-first (or local-first) app Brisqi[0] for a while now, it was designed from the ground up with the offline-first philosophy.

In my view, a local-first app is designed to function completely offline for an indefinite period. The local experience is the foundation, not a fallback and cloud syncing should be a secondary enhancement, not a requirement.

I also don’t consider apps that rely on temporary cache to be offline-first. A true offline-first app should use a local database to persist data. Many apps labeled as “offline-first” are actually just offline-tolerant, they offer limited offline functionality but ultimately depend on reconnecting to the internet.

Building an offline-first app is certainly more challenging than creating an online-only web app. The syncing mechanism must be reliable enough to handle transitions between offline and online states, ensuring that data syncs to the cloud consistently and without loss. I’ve written more about how I approached this in my blog post[1].

[0] https://brisqi.com

[1] https://blog.brisqi.com/posts/how-i-designed-an-offline-firs...

reply
AstroBen
9 hours ago
[-]
How has it been going? I've been thinking of trying this model but a bit worried about how much harder it would be to make it sustainable as a business
reply
monkeyelite
11 hours ago
[-]
There is no reason for every application to have its own sync platform. I suspect this framing came out of mobile apps where there is no composability or modularity between programs.

If you really embrace "local first" just use the file system, and the user can choose from many solutions like git, box, etc.

I hate signing up for your sync just as much as any other SAAS, but it's even more opaque and likely to break.

reply
swsieber
10 hours ago
[-]
I agree that not every app needs it's own sync engine, but I disagree with your framing that the file system is the universal way to embrace local first. I have two reasons.

First is that yeah, local first, but I also want concurrency. If it's just local first, you're right, any old sync will do. But I want more than that. I want to not have to think (a la dropbox, being slick). I want my wife and I to be able to make separate edits on our phones when we're in a dead zone.

Second is that sync works a lot better when it has deep knowledge of the data structure and semantics. Git and box both have significant shortcomings, but both exacerbated by the concurrency desire.

reply
monkeyelite
10 hours ago
[-]
But this problem isn't going to be solved by every app making its own sync system. Even if there is a magic library you can adopt that does pretty good, then everyone having their own completely independent hosting solution and sync schedule.

If files are insufficient, what data-structure would make modular sync possible for multiple applications in an OS?

And I’m not suggesting one doesn’t exist, I’m challenging to present a comprehensive solution, that probably involved operating systems.

> I want my wife and I to be able to make separate edits on our phones when we're in a dead zone.

Files do this.

reply
necovek
3 hours ago
[-]
I agree that files solve some rudimentary cases, but they do not even allow simple conflict resolution. Eg. compressed files, including container formats like OpenOffice (text files in a ZIP archive IIRC), might be simple to apply changes from two sides if they are in distant parts, but syncing full files simply barfs.

Note that this does not even need two users: I hit this problem with a desktop and laptop and self-hosted NextCloud myself.

In general, a filesystem that actually stored both raw data (to fail-over to), but also a per-format event log, and maybe even app specific events (imagine a PNG changes, we could have any change recorded as raw bytes, generic bitmap image operation like "modify pixels at x,y to ..." and app-specific log like "gimp: apply sharpen filter on polygon area ...").

This would allow the other side to attempt to do the smartest sync it has (if it has a compatible version of gimp, it could decide to apply the filter, otherwise fall back to raw pixel changes if no conflicts, and then fall back to full file contents reconciliation).

Just like MIME handlers get registered, if file systems provided such change logs, some could have very advanced sync systems with this support from "filesystems".

reply
swsieber
9 hours ago
[-]
Files come with certain restrictions, which don't matter for certain types of applications. But for others they do.

I think it boils down to provenance and concurrency. If we edit the same line a file, that's ba merge conflict when it really should be simple and something I shouldn't have to bother with. And when we do do the same line edit, I'd love to have provenance on that data.

Granted, those aren't local first thing exactly, but I think there will be apps that want all of that.

reply
hidelooktropic
11 hours ago
[-]
I mostly agree with this, but sometimes it's not that simple in practice. I created an app that did exactly this and it resulted in inevitable file conflicts because I couldn't negotiate between the clients when a file should be allowed for editing.
reply
GMoromisato
17 hours ago
[-]
Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).

The problems with closed-source software (lack of control, lack of reliability) were solved with a new business model: open source development, which came with new licenses and new ways of getting revenue (maintenance contracts instead of license fees).

In the same way, we need a business model solution to cloud-vendor ills.

Imagine we create standard contracts/licenses that define rights so that users can be confident of their relationship with cloud-vendors. Over time, maybe users would only deal with vendors that had these licenses. The rights would be something like:

* End-of-life contracts: cloud-vendors should contractually spell out what happens if they can't afford to keep the servers running.

* Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.

* Data privacy transparency: Vendors must track/audit all data access and report to the user who/what read their data and when.

I'm sure you can think of a dozen other clauses.

The tricky part is, of course, adoption. What's in it for the cloud-vendors? Why would they adopt this? The major fear of cloud-vendors is, I think, churn. If you're paying lots of money to get people to try your service, you have to make sure they don't churn out, or you'll lose money. Maybe these contracts come only with annual subscription terms. Or maybe the appeal of these contracts is enough for vendors to charge more.

reply
AnthonyMouse
15 hours ago
[-]
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).

Whenever it's possible to solve a business problem or political problem with a technical solution, that's usually a strong approach, because those problems are caused by an adversarial entity and the technical solution is to eliminate the adversarial entity's ability to defect.

Encryption is a great example of this if you are going to use a cloud service. Trying to protect your data with privacy policies and bureaucratic rules is a fool's errand because there are too many perverse incentives. The data is valuable, neither the customer nor the government can easily tell if the company is selling it behind their backs, it's also hard to tell if he provider has cheaped out on security until it's too late, etc.

But if it's encrypted on the client device and you can prove with math that the server has no access to the plaintext, you don't have to worry about any of that.

The trouble is sometimes you want the server to process the data and not just store it, and then the technical solution becomes, use your own servers.

reply
GMoromisato
14 hours ago
[-]
I 100% agree, actually. If there were a technical solution, then that's usually a better approach.

For something like data portability--being able to take my data to a different provider--that probably requires a technical solution.

But other problems, like enshittification, can't be solved technically. How do you technically prevent a cloud vendor from changing their pricing?

And you're right that the solution space is constrained by technical limits. If you want to share data with another user, you either need to trust a central authority or use a distributed protocol like blockchain. The former means you need to trust the central provider; the latter means you have to do your own key-management (how much money has been lost by people forgetting the keys to their wallet?)

There is no technical solution that gets you all the benefits of central plus all the benefits of local-first. There will always be trade-offs.

reply
solidsnack9000
8 hours ago
[-]
This would make cloud vendors kind of like banks. The cloud vendor is holding a kind of property for the user in the user's account. The user would have clearly defined rights to that property, and the legal ability to call this property back to themselves from the account.

This calling back might amount to taking delivery. In a banking context, that is where the user takes delivery of whatever money and other property is in the account. In the cloud vendor case, this would be the user receiving a big Zip file with all the contents of the account.

Taking delivery is not always practical and is also not always desirable. Another option in a financial context is transferring accounts from one vendor to another: this can take the form of wiring money or sometimes involves a specialized transfer process. Transferring the account is probably way more useful for many cloud services.

This leads us to a hard thing about these services, though: portability. Say we delineate a clear property interest for user's in their cloud accounts and we delineate all of their rights. We have some good interests and some good rights; but what does it mean to take delivery of your Facebook friends? What does it mean to transfer your Facebook account from one place to another?

reply
al_borland
16 hours ago
[-]
Does this really solve the problem? Let's say I'm using a cloud provider for some service I enjoy. They have documents that spell out that if they have to close their doors they will give X months of notice and allow for a data export. Ok, great. Now they decide to shut their doors and honor those agreements. What am I left with? A giant JSON file that is effectively useless unless I decide to write my own app, or some nice stranger does? The thought is there, it's better than nothing, but it's not as good as having a local app that will keep running, potentially for years or decades, after the company shuts their doors or drops support.
reply
GMoromisato
14 hours ago
[-]
Data portability is, I think, useful even before the service shuts down. If I'm using some Google cloud-service and I can easily move all my data to a competing service, then there will be competition for my business.

What if cloud platforms were more like brokerage firms? I can move my stocks from UBS to Fidelity by filling out a few forms and everything moves (somewhat) seamlessly.

My data should be the same way. I should be able to move all my data out of Google and move it to Microsoft with a few clicks without losing any documents or even my folder hierarchy. [Disclaimer: Maybe this is possible already and I'm just out of the loop. If so, though, extend to all SaaS vendors and all data.]

reply
al_borland
13 hours ago
[-]
This mainly just requires the ability to export, and standard formats. For generic file storage, emails, contacts, calendars, etc, this is largely possible already. Though there are minor incompatibilities based on various implementations or customizations on top of the standard.

The big problem comes into play for new, or more custom types of applications. It takes a while for something to become ubiquitous enough that standard formats are developed to support them.

reply
hodgesrm
17 hours ago
[-]
> * Data portability guarantees: Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.

This is not practical for data of any size. Prod migrations to a new database take months or even years if you want things to go smoothly. In a crisis you can do it in weeks but it can be really ugly, That applies even when moving between the same version of open source database, because there's a lot of variation between the cloud services themselves.

The best solution is to have the data in your own environment to begin with and just unplug. It's possible with bring-your-own-cloud management combined with open source.

My company operates a BYOC data product which means I have an economic interest in this approach. On the other hand I've seen it work, so I know it's possible.

reply
GMoromisato
17 hours ago
[-]
I'd love to know more about BYOC. Does that apply to the raw data (e.g., the database lives inside the enterprise) or the entire application stack (e.g., the enterprise is effectively self-hosting the cloud).

It seems like you'd need the latter to truly be immune to cloud-vendor problems. [But I may not understand how it works.]

reply
WarOnPrivacy
16 hours ago
[-]
> End-of-life contracts: cloud-vendors should contractually spell out what happens if they can't afford to keep the servers running.

I'm trying to imagine how this would be enforced when a company shutters and it's principals walk away.

reply
necovek
2 hours ago
[-]
Putting stuff in escrow is usually the way to go: escrow service is paid upfront (say, always for the next 3 months), and that's the time you've got to pull out your data.

My company does that with a few small vendors we've got for the source code we depend on.

reply
GMoromisato
16 hours ago
[-]
It's a good question--I am not a lawyer.

But that's the point of contracts, right? When a company shuts down, the contracts become part of the liabilities. E.g., if the contract says "you must pay each customer $1000 if we shut down" then the customers become creditors in a bankruptcy proceeding. It doesn't guarantee that they get all (or any) money, but their interests are negotiated by the bankruptcy judge.

Similarly, I can imagine a contract that says, "if the company shuts down, all our software becomes open source." Again, this would be managed by a bankruptcy judge who would mandate a release instead of allowing the creditors to gain the IP.

Another possibility is for the company to create a legal trust that is funded to keep the servers running (at a minimal level) for some specified amount of time.

reply
bigfatkitten
10 hours ago
[-]
No, not at all.

The entire point of Chapter 11 (and similar bankruptcy legislation internationally) is to allow companies to get out of contracts, so that they can restructure the business to hopefully continue on as a going concern.

reply
WarOnPrivacy
16 hours ago
[-]
> When a company shuts down, the contracts become part of the liabilities.

The asset in the contract is their customer's data; it is becoming stale by the minute. It could be residing in debtor-owned hardware and/or in data centers that are no longer getting their bills paid.

It takes time to get a trustee assigned and I think we need an immediate response - like same day. (NAL but prep'd 7s & 13s)

reply
WarOnPrivacy
16 hours ago
[-]
(cont. thinking...) One possibility. A 3rd party manages a continually updating data escrow. It'd add some expense and complexity to the going concern.
reply
prmoustache
16 hours ago
[-]
> Personally, I disagree with this approach. This is trying to solve a business problem (I can't trust cloud-providers)

It is not only a business problem. I stay away from cloud based services not only because of subscription model, but also because I want my data to be safe.

When you send data to a cloud service, and that data is not encrypted locally before being sent to the cloud (a rare feature), it is not a question of if but when that data will be pwned.

reply
HappMacDonald
10 hours ago
[-]
"Trust about whether or not another company will maintain confidentiality" still sounds like a business problem to me (or at least one valid way of perceiving the problem)

And the biggest advantage I see of this perspective over the "technical problem" perspective is that assigning responsibility completely covers the problem space, while "hope that some clever math formula can magic the problem away" does not.

reply
necovek
2 hours ago
[-]
Here at HN, I think most people see it differently (me included): having clear math proof of "confidentiality" is usually seen as both cheaper and more trustworthy.

Yes, there might be a breakthrough or a bug in encryption, and jnless you've been targetted, you can respond. But we've seen and experienced breakdowns in human character (employees spying on customers, stealing data...), government policies and company behaviour to trust the complexity and cost (lawyers) of enforcing accountability through policy.

In general, you do need both, but if you've got one, to engineers, technical solution is usually more appealing.

reply
bigfatkitten
10 hours ago
[-]
I have spent the last decade or so working in digital forensics and incident response for a series of well-known SaaS companies.

The experience has made me a big fan of self hosting.

reply
samwillis
16 hours ago
[-]
> This is trying to solve a business problem (I can't trust cloud-providers) with a technical trade-off (avoid centralized architecture).

I don't think that's quite correct. I think the authors fully acknowledge that the business case for local-first is not complexly solved and is a closely related problem. These issues need both a business and technical solution, and the paper proposes a set of characteristics of what a solution could look like.

It's also incorrect to suggest that local-first is an argument for decentralisation - Martin Kleppmann has explicitly stated that he doesn't think decentralised tech solves these issues in a way that could become mass market. He is a proponent of centralised standardised sync engines that enable the ideals of local-first. See his talk from Local-first conf last year: https://youtu.be/NMq0vncHJvU?si=ilsQqIAncq0sBW95

reply
GMoromisato
15 hours ago
[-]
I'm sure I'm missing a lot, but the paper is proposing CRDTs (Conflict-free Replicated Data Types) as the way to get all seven checkmarks. That is fundamentally a distributed solution, not a centralized one (since you don't need CRDTs if you have a central server).

And while they spend a lot of time on CRDTs as a technical solution, I didn't see any suggestions for business model solutions.

In fact, if we had a business model solution--particularly one where your data is not tied to a specific cloud-vendor--then decentralization would not be needed.

I get that they are trying to solve multiple problems with CDRTs (such a latency and offline support) but in my experience (we did this with Groove in the early 2000s) the trade-offs are too big for average users.

Tech has improved since then, of course, so maybe it will work this time.

reply
satvikpendem
12 hours ago
[-]
> This is trying to solve a business problem (I can't trust cloud-providers)

Not necessarily. I like local-first due to robust syncing via CRDTs, not because I somehow want to avoid cloud providers.

reply
maccard
16 hours ago
[-]
> Vendors must spell out how data gets migrated out, and all formats must be either open or (at minimum) fully documented.

Anecdotally, I’ve never worked anywhere where the data formats are documented in any way other than a schema in code,

reply
mumbisChungo
16 hours ago
[-]
A good contract can help you to seek some restitution if wrongdoing is done and you become aware of it and you can prove it. It won't mechanically prevent the wrongdoing from happening.
reply
HappMacDonald
10 hours ago
[-]
It can also help to align the incentives of multiple parties to actually care about the same goals.

"Mechanically preventing wrongdoing from happening" can be a bit of a Shangri-La. What Tech can mechanically do is increase the cost of wrongdoing, or temporarily deflect attempts towards easier targets. But that by definition cannot "solve the problem for everyone" as there will always be a lowest hanging fruit remaining somewhere.

What contracts can do is help to reduce the demand for wrongdoing.

reply
Habgdnv
17 hours ago
[-]
Currently there are laws but not for hosting. Look at the contract of Steam for example or Ubisoft, or anything else - Q: What happens to your game collection if we shut down our servers? A: You own nothing and lose everything, GG!

It is like that we must protect users privacy from greedy websites so we will make the bad ones spell out that they use cookies to spy on users - and the result is what we have now with the banners.

reply
GMoromisato
17 hours ago
[-]
I agree with you! And your point about cookie banners underlines that we can't just rely on regulation (because companies are so good are subverting or outright lobbying their way out of them).

Just as with the open source movement, there needs to be a business model (and don't forget that OSS is a business model, not a technology) that competes with the old way of doing things.

Getting that new business model to work is the hard part, but we did it once with open source and I think we can do it again with cloud infrastructure. But I don't think local-first is the answer--that's just a dead end because normal users will never go with it.

reply
sirjaz
10 hours ago
[-]
I've found people want local software and access. This is a major reason why people like mobile more now than desktops outside of the obvious of having it in their pocket. A mobile app gives you more of a private feel than going to website and entering your info. In addition to an extent it is kept local first, due to sync issues.
reply
__MatrixMan__
10 hours ago
[-]
Is it trying to solve a business problem? I think it's trying to solve a more general problem which has nothing to do with business.

It's ok to just solve the problem and let the businesses fail. Predation is healthy for the herd. Capitalism finds a way, we don't have to protect it.

reply
montereynack
18 hours ago
[-]
Cool to see principles behind this, although I think it’s definitely geared towards the consumer space. Shameless self plug, but related: we’re doing this for industrial assets/industrial data currently (www.sentineldevices.com), where the entire training, analysis and decision-making process happens on customer equipment. We don’t even have any servers they can send data to, our model is explicitly geared on everything happening on-device (so the network principle the article discussed I found really interesting). This is to support use cases in SCADA/industrial automation where you just can’t bring data to the outside world. There’s imo a huge customer base and set of use cases that are just casually ignored by data/AI companies because actually providing a service where the customer/user is is too hard, and they’d prefer to have the data come to them while keeping vendor lock-in. The funny part is, in discussions with customers we actually have to lean in and be very clear on “no this is local, there’s no external connectivity” piece, because they really don’t hear that anywhere and sometimes we have to walk them through it step by step to help them understand that everything is happening locally. It also tends to break the brains of software vendors. I hope local-first software starts taking hold more in the consumer space so we can see people start getting used to it in the industrial space.
reply
spauldo
15 hours ago
[-]
It doesn't help that all the SCADA vendors are jumping on the cloud wagon and trying to push us all in that direction. "Run your factory from your smartphone!" Great, now I'm one zero-day away from some script kiddie playing around with my pumps.
reply
codybontecou
18 hours ago
[-]
An exciting space and I'm glad you and your team are working in it.

I looked over your careers page and see all of your positions are non-remote. Is this because of limitations of working on local-first software require you to be in-person? Or is this primarily a management issue?

reply
ciju
2 hours ago
[-]
We have been building a local-first browser app (PWA) for personal finance, based on double-entry accounting. https://finbodhi.com/

We do use online services like firebase for auth, and some service to fetch commodity prices etc, but rest of the data is stored in browser storage (sqlite) and backed to local disk (and soon dropbox). We also syncs data across devices, always encrypting data in transit.

I think it's the way to go, for most personal data applications.

reply
davepeck
18 hours ago
[-]
In theory, I love the local-first mode of building. It aligns well with “small tech” philosophy where privacy and data ownership are fundamental.

In practice, it’s hard! You’re effectively responsible for building a sync engine, handling conflict resolution, managing schema migration, etc.

This said, tools for local-first software development seem to have improved in the past couple years. I keep my eye on jazz.tools, electric-sql, and Rocicorp’s Zero. Are there others?

reply
rzzzt
18 hours ago
[-]
CouchDB on the server and PouchDB on the client was an attempt at making such an environment:

- https://couchdb.apache.org/

- https://pouchdb.com/

Also some more pondering on local-first application development from a "few" (~10) years back can be found here: https://unhosted.org/

reply
jkestner
7 hours ago
[-]
Using Couch/Pouch on our current app for this reason. Great to work with. Though we’re not supporting offline-first right away (depends on external services), it’s going to help with resilience and a future escape valve.
reply
refset
12 hours ago
[-]
Lotus Notes always deserves a mention in these threads too, as 1989's answer to local-first development. CouchDB was heavily inspired by Notes.
reply
sroussey
16 hours ago
[-]
reply
ofrzeta
18 hours ago
[-]
Do you know that website? https://www.localfirst.fm

EDIT: actually I wanted to point to the "landscape" link (in the top menu) but that URL is quite unergonomic.

reply
davepeck
17 hours ago
[-]
No, I didn't know about it -- thank you! (EDIT: and the landscape page has lots of libraries I hadn't run across before. Neat.)
reply
jessmartin
13 hours ago
[-]
One of the authors of the Landscape here. Glad you found it helpful!
reply
swsieber
10 hours ago
[-]
I've been using instantdb in anger for the past month or so for a side project of mine. I'm building a personal budget app.

I should probably write a blog post, but I will say that I investigated power sync, electricSQL, livestore and powersync before. I briefly looked at jazz tools but wanted something a bit more structured.

I'm pretty impressed this far. I've actually been writing it with Vue and a community library. Permissions were a bit tricky, but once I figured it out it was simple. And I like their magic email login. And I like their dashboard/reply, but there are a few big changes I would make there to make it less fiddly.

I love that it's open source, and that if I want to, I could self host it.

As for the other options:

- jazz wasn't structured enough

- livestore came off as too fiddly with the event store, but it was appealing. That the dev tools are payealled was disappointing, but understandable

- electriSQL really only provided half a solution (read, not the write model

- couchDB / pouchDB wasn't structured enough for me, and I wanted better cross document support than was obvious / baked in.

- did not investigate zero really

reply
zdragnar
18 hours ago
[-]
I think I saw someone point out automerge not long ago:

https://automerge.org/

Rust and JavaScript implementations, a handful of network strategies. It doesn't come with the free or paid offering that jazz.tools does, but it's pretty nice.

reply
satvikpendem
12 hours ago
[-]
I like https://loro.dev personally, also in Rust and JS. Many such CRDTs are being built in Rust these days.
reply
samwillis
16 hours ago
[-]
Along with the others mentioned, it's worth highlighting Yjs. It's an incredible CRDT toolkit that enables many of the realtime and async collaborative editing experience you want from local-first software.

https://yjs.dev/

reply
thorum
16 hours ago
[-]
I’ve built several apps on yjs and highly recommend it. My only complaint is that storing user data as a CRDT isn’t great for being able to inspect or query the user data server-side (or outside the application). You have to load all the user’s data into memory via the yjs library before you can work with any part of it. There are major benefits to CRDTs but I don’t think this trade-off is worth it for all projects.
reply
3036e4
17 hours ago
[-]
I use local software and sync files using git or sometimes fossil (both work fine in Android with termux for instance, for stuff In want to access on my phone). I don't host servers or use any special software that requires syncing data in special ways.
reply
jonotime
12 hours ago
[-]
There are a bunch and quite a breadth of different solutions/takes on the problem.

Here is a good recap of the current players. https://www.localfirst.fm/landscape

reply
sgt
16 hours ago
[-]
There's also PowerSync: https://www.powersync.com/

It's also open source and has bindings for Dart, JS, Swift, C#, Kotlin, etc

reply
ochiba
17 hours ago
[-]
This site also has a directory of devtools: https://lofi.so/
reply
ibizaman
18 hours ago
[-]
That’s essentially what I’m trying to make widely available through my projects https://github.com/ibizaman/selfhostblocks and https://github.com/ibizaman/skarabox. Their shared goal is to make self-hosting more approachable to the masses.

It’s based on NixOS to provide as much as possible out of the box and declaratively: https, SSO, LDAP, backups, ZFS w/ snapshots, etc.

It’s a competitor to cloud hosting because it packages Vaultwarden and Nextcloud to store most of your data. It does provide more services than that though, home assistant for example.

It’s a competitor to YUNoHost but IMO better (or aims to be) because you can use the building blocks provided by SelfHostBlocks to self-host any packages you want. It’s more of a library than a framework.

It’s a competitor to NAS but better because everything is open source.

It still requires the user to be technical but I’m working on removing that caveat. One of my goals is to allow to install it on your hardware without needing nix or touching the command line.

reply
pastaheld
15 hours ago
[-]
Love it! I've been thinking about this a lot lately. It's crazy how many great FOSS alternatives are out there to everything – and while they might be relatively easy to install for tech-people ("docker compose up"), they are still out of reach for non-tech people.

Also, so many of these selfhostable apps are web applications with a db, server and frontend, but for a lot of use cases (at least for me personally) you just use it on one machine and don't even need a "hosted" version or any kind of sync to another device. A completely local desktop program would suffice. For example I do personal accounting once a month on my computer – no need to have a web app running 24/7 somewhere else. I want to turn on the program, do my work, and then turn it off. While I can achieve that easily as a developer, most of the people can't. There seems to be a huge misalignment (for lack of a better word) between the amount of high-quality selfhostable FOSS alternatives and the amount of people that can actually use them. I think we need more projects like yours, where the goal is to close that gap.

I will definitely try to use selfhostblocks for a few things and try to contribute, keep it up!

reply
ibizaman
12 hours ago
[-]
My guess as to why most apps are now a web UI on top of a DB is because it’s easy to “install”. SelfHostBlocks is admittedly geared towards a central server serving web apps. Or at least apps with a desktop or mobile component but geared towards synching to a central server.

Feel free to give it a try though, I’d love that! Also feel free to join the matrix channel UF you have any questions or just to get some updates.

reply
pastaheld
2 hours ago
[-]
> My guess as to why most apps are now a web UI on top of a DB is because it’s easy to “install”.

That plus web dev is trendy and everybody is learning it. I wouldn't know how to code a proper desktop app right now, I've not done it in years. I don't want to criticize that or the centralization aspect – there will still be ways to put these centralized things on a PC for example.

reply
virgoerns
15 hours ago
[-]
I love that you include hledger! It's amazing piece of software, even if a little obscure for people unfamiliar with plaintext accounting!
reply
ibizaman
12 hours ago
[-]
I love that application. I plan to make some improvements to the web UI. I’d love to have multiple tabs with saved reports. That would allow my spouse to use it quite easily. I’ll be adding that at some point.
reply
voat
17 hours ago
[-]
Looks really neat! Thanks for building this
reply
ibizaman
12 hours ago
[-]
Thank you for the kind words :)
reply
kristianp
54 minutes ago
[-]
People's personal computers, even their tablets and phones are so powerful, they can fulfill most use cases (except AI), especially if the application is reasonably efficient.
reply
kristianc
14 hours ago
[-]
The old model—a one-time purchase, local install, full user control—worked because devs could sell boxed software at scale. Now, that model collapses unless someone’s willing to either Undervalue their own labour, or treat the software like a public good, absorbing the long tail of maintenance with no recurring income.

The article posits it as though subscription software is something which has been sneaked in on us. But users today expect things like instant updates, sync across devices, collaboration, and constant bug fixes and patches - none of which come easily if you're only willing to pay for the system once.

reply
OjotCewIo
14 hours ago
[-]
> as though subscription software is something which has been sneaked in on us

Oh but it has (IMO).

> users today expect things like instant updates [...] constant bug fixes and patches

Nah, this is in reverse. With boxed software, the developer had to deliver an essentially bug-free product. Now, with easy updates technically possible, the developers have gone complacent, and deliver shit. That is why users expect bugfixes instantly. (And any enlightened user abhors unrequested features, as there are no features without regressions, and who wants regressions in any serious application?) The only tolerable online updates are security fixes.

> sync across devices, collaboration

This is a valid expectation, but its execution has been a train-wreck. Research, design and implementation should start with end-to-end encryption; the network architecture should be peer-to-peer (mesh, not centralized). What do we get instead? More centralization of control than ever, and less privacy and ownership than ever.

reply
kristianc
13 hours ago
[-]
Generally that's not how I remember it - third party software on the Mac at least got some kind of a beach-head because Windows software was full of bugs, crashes, corrupted files, drivers that never worked, and patch CDs mailed to enterprise customers like they were firmware apologies. Own your own software, taken to its logical endpoint, was a shareware nightmare.
reply
threetonesun
12 hours ago
[-]
The old model of boxed updates is still in use by some companies today, JetBrains comes to mind. In either case you tuck major new features in a new major version or rolling yearly releases and sell the customer a license to the software that gets a year of updates. In a similar vein many apps I use on my Mac have monthly subscriptions but cancelling them limits their use to essentially one device, but doesn't remove the application or my access to the data.
reply
flkenosad
13 hours ago
[-]
> treat the software like a public good, absorbing the long tail of maintenance with no recurring income.

Good point. Governments would do this if they really worked "for the people"

reply
mazzystar
7 hours ago
[-]
This reminds me of my own painful story: I once made a local photo search app called Queryable that ported OpenAI's CLIP model to iPhone, letting you search your photos with queries like "a black cat sitting on a sofa."

Since it needed to access users' local photo libraries, I didn't want the app to connect to the internet under any circumstances. So I made it a paid app instead of the usual free+in-app purchases model, since the latter requires calling StoreKit which goes online. But because the app had to run the CLIP model, it would crash on lower-performance phones like the iPhone X. Users who paid for it couldn't use it and felt scammed, leading to tons of one-star reviews and angry complaints about their photos being stolen. Eventually I decided to open-source the app, though it never brought me much revenue anyway.

Two years later, Apple started announcing they'd be integrating this exact feature into Apple Intelligence : )

reply
replwoacause
6 hours ago
[-]
Couldn’t you have just restricted the app to being installable on only certain iPhone models?
reply
mazzystar
10 minutes ago
[-]
Apple doesn't allow developers to target specific device models, presumably to prevent discrimination. However, you have two options: 1. Set a minimum iOS version requirement, or 2. Restrict to devices with A12 chips or later. But neither approach can exclude certain problematic device models.
reply
tombert
13 hours ago
[-]
I recently started using Typst instead of Pandoc->LaTeX.

I held off on playing with Typst for years because I was under the (incorrect) impression that the only way to use it was with their web editor. I'm sure that their editor is completely fine, but I am pretty entrenched in Neovim and Pandoc had been serving me well.

Once I found out that Typst has a command line version that I can use directly, it became more appealing, because I'm pretty sick of cloud shit.

reply
bhauer
18 hours ago
[-]
I've been wanting a computing model I call PAO [1] for a long time. PAO would run personal application "servers" and connect dynamic clients across all devices. PAO is centralized, but centralized per user, and operating at their discretion. It avoids synchronization, complex concurrent data structures, and many other problems associated with alternatives. Its weakness is a need for always-on networks, but that complication seems ever easier to accept as omnipresent networks become realistic.

[1] https://tiamat.tsotech.com/pao (2012)

reply
2color
15 hours ago
[-]
It's a very exciting moment for this movement. A lot of the research and tech for local-first is nearing the point that it's mature, efficient, and packaged into well designed APIs.

Moreover, local-first —at least in theory— enables less infrastructure, which could reignite new indie open source software with less vendor lock-in.

However, despite all my excitement about embracing these ideas in the pursuit of better software, there's one hurdle that preventing more wide spread adoption amongst developers, and that is the Web platform.

The Web platform lacks building blocks for distributing hashed and/or signed software that isn't tied to origins. In other words, it's hard to decouple web-apps from the same-origin model which requires you set up a domain and serve requests dynamically.

Service Workers and PWAs do help a bit in terms of building offline experiences, but if you want users to download once, and upgrade when they want (and internet is available), you can't use the Web. So you end up breaking out of the browser, and start using Web technologies outside of the browser with better OS functionality, like Electron, React Native, Tauri et al (the https://userandagents.com/ community is doing some cool experiments in this space).

reply
sirjaz
10 hours ago
[-]
We need to get back to apps rather than webapps. The hardware compatibility issues of the past are basically no longer here, and there are three major OS types two of which can use each other's apps.
reply
HappMacDonald
10 hours ago
[-]
Perhaps, but then how will they be authored? In what language and with what GUI toolkit?

I view everyone flocking around Electron as proof of a failure on this front.

reply
owebmaster
7 hours ago
[-]
Pretty much the opposite. Local-first makes web apps feel just like apps, without the native-apps security risks.
reply
dtkav
16 hours ago
[-]
We need a term for a viable business model to pair with local-first tech.

I've been working on Relay [0] (realtime multiplayer for Obsidian) and we're trying to follow tailscale's approach by separating out the compute/document sync from our auth control plane.

This means thats users still subscribe to our service (and help fund development) and do authn/authz through our service, but we can keep their data entirely private (we can't access it).

[0] https://relay.md

reply
jessmartin
13 hours ago
[-]
Relay user here! It’s great. Quite reliable for an early product.
reply
dtkav
13 hours ago
[-]
Thanks for the kind words
reply
trinsic2
12 hours ago
[-]
Are you requiring a google account for file/folder based auth on a per user bases for a vault? Not to keen on using a 3rd party for this kind of thing.
reply
dtkav
6 hours ago
[-]
For our free/individual plan we do use OAuth2 providers (currently only Google is enabled, but considering others), and can support other methods for larger teams (like oidc).

Originally the idea was to keep everything within the Obsidian UI so things like username/password didn't make sense (no password managers in Obsidian).

We initiate the OAuth2 login flow from within Obsidian. I guess we could add an extra click that takes you to our website first and then add support more auth methods from there. I don't really want it to feel like a web app though.

I'd love to hear your take. Which login method do you think is both simple and could be coherently used within Obsidian on all platforms?

reply
alun
2 hours ago
[-]
I love this idea of local-first software, but from a business point of view there's unfortunately no current incentive to adopt it since it's nowhere near as profitable compared to SaaS. That, in my opinion, is the biggest bottleneck right now to this getting worldwide adoption
reply
cheema33
17 hours ago
[-]
The primary challenge with building local first software is the sync layer. The current 3rd party offerings are not mature. And people have been working on these for a few years. Electric SQL comes to mind.
reply
owebmaster
7 hours ago
[-]
As a local-first developer, I'd say the biggest challenge is p2p. Or more specifically, NAT traversal and the need of a TURN server.
reply
chrisweekly
15 hours ago
[-]
> "we have gone further than other projects down the path towards production-ready local-first applications based on CRDTs"

This seems like a bold claim, but IMHO Ink & Switch have earned their solid reputation and it wouldn't surprise me if it's true. I agree w/ their analysis and am philosophically aligned w/ their user-centric worldview. So who's going to build "Firebase for CRDTs"?

reply
packetlost
14 hours ago
[-]
> Firebase for CRDTs

Do you actually need anything special for CRDTs over a normal database? My understanding is the actual CRDT part is done "client side"

reply
chrisweekly
14 hours ago
[-]
I was just referring to the posted article's assertion that "Firebase for CRDTs" is a huge opportunity. I think I agree w the authors that a well-architected CRDT solution for local-first apps requires capabilities not currently provided by Firebase or any other vendor. But I'm no expert.
reply
Incipient
18 hours ago
[-]
The data part aside, and specifically on the platform/functionality side - these cloud/large products unfortunately do offer more powerful/advanced features, or convenience. Be it cloud multi-device functionality that makes moving around and collaborating seamless, or to enterprise products like snowflake and fabric that offers all sorts over a standard mssql db.

I'm personally very against vendor lock in, but there is some value to them.

reply
hemant6488
15 hours ago
[-]
I've been building exactly this with SoundLeaf [0] - an iOS client for the excellent open-source Audiobookshelf server. No data collection, no third-party servers, just your audiobooks syncing directly with your own instance.

The user-friendliness challenge is real though. Setting up Audiobookshelf [1] is more work than "just sign up," but once you have it running, the local-first client becomes much cleaner to build. No user accounts, no subscription billing, no scaling concerns. Simple pricing too: buy once, own forever. No monthly fees to access your own audiobooks.

[0] https://soundleafapp.com

[1] https://github.com/advplyr/audiobookshelf

reply
outlore
17 hours ago
[-]
What are the top web local first frameworks worth checking out these days? i’ve heard of livestore, tanstack DB with electric, zero. any others that are easy to use and flexible? use case is multiplayer apps and maybe games. thanks!
reply
goodthink
8 hours ago
[-]
multisynq.io Formerly croquet.io. A team Alan Kay project. Dead simple. Synchronized execution, synchronized data comes along for free.
reply
arendtio
13 hours ago
[-]
Regarding the no-spinners: I think it is the wrong approach to argue that just because you have data locally, you don't need any spinners.

Whether you need a spinner or not should be decided by the User Experience (e.g., when the user has to wait for more than 100ms, show a spinner), and not by the location of the data. I am a big fan of local-first apps and enjoy building them myself. However, sometimes your app takes a moment to load. With local-first, you eliminate the network as a source of delays, but there are other factors as well, such as large data sets or complex algorithms.

For example, when you have a project planning software and want to plan 100 work packages with multiple resource combinations in an optimal way, depending on the algorithm, this can take some time. In that case, a spinner or a progress bar is a good thing.

reply
d1sxeyes
12 hours ago
[-]
Agreed. No loading spinners is a good goal, but processing spinners might be unavoidable.
reply
samtho
12 hours ago
[-]
I didn’t get the impression that the author is advocating for removing spinners as a UI concept, rather it’s just being used a shorthand for, “you should not need to send and load the data to and from elsewhere while you are working.”
reply
arendtio
3 hours ago
[-]
Agreed, my comment was meant to provoke exactly that conclusion ;-)
reply
JFingleton
12 hours ago
[-]
A properly designed app would leverage multi threading to place any long running jobs in the background, allowing the user to carry on with other tasks.

Spinners should not exist in a local first app.

reply
arendtio
3 hours ago
[-]
You are aware that 'local-first' does not mean 'no-network'. Having a sync mechanism that runs in the background without user notification can be quite disconcerting.

I mean, I did it, I built an app with a transparent background sync. Then I added a special page, 'sync center'.

In reality, mobile devices don't always have perfect network connections. Therefore, when the user is unsure whether the device is in sync or if the sync is in progress but encounters an obstacle, they might perceive the app as unreliable.

Banning spinners is dogmatic, not user-centric.

reply
drpixie
9 hours ago
[-]
Remember when the justification for cloud was "Your work is not trapped on one device". Well, turns out your cloud data is trapped on one device, AND it's not under your control.
reply
coffeecoders
15 hours ago
[-]
Lately, I have been following this approach and going towards local-first software. I like simple softwares with barebone features.

- Password manager: KeyPassXC

- Notes: Logseq

- Analytics: Plausible

- Media: Jeyllyfin

- Uptime kuma

- Finance tracker: Actual Budget etc is too heavy so I built this. https://github.com/neberej/freemycash/

- Search: Whoogle? is kinda dead. Need alternative.

reply
sunshine-o
14 hours ago
[-]
Most of that stuff was very much over engineered in the last two decades.

The backend for my personal notes, tasks, bookmarks, calendar and feeds are files in directories synced with Syncthing across devices.

I ended there after going from one app to another and being tired of all this.

It is self hosted with no server backend (beyond a Syncthing on a NAS or VPS, optional). It is very reliable and works without Internet connection.

I could have put everything in sqlite too and sync it one way or another, but it seemed already too complicated for my requirements.

I can't share it beyond my close relatives but I had the same problem with people using Google or Microsoft before.

reply
gerdesj
10 hours ago
[-]
Nextcloud with a few addons - all open source - gets you feature parity with all of that lot.

NC itself gets you file sync and webdav etc. An add on gets you the webby version of LibreOffice. You can bolt on AI addons to classify and tag your images/photos and with a bit more effort, your docs too.

It's properly local first.

reply
danjl
16 hours ago
[-]
Goal #2, your data is not trapped in a single device is the hard bit, especially with goal #3, the network is optional. For #2 to be true, this means the network is *not* optional for the developer, it is required. Thus the entire complexity of building a distributed app, especially one without a centralized server, which is particularly difficult even with modern local first database tools, greatly increases the complexity of writing this type of software compared to either traditional desktop apps or cloud apps.
reply
goodthink
8 hours ago
[-]
Synchronize execution. Not data. https://multisynq.io Synchronization of the data is implicit. NO centralized anything.
reply
Existenceblinks
15 hours ago
[-]
Tried to adopt this last month at work, it failed. E.g. the mentioned Automerge, it has poor docs https://automerge.org/docs/reference/library_initialization/... and that left out a lot of question, it seems backend agnostic but have to figure out how to store, how to broadcast ourselves.
reply
jes5199
13 hours ago
[-]
yeah I tried to build a project on Automerge but I ended up switching to Yjs, it seems more mature.
reply
jonotime
12 hours ago
[-]
Awesome to see this getting more coverage. I am very interested in local first and I am working on several progressive web apps based around this. One app depends on file sync, not database sync and the best I have found is remoteStorage.js. Its not perfect, but its very much the missing piece I was often looking for.
reply
thenthenthen
15 hours ago
[-]
Didn't this already happen? The internet died 20 years ago. Now it is just ‘somewhat’ interconnected intranets with their own local legislation?
reply
sygned
16 hours ago
[-]
I've made a local first, end-to-end encrypted, auto sync bookmark extension that doesn't milk your data in any way. It's 100% private, I even don't use Google analytics on my website. Some of the reasons why I've put some work into this is:

  - because I could not find something similar that doesn't milk and own my data
  - to never lose a bookmark again
  - to have my bookmark data encrypted in the cloud
  - to have private history
  - to have some extra time saving features in the extension that are for unknown reason rare to find
  - more learning and experience (it's acutally quite complex to build this)
After about 4 years of using it daily on every pc I own, I found out it's a pain for me and my family when it is not installed on a browser. I thought; if it's useful for us, it might be useful for others too! So, I decided to make it available by subscription for a small fee to cover the server and other costs. I'm not really into marketing, so almost no one knows it exists. You can find it on markbook.io.
reply
neilv
10 hours ago
[-]
Skimming the article, it seems to touch on a lot of the right points, but the motivating first paragraph seems weak:

> Cloud apps like Google Docs and Trello are popular because they enable real-time collaboration with colleagues, and they make it easy for us to access our work from all of our devices. However, by centralizing data storage on servers, cloud apps also take away ownership and agency from users. If a service shuts down, the software stops functioning, and data created with that software is lost.

"Apple pie might be tasty and nutritious and exactly what you want, but, theoretically, apple pie could burst into flames someday, and take your favorite pie-eating bib with it.

reply
miladyincontrol
15 hours ago
[-]
In a world of owning nothing and paying subscriptions for everything, owning your data and using software that is either yours or libre is 'rebellion' to many a service provider.

Its not local-first or some sort of cloud diet trend, it should be the norm.

reply
OjotCewIo
14 hours ago
[-]
Right. I don't even understand why this article had to be this verbose. It's not like we need to be "convinced" that local is better. Everybody who values privacy and independence knows already. But this stuff is unimplementable -- we suffer from the cloud disease because it's immensely profitable for the cloud providers and cloud-based app providers to enslave us, and to bleed us out. Their whole point is locking us in.

"Sharing models" are totally irrelevant until the concept of self-determination is tolerated by the powerful (and they will never tolerate it). Accessing my data from multiple devices is totally secondary; I don't trust mobile endpoints to access my remote data in the first place.

reply
neon_me
17 hours ago
[-]
100%! Not only local-first. But also private, zero/minimal dependency, open source and environment agnostic!

If there is anyone interested in working on such projects - let's talk! We can't leave our future to greedy surveillance zealots.

reply
velyan
4 hours ago
[-]
That’s why I’m working on https://collate.one - offline AI workspace
reply
bilsbie
9 hours ago
[-]
One compromise could be to host the software but also offer the option for self hosting.
reply
owebmaster
7 hours ago
[-]
Local-first apps should not need hosting.
reply
alganet
14 hours ago
[-]
Offline-first, now with CRDTs, and a brand new name!
reply
rossant
13 hours ago
[-]
That was published 6 years ago. What's the state of the art of local-first software technology in 2025?
reply
jumploops
18 hours ago
[-]
One thing I’m personally excited about is the democratization of software via LLMs.

Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.

Getting a user to install a local DB and a service to run their app (god forbid, updating said service), is a challenge that’s complex, even for developers (hence the prevalence of containers).

It will take some time (i.e. pre-training runs), but this is a future I believe is worth fighting for.

reply
satvikpendem
12 hours ago
[-]
> Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.

Not sure where your experience is coming from but when I asked an LLM, Claude to be more precise, it referred me to local options first, such as SQLite. It didn't consider cloud platforms at all until I had asked, presumably because it can understand local code and data (it can query it directly and get back results) but cannot understand the context of what's in the cloud unless you configure it properly and give it the env variables to query said data.

reply
jumploops
12 hours ago
[-]
What was your prompt?

In my experience it’s great at utilizing local storage and SQLite, if you ask it to.

I just asked the ChatGPT web client (4o, as that’s what most non-developers might default to):

> Can you build me a website for my photos

And it immediately started suggesting Wordpress, Wix, Squarespace, etc.

Specifically, this was section 4 of the various questions it asked me:

> 4. Tech Preference (optional)

> - Do you want this as a static HTML site, WordPress, or built with something like React, Next.js, or Wix/Squarespace? > - Do you need help hosting it (e.g., using Netlify, Vercel, or shared hosting)?

As a non-programmer, I likely wouldn’t understand half those words, and the section is marked optional.

If I follow the “default path” I’m quickly forking over a credit card and uploading my pictures of dogs/family/rocks to the cloud.

reply
moffkalast
18 hours ago
[-]
Local LLMs are even more amazing in concept, all of the world's knowledge and someone to guide you through learning it without needing anything but electricity (and a hilariously expensive inference rig) to run it.

I would be surprised if in a decade we won't have local models that are an order of magnitude better than current cloud offerings while being smaller and faster, and affordable ASICs to run them. That'll be the first real challenger to the internet's current position as "the" place for everything. The more the web gets enshittified and commercialized and ad-ridden, the more people will flock to this sort of option.

reply
ChiMan
9 hours ago
[-]
The speed alone is sufficient for a local-first approach. The latency of any cloud software I’ve ever used is like constant sand in the gears of thinking. Although taking supplements that slow my thinking—essentially natural downers—do improve my experience with such software, the improved experience comes at the expense of IQ. Basically, you need to be a little slow and dumb for the software to work as intended.

This is nuts. Computers are supposed to enhance and enable thinking, not make you stupid. In this sense, cloud software is possibly the biggest fraud ever perpetrated on the paying, computer-using public.

For the love of God, please bring back my late 1990s and early 2000s brain-boosting computer experience.

reply
nixpulvis
12 hours ago
[-]
AIs like GPT being non-local is one of my biggest issues with it.
reply
hkt
17 hours ago
[-]
Self hosting (which is often adjacent to local-first software) is fine. I've done it for years.

But it is a nightmare when it goes wrong: the conclusion I've reached is that it is out of reach to regular people who don't want the Byzantine support load that could accompany something going wrong. They want turnkey. They want simple. They aren't interested in operating services, they're interested in using them.

The FLOSS model of self hosting doesn't really offer a reliable way of getting this: most businesses operating this way are undercapitalised and have little hope of ever being any other way. Many are just hobbies. There are a few exceptions, but they're rare and fundamentally the possibility of needing support still exists.

What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.

I've done a little work towards this myself, in the form of a not-yet-seen-the-light-of-day project.

What I'd love to see is a set of developers and operators actually getting paid for their work and users getting a better deal in terms of cost, service, and privacy, on their own (aggregate) terms. Honestly, I'd love to be one of them.

Does anyone think this has legs to the same extent as local-first or self hosting? Curious to know people's responses.

reply
ibizaman
14 hours ago
[-]
This is the business model I want to have: I work on a stack of fully open source software and package them in a turn-key server that you own. You can use it on your own for free if you’re knowledgeable and I offer a subscription where I’m the sysadmin of the box you own and that I built for you. I do the maintenance, the updates, etc. There’s no lock-in because you can stop the subscription anytime or even just pick another sysadmin that would know the stack. The only reason you’d keep me around would be that the service I offer is pretty damn good. Would something like that appeal to you?
reply
mxuribe
17 hours ago
[-]
I was about to suggest that a better, more open, and fair form of capitalism would need to be used as a tool...but then, re-reading your comment - "...leverage the power of centralised, professional operations and development, but to govern it democratically..." - i think you better encapsulate what i meant to convey. :-)

That being said, yes, i do believe *in the near/upcoming future* local-first, self-hosting and i will add more fair open source vendors will work! Well, at least, i hope so! I say that because Europe's recent desire to pivot away from the big U.s. tech companies, and towards more digital sovereignty - in my opinion - begins the foundational dependency for an ecosystem that will/could sustain self hosting, etc. The more that europe is able to pivot away from big tech, the more possibilty exists for more and varied non-big tech vendors manifest...and the more that Europe adopts open source, the more the possibility that usage and expertise of self-hosting grows....plus, for those who do not know how to, or simply do not wish to manage services themselves...well, in time i think Europe will have fostered a vast array of vendors who can provide such open source, digital services, but get paid a fair cost for providing fair value/services, etc. ...and, by the way, i say this all as a biased person in favor of open source AS WELL AS being an American. :-)

reply
OjotCewIo
14 hours ago
[-]
> What is needed, imo, is to leverage the power of centralised, professional operations and development, but to govern it democratically. This means cooperatives where users are active participants in governance alongside employees.

Utopia. Unattainable. Self-determination of the individual has been consistently persecuted under all societal arrangements; communism and capitalism equally hate a citizen that wants to remain independent and self-sufficient.

reply
didgetmaster
18 hours ago
[-]
Databases like Postgres can be run locally or as part of some kind of managed service in the cloud. Anyone know of recent stats that show the percentage of databases that are managed locally vs by some cloud service?
reply
lutusp
17 hours ago
[-]
Complete agreement. Here's a brief, practical action plan for Windows users:

  * Download all your data from Microsoft's "OneDrive" cloud storage, which if not disabled, is the default storage method in a new Windows install.
  * Verify that all your files are now stored locally.
  * Click the gear icon, go to "Settings -> "Account" -> "Unlink this PC," right-click, "Unlink account".
  * Remove Microsoft's OneDrive app from your system -- full removal is the only way to prevent perpetual harassment and reactivation. Go to "Apps" -> "Apps & features" (or "Installed apps" on Windows 11) -> "Microsoft OneDrive", right-click, "Uninstall."
  * Optional extra step: cancel your Microsoft 365 subscription and install LibreOffice (free, open-source).
Remember this -- cloud storage only has advantages for Microsoft and law enforcement (which have a number of easy ways to gain access to your documents compared to local storage). For a Windows user, cloud storage is the ultimate Dark Pattern.
reply
curtisblaine
14 hours ago
[-]
With crdt implementations like y.js, writing your own synchronization engine is trivial: https://greenvitriol.com/posts/sync-engine-for-everyone
reply
3cats-in-a-coat
16 hours ago
[-]
How about redundancy in general. Not local first, not cloud first, but "anything can be first and last". That's how the "cloud" works in the first place. Redundancy. Mesh networks as well.
reply
captainregex
12 hours ago
[-]
yes but think of all those poor shareholders with unmaximized value you heartless man!
reply
cyanydeez
19 hours ago
[-]
Local first is almost equates to both privacy protective and public software good.

Essentially antithetical to capitalism, especially America's toxic late stage subscription based enshittification.

Which means its typically a labor of love or a government org has a long term understanding of Software as a Infrastructure (as opposed to SaaS)

reply
bigyabai
16 hours ago
[-]
"Local first" is neither equivalent to privacy protection or public software good. Many businesses sell local-first software that still contains remote backdoors[0] you cannot control. And it most certainly doesn't ensure "public software good" when there is zero obligation to improve the upstream or empower users to seek alternatives.

I would sooner trust a GPL-licensed remote software program than store a kilobyte of personally identifying information in a proprietary "local first" system.

[0] https://www.macrumors.com/2023/12/06/apple-governments-surve...

reply
Nevermark
18 hours ago
[-]
I think you mean antithetical to corrupted conflict-of-interest capitalism.

Conflict-of-interest transactions have hidden or coercive impact, lined up in favor of the party with stronger leverage. Examples include un-asked and unwanted surveillance of data or activity, coercive use of information, vendor lock in, unwanted artificial product/service dependencies, insertion of unwanted interaction (ads), ...

None of that is inherent to capitalism. They clearly violate the spirit of capitalism, free trade, etc.

It is providers taking advantage of customer lack of leverage and knowledge to extract value that does not reflect the plain transaction actually desired by customers. Done legally but often with surreptitious disclosure or dark pattern permissions, border line legally where customers would incur great costs identify and protest, or plain old illegally but in a hidden manner with a massive legal budget to provide a moat against accountability.

It is tragic that the current generation of Silicon Valley and VC firms have embraced conflict of interest based business models. Due to the amounts of money that scaling "small" conflicts can make. Despite the great damage that we now know scaling up "small" conflicts can do.

That was not always the case.

reply
nicoburns
18 hours ago
[-]
The problem with our current system of capitalism is that it causes capitalism to accumulate. This leads to less competition, fewer checks and balances, and undermines the whole "wisdom of the crowd" mechanism that captialism is premised on.

If we want a functioning market based system then we need to explicitly correct for this by aggressively taxing the wealthiest entities (individuals and companies) in our society to bring things closer to a level playing field.

reply
immibis
12 hours ago
[-]
"corrupted conflict-of-interest capitalism" is just capitalism.

Free trade is antithetical to capitalism. Free trade means everyone is on a level playing field, but capitalism means those with more capital are above the rest. These are obviously not compatible.

reply
ndr
18 hours ago
[-]
It might be antithetical to rent seeking at best, but capitalism?
reply