But sadly the competitors are as bad, just in different ways. Why has nobody yet managed to build a good IM client? It does not seem like we have come far from what we had back in the Pidgin days.
This is par for the course with chat backups, though.
Messenger - Bad - No way to save chat responses of people you have talked to. This means you only ever have one side of a conversation, making it meaningless.
Twitter DMs - Bad - See Messenger.
Jami - Ehhhh - Saves a git local repository of messages. The only problem is message syncing is effing abysmal.
Dino (XMPP) - Bad - Does not allow backing anything up, this is "intentional". Depending on which protocol you use, as soon as you move to another device all the messages you _had_ are retroactively converted to Cannot Decrypt. They're my effing messages!
Discord - Good - Discord History Tracker (tedious to use but slurps everything up into a sqlite3 database that is itself, an official archival format)
WhatsApp - Good - Dumps a text record + files/images/etc. onto the phone's filesystem. This is reasonably easy to archive.
Signal - Mediocre - If you have an old Signal backup from 2018? That you could only transfer off your phone by deleting old messages? lmao you're effed. Load up a version from ten years ago, gradually update it and then maybe, MAYBE you can extract the sqlite3 archive? These days you have a .signalbackup or whatever which is an encrypted archive, and I assume that there's a tool to decrypt it, but uhhhhhh. Last I tried to use it it required way more RAM than I had accessible.
Whatsapp mentions don’t work (just show the name of the mention to the other users), and polls or albums don’t work.
Messenger disconnects every couple of days at this point.
Pasting links won’t always expand.
Attachments are always hit or miss.
So many small other things. Still love it.
What "proprietary blobs" does Signal have?
I'll also just add: it's probably not a good idea to use any modifications to an E2EE messenger unless you are comfortable with those privacy/security guarantees possibly being violated by the 3rd party code.
The only exception to this would be if I really trusted the goals of the 3rd party, like Graphene.
As they say in the Github readme, FCM and Google Maps.
FCM doesn't technically require a blob — it's just that Google wants you to think it does. I reverse engineered their library and it turned out to be a criminally over-engineered wrapper around two broadcast receivers. So, the Mastodon app is proudly the first app ever to both support FCM push notifications, and be 100% open-source.
Thanks, I didn't notice that. Reading this, I'm kind of surprised that Signal doesn't offer an OpenStreetMaps build as it seems like it'd be more inline with their philosophy.
Here's how you request a push token:
Intent intent = new Intent("com.google.iid.TOKEN_REQUEST");
intent.setPackage("com.google.android.gms");
intent.putExtra("app", PendingIntent.getBroadcast(context, 0, new Intent(), PendingIntent.FLAG_IMMUTABLE));
intent.putExtra("sender", FCM_SENDER_ID);
intent.putExtra("subtype", FCM_SENDER_ID);
intent.putExtra("scope", "*");
intent.putExtra("kid", "|ID|1|");
context.sendBroadcast(intent);
Here are the two receivers: <receiver android:name=".PushNotificationReceiver" android:exported="true" android:permission="com.google.android.c2dm.permission.SEND">
<intent-filter>
<action android:name="com.google.android.c2dm.intent.RECEIVE" />
</intent-filter>
</receiver>
<receiver android:name=".api.PushSubscriptionManager$RegistrationReceiver" android:exported="true" android:permission="com.google.android.c2dm.permission.SEND">
<intent-filter>
<action android:name="com.google.android.c2dm.intent.REGISTRATION"/>
</intent-filter>
</receiver>
The first one is where you get notifications. The parameters you sent from the server will simply be your intent extras.The second one is where you get push tokens. There will be a "registration_id" extra string which is your token. It may start with "|ID|1|" (the "kid" parameter from the request, not quite sure what it does), in which case you need to remove that part.
You want to refresh your push token every time your app gets updated and also just periodically if you haven't done it in a while. I do it every 30 days.
It's Apache 2.
The protocol itself was easy, but my problem was that Google Play Services have a special permission to exempt itself from power management. And more importantly, grant that permission temporarily to the individual apps when they have a notification. I don't think I ever found out how to work around this.
I miss the times IM software respected, or at least didn't fight hard to defeat, the end-user's freedom to computing on their own device, which includes viewing and sending messages through whatever interface they see fit, including indirectly as part of a script/automation. But that was all before E2EE era, hell, before mobile dominance.
> might as well use Whatsapp.
- still scrapes metadata
- run by company who's entire objective is to profile you
Stop being so ridiculous. You can criticize Signal (and there's plenty to critique) but that's just silly. What, should we also just use telegram where E2EE is off by default?You know signal is open source, right? That's why Molly exists. They can run their own servers too.
Now I wish you could do both. Talk in both signal and the decentralized molly servers. I wish signal had a mesh like feature since it's way harder to snoop on conversations if you have to be physically near. I even wish Signal made the signal sticker site accessible from inside the app. There's tons of things they should do but let's not pretend that just because they're not perfect that we should use apps from a company whose motto might as well be "be evil".
There are plenty of others, all with their pros and cons.
Ultimately,the network effect is usually the hardest parameter to overcome.
Ironically, the only person who mentionned wanting to use signal instead of whatsapp in my network circle is my 71y old mother.
edit: lol, i assumed you were the OP. ignore me
There are actually two builds of Molly: Molly and Molly-FOSS. IIRC Molly uses regular Firebase, which can be faster and more reliable but comes with the above tradeoffs, while Molly-FOSS uses UnifiedPush.
Your point about exercising caution with forks of encrypted messaging apps is a great rule of thumb, and in general, social proof should NOT substitute for competent software security specialists reading and evaluating source code, but given you seem to trust GrapheneOS, it's worth noting that they've formally endorsed Molly: https://xcancel.com/GrapheneOS/status/1769277147569443309
Also a great point :) And thank you for the reference.
Happy user for many years now, thanks for the support!
AFAIK signal only blocks due to security patches. Which it's on a much longer timeframe than a few weeks.
Most of the time there is zero explanation for the update. They are just training their users to auto accept updates with no thought about why, which in itself is a security risk.
If signal really is pushing these updates for "security" then it must be one of the most insecure apps ever built. I legitimately can't think of another app or program that updates more frequently... Maybe youtube-dl?
Frequent updates have the downside of more frequent breakage and of course extra bandwidth usage. Let users make the trade off between those downsides and the risk of zero days.
You're putting everyone who you've talked to at risk. I don't know about you, but I prefer not having to worry about whether I'm communicating with someone whose installation can easily be pwned by any halfway incompetent attacker.
Compare with Signal where there is only one allowed server entity and hardly anyone verifies identities making man in the middle attacks trivial.
Can someone explain, is this different from adding (up to 5) devices to your Signal account? Are these devices all "primary" or something?
I would ideally want to not have one device being the master and the rest linked to it (e.g. Element can do that for Matrix) but that might be a too big change. And as far as I know Molly does not try to solve that.
I used both desktop and android with no issues.
I have multiple android devices and I can only log into signal on one. I can have as many desktop slaves as I want tho.
The fact that this "improved" version does not show a single screenshot of the UI on their own website, signals to me (pun intended) that this app will address none of my wishes.
It really is weird not to show a single screenshot when the 4th listed feature is design ("Material You | Extra theme that follows your device palette").
i don't use any of the enhancements, but it does receive notifications over the websocket it keeps open in the background vs only waking up on an FCM push notification like the regular app
i wonder if the supply chain risk of having a second entity (that signs the apks!) involved is really worth it to anyone... hope signal can be published on Accrescent or similar someday :p
FWIW you can actually do the FOSS version of this now with UnifiedPush support (rolled out in Molly a while back).
It's a massive saver on battery life but it does require that you have a server set up to forward notifications to your unifiedpush distributor.
What database?
This page is clearly written for developers that are already familiar with it.
From this I can already predict this project is going nowhere.
The local database used by Signal to organize every message, every contact, every profile photo, every attachment, every group, basically every dynamic piece of data you interact with in the app.
Signal is basically a UI layer for a database. The in-transit encryption is genuinely good enough to be textbook study material for cryptographers, but the at-rest encryption became a joke the moment they stopped using your pin to encrypt the local DB and requiring it to open the app.
As someone who's been enthusiastic about Signal since it was TextSecure and RedPhone, the changes made over the years to broaden the userbase have been really exciting from an adoption perspective, and really depressing from a security perspective.
TL;DR of Molly is that it fixes/improves several of those security regressions (and adds new security features, like wiping RAM on db lock) while maintaining transparent compatibility with the official servers, and accordingly, other people using the regular Signal client.
But if my phone gets taken and an exploit is used to get root access on it, I don't want the messages to be readable and there's nothing I can do about it. It's not like I can just use a different storage backend.
It's also a very simple solution - just let me set an encryption password. It's not an open-ended problem like protecting from malware running on the device when you're using it.
Which is to say this is an incoherent security boundary: you're not encrypting your phone's storage in a meaningful way, but planning to rely on entering a pin number every time you launch Signal to secure it? (Which in turn is also not secure because a pin is not secure without hardware able to enforce lock outs and tamper resistance...which in this scenario you just indicated have been bypassed).
A passphrase can be long, not just a short numeric PIN. It can be different from the phone unlock one. It could even be different for different chats.
Their justification here https://source.android.com/docs/security/features/encryption is that
> Upon boot, the user must provide their credentials before any part of the disk is accessible.
> While this is great for security, it means that most of the core functionality of the phone is not immediately available when users reboot their device. Because access to their data is protected behind their single user credential, features like alarms could not operate, accessibility services were unavailable, and phones could not receive calls.
I'm sure they could have found a better approach, instead of file based encryption, but must have been nice to simplify engineering overhead and giving 3 letter agencies, at the same time, something that simplifies their work.
This is less true for fully patched GrapheneOS devices than it is for fully patched iOS and other Android devices, but this space is basically a constantly evolving cat and mouse game. We don't get a press release when GrayKey or Cellebrite develop a new zero day, so defense in depth can be helpful even for hardened platforms like GOS.
As always, it depends on your threat model.
I use signal because I value my privacy and don't trust Facebook. Not because I'm an activist. So I'm in the target group for Signal's new behavior and I welcome it (especially since to use it to share personal information that I don't want Facebook or advertisers to get, I need my parents and in-laws to use it as well, so it must be user friendly enough).
I wish they continue moving forward in that direction by the way and allow shared pictures to be stored directly on the phone's main memory (or at least add an opt-in setting for that), because the security I get from it not being is zero and the usability suffers significantly.
I'm a really big fan of the airport bathroom analogy. When you use the restroom in the airport, you close the stall door behind you.
You're not doing anything wrong, you have nothing to hide, and everyone knows what you're doing. But you take actions to preserve your privacy anyway, and that's good.
Everyone deserves privacy, and the psychological comfort that comes with it. Dance like nobody's watching, encrypt like everyone is :)
This is a much better way of saying what I wanted, thank you.
That said, Molly definitely isn't designed for the average person's threat model, that's totally true, but it's also worth noting that just because someone isn't aware of a certain risk in their threat model, that doesn't mean they will never benefit from taking steps to proactively protect themselves from that risk.
IMO, security and privacy are best conceptualized not as binary properties where you either have it or you don't, but rather as journeys, where every step in the right direction is a good one.
I'd always encourage everyone to question their own assumptions about security and never stop learning, it's good for your brain even if you ultimately decide that you don't want to accept the tradeoffs of an approach like the one Molly takes towards at-rest encryption.
I find that unconvincing. If your phone is hacked, your phone is hacked. I think its bad to make assumptions that an attacker can compromise your phone but not log keystrokes. I'm not super familiar with state of the art of phone malware and countermeasures, but i think anything trying to be secure in the face of a compromised platform is like trying to get toothpaste back in the tube.
> it's also worth noting that just because someone isn't aware of a certain risk in their threat model, that doesn't mean they will never benefit from taking steps to proactively protect themselves from that risk.
Threat models are just as much about ensuring you have all your bases covered as ensuring you don't spend effort in counterproductive ways.
> IMO, security and privacy are best conceptualized not as binary properties where you either have it or you don't
I agree. I think security is relative to the threat you are trying to defend against. There are no absolutes.
> but rather as journeys, where every step in the right direction is a good one.
Here is where i disagree. Just because you take a step does not mean you are walking forward.
A poorly thought out security measure can have negative impacts on overall system security.
But not sure if even the upstream Signal client has this.
I'm pretty unconvinced that this is a sane or useful thing to do.
And given how in this case Molly could fix it it cannot have been that hard to fix.
Edit: Found it! "Careless Whisper: Exploiting Silent Delivery Receipts to Monitor Users on Mobile Instant Messengers" https://arxiv.org/abs/2411.11194
Signal still requires a phone number for registration.
Edit: Found it! "Careless Whisper: Exploiting Silent Delivery Receipts to Monitor Users on Mobile Instant Messengers" https://arxiv.org/abs/2411.11194