Hi HN – I built an open-source, self-hosted Model Context Protocol (MCP) server for WhatsApp: https://github.com/lharries/whatsapp-mcpIt connects to your personal WhatsApp account via the WhatsApp Web multi-device API (using whatsmeow from the Beeper team), and doesn't rely on third-party APIs. All messages are stored locally in SQLite. Nothing is sent to the cloud unless you explicitly allow your LLM to access the data via tools – so you maintain full control and privacy.
The MCP server can:
- Search your messages, contacts, and groups
- Send WhatsApp messages to individuals or groups
Why build this?
99% of your life is stored in WhatsApp, by connecting an LLM to WhatsApp you get all this context. And your AI agent can execute tasks on your behalf by sending messages.
▲There is any reason why you choose to build a Python wrapper for a Golang binary?
I've been playing around with mcp-go[0] and it seems pretty good.
[0] - https://github.com/mark3labs/mcp-go
reply▲Mainly that I'm more familiar with Python — you're correct however that MCP Go would be a good way to go if I do a bigger refactor
reply▲BTW is there a good "WhatsApp API SaaS" open-source server? There's Waha
https://github.com/devlikeapro/waha which is nice, but unfortunately they don't offer "safety features" in the open-source version to lock the access
I guess it's solvable with Tailscale though
reply▲I don’t know if whatsmeow was intended for this use case: imagine a LLM performing multiple calls in a short period of time, could you risk to have your whatsapp account blocked by Meta?
reply▲reply▲I've been running and using a puppeting bridge [1] for my Matrix instance for more than 4 years with 3 different numbers and no problem at all.
The controls they have in place are probably based on behavior, rather than on access.
[1]: https://github.com/mautrix/whatsapp
reply▲WhatsApp will ban you if it detect any kind of automation, if you use this or an other wrapper you will -very luckily- get banned, even without making calls. They want you to use their paid business API
reply▲Good to know WhatsApp is at least somewhat proactive against scammers.
reply▲It saves the messages in a local SQLite database on load — so querying shouldn't be an issue as it doesn't contact the WhatsApp server. For sending, this does use the API however and needs more caution
reply▲The why is because we can, but damn am I finding the tools being built with, or having tacked on, AI depressing. Is this a small glimpse of the future we're building for ourselves? Communication is valuable because thought and effort went into it, lowering the bar on producing content doesn't mean more choice, it means lower quality. Already I see a reaction against this amongst some peers when they find out something they were asked to review was AI generated, why should they put effort in if the other person didn't.
reply▲If we find the "thought and effort" part of communication valuable, we'll keep it. If not, we won't.
reply▲That would be a fine posture to take, very naturally-selective, but I find it discomforting because I've seen so many different ways that humans act that don't benefit them (individually or on the whole). It isn't always out of self-destruction or lack of self-preservation. More often than not, the choice was based on what's easiest -- a tendency towards the path of least resistance. This technology looks a lot like trading off intention, and attention, for quick and good-enough(?) results. Enough so that I can understand GP's concern for our communication skills as a society.
I think we could find ourselves losing the "thought and effort" in spite of it being more valuable, because many people find it easier. Or that even those who continue on with it, despite it not being easier, are broadly labeled as a bot because their writings failed the vibe check.
I have confidence that there will always be small communities that continue holding this as valuable, but that maintaining it in a community setting will become more strained as the general zeitgeist drifts in the direction of regarding output higher than effort.
reply▲Looks interesting. It would be nice to see something like this for all the communication methods in Beeper unified together. That way all communication is covered rather then just 1 app of many.
reply▲Beeper is actually built on top of an open source protocol called Matrix, and they also open sourced all of the bridges, so you can host beeper yourself and connect the LLM to it. And even better - since the protocol is open source, you can also connect to the official Beeper server! You can even use alternative Matrix clients to connect to Beeper, I do this on my company laptop since Beeper is blocked.
https://matrix.org/ecosystem/clients/ reply▲I think this is like the 7th or 8th WhatsApp MCP implementation.
I really have zero understanding why people think this is something crazy. It’s not. It’s importing the official MCP packages and wrapping basic API methods with an MCP tool decorator.
You can even ask Claude or ChatGPT to make your MCP tools for you and they will write this same code in 1 minute.
I can’t wait until the community realizes that MCP servers are literally just regular methods with a one line decorator and these posts just get downvoted for being incredibly low effort.
It’s basically the same as upvoting someone saying “hey guys, I wrote a method which connects to the WhatsApp API”, that’s it, really.
reply▲enoughalready2 days ago
[-] I've been struggling to understand the hype as well.
I think the excitement is more around seeing how MCP is being embraced, and the impact it can have. Rather than me having to wire up a bunch of tools by hand, I can now point to some configuration files and MCP servers.
I think it's fair for people to be excited, but ultimately, yes, your prediction is correct, and it won't be as exciting after MCP becomes a standard. This is similar to any new technology. e.g. years ago, Go programs and literature were popular on this site, but not so much any more.
reply▲Yeah. I've written infrastructure for LLMs to use Twitter, Discord, WhatsApp and more since '22 before GPT 3.5 was even out. You can imagine my contempt for all of this right now.
That said, MCP might be here to stay for a while as a stopgap for reducing duplication of engineering work.
reply▲You know why: MCPs are still a hot topic, and WhatsApp is ubiquitous (unfortunately).
reply▲If this thing is connected to someone's WA and I ask "Who is your girlfriend?", the AI theoretically could list all the chats, read some messages in each chat, determine who could be the partner and return the name to me :))
I can go ahead and ask "What was the last fight about".
reply▲Cool work, but I'm more fascinated by your claim "99% of your life is stored in WhatsApp...".
Not even remotely true for me even if it would encompass all messaging apps I use. I guess I'm just an old introvert, but it makes me wonder how life looks like for those for whom it is true.
reply▲The way to absolute dominance of WhatsApp with the Normies has't been sufficiently analyzed, I reckon.
Somehow, WhatsApp managed to become extremely popular and heavily used by people who have trouble switching on a desktop pc. Even senior citizens have no trouble using it.
Is it because they have a high motivation to use it? The UI / UX of WhatsApp surely isn't great, I'd even say it's quite bad. Where am I wrong? What am I not seeing?
reply▲I don't use WhatsApp, I helped most of my family and many friends move away from it, and I feel like we (you and I) see WA in a similar way. IMO the main issue here is the network effect and vendor lock-in (for the lack of a better term -- I'm writing this from my phone in a rush).
> Where am I wrong? What am I not seeing?
English is my second language, so maybe I'm missing some context here, but every time I hear techies calling non-techies "Normies", it's used in a derogatory / condescending fashion. That's reductive (non-techies are not a homogenous group) and somewhat intellectually lazy.
To give you an example, WA users in the US and, say Portugal or Poland ended up using it for slightly different reasons and in a different technical context. WA used to be the best video chat app for quite some time in the UK, AT or PL (imo), and I know of many people who started using it for that specific reason (esp. important for large migrant communities). FaceTime wasn't that popular because Android market share in the EU was bigger than in the US.
reply▲WhatsApp acts the phone's entry point for many.
Contacts aren't stored on the phone, messages act as contacts. Phone and video calls also on WhatsApp. Photos are shared via WhatsApp so that's where the gallery is. It even functions as a calendar of sorts – events are organised in WhatsApp groups so there you have directions and dates and who brings what. More and more businesses use WhatsApp to communicate with customers.
Why people like this, I do not know. But this is what I observe. Maybe software in general is too shit to use so people prefer to take a sub-optimal WhatsApp-based life over fighting their phones at every step. And I can't blame them.
reply▲>
Where am I wrong? What am I not seeing?The killer application are the group chats. Normies can't make an email group, or remember to reply to all, or even make a Yahoo! Group [dead] or Google Group [dead?].
It's very easy to setup a group in WhatsApp and keep the member list updated.
For bonus points, it's very difficult to Ctr-C the info in WhatsApp, it's easier to press the arrow and forward the message to another WhatsApp group.
Once you have all your groups in WhatsApp, it's easier to use it for everything.
PS: Also, a few eons ago in many countries SMS had a cost, and WhatsApp was free, it was so another good point to use it.
reply▲> PS: Also, a few eons ago in many countries SMS had a cost, and WhatsApp was free, it was so another good point to use it.
That was the starting point for sure in many European countries.
reply▲The worst part is that it wasn't even free, you were still paying for the data unless you were on wifi. Standard text messaging was so expensive (and so terrible) that data rates obliterated standard phone services in both quality and price.
I remember paying 23 cents per SMS. Still, carriers were somehow surprised when people moved away from them.
Funnily enough, now that RCS is slowly making its way back, people seem to forget that free unlimited messages are hardly guaranteed with these services. Can't wait for the backlash when the first iPhone users start getting charged for RCS messages with their Android contacts.
reply▲WhatsApp nailed the onboarding experience. In a time where other services asked you to create an account with an email and password, which is enough of a hurdle already, WhatsApp looked up your phone number and said "I'm sending you an SMS, enter the number you received to check we got it right". And then, it never asked for anything ever again.
reply▲WhatsApp was the first messaging app I remember that didn't require a user account.
People already had the phone numbers of people they knew so they just had to install the app and could immediately chat with any of their contacts.
Compared to chat apps with usernames where you started with an empty contact list, the barrier to start using it was very low. It was basically a drop-in replacement for SMS. Group chat and free messages were the reason to switch.
reply▲You're wrong that it's bad. Clearly it cannot be that bad if even your 80 yo grandma can get the gist of how to use it properly.
Can it be better? Probably. Does it need to be better? Clearly not.
reply▲The UX is pretty good. At least as good as any competitor. There are no ads. It's cross-platform. It's secure. Group chats work.
WhatsApp gained dominance when the alternatives were still SMS and BBM! You don't have to resort to ego-boosting put-downs to explain why it is so popular.
reply▲Whatsapp has absolutely taken over communication in the world (aside from the US for some reason)? Funerals, trips, fund-raisers, even interacting with businesses is now done on WhatsApp in my country. When briefly I decided to try to de-Meta my life, I found that it was one service I absolutely cannot do without and still function as a member of my family and community.
reply▲Network effects are network effects. I really feel physical disgust and try to keep it off my devices, but while it's feasible when living normal, uh, "no-lifer" life, it's unfortunately basically impossible when I travel or even just participate in local communities (sports, hobbies, etc). In the latter case, it's simply rude and disorganizing to refuse to join the group chat, in the former case it's often the only way to contact hotel/taxi/whoever you urgently need to contact in some not so touristy spot. (And, BTW, it's designed to be practically unusable, if you don't allow it to access your contact book, so no dirty tricks like that will fly.)
As to "how do they even manage to use it", well, the mere notion of a great UI is vastly overhyped in the first place (by designers themselves, most of all). People get used to just about anything, if they are taught to use it. And learning a 5-step sequence to use any app like this — anybody can do that, even a 85 year old (even if they swear they can't).
reply▲It's really a demographic issue. I've found that in the US relatively few people use WA. Same for SEA. But in western Europe almost everything is done on WA.
From "neighborhoodwatch" via "the school information" to "colleagues". Hell, even governments and public transportation have "feel unsafe? message us on whatsapp 06..." In the Netherlands it's truly omnipresent, and without WA you will be left out socially.
I'm reluctant of WA, because I'm steering clear of Meta as much as possible, but I do use it to: keep in touch with at least 26 social groups - friends, family, colleagues, co-workers, business - keep in touch with my date/love, my mother, and father who lives across the world.
granted, there's a big move going towards signal lately - finally. But I'm not sure how big this really is, and how sticky it will prove. Nor if this is just my bubble or truly across the whole country.
reply▲Even within the Netherlands there are pockets of Signal users, Telegram users, and in rare occasions even people using SMS or iMessage. Facebook Messenger also has a foothold in some circles.
That said, you'd be an outlier if you have a mobile phone but no WhatsApp. Everyone has WhatsApp, but fortunately not everyone is using it for everything. Unfortunately, that usually means using an even less secure and trustworthy messenger service.
reply▲Some people I know seems to be more split between Snapchat, Instagram and Facebook Messenger. I don't see anyone being 99% in one app, and certainly not WhatsApp. I think I know like three people who use WhatsApp privately.
I do know people working in sales and purchasing and I wouldn't be surprised if they had 25% or more of their professional lives in WhatsApp. Previously they used Skype heavily. So this could be interesting for them.
reply▲In South Africa, without WhatsApp you are literally cutoff from communication, so for us it is more like 99.99%
reply▲Certainly not the case for me, but I reckon I could do plenty via whatsapp
- Group Chat for the family (Immediate, Extended Mother Side, Extended Father Side)
- Chat with hole in the wall restaurant to order sometinhg to eat after work
- 2FA for local shop
- Customer Support for local bank
- Work Chat (Informally assigned tasks and general info)
- Chat with Taxi
- Delivery Tracking for local courier
- LLM wrapper
- Invoice/Receipt Sending
- Appointments for many services (Like barbers)
I could more or less do anything, from work related task management, to requesting a new credit card from inside whatsapp
reply▲I have 99% of my life is in Telegram; I like the choice between end to end encryption or not (for unimportant stuff) so I create private groups for everything and just toss everything in there. I have a (custom) LLM bot which does stuff depending what the group is about and what is posted. Telegram is very fast and convenient (yes, because it's not all encrypted, just the stuff you want to be).
reply▲For most people, their life isn't stored in neat Notion database.
Instead, if you have calendar + email + their main messaging apps (e.g. WhatsApp) you cover the majority of it. It's messy and unstructured — but luckily LLMs are great at that
reply▲WhatsApp is a mandatory app in India
reply▲Thank you! I live in Latin America and here WhatsApp group chats are through the roof. I easily receive several hundreds of messages across various group chats. It's not possible nor healthy to read every single message. It would be great if the tool could summarize all the unread messages from the given group chat. I played around a bit with the MCP server and it was having problems to even get the correct group chat. The group chat name could be stored together with the ID in SQLite and, if matched, it should immediately query for messages from that group chat, skipping the need to listing chats and trying to find the correct group chat and, worst of all, failing.
reply▲I am not sure about how other Whatsapp MCP servers build , but I like the design here : Go server to integrate with Whatsapp , scan the QR etc which acts like a bridge and lightsql to store some data
APP MCP server : To interact with the data , app and LLM
reply▲Really cool project, the privacy-first angle and self-hosted design are a huge plus. Curious: have you run into any rate limits or session issues with the whatsmeow API, especially when used continuously by agents?
reply▲Why was Go necessary here? couldn't it just be a whole-python project?
reply▲whatsmeow is in Go. Potentially I could have used a gopy instead and done it all in python (or just done it all in Go)
reply▲By the way, I was just searching for a way not to have the WhatsApp app on my phone while limiting the risk of third party software and/or have my number banned.
Is there a way to login on WhatsApp Web on a server and then proxy or scrape the messages to send them to my phone?
reply▲what I'm most shocked about is that people choose whatsapp and the like for "end to end encryption" then open a window to small companies like OpenAI and Anthropic.
reply▲So lovely. Thanks. Now i can answer all my friends and spouse immediately instead of ghosting them for days.
reply▲Is it possible to augment WhatsApp's SQLite installation with an extension?
reply▲> 99% of your life is stored in WhatsApp
That's quite a presumption to make.
reply▲that's a low estimate if you live in India or south America.
reply▲As others have already said, think about what you're doing when you use this.
If you connect a not-selhosted LLM to this, you're effectively uploading chat message with other people to a third-party server. The people you chat with have an expectation of privacy so this would probably be illegal in many jurisdictions.
reply▲Except basically all of Europe is one-party consent, and things like tech support call centres are already doing variants of this for years.
reply▲One-party-consent only means you can legally record something, it doesn't necessarily mean that you're allowed to share it with (non-government) third parties later.
It could be legal to record and use as evidence in court later, but that doesn't mean you're allowed to share it with some AI company.
reply▲They TOS utilisation of the data under 'Quality and Training purposes', with implied consent by engagement with the service in question - the breadth and application of which has never had a test case to my knowledge.
reply▲Your information is gone the moment you utter words. I can also copy and paste the messages people send me.
reply▲> I can also copy and paste the messages people send me.
Sure you can, but the people can sue you if you paste it into something public. I don't know if you're making some deep philosophical comment but this is something people have been sued and lost for before.
reply▲I would argue that there is no expectation of privacy for messaging apps without end to end encryption. There is always the man in the middle listening.
reply▲Legally, there absolutely is. Because by law, the messaging app operator also can't just publish the stuff you write in a chat.
Even some disclaimer in the terms of service probably wouldn't work if people would generally assume the chat to be private.
And it also doesn't even matter because WhatsApp claims to be E2E-encrypted.
reply▲WhatsApp has end-to-end encryption
reply▲Yes, but it also has a back door so it is of no use.
reply▲Meta claims WhatsApp is end-to-end encrypted.
It's up to you to trust Meta or not, but people who trust them do have an expectation of privacy.
reply▲hombre_fatal2 days ago
[-] That's irrelevant here because the OP is running the LLM on one of the ends, so it's decrypted that same as when you're reading the chat convo yourself.
It also misses the mark because you're talking about an eavesdropper intercepting messages and the OP is the receiver sharing the messages with a third party themself.
reply▲greenavocado2 days ago
[-] > The people you chat with have an expectation of privacy so this would probably be illegal in many jurisdictions.
Name one
reply▲Germany.
You have a "allgemeines Persönlichkeitsrecht" (general personal rights?) that prevents other people from publishing information that's supposed to be private.
Here's a case where someone published a facebook dm for example:
https://openjur.de/u/636287.html
reply▲So here's the deal with German law on this topic - there's actually a big difference between sharing someone's DM and running LLM tools on social media conversations. The OLG Hamburg case from 2013 (case number 7 W 5/13) establishes that publishing private messages without permission violates your personality rights ("allgemeines Persönlichkeitsrecht"). While we don't have specific LLM court rulings yet, German data protection authorities have been addressing AI technologies under GDPR principles. The Bavarian Data Protection Authority (BayLDA) and the Hamburg Commissioner for Data Protection have both issued opinions that automated AI processing of personal communications requires explicit legal basis under Article 6 GDPR, unlike simple sharing which falls under personality rights law. The German Federal Commissioner for Data Protection (BfDI) has indicated that LLM processing would likely be evaluated based on purpose limitation, data minimization, and transparency requirements. In practice, this means LLM tools could legally process conversations if they implement proper anonymization techniques, provide clear user notices, and follow purpose limitations - conditions not required for the simpler act of sharing a message. The German courts distinguish between publishing content (governed by personality rights) and processing data (governed by data protection law), creating different standards for each activity. While the BGH (Federal Court) hasn't ruled specifically on LLMs, their decisions on automated data processing indicate they would likely allow such processing with appropriate safeguards, whereas unauthorized DM sharing remains almost always prohibited under personality rights jurisprudence regardless of technical implementation.
reply▲It sounds like you agree with me that the posted tool would not be legal to use in Germany then? Or am I misreading this comment?
Your initial „name one“ comment sounded like you didn’t believe there would be a jurisdiction where it is illegal.
reply▲The so-called expectation of privacy is irrelevant in this context
reply▲But it would still be illegal to use? Does the exact mechanism matter?
reply▲How would this stand up to the "I didn't do it, I probably got hacked!" defense? It's one thing to publish personal conversation, and another to have your conversations aggregated by some LLM (and if they leak plain-text, the "hacked" defense is even more plausible).
reply▲That’s a separate issue. You might not be able to prove it as the victim, but that doesn’t make it legal.
reply▲I would say it's a gray area at best/worst. I think the goal of the law is that you shouldn't e.g. take a screenshot of a message someone sent you in confidence/in private, and use it to make fun of, or shame them on a public forum (or whatever else - but a "targeted action").
This scenario however is "I take my personal data an run it through tools to make my life easier" (heck, even backup could fit the bill here). If I'm allowed to do that... am I allowed to do that only with tools that are perfectly secure? Can I send data to the cloud? (subcases: I own the cloud service & hardware/it's a nextcloud instance; I own it, but it's very poorly secured; Proton owns it and their terms of use promise to not disclose it; OpenAI owns it and their terms of use say they can make use of my data)
reply▲As a non-lawyer:
> am I allowed to do that only with tools that are perfectly secure?
No, actual security doesn't matter at all, but you have to think that they are reasonably secure.
> Can I send data to the cloud?
Yes, if you can expect the data to stay private
> (subcases: I own the cloud service & hardware/it's a nextcloud instance;
Yes
> I own it, but it's very poorly secured;
No
> Proton owns it and their terms of use promise to not disclose it;
Yes, if Proton is generally considered trustworthy.
> OpenAI owns it and their terms of use say they can make use of my data)
No
reply▲Your thesis implies that before using my data I am compelled by law to know very well the terms of use; I think the opposite has happened in practice, especially in Europe the trend is to say that lengthy TOS don't mean that companies can do whatever they want/ just because the end-user clicked "I agree" doesn't automatically make them liable, in the eyes of the law, to know and understand all implications of the TOS. That's undue burden.
I guess you can argue that "I should've known that OpenAI will use my conversations if I send them to ChatGPT" but I'm not convinced it'd be crystal clear in court that I'm liable. Like I said.... I think until actually litigated, this is very much a gray area.
P.S. The distinction you make between "properly secured" and "improperly secured" nextcloud instance would, again, be a legal nightmare. I guess there could be an example of "criminal negligence" in extreme cases, but given companies get hacked all the time (more often than not with relatively minor consequences), and even Troy Hunt was hacked(https://www.troyhunt.com/a-sneaky-phish-just-grabbed-my-mail...) - I have a hard time believing the average Joe would face legal consequences for failing to secure their own Nexcloud instance.
reply▲That case describes publishing this to the public internet. I don't believe the same would apply when using a tool like this.
My family members all back up our conversations to Google Drive, I doubt WhatsApp would provide that feature if it were illegal.
reply▲Well it would depend on which LLM you use and what their terms are.
But if they use your input as training data, that would probably be enough.
reply▲We'll have to see. Tools like these are already common on platforms like LinkedIn, so if it's legally questionable I expect the courts to cover it soon enough.
My German isn't good enough to read the original text about this case, but if the sentiment behind https://natlawreview.com/article/data-mining-ai-systems-trai... is correct, I wouldn't be surprised if this would also fall under some kind of legal exception.
The biggest problem, of course, is that regardless of legality, this software will probably be used (and probably already is being used) because it's almost impossible to prove or disprove its use as a remote party.
reply▲> My German isn't good enough to read the original text about this case, but if the sentiment behind
https://natlawreview.com/article/data-mining-ai-systems-trai... is correct, I wouldn't be surprised if this would also fall under some kind of legal exception.
That's something completely different. One is about copyright of stuff that was shared publically, while the other is about sharing private communications, violating their personal rights (not copyright).
But of course, we'll have to see, I'm not a lawyer either.
reply▲reply▲I believe echoangle’s concern is about the security and privacy of the LLM using the data, not the MCP server itself.
reply▲ah right. my bad.
reply▲Sorry, I should have added my second thought. Your original comment about isolating MCP servers is also good!
These are tools where the AI may tell you it’s doing one thing and then accidentally do another (I had an LLM tell me it would make a directory using mkdir but then called the shell command kdir (thankfully didn’t exist)). Sandboxing MCP servers is also important!
reply▲could i use this to create groupchat summaries?
reply▲deutschepost2 days ago
[-] It is crazy how out of touch people on this platform can be. I live in Europe but have been to Asia and South America. People use WhatsApp everywhere. Just because you live in North America where everyone uses SMS/iMessage/whatever doesn't mean everyone does. I can remember my parents scolding me because they got charged for me receiving some SMS. WhatsApp was a gamechanger. You could send messages or pictures without having to think about the price of it (While being connected to a WLAN...). So at some point no one used SMS anymore. iMessage was out of the question also, because only a very small amount of people had iPhones. And everyone was scared of sending a Message because you wouldn't know if the Message would cost you or not. But everyone had WhatsApp.
For some people it is a requirement to have a social life. It is not your choice to use it or not. Network effects are taking care of that. If you think Signal or whatever is a better choice, good on you. But if you don't want to cut ties with some of your friends, prepare to use multiple apps. Including WhatsApp.
reply▲I live in europe and my social life is spread out between telegram, whatsapp, signal, discord. For professional life you could even include linkedin and slack.
A single MCP server wont ever cut it for me and adding 6 for 6 different tools will confuse any llm and make it send the message to the wrong person.
Completely ignoring the fact that I would not use this anyways... like people want to talk to ME not an LLM. Whatsapp has a chat with Llama now anyways if that's who they want to talk to.
reply▲> WhatsApp was a gamechanger. You could send messages or pictures without having to think about the price of it (While being connected to a WLAN...)
Back, when data plans were around 1GB or less some network providers didn't charge you for using whatsapp on specific plans in Europe. There were also whatsapp branded sim cards, but I haven't seem them for a long time though.
reply▲If I find out someone pipes my chat messages into a LLM, I will not converse with that person anymore.
reply▲Since Meta is a big AI investor, I suggest you skip WhatsApp altogether.
reply▲You think they (plan to) decrypt messages and then upload them again in plain text to a server?
Since on-device processing is neither as objectionable nor could be very large
I don't use WhatsApp myself because of who runs it and there are plenty of better options out there, so I certainly agree with the sentiment of steering clear, but this claim does seem pretty far out there
reply▲They don't plan it because they have no use for it. They only care about the metadata. When you talked to this person; your wife; at what time of day; was it at night; how long is the message; was there a product mentioned in the message; was the message about sports; etc.
reply▲They don't plan it, because so far, they don't have the keys to do so.
We do need to trust Meta that they really don't, to some extent, but people way smarter than me have researched the WA implementation of the Signal protocol and it seems solid. I.E: Meta appears to simply be unable to read what you chat and send. (but TBC: they do see with whom and when you do this, just not the contents).
reply▲What prevents them from simply pushing an update that quietly uploads private keys or unencrypted messages to their servers
Presumably they use proper HTTPS, so all the data is essentially encrypted twice, if they just concatenate some packets with keys, it would be extremely difficult to detect as you'd need to decrypt HTTPS (which is possible if you can install your own certificates on a device), then dig through random message data to find a random value you don't even know.
reply▲At least on Android it's possible to dissect an app. You won't get the original java code, but static analysis is possible. And indeed, it's possible to capture it's network traffic and even often decrypt that traffic (with root access to the device). Now, I, or you may not research at this level, but someone looking into wether they may use WhatsApp to discuss attack plans on, say, Jemen, might find such weaknesses.
People find exploits in proprietary code, or even SaaS (where researchers cannot even access the software) every day.
People at Meta might leak this information too.
"Information wants to be free"
My point is: the risk of this becoming known is real.
reply▲> What prevents them from simply pushing an update that quietly uploads private keys or unencrypted messages to their servers
Reputation
Or what's the translation of bank run but generic for any service? Leegloop in Dutch. Translator gives only nonsense. Going for the descriptive route: many people would leave because of the tarnished reputation
The trick is to have Facebook continue to believe that this reputation/trust is more valuable than reading the messages of those who stay behind, which can partially be done by having realistic alternatives for people to switch to so that there is no reason to stay when trust is broken. Which kinda means pre-emptively switching (at least to build up a decent network effect elsewhere), which is what I've chosen to and encourage anyone to also do. But I'm not a conspiracy theorist who thinks that, at the present time, they'll try to roll out such an update in secret, at least not to everyone at once (intelligence agencies might send NSLs with specific targets)
reply▲They don't have the keys, but they probably can get them.
reply▲That's a strong accusation.
The only way I can think of, is by pushing an update that grabs all your keys and pushes them to their servers.
Otherwise, it's pretty decent set up (if I am to believe Moxie, which I do)
reply▲tobyhinloopen2 days ago
[-] That's a completely reasonable boundary. Privacy and consent are critical, especially when sharing personal messages or conversations. It's fair to expect that your interactions remain private unless you've explicitly agreed otherwise. If you'd like, you can communicate your stance clearly to others in advance, ensuring they're aware of your boundaries regarding the use of your messages with AI tools or other external resources.
reply▲I understand why one would think it's funny to feed the parent comment into an LLM but please at least label when you echo such output on the site
reply▲I don't think their main concern was the privacy aspect.
reply▲What do you think their concern was? I can't see any other issues someone might have.
reply▲Energy usage is another. What would happen to world power consumption if 1% of WhatsApp chats would be fed to ChatGPT?
A third reason besides privacy would be the purpose. Is the purpose generating automatic replies? Or automatic summaries because the recipient can't be bothered to read what I wrote? That would be a dick move and a good reason to object as well, in my opinion
reply▲> What would happen to world power consumption if 1% of WhatsApp chats would be fed to ChatGPT?
The same thing that happens now, when 100% of power consumption is fed to other purposes. What's the problem with that?
reply▲Huh? It's additional power draw in the midst of an energy transition. It's not currently being used differently. What do you mean what's the problem with that?
Also don't forget it's just one of three aspects I can think of off the top of my head. This isn't the only issue with LLMs...
Edit: while typing this reply, I remembered a fourth: I've seen many people object morally/ethically to the training method in terms of taking other people's work for free and replicating it. I don't know how I stand on that one myself yet (it is awfully similar to a human learning and replicating creatively, but clearly on an inhuman scale, so idk) but that's yet another possible reason not to want this
reply▲If people need additional power, they pay for it. If they want to pay for extra power, why would we gatekeep whether their need is legitimate or not?
reply▲Because of the aforementioned shortage. Paying for more power means coal and gas gets spun up since there aren't enough renewables, and the externalities aren't being paid for by those people
I'm also happy to have them pay for the full cleanup cost rather than discourage useless consumption, but somehow people don't seem to think crazy energy prices are a great idea either
Also you're still very focused on this one specific issue rather than looking at the bigger picture. Not sure if the conversation is going anywhere like this
reply▲What's the bigger picture? You said "power usage", "to what purpose?" (you kind of don't get a say in whether I use an LLM to reply to you, though you're free to stop talking to me), and "objections to the training method", which doesn't really seem relevant to the use case, but more of a general objection to LLMs.
reply▲Where do you draw the line? LLMs for searching, BM25 for searching, exact match only, no processes at all (forbid whatsapp search, make them scroll)?
reply▲Funny that people freakout about a local LLM while using Facebook products. They're probably the same types who use it to do their work.
reply▲> They're probably the same types who use it to do their work.
Citation needed.
It's a local LLM with access to an extraordinary amount of personal data. In the EU at least that personal data is supposed to be handled with care. I don't see people freaking out, but simple pointing out the leap of handing it over to ANOTHER company.
reply▲Not all Meta products are alike. WA has E2E encryption, has had it for a long time. It's the same protocol as Signal: in fact, it was built for/in WA by Moxie/signal a while ago.
That doesn't make the metadata private. Meta can use that as they want. But not the contents, nor the images, not even in group chats (as opposed to Telegram, where group-chats aren't (weren't?) E2E encrypted).
What you say or send on WA is private. Meta cannot see that. Nor governments nor your ISP or your router. Only you and the person or people you sent it to can read that.
It's a d*ck move if they then publicize this. And, as others pointed out, illegal even in many jurisdictions: AFAIK, it is in my country.
reply▲Do you think it'd be okay if they used a local LLM, via ollama, and this MCP server?
reply▲InfiniteLoup2 days ago
[-] Personally, I would say that still reeks of being manipulative. I've received messages from a friend which were definitely LLM-generated, it made me like that person considerably less
reply▲If they use the LLM to search ("when did X tell me about that party somewhere around Y's neighborhood") then I don't think there's any problem.
If they configure it to indicate a prefix, for instance when answering questions like "when are you free to hang out" and it responding "[AI] according to X's calendar and work schedule, they may be available on the following days" I might also consider that somewhat useful (I just wouldn't take it as something they actually said).
If they're using LLMs to reword themselves or because they're not really interested in conversing, that's a definite ick.
I would personally use such a system in a receive-only mode for adding things to calendars or searching. But I'd also stick to local LLMs, so the context window would probably be too small to get much out of it anyway.
reply▲This is actually something I am curious about, if for example I use this and I and streaming all my contacts information and messages externally, surely I'm breaking privacy laws in some US states and certainly in the EU.
This seems sketchy to me.
reply▲It very much depends on the specifics around use cases, parties, and jurisdictions. In plenty of them, you're allowed to record and keep track of conversations you're taking part in, as is the other party, but publishing those on the internet would he illegal.
Processing them (like compressing them to mp3 files or storing them in cloud storage) is probably legal in most cases.
The potential problem with LLMs is that they use your input to train themselves.
As of right now, the legal status of AI is very much up in the air. It's looking like AI training will be exempt from things like copyright laws (because how else would you train an LLM without performing the biggest book piracy operation in history?), and if that happens things like personal rights may also fall to the AI training overlords.
I personally don't think using this is illegal. I'm pretty sure 100% of LinkedIn messages are being passed through AI already, as are all WhatsApp business accounts and any similar service. I suppose we'll have to wait for someone to get caught using these tools and the problem making it to a high enough court to form jurisprudence.
reply▲I hope you never contacted anyone with a business account then.
reply▲You should just assume that every single thing that you type into an electronic device made after the 90s gets piped into a LLM anyway
reply▲Zuck is already piping it into much worse
reply▲kingkongjaffa2 days ago
[-] I mean, the technology is not the issue. Someone can read your past conversations today and take diligent notes to unearth the same insights an LLM might, if they were so inclined.
This might a actually be helpful for people with poor memory or neurodivergent minds, to help surface relative context to continue their conversation.
Or sales people to help with their customer relationship management.
reply▲If 99% of your life is stored at one of the biggest advertising companies in the world you already gave up on privacy anyway...
reply▲All messages are e2e encrypted though right?
reply▲Messages are. I do not believe that metadata about the messages are, however. So they know who you're speaking to, at what frequency, and from where.
reply▲Fair enough. That's not a problem to me but I can see it being an issue for people requiring complete anonymity. Are there any alternatives to WhatsApp that would fix this problem?
reply▲Ricochet Refresh, but it is missing crucial features.
Try Briar, I think it does not store metadata either?
reply▲Signal. But not because they cannot read this metadata (they can) but because they promise they don't.
reply▲my mom showed me her phone the other day - she had updated her Whatsapp App and now the search bar has changed from "search your chats" to "search chats or ask Meta AI anything".
I've googled a bit but did not find an option to disable meta AI and also found no definitive answer what Meta AI actually does - if i search for a chat, does it use this chat as context to provide answers? does it run locally (i highly doubt that)? is it only another interface to chat with an llm?
It sure seems that this might be a stepping stone to pipe Whatsapp Messages into Meta AI and thus ignoring e2ee, not sure if it's done already but the line is getting quite thin.
reply▲Whatsapp has no meta ai built in
reply▲I think they are rolling it out, i don't have it but a couple of friends do. I guess they are testing
reply▲I have whatsapp open on my phone right now. There is blue-pink button on the home screen in the lower right corner. If you press it, it switches to a chat that says "Ask Meta AI anything".
reply▲What guarantee do I have that what moves between soft keyboard and the message window is not intercepted and same goes for displayed messages?
It's a closed source client. End to end encryption means nothing.
reply▲So this is a special case where two wrongs DO make a right when directed at a single victim?
reply