For context, I'm currently working on a HIPAA compliant app that uses AI to collect medical background info, then connects the user with an actual human doctor. To get HIPAA certified, the app code, infrastructure, and LLMs all need to be certified, using enterprise accounts with signed BAAs (contracts) that isolate PII and medical data. This prevents the medical data from being used as training data for the LLM.
HIPAA is not a foolproof system, but it's a crucial piece in the trust puzzle. I wouldn't trust an AI medical app without HIPAA certification. The chance of data leaking out through the LLM or hacks is too high without HIPAA.
Self-identified woman submits her dick pic.
AI: Massive tumor detected. Seek immediate medical help.
Delano should have said they’re going to provide that escape hatch with fast response times.
I wouldn't trust the state-of-the-art "machine learning" classifier, and the app described in the article certainly isn't state-of-the-art.
But an image analysis application like this is exactly what the tech is good at.
I also highly agree with the skeptisism regarding the companies selling the products. I would want this used along with a Dr's diagnosis, as an additional tool.
But since visual analysis doesn't work for almost any STIs, and is intended to be used by uneducated partners that believe a visual inspection would protect them, then no.