I gotta say I am continuously amazed how much Musk is allowed to get away with. I know he can get some things done and he is, apparently, skilled manager, fund raiser and bs'er of epic proportions, but I have a hard time understanding how all this didn't catch up to him yet.
If you can’t walk into a pizzeria on a whim (thinking about sama this week), you no longer have your freedom.
The myth of the tortured life of the rich and famous is a joke.
Sure - but it’s a risk that people of the unknown variety don’t have.
I would much rather have lived in the world where he could walk into a pizza place without security and fear. But I also would much rather have lived in the world where people had healthcare and they didn't have to fight companies like UHC tooth and nail to avoid getting their claims denied at every turn.
And the stability of this all comes from trust in Government regulation. Which he gutted, when he gutted the programs that were targeting him for committing fraud.
I’m not sure I would.
I think I’m okay with there being some remaining consequences for things you do to other people,
even if what you’ve done is legal.
> I think I’m okay with there being some remaining consequences for things you do to other people,
I would rather live in the world where they hadn't done those things to other people. Not that I would rather people forgive them and let them into a pizza shop. It's a pizza shop, who cares, they have a million other ways to get pizza if they want it. Ostracism is the last response we have for the wealthy doing unacceptable things.
(Having the money would be nice, of course.)
so you feel like less of a pariah?
you’re still a social outcast,
no matter how much expensive Truman Show you set up around you.
but I like to serve humans, not serve humans.
The elder generation of politicians he manipulated were educated in their day in historical allegory and gospel
Leadership is ignorant and clueless. No idea how "check the work" so to speak. Didn't matter to them. Trickle down made them rich. They ultimately began to encourage it
Congress is predominantly nihilists who pretend to believe in American norms to secure power
I don't like it either, but it's not capitalism. Nothing Musk has done would have been possible without active support from governments putting their fingers on the scale.
Class actions in the Netherlands mostly favor lawyers.
I think, this is a calculation to understand if an upgrade of hw3 to hw4 actually solves the problem or if hw3 must be updated to hw5.
One upgrade is more economical than two, but I would be annoyed for sure as well.
HW5 is unlikely to solve FSD.
It's run by the person mentioned in the article, and unsurprisingly the domain is Dutch, but seems the same thing will apply in lots of countries if FSD rolls out there too, not just Netherlands.
https://law.justia.com/cases/federal/district-courts/califor...
On page 16, the judge states that the defendants, Tesla, argued: “Defendants also assert that several Safety Statements are corporate puffery. For example, statements that safety is “paramount” (FAC ¶ 325), Tesla cars are “absurdly safe” (id.), autopilot is “superhuman” (FAC ¶ 337), and “we want to get to as close to perfection as possible” (FAC ¶ 363). Mot. at 19.”
In case you forgot you replied to "It worked for fox news [claiming opinion]" with "And MSNBC in court."
Since you clearly misread or purposely misconstrued my statement, let me rephrase:
"Thank Fox for paving the way for inserting ones opinion in news. Did you not know that Fox had to do it in court first?"
After paying the full cost and being stuck on old software that had a promise of having the hardware required for it
People don't talk about these cars driving themselves enough imho
Then, on the way home it drove me home on the wrong side of the street and I had to take over. Such a silly mistake.
Similar to what you said; from there on out, it was more trouble than it's worth because you can't let your guard down.
FWIW: My 2026 Huyndai's driver assistance is better than my old 2018 Tesla Model 3's enhanced autopilot.
I find it less cognitive load to drive it myself. It's easier to predict what other vehicles will do than my own. Boo.
I have sympathy with the challenges as I worked in the field.
It always had the feeling of being outside with your toddler by the pool. I can look away but I have 50/50 odds of a dead toddler if I do it for to long.
Adaptive cruise control, lane keeping, blind spot detection and emergency braking are all the modern automation I want in a personal vehicle at this point. Other drivers are unpredictable, I want to choose how I respond to their various forms of idiocy and not delegate to a black box.
Their role is to stop the train in an emergency and adjust to speed etc. to track/driving conditions.
Automating their job probably wouldn't even need the complex ML used for self-driving because the context is significantly simpler and relatively well defined. Maybe a team in city might need such a model but it would still be a significantly simpler task than driving a car.
I was doing laundry in my basement, and I tripped over a metal bar that wasn't there the moment before. I looked down: "Rail? WTF?" and then I saw concrete sleepers underneath and heard the rumbling. Deafening railroad horn. I dumped my wife's pants, unfolded, and dove behind the water heater. It was a double-stacked Z train, headed east towards the fast single track of the BNSF Emporia Sub (Flint Hills). Majestic as hell: 75 mph, 6 units, distributed power: 4 ES44DC's pulling, and 2 Dash-9's pushing, all in run 8. Whole house smelled like diesel for a couple of hours!
Fact is, there is no way to discern which path a train will take, so you really have to be watchful. If only there were some way of knowing the routes trains travel; maybe some sort of marks on the ground, like twin iron bars running along the paths trains take. You could look for trains when you encounter the iron bars on the ground, and avoid these sorts of collisions. But such a measure would be extremely expensive. And how would one enforce a rule keeping the trains on those paths?
A big hole in homeland security is railway engineer screening and hijacking prevention. There is nothing to stop a rogue engineer, or an ISIS terrorist, from driving a train into the Pentagon, the White House or the Statue of Liberty, and our government has done fuck-all to prevent it.
Triggering psychosis is not difficult and the LLM is easily capable of doing that. For a person they soon get freaked out and are likely to summon help. "Johnny started acting crazy and I'm not sure what to do, please come". But the LLM isn't a person, Johnny needs to know more about the CIA's programme to cross breed Venusians with Hollywood stars? Here's an itinerary with the address of a real hotel in LA and an entirely hallucinated CIA officer's schedule.
Next thing you know, Johnny is shot dead by officers responding to a maniac with a fire axe who broke into an LA hotel and was screaming about space aliens.
I’m pretty sure LAPD is too used to this sort of thing to get spooked by it?
Compared to either one of them, FSD is way less stressful and is a vastly better driver.
I do not want to be the 'manager of my car'. That'd be a downgrade from being an actual driver.
Lane Assist, auto-stop-start, cruise control are enough for me and have been available mostly for decades and require a similar amount of attention.
FSD is a busted flush and I can't believe those who got conned by it aren't more vocal.
In 2024, Donald Trump received 77,302,580 votes.
Do you have a point?
If you’re driving, your brain can automatically prioritize the importance of things that you see. But since a computer fails in different ways than a human, you lose all automatic prioritization
A "self-driving" tesla is an adversary you need to supervise to make sure it doesn't take actions you wouldn't expect of a normal car.
As other posters have pointed out, it's like running an LLM with `--dangerously-skip-permissions`: I wouldn't `rm -rf /` my computer (or in the case of tesla, my life), but an AI might.
One such study is "Performance consequences of automation-induced 'complacency'" (Parasuraman, Molloy & Singh, 1993) https://www.pacdeff.com/pdfs/Automation%20Induced%20Complace...
Previous studies had found that a human and a computer performed markedly better than either a human alone or a computer alone - but in those studies failures were quite common, so they didn't give the humans time to get bored or distracted.
When researchers got test subjects to perform a simulated flying task, monitoring a system with 99%+ reliability, they found the humans were proportionally much worse at stepping in than they were on less reliable systems.
Swimming pool lifeguards will often change posts every 15-20 minutes and and get a 10-15 minute break every hour, to keep things interesting enough that they can pay attention. Good luck getting drivers to do that.
Funny, I was going to mention exactly that. I'm a private pilot with a modern autopilot and flying is exhausting. Partly because the piston engine is rattling your brain the entire time but also because you're on high alert the entire time. You're always making sure the autopilot is keeping the plane on the blue (or green) line and is being predictable. And my smartwatch shows my heart rate is usually more elevated on autopilot than not.
Therefore, you have to be 100% ready at all times to react in case anything that's possible happens.
Sounds way more tiring than just driving yourself and only having to account for the known, relatively easy to model human failure modes.
Have you ever been a passenger of an unpredictable driver? Was that stressful? Now, add not just the capacity but the responsibility to fix their mistakes.
https://www.faistgroup.com/site/assets/files/1657/j3016-leve...
While FSD's manipulation of controls is impressive -- it is missing a very critical component that is required for self driving: the ability to guarantee whether or not it can make a safe decision. Tesla's FSD still offloads this task to the human driver. Once they can do this more than zero percent of the time, they will have achieved level 3.
It’s because driving on the freeway isn’t FSD, it’s a better version of cruise control, and other companies also offer similar capabilities. Within a city, the thing is a shitshow. It does random things all the time and it’s almost a larger cognitive burden on me to constantly be on the lookout for it to make mistake where I have to take over vs me just driving the car myself. For me specifically, it’s just impossible to drive because it fails to recognize curved streets and a couple of other irregularities just within blocks of where I live.
On a freeway it’s only kind of usable. It switches lanes far too aggressively and for no reason, to the point that it makes the ride uncomfortable.
What I really want is auto steer with lane switching when I signal, which for some reason I could never get working in any mode. It either doesn’t change lanes at all, or changes them arbitrarily of its own volition. And if I change lanes manually it turns off autosteer, which is too irritating to use in practice.
Tesla self driving, in any mode, is a bad product. And I say this as a Tesla fan.
FSD is amazing. Any notion it takes more effort to use it than driving is made up.
On the other hand, when I got FSD trials in the model 3 in the last year or so, it never managed to get more than ~a mile without me having to disengage.
Other brands have had self driving features for years now. Some even operate at a higher level of automation.
And that was actual hands-free, while Teslas at the time required you to take putting torque on the wheel to lie to the system.
Even then my 2017 Hyundai did practically everything but steer. Get it on the highway, turn on ACC, and it'll handle the traffic just keep it in the lane. It even did all the stop and go traffic.
Robotaxi either has significant differences from the vehicles they sell or they are operating against highway safety standards. (Which could be the case because NHTSA is currently investigating this)
1. https://www.tesla.com/customer-stories/cross-country-trip-fu...
That’s not at all true. Very few people take the time to drive thousands of miles in a short period of time and document it while never intervening for any reason (not even accidentally bumping the wheel).
Anyways, I remember you. You claimed that Tesla would never remove safety drivers from their robotaxis.[1] I tried to get us to bet on this prediction but you never replied.
Well, Tesla has removed safety drivers from some of their robotaxis, meaning your prediction that their technology is “never going to be reliable enough” was falsified within a few months.
"Tesla has removed safety drivers from some of their robotaxis"
Why doesn't Tesla has as many empty Robotaxis as Waymo has cars? Because Robotaxis aren't good enough. The entire Robotaxi rollout is a very carefully choreographed smoke and mirrors show to inflate Tesla share price. Tesla doesn't even have the permits to operate Robotaxis in California and probably never will because they are actually stricter vs Texas.
I'd like to note that in just a few months, the goalposts have moved from "Tesla will never get rid of their safety drivers in their Robotaxis" to "Tesla will never operate Robotaxis in California."
As before, would you like to bet on your prediction? I'm willing to wager you any amount up to $1,000 that before the end of 2028, Tesla will have Robotaxis in California, available to the public, without a safety driver. Note that this may not require a permit for deployment, just driverless testing, as that is how Waymo currently operates.[1]
December 31st, 2028 may seem like a long ways off, but it's much sooner than never.
1. https://www.dmv.ca.gov/portal/vehicle-industry-services/auto...
They didn't though, they just moved them to a chase car. Because Elon Musk is an incredibly successful con artist. Why isn't every robotaxi driverless and open to the public like Waymo is? Tesla is so far behind Waymo it is laughable.
https://www.reuters.com/business/autos-transportation/musk-t...
Tesla touts California robotaxis but does nothing to get permits
Documenting driverless testing miles is critical for a series of permits
Tesla logged zero miles last year in California for the sixth straight year
Waymo documented 13 million miles over a decade before securing driverless ride-hailing permit
Will you take the bet or not?
1. https://www.reddit.com/r/SelfDrivingCars/comments/1qqlpgg/un...
2. https://waymo.com/blog/2025/12/autonomously-navigating-the-r...
Luckily that's not a requirement here. If your 1 example, your 1 datapoint isn't a fluke, then someone driving 100 miles should have 1/10th the interruptions as someone driving 1000 miles.
If you have to intervene 0 times in 1000 miles, and that isn't an outler, then we should expect to see 0 intervention required in all 100 mile drives, all 10 mile drives, and all 1 mile drives. Unless it's a fluke.
Statistics > 1 anecdotal post about a car on the car company CEO's website.
You can actually do the math to find out the sample size you would need to derive a statistically significant conclusion. No need to be incredulous at my answer to your question: it's basic statistics.
As an aside: in general, if you find yourself going to a single person for data and expertise, you might be falling victim to a personality cult.
Nowhere in this thread am I claiming that FSD is safe enough to be used without human oversight. I'm also not claiming that Tesla has delivered on their promises (they obviously haven't). I am comparing the capabilities of autonomous systems using evidence that is publicly available. Since we don't have comparable statistics across manufacturers or some sort of standardized road test, I think "miles between human interventions" is a useful measure of autonomous capabilities. That's what has been used to demonstrate safety of other autonomous systems, and I see no good reason why such a metric should be ignored in this case.
We don't have data on the intervention rate of Teslas, either. So far all you've presented is a single, non-statistically-significant anecdote.
>I think "miles between human interventions" is a useful measure of autonomous capabilities.
Exactly. What is Tesla's average miles between human intervention? The anecdote you presented is maybe 1/1000th of the minimum data you would need (of randomly selected participants) to answer that question. 1,000 is a bold claim, it requires statistically significant evidence
1. https://www.thedrive.com/opinion/40604/five-things-my-roomba...
2. https://www.thedrive.com/news/a-tesla-actually-drove-itself-...
Totally fully self driving even though you need not one, not two, but three autonomous driving experts with you. And be sure to have a second car with you when your first autonomous vehicle strands you. Sure sounds like a reliable system ready for the masses to use on public roadways!
I’m not making any claims about FSD’s safety or how ready it is for mass usage on public roads. I am trying to figure out what information would convince you that someone has used FSD for thousands of miles without intervening. Does this count or not? If not, why?
I never doubted it, I just said I don't trust things Tesla states on their website (they're objectively known to lie, especially when it comes to videos about their self driving) and I don't trust randos on Twitter.
I will say though, the people in the article have a vested interest in pushing a pro-AV agenda. But in the end, sure, I guess they probably did have that trip they say they did.
It doesn't surprise me people managed to go thousands of miles without disengaging especially since it sounds like this isn't their first time trying (flip a coin enough and you'll possibly get heads several times in a row after all) and that's nearly all highway miles. I've personally driven many shots on a non-Tesla well over 150 miles hands-free without any disengagements on a system that attempts less than what Tesla does. The only disengagements for most of those drives were to exit the highway to charge. You pick a route that has easy to get to chargers, you don't venture off the highways much, sure sounds possible to me. In the end though I don't personally see it as that radical of a difference on a road trip. On a nearly 300mi drive I probably directly operated the car like 5 of those miles total. Is risking people's lives at the surface street parts with beta software worth that last little bit?
Note, that's several thousand miles of no disengagements on a long, pre-planned cross country drive. Not 10,000 miles of driving around in a city and having all the other randomness of life peppered in. So what are we really measuring here? I'm sure we could get it to 500,000mi or more on a closed course if we wanted to. Although, after saying that, they still haven't on the Las Vegas Loop, so...maybe not?
And people act like this is delivering what Elon promised about cross-country autonomous driving. But it's not. They still needed the driver there in the car, paying attention the whole way. They still needed to charge it themselves. So we're a decade late and we still don't actually have what was promised.
Regarding the rest of your comment: Again, nowhere in this thread am I making any claims about FSD’s safety or how ready it is for mass usage on public roads. I am not saying it lives up to the promises or that it has been delivered on schedule. You are making arguments against beliefs I do not hold. That is a waste of time for both of us.
The point I am making is that other brands have zero examples of consumers using them on public roads without intervention for thousands of consecutive miles, so claiming they are equivalent or better in capability to FSD is not accurate.
I called you crazy for citing a website known for publishing lies.
> only to be gaslit about the need to provide it
I never asked for you to go digging. Your own personal drive to stan for Tesla did that.
> claiming they are equivalent or better in capability to FSD is not accurate.
I'm not saying they're 100% exactly the same. I'm just saying spending 5 miles actually operating the car on a several hundred mile road trip is pretty much the same as not touching the wheel at all to me if I'm still ultimately responsible for the operation of the car. In the end I had to completely pay attention the whole way. Until it's actually (Unsupervised) it's really mostly the same to me. Especially since I can't really trust the not-full not-self driving to not get into an accident or kill someone with me as the one responsible.
In the end they're both level 2 systems.
Have Tesla and its fanboys overstated FSD’s capabilities? Absolutely. But I’m not saying that FSD is currently good enough that one should expect to have thousands of miles between interventions. I’m trying to convince someone that it has been done. The reason I’m trying to do this is because that same cannot be said for any other self-driving technology available in a consumer vehicle today, so claiming that FSD is no better than competing offerings is not accurate. FSD overhyped? Sure. Late? Extremely. Fraudulent, bordering on criminal? I could see that. But it’s still in a league of its own in terms of what it can do.
I heard the same thing in 2019, HW3 solved all the issues, it finally just works as advertised. That was after HW2 was guaranteed to ship with all the hardware needed for FSD a decade ago, for real this time.
I'll probably wait for HW5, then you'll tell me its really there. This time it won't even run people over, and it actually stops at stop signs more than just 98% of the time.
Personally I try and avoid systems that drive people in front of trains. https://www.youtube.com/watch?v=vMqTmOTtft4
You realize that a cross-country trip makes that achievement weaker, not stronger, right? That's just a bunch of highway driving, which is the easiest to automate and will have you racking up a lot of miles quickly.
City driving is the real test, not driving a milion miles in a straight line.
I will give car makers the benefit of the doubt: it is difficult to simulate real traffic. You can't do real life tests with teenagers on bicycles.
But like, actually the reason is that the cars consistently make dangerous decisions when they're not being used as "glorified cruise control", and the sensors mean they only even get to do that during "perfect driving conditions"
If I can't go to sleep lying down on the seat as a sole occupant, it's not yet self driving.
What's worse is this is all going to end up happening again when HW5 comes out and all of the HW4 cars start getting a trimmed down version of the FSD software from HW5, like HW3 is currently receiving.
I thought the diver was supposed to keep hands on the wheel in case consuming hits wrong.
That's why Tesla fans buy those weighted gizmos to fool the computer into thinking they're still holding the steering wheel.
Also the EU adopted laws restricting self-driving behavior, making FSD far less capable there. For example, the software cannot exert a lateral acceleration of more than 3m/sec^2. It must also cancel lane changes after 5 seconds after the start of engaging the turn signal. Tesla gimped their self-driving features in the EU & Australia because of this.[1]
It’s only the latest version of FSD (which only runs on HW4) that lacks these restrictions and has been approved for use in the Netherlands. Even then, it requires you to pay attention to the road, so it's not what he paid for.
1. https://electrek.co/2019/05/17/tesla-nerfs-autopilot-europe-...
The math doesn't work out. It should be \euro 20,000,000 in FSD purchases, no?
The fact that FSD in Europe has been massively delayed is mostly the fault of regulators.
Sad but true.
You can use FSD with HW3 in other countries like Canada.