How Equifax Was Breached in 2017
257 points
1 year ago
| 19 comments
| blog.0x7d0.dev
| HN
hn_throwaway_99
1 year ago
[-]
I really appreciate detailed breach reports like this. This was the money quote for me:

> The attackers continued their search and eventually discovered a mounted NFS share on the web server. This file share contained notes and configuration files used by Equifax engineers, in which they found many database credentials.

Seriously, WTF? I get paranoid all the time worrying about my application security - it often feels like there is always some potential issue around the corner you don't know about.

But then I read about how lots of these kinds of breaches occur (storing prod DB credentials in plaintext on an NFS share, reusing passwords and not using 2FA, leaving your server password as "solarwinds123", etc.) and I think maybe I'm not so bad after all.

reply
keyle
1 year ago
[-]
I'm totally with you on this one; but remember, if you have 25 developers in groups of 5, in only takes 1 muppet in any of the 5 groups to have low standards, and voila.

I've seen it, in pretty much every large business I've worked in.

This goes back to the saying: "you should never hire someone less good than yourself".

Sadly when the people hiring literally come from sales or airline customer service, your company is boned. It's only a matter of time.

reply
hn_throwaway_99
1 year ago
[-]
I agree with all that, with the one small caveat that more than anything else I think what is most important about security is a strong security culture at a company. All the checklists and compliance frameworks in the world are doomed in the face of a poor security culture. On the flip side, a strong and constantly reinforced security culture can help protect against the occasional muppet.

One example: years ago I started work at a tech company (a fintech no less), and shortly after starting I asked the head of customer service how I could get an account to access an internal admin portal (I was an engineer and needed to understand some of ops processes). "Oh, you just log in with my account, and the password is <CompanyName><Year> - all the reps just use that shared account" I got an immediate sinking, sinking feeling of despair.

reply
foota
1 year ago
[-]
Better than culture is enforced guarantees, nobody can store the database password on an NFS share if it's not available to them.
reply
hn_throwaway_99
1 year ago
[-]
I honestly believe that enforced guarantees come about through security culture though.

Meaning a strong security culture means you do appropriate secrets management, and importantly, everyone understands how secrets management should be done. That way if you have the occasional breach in your automated enforced guarantees (e.g. the article talks about how Equifax missed one of their vulnerable systems to patch), that if people see a problem they will speak up.

That is, I agree with enforcing guarantees as much as possible, but any engineer on that team who came across an NFS file with DB credentials should have spoken up loudly about "Why TF are these DB credentials present on a network drive?"

reply
mcny
1 year ago
[-]
I think you will love the way Microsoft handles this. Basically, there are automated flags that seem to ding managers (M1 I believe) so they will make sure the people in their teams handle these.

> any engineer on that team who came across an NFS file with DB credentials should have spoken up loudly about "Why TF are these DB credentials present on a network drive?"

This requires empowering your employees and the lower case a while with its cross functional teams which most managers hate.

reply
eli
1 year ago
[-]
If you take away NFS shares without providing a better way to store and manage access controls, engineers will eventually just come up with an even worse solution.

I'm skeptical you can even fix this without a culture change, but you definitely can't do it just by taking things away.

reply
foota
1 year ago
[-]
Yes, I agree. I realize this isn't feasible everywhere, but having access tied to a user account (and then auditing and limiting that access) can serve as a replacement. E.g., want to select a single row? Fine, but if they're dumping the db something is phishey.

Ironically, user accounts are in one sense more secure (than a system account with a shared password) because they can use 2fa (and there's no inherent need to distribute the password).

reply
mysterydip
1 year ago
[-]
All it takes is outsourcing one portion to the lowest bidder, and you get what you pay for.
reply
rawgabbit
1 year ago
[-]
What is egregious is that Equifax isn't some random company. Equifax is a credit reporting agency and is a goldmine for identity thieves.

After the Equifax breach, I just assume now if an identity thief gives a half assed effort, he can pull up the PII for any American resident. How Experian/Equifax/Transunion can honestly say they have accurate data without physically verifying driver licenses, identity cards, or passports is beyond me.

reply
coldpie
1 year ago
[-]
> This goes back to the saying: "you should never hire someone less good than yourself".

It's a good idea, but now your product is twice as expensive as the other guy's, and you're not going to win any bids that way, and now you're out of business.

reply
MilStdJunkie
1 year ago
[-]
It feels like this wouldn't be all that hard to keep out of deployment with a githook and a more-or-less simple regex.

I did something very similar for a publishing system, where, well, long story short, but equivalent of local variables[1] were verboten in the production repo. On the other hand, the writers were constantly asking me to override the precommits, so, well, there's the muppet argument.

[1] https://docs.asciidoctor.org/asciidoc/latest/attributes/cust... Basically, declaring an attribute in an include target. Variable scope is one of those things getting debated in the Asciidoc world these days. Ha ha ha welcome to the 1997 SGML technical steering group, suckas. You're discovering why HTML doesn't have transclusion.

reply
totallywrong
1 year ago
[-]
I work mostly on Infra / DevOps. I'm constantly amazed by the absurdly low standards most devs have when it comes to security. The path of least resistance is chosen 99% of time. You're definitely the exception.
reply
mejutoco
1 year ago
[-]
IMO often this just reflects the priorities of the organization.

While it is the responsibility of devs, some system needs to be put in place to actually enforce it. Like, do not have an nfs shared volume, or incentivise anybody to report these, and give incentives for it. Otherwise "just be very careful" advice slows development to a halt.

reply
totallywrong
1 year ago
[-]
I only half agree. Of course the company needs to have processes in place. But having shared storage can serve a multitude of legit purposes, and I feel like some alarm should go off in a developer's mind when he thinks of just dumping secrets there. Or the next one who comes and use them.

Or you might not have shared storage, and then they'll just put the creds in a Google spreadsheet like I've seen very recently.

reply
qingcharles
1 year ago
[-]
I rooted a major web hosting provider back in the early 2000s by uploading a web shell as a profile pic (they weren't checking pictures were image files).

Once I was in it didn't take long from rummaging around in the files to first find the database credentials in a config file, then eventually finding the root password to their servers, which in fact was simply "internet" o_O

I was a nice guy so I sent them an email with their passwords and told them they might want to upgrade their forum software.

reply
ahhfgshando6698
1 year ago
[-]
It's nfs so I always assume corporate acceptance of Microsoft is the root cause. Old companies like this are often incredibly insecure due primarily to the wildly irresponsible practice of having everything on active directory and network drives in general.

Once you've got a whole company that relies on dumping stuff on a network drive you're pretty much fucked, it's very difficult to get non-technical users to switch to SFTP. It's like pulling teeth.

reply
dwd
1 year ago
[-]
Not mentioned here was that the group that exploited the vulnerability handed over to PLA linked individuals who then conducted the exfiltration.

https://www.justice.gov/opa/pr/chinese-military-personnel-ch...

As far as I am aware the data has never been seen on the open market, so there's a whole other National Security story around whether the information was used to compromise individuals with credit issues for commercial and military espionage purposes. It would seem that this was known very early on and possibly factored into the settlement.

reply
bbarnett
1 year ago
[-]
Also mis-mentioned, is that I heard nothing was "missed" but security upgrades were not possible due to the age of the stack.

Pre-0 days are one thing. But leaving systems unpatched for months, because your stack is too old, is a common, but inexcusable theme.

This is why it is vital to use libraries, frameworks, with a stable, unchanging LTS branch. Failure to do so, means a security update that needs to be applied instantly, cannot be done, without extensive app changes.

New shiny is fine. But it must never, ever override basic security concerns.

Security comes first. Not last. Always.

reply
tivert
1 year ago
[-]
> This is why it is vital to use libraries, frameworks, with a stable, unchanging LTS branch. Failure to do so, means a security update that needs to be applied instantly, cannot be done, without extensive app changes.

It's also another reason why it's important to provide such things.

It's amazing to me how many people seriously argue it's fine to aggressively drop support for old versions and old features to focus on the newest stuff (and that it's totally fine for table-stakes of "having software" is to have engineers continuously working to keep up with changing dependencies.

The reality is the cheapest thing for society is to offer very long term support for old versions, even if it's just security patches, or well-tested backwards-compatibly features in newer versions. It's not sexy work, but it's important.

reply
bbarnett
1 year ago
[-]
Quite true.

But such things do exist, you just have to vet things first.

For example, stick to a non-rolling distro, such as debia n stable. Everything there will have around 3 years support, with all the security updates done for you.

Debian backports almost all security patches, or sticks with an LTS variant of something (like php) for its lifetime.

No surprise API changes, no sudden need for code changes.

So many people use the latest shiny, and literally only because they're told to. Many need nothing from that bleeding edge version.

When it comes to frameworks, some have LTS versions, stick with those.

And things like node? Heh.

reply
rmbyrro
1 year ago
[-]
You're certainly right.

But in this case, no LTS would have covered, since the system was decades old.

The issue was that they had a poorly maintained service, hugely outdated, which is hard to secure, mingled with their main up-to-date stack.

Lesson: isolate the bad lemons from the good ones.

reply
bbarnett
1 year ago
[-]
What's happening, is companies are doing a risk assessment, and thinking "well, it probably won't be hacked, we don't need to replace this with something that is auditable, and secure".

That needs to 100% end. There are also cases where companies think, "Well, it will take use 3 weeks to update this stack, we'll leave the old, vulnerable code online for that 3 weeks, plus testing, and plus (of course) push, so 2 months, even though this is a very easily exploitable, high profile CVE".

That too needs to end.

The only way that can end, is if fines are WELL beyond any possible savings, including being 100s of times more than those savings, so that companies will TREMBLE IN FEAR at the very idea of leaving unpatched servers online. Your stack will take 2 months + testing to upgrade?

Because you chose a stack without an easy way to upgrade instantly?

then you take your stack offline, and tough if it bankrupts you.

Because otherwise, we'll fine the company dry, and its directors, and jail the CTO, and the employees who knew.

reply
dwd
1 year ago
[-]
Been a few years since I read it, but worth a look due to the detail it goes into.

https://www.hsgac.senate.gov/wp-content/uploads/imo/media/do...

reply
hnthrowaway0315
1 year ago
[-]
"They routed traffic through approximately 34 servers located in nearly 20 countries to obfuscate their true location, used encrypted communication channels within Equifax’s network to blend in with normal network activity, and deleted compressed files and wiped log files on a daily basis in an effort to eliminate records of their activity."

I wonder how they managed to figure that out. Did they have to look into each of the servers?

How did they get the names?

reply
bbarnett
1 year ago
[-]
They had months to work at it. Due to Equifax's incompetence.

It should be noted, the "official" report is what investigators have been told, not what really happened behind the scenes. Naturally Equifax and its employees tried to play the poor, innocent, helpless corp, with those dastardly hackers almost mysteriously getting in.

reply
hnthrowaway0315
1 year ago
[-]
Thanks, might be my misunderstanding but I was trying to figure out how the investigators managed to figure out. Feels like the only sure way is from a leak from China, but theoretically they can also track all those servers.

The whole attack/investigation is super fascinating.

reply
g051051
1 year ago
[-]
The initial vulnerability was in the "modern" web application that fronted the legacy mainframe application. It was completely patchable, but was missed as described in the article.
reply
bbarnett
1 year ago
[-]
A little bird told me otherwise.
reply
TheHappyOddish
1 year ago
[-]
What an odd thing to do. I assume the US has no way to enforce this, so it's effectively just... a press release?

I also found this funny:

> “Today, we hold PLA hackers accountable for their criminal actions, and we remind the Chinese government that we have the capability to remove the Internet’s cloak of anonymity and find the hackers that nation repeatedly deploys against us.

And then:

> The details contained in the charging document are allegations. > The defendants are presumed innocent until proven guilty beyond a reasonable doubt in a court of law.

reply
hnthrowaway0315
1 year ago
[-]
Just curious how can I check whether a data set is on market?
reply
mnw21cam
1 year ago
[-]
One of the best ways is to look at the HaveIBeenPwned service https://haveibeenpwned.com/ - Troy Hunt goes around spending effort finding these things, so whether he has found it is a reasonable indication of whether it is out there. There's a list of the breaches he has included at http://feeds.feedburner.com/HaveIBeenPwnedLatestBreaches

Edit - sorry, a better list of breaches is at https://haveibeenpwned.com/PwnedWebsites

reply
hnthrowaway0315
1 year ago
[-]
Thanks a lot, didn't know this guy.
reply
justinclift
1 year ago
[-]
Didn't Equifax receive practically no penalty for it though?

So, what would be the motivation to avoid future things like this happening again?

reply
Mistletoe
1 year ago
[-]
I can’t remember what I got but yes it amounted to nothing. Free credit monitoring that had a million upsells and dark patterns meant to make them more money. That’s my recollection but it’s all fuzzy because we’ve all been breached so many times.
reply
RachelF
1 year ago
[-]
A tiny penalty.

The CIO got a $3M bonus, too. Odd thing is that she had a music degree and little experience in IT, but was an old friend of the board members.

reply
hn_throwaway_99
1 year ago
[-]
Not enough downvotes for this. I'm assuming this is all BS considering you got all the details wrong. It was the CEO who got a $3 million bonus in 2016, not the CIO. Susan Mauldin, who earned a music degree in college, was the Equifax CISO, not their CIO.

The reason I'm so salty about your response is when the breach happened, there were tons of news reports denigrating the CISO because she had a music degree. There may be a ton of reasons she wasn't good at her job (though it's hard to say as CISO is often a "sacrificial lamb" job anyway), and I'm certainly not defending Equifax, but I take major issue with the implication that a music degree makes someone unqualified for a tech job.

First, as she was CISO, she was presumably done with college many, many years ago. Lots of people have college degrees that aren't necessarily directed to the career they end up in. More importantly, though, I've found that there is a direct correlation between highly trained musicians and great software engineers. I don't know if it's a "same part of the brain" thing or whatever, but I'm actually astounded at the sheer number of "best of the best" software engineers I've worked with that are classically trained musicians. It's to the point that when hiring I give "extra points" if you will to musicians because, it my experience at least, the correlation is so strong.

So, frankly, you can take your "she had a music degree" shade and shove it.

reply
jstarfish
1 year ago
[-]
As a former insider, I agree with all of this. Thanks for being a voice of sanity.

The music degree scrutiny is unnecessarily derogatory and borderline misogynistic. She was a fine executive and predictably the first one thrown under the bus. I can't say she revolutionized anything, but I had no complaints about her competence. (By comparison, the male C-levels in the company I currently work under have relevant degrees from impressive institutions. I see them watching porn, engaging in insider trading and doing God knows what on Tor...while our latest two product launches failed.)

Equifax's fate was sealed by the CEO himself. We had highly-competent security teams that kept up with CVEs, ran CABs, everything a "secure" org should do...but there was always a top-down culture of "I'm not saying don't patch systems, but don't impact production" at every level. This sort of event was inevitable under Smith's leadership.

reply
inhumantsar
1 year ago
[-]
It was also widely reported that she had no apparent security background, so maybe provide some evidence that the shade was unwarranted?

Of course lots of people get into tech from non tech but it's not a reason to go off on the commenter with an angry screed.

Also, she was CSO not CISO or CIO not that there's much of a difference between those titles in practice anyway.

reply
hn_throwaway_99
1 year ago
[-]
Show me any reference that says she didn't have a security background in her actual job experience. I couldn't find her roles but she had jobs at First Data, Sun Trust Bank, and Hewlett Packard. This article makes the same point: https://www.thesslstore.com/blog/equifaxs-cso-music-major-co...
reply
jiggawatts
1 year ago
[-]
I've worked with several senior people ("Principal Enterprise Architect", etc...) who were music majors, and as a rule they were terrible at their jobs. They just... didn't care about anything even vaguely related to computers. Without exception they got into their positions through nepotism, ass-kissing, or dirty politics. None got there through talent.

People who like computers do it as a hobby. They learn programming at a young age, they get a CS degree or a hard science degree, and then they spend their spare time on tech forums like HN.

People who don't like computers play music, learn painting, or do something else. They get degrees in the arts or humanities. They spend their spare time playing music at the local pub, or whatever.

PS: One of the worst programmers I had ever met is also one of the best musicians I had ever met.

reply
electrondood
1 year ago
[-]
Sorry, this is utter bullshit. Got into engineering late, and this mindset is just typical engineer snobbery. It's like the toxic "10x engineer" trope that also needs to die, as if taking an unconventional career path or not living and breathing open source contributions and tech blogs in your spare time means you aren't a Real EngineerTM
reply
stef25
1 year ago
[-]
> People who like computers do it as a hobby. They learn programming at a young age, they get a CS degree or a hard science degree, and then they spend their spare time on tech forums like HN.

Well that's me and trust me, you don't want me in charge of any IT department. Maybe it's cause I also like music.

reply
coev
1 year ago
[-]
Plenty of us like painting and programming. Your brush is way too broad man.
reply
hn_throwaway_99
1 year ago
[-]
Sorry you had that experience, but that certainly wasn't mine. Don't want to reveal too much but some very high level people in the tech world that you've probably heard of (or at least heard of their companies) have strong musical backgrounds.
reply
jiggawatts
1 year ago
[-]
Very high level people in the tech world are often politicians, climbing the ladder through their social skills.

I've observed an inverse relationship between technical skill and career progression in all technical industries.

It's always the pimply junior contractor tech who is the Global Administrator doing the actual work, and the "very high level people" struggle with copy-paste from one email to another.

reply
dncornholio
1 year ago
[-]
That's pretty descriminating
reply
raarts
1 year ago
[-]
> but I take major issue with the implication that a music degree makes someone unqualified for a tech job

OP never said 'a tech job', it is implied it would make someone unqualified for a CISO job though. And as a general rule, I tend to agree.

reply
hn_throwaway_99
1 year ago
[-]
That's total bullshit. She was a music major in college, presumably some 20 or so years before her job at Equifax, and in the interim she had jobs at major banks and tech companies. People act like she was a musician until just before she plopped into Equifax.

Sundar Pichai majored in metallurgy engineering. How much of his college coursework do you think he uses day-to-day?

reply
dwd
1 year ago
[-]
You should read the approved judgements with the various State AGs that outline the measures, Government oversight and reporting Equifax is still required to do to prevent a future occurrence.

Should it happen again then you would very likely hear for calls for Gov to step in and take direct control of the firm.

reply
caconym_
1 year ago
[-]
> Should it happen again then you would very likely hear for calls for Gov to step in and take direct control of the firm.

We heard calls for similar last time, but I don't think anybody expected the legal/regulatory response to be anything resembling an existential threat to Equifax, and it wasn't. I don't see why the second time would be any different—we are surrounded by examples of how our dogshit government is utterly derelict in its duty to protect workers and consumers, and arguably complicit across the vast scope of corporate abuse of the same.

reply
dwd
1 year ago
[-]
When anyone talks about Equifax's "customers" that means Government at all levels along with every corporate who isn't using a competitor. I would think a takeover similar to what happened with Fannie Mae/Freddy Mac could happen as much to maintain Equifax as a going concern and protect the credit markets than an actual penalty. Consumers still get screwed.
reply
justinclift
1 year ago
[-]
So, treated the same way as banks after the GFC then, but without needing to give them money as well?
reply
dwd
1 year ago
[-]
Could you imagine how well bailing out Equifax to pay its legal bills would go down?
reply
iinnPP
1 year ago
[-]
Does it matter if the worst happened? It will be irrelevant 24 hours later.
reply
smegsicle
1 year ago
[-]
make sure to become too big to fail before you fail
reply
schemescape
1 year ago
[-]
> Malicious actors had been exfiltrating data for several months and had already collected personal information from 163 million customers.

I don't think "customers" is the right term, considering I never wanted them collecting data about me.

reply
RachelF
1 year ago
[-]
Yes, this is what most people don't understand with data breaches: it's not the company's data, it's data on others. That's why they don't really care about protecting it.
reply
dwd
1 year ago
[-]
That is not correct for a data brokerage as the data is the business. Lose your monopoly on that data and you have no business.

If it is information collected as part of doing business, then yes; they don't care. A good reason to question any Gov attempt to implement centralisation of data like identity or medical records.

reply
forkerenok
1 year ago
[-]
> Lose your monopoly on that data and you have no business.

But do these breaches affect their monopoly? My thinking is:

1. B2B customers won't go on darknet to source illegal data dumps.

2. This data, even if it doesn't quickly become effectively stale, would be considered stale by businesses very quickly if it's not connected to the continuous data ingestion pipeline.

reply
dwd
1 year ago
[-]
1) Customers, probably not. Competitors I would not be so sure they wouldn't have look.

2) This is not specific to the data that underlines consumer credit scoring; a broker could be selling products derived from data on historical house prices or car sales for example. A competitor might use it to compare and validate their own dataset or simply have a look. Third party investigators, journalists, etc though could have a field day fact-checking it.

reply
dataflow
1 year ago
[-]
> Lose your monopoly on that data and you have no business.

Could you elaborate on how Equifax would have gone out of business if all their data had been stolen?

reply
yomlica8
1 year ago
[-]
Doesn't track to me. There is no loss to Equifax really from losing all the data besides a fine. I doubt many of their customers are willing or able to purchase their data from dark markets at a discount, and the data would age unless the hack remained in place.
reply
stef25
1 year ago
[-]
Come on man ... no company wants their DB leaked regardless of what's inside. There's probably zero Western companies in 2023 that "don't care" about PPI leaking from their systems.
reply
simonswords82
1 year ago
[-]
People care about events when the outcomes of those events have consequences either for the company or better still those in charge of the company.

The reality is that despite Equifax showing a blatant disregard for the security of the data they have on people, the repercussions of this breach were trivial to them and their senior people.

So yes, I do agree that there is at least one company out there, Equifax, who does not care about PPI leaking from their systems.

reply
HeckFeck
1 year ago
[-]
HMRC calls its pillaged subjects 'customers' - which gives me no end of amusement. I can't ever remember asking for their 'custom', nor do I remember them ever going out of their way to win it from me.
reply
baz00
1 year ago
[-]
I did some contract work for another credit agency many moons ago and they pretty much brainwashed all full time staff into referring to data subjects as customers. A fellow contractor made a very snide analogy of suggesting that the Nazis could have called Jews customers to legitimise their actions. None of us renewed. Horrible place.
reply
Falkon1313
1 year ago
[-]
This is a nice list of could've, should've, would've's. But you'd have to dig deeper to get to the core.

Why did these things happen (or not happen)? Insufficient training? Insufficient processes? Were changes being reviewed and accepted by people who didn't really understand the changes, for expediency? Were there alerts but they were lost in the noise of thousands of bogus alerts people had learned to ignore? Was the lack of segmentation a known issue but allowed because it made some things easier? Were the credentials stored on NFS because they simply hadn't setup a more appropriate system yet and that was considered low-priority? Were business priorities getting in the way of technical priorities such that known issues were backlogged?

It's fairly easy to make a bullet list of things that should (or shouldn't) be done. It's a bit more difficult to figure out why, in a specific organization, those things aren't (or are) being done. Even if/when people might know that they should/shouldn't.

The surface level mistakes are interesting. The deeper organizational causes of those mistakes would be interesting. Solving those things at a higher systemic/organizational level can reduce the whack-a-mole nature of individual mistakes.

reply
cookiengineer
1 year ago
[-]
TLDR is:

Equifax had no working firewall / intrusion detection for almost a year, because they did not update their snakeoil MITM certificate and forgot about it.

Remind me again, how did Equifax get SOC 1&2, and ISO27001 certified?

Oh yeah, they probably have a checklist for that, so they must be secure. /s

reply
dilyevsky
1 year ago
[-]
Why "snakeoil"? Sounds like the system actually caught the intrusion once operable! The fact it was silently down for god knows how long is another matter...

> Remind me again, how did Equifax get SOC 1&2, and ISO27001 certified?

You probably already know that these are compliance CYA focused around process not actual measure of how secure the system is (if there could be such a thing).

reply
jszymborski
1 year ago
[-]
Snakeoil is a term for self-signed certs. I think they were lamenting the fact that it was not updated, not that it was self-signed.

https://eleni.blog/2019/04/10/the-snakeoil-ssl-certificate/

reply
tialaramex
1 year ago
[-]
As another person pointed out, it's just usual to name this certificate "snakeoil" because it's just ticking a box mechanically and serves little to no functional purpose. The service won't run without a certificate, this is a certificate, good enough. You might prefer to think of it as a placebo, but "snakeoil" seems to be usual name.

Yes, you're correct ISO standards are very focused on paperwork.

One of the fears some C++ people have is that today ISO 26262 (safety for road vehicles) says they can write car software in C++ because hey, there's an ISO standard for C++ so that's paperwork we can point to - But, wait, why is that enough? C++ is laughably unsuited to this work. Maybe 26262 should be revised to not say that C++ is suitable.

reply
cookiengineer
1 year ago
[-]
Funny you mention ISO 26262 and somewhat were pointing to the dumpster fire that is Misra-C :D

Eversince ISO 21434 got rolled out, all Tiers are panicking because they need to introduce modern CI/CD pipelines that work with source verification. Simple things like generating an SBOM become impossible because even the Tiers that sold you their software don't have the source code themselves and just redistribute binaries from another Tier down the line.

I am somewhat a strong opponent of using C for these kind of areas because in the automotive industry I learned the hard way that these firmwares are pretty much the definition of unmaintainable.

Sometimes Tiers even cannot compile their own software anymore because they lack licenses of old Vektor DaVinci versions, and they literally have deals with Vektor where they send zip files and an excel spreadsheet that reflects the dependencies of kernel modules, and a _person_ not a program sends back the compiled firmware.

reply
cookiengineer
1 year ago
[-]
Well, the process itself cannot be working because otherwise this whole fiasco would have been found. Technically within 24 hours, if the certifications are to be believed.

Trying to defend a broken process isn't what this is about, my critic was about that there was an audit a decade ago, and that the auditors did not verify any of the claims or processes in place. Certifications and audits without any verification of claims are not valid certifications.

SOC2 and ISO27001 also include _mandatory_ pentests which obviously didn't happen that year. Either that or the pentesting agency wasn't actually doing more than a metasploit run ;)

reply
bennyelv
1 year ago
[-]
Common misunderstanding about 27001 - it doesn’t have mandatory anything when it comes to security controls.

It defines how you structure and operate a risk based security management system, that’s all. It’s perfectly valid to say “I should be doing pen testing but my risk appetite is high enough for me not to care”, and still get a 27001 certification.

reply
cookiengineer
1 year ago
[-]
> “I should be doing pen testing but my risk appetite is high enough for me not to care”, and still get a 27001 certification.

I would agree with you if Equifax wouldn't be part of critical infrastructure.

reply
bennyelv
1 year ago
[-]
Agreed - but 27001 doesn't have an opinion on that. It only requires that top management have set the context that the rest of the information security management system hangs off of. It doesn't specify what that context should be for your company.

It's completely unlike SOC in that regard.

reply
Roark66
1 year ago
[-]
Personally I think the root cause of this was bad documentation practices. If the old system was properly documented they would've scanned the right folder.

Likewise with the certificate, if there was documentation to indicate when that cert expires (or monitoring to alert few weeks in advance) they would have a functioning ids and these web shells would be found immediately.

Unfortunately, out of half a dozen fortune 500 companies I worked for perhaps 2 had doc practices good enough to prevent this.

reply
yuliyp
1 year ago
[-]
That feels like the wrong conclusion. Assuming documentation will be followed properly is not a reasonable security strategy. Validation and monitoring is needed. That their NIDS gracefully degraded to a "don't monitor the payloads" when it was expected that it would be monitoring those and nobody noticed is a problem. A scan of a system which misses a web server running it without erroring is a problem.
reply
hn_throwaway_99
1 year ago
[-]
Couldn't agree with this more. While I think it's important to have good documentation, it is nearly always a very bad idea to rely on that documentation being 100% correct. Businesses simply have way too many moving parts to assume the state of the world is always up-to-date in the documentation.

You also highlight a very good point. Things like security software should "break loudly", i.e. beyond just sending alerts (which can be ignored), there should be some explicitly "painful" steps that occur if the security system is in a broken state for long.

reply
bishopsmother
1 year ago
[-]
Does the use of snakeoil in the TLDR not run contrary to the narrative in the blog? When the 'snakeoil MITM' certificate was updated - they became aware of, as a direct result of MITM, a problem that had not previously been known to them?
reply
blisterpeanuts
1 year ago
[-]
I’m taking a cybersecurity course right now and this article is timely and informative. I’m a programmer with a lot of Java and database experience, but not really knowledgeable about security practices.

Maybe security certification should be more of a requirement in hiring software engineers; I don’t recall it ever being mentioned in job listings.

Anyway, it got me wondering, how did devs get away with storing database credentials in a file on an NFS share? That’s sheer recklessness. As a regular procedure, an audit should include scanning all files for passwords, for example; run find-grep-dired or similar on every mount, every disk, every cloud instance etc. And, obviously, require regular password changes.

It should be assumed that the entire system is vulnerable, and hardening should be done regularly and rigorously. A company as big as Equifax (or Target) should have a dedicated team whose job it is to constantly probe and audit. Since, after all, the black hats are constantly probing, too.

reply
tekla
1 year ago
[-]
> an audit should include scanning all files for passwords

Please continue taking the security course. Scanning all files for passwords is madness. How do you differentiate "thisissupersecret" and "123fqfqlfni34235r4" and "git@somegitrepo.com" as passwords? You can't, they're all valid passwords for a majority of services.

At some point, you need to trust developers to do the right thing, which is impossible.

reply
blisterpeanuts
1 year ago
[-]
You're right, but I was thinking of searching for known passwords. For example, if an Oracle db accessed by developers is tekla / tekla1234, then scan for the string "tekla1234". It should not exist in any file. If it exists in a file, then obviously that is a potential leak.
reply
JamesSwift
1 year ago
[-]
And where do you intend to store these "bad" passwords in order to scan??
reply
ponyboy123
1 year ago
[-]
Don’t be so mean to the guy.

One approach would be to have passwords of a known format, that are rotated frequently, and to verify that you’re not finding any strings matching those patterns save to disk or in log files, etc.

reply
hnthrowaway0315
1 year ago
[-]
A lot of companies are like that. In my previous company people sharing username and password through MS team and I'm sure someone stores them in team folders too.
reply
blisterpeanuts
1 year ago
[-]
In the early 90s, at a large financial company I worked for, the system user name/password for a Sybase db was sa/sa. It was so convenient. Of course this was the primordial days but still.
reply
hnthrowaway0315
1 year ago
[-]
Sybase! Ah, my father has a book about that. Yeah I get back in the days many are ignorant about security.
reply
JCM9
1 year ago
[-]
Having a plaintext file with “notes” containing the login credentials for a database containing actual customer data with PII is borderline criminal negligence. What a friggin disgrace.
reply
EGreg
1 year ago
[-]
reply
vb-8448
1 year ago
[-]
Wow, basically everything that could go wrong ... gone wrong.

I wonder how they realized about their NIDS expired certificate.

reply
e79
1 year ago
[-]
For anyone curious about how the exploit worked:

https://www.aon.com/cyber-solutions/aon_cyber_labs/an-analys...

reply
Forgotthepass8
1 year ago
[-]
Bold of them to not spoof the wget UserAgent
reply
nunez
1 year ago
[-]
> The attackers continued their search and eventually discovered a mounted NFS share on the web server. This file share contained notes and configuration files used by Equifax engineers, in which they found many database credentials.

Crazy that the user that ACIS was running as had enough permissions to access NFS mounts to begin with.

It’s also crazy that the attackers even found ACIS.

This was an insanely dedicated attack.

reply
k1rcher
1 year ago
[-]
Wow, the fact that they remained undetected for so long and used wget for data exfiltration..

Hopefully security posture has increased since

reply
x86a
1 year ago
[-]
using wget here does not make it more embarrassing as its user agent was almost certainly randomized to look like normal web traffic. Normal traffic downloading lots of 10MB files... well, yeah that's not great.
reply
throwaw1yyy
1 year ago
[-]
I’m planning to sue in small claims court, did anyone also do this and have any tips?
reply
simonswords82
1 year ago
[-]
Nothing irritates me more than two for profit companies (Equifax and Experian) who have a license to print money by collecting my data without my explicit permission. Even with the introduction of GDPR and all the new consumer protection this brought about, I cannot ask them to delete all of my data.

They should not exist, or if they must exist they should be not for profit. It's a total scam.

reply
wordpad25
1 year ago
[-]
Wasn't Equifax Chief of Security a Music major?

That was hilarious to read about...

reply
YeBanKo
1 year ago
[-]
I have met quiet a few people with no-tech degrees that made it in computer related IT fields. This alone is not an issue, but she hadn’t seem to have much experience in security at all.
reply
saagarjha
1 year ago
[-]
A lot of people in security don’t have a degree at all, let alone in computer science. Judging people’s qualifications is much more complicated than just looking at their degree.
reply
IE6
1 year ago
[-]
You'de be really surprised how irrelevant a major and degree can be - for example I used to work with an engineer who carries a highly technical PhD and required extensive assistance with even the simplest tasks (often hour+ just to understand the ask and then more time on engineering a solution) - and I also had an art major intern who really just wanted to be a dropshipper who devoured any work I gave them and delivered amazing stuff.
reply
te_chris
1 year ago
[-]
Music major CTO here. Jog on.

God forbid our executives be trained in creativity.

reply
xwolfi
1 year ago
[-]
As long as you're also trained in IT...
reply
orwin
1 year ago
[-]
You mean college-trained? Because GP is a CTO, so I guess he has experience. I don't respect the C-suite that much, but I've never seen CTO without solid SWE knowledge.
reply
sk0g
1 year ago
[-]
Most CXOs have extensive experience in their relevant fields. Your retort seems needlessly defensive.
reply
H8crilA
1 year ago
[-]
There are different types of creativity. Working in security actually requires a lot of a specific type of imagination/creativity that pretty much isn't used anywhere else.
reply
est
1 year ago
[-]
ah, the good old struts2 exploit.
reply
demizer
1 year ago
[-]
If any company deserves the corporate death penalty, it's Equifax after this fiasco. That company should no longer exist.
reply
dang
1 year ago
[-]
Ok, but please don't post generic comments, and especially not generic-indignant comments, to HN. They lead to significantly less interesting discussion.
reply
hyperdunc
1 year ago
[-]
Yes, the punishment should be immense. But we all know there's no real justice to be had here.

In places like China, there's personal accountability at the highest level of an org for major screw ups - sometimes even capital punishment.

If we put such options on the table here, perhaps corporations would be a little less callous with people's private data, and a little less eager to collect it.

reply