Install: npm install -g @dendronhq/safe-npm Usage: safe-npm install react@^18 lodash
How it works: - Queries npm registry for all versions matching your semver range - Filters out anything published in the last 90 days - Installs the newest "aged" version
Limitations: - Won't protect against packages malicious from day one - Doesn't control transitive dependencies (yet - looking into overrides) - Delays access to legitimate new features
This is meant as a 80/20 measure against recently compromised NPM packages and is not a silver bullet. Please give it a try and let me know if you have feedback.
Pardon me, I couldn’t help myself :D
And as far as cat-and-mouse-games go in other package managers, I'd say that pinning dependencies and disabling postinstall scripts is a much better option. Sure, not a foolproof one either, but as good as it gets.
edit: misspelled someotherguyy's user name
Related to that is the proposal for `stabilityDays`, which seems way more practical: https://github.com/npm/cli/issues/8570#issuecomment-33004136.... So rather than merely saying "I only want package versions more than N days old", you'd be adding the requirement that "...and also they should have gone at least N days without a subsequent patch release". e.g. if mylib@6.0.0 is released, only to be quickly followed by 6.0.1 and 6.0.2, you ideally wouldn't want to risk ever installing the probably-broken 6.0.0 or 6.0.1 based on luck of the draw; the better behavior would be to stick with the last 5.x release until 6.0.2 has aged past the threshold.
Many quirks come from abilities that were once deemed useful, such as compiling code in other languages after package install.
Sure, today, I can disable install scripts if I want but it doesn't change much when I eventually run code from the package anyway.
But even restricting access to the file system to the project's root folder would leave many doors open, with or without foreign languages: Node is designed as a general purpose JS runtime, including server-side and build-time usage.
The utility of node.js was initially to provide a JS API that, unlike the web platform, is not sandboxed. And npm is the default package manager.
This not only allows server-side usage, but also is essential to many early dev scenarios. Back in the days, it might have been SCSS builds using node-gyp (wouldn't recommend). Today it's things like Golang TypeScript or SSGs.
So, long story short: as many people before me already said, it's an ecosystem/cultural problem.
One thing against npm in this regard was/is its broken lock-file handling until I think version 12 or 16. That led to unintended transitive dependency version changes, breaking any reproducibility.
Same for compiling foreign languages.
These problems are solved today / not different from other package managers and -registries, as far as I know.
The culture of taking breaking changes and dependency bloat lightly has not changed as much, I think, although it's improved.
This most important point seems to be related to 3 reasons IMO:
- junior developers without experience in library development reaching large audiences
- specs, languages, runtime, and the package managers itself going through disruptions and evolutions
- rapidly releasing breaking majors, often caused by the above factors
The combination of these plus the role of the project lead/team who actually decides about the dependencies.
There are probably also many projects with unclear roles and many people who can push manifest changes, coupled with habitual access to CI/CD pipelines.
So yeah, ~everyone is using a lockfile with checksums. But even if I think really hard about installing XYZ@1.2.3 package, and check that the lockfile diff is reasonable, I'm not manually auditing the whole supply chain (I'd get fired for getting nothing done). And a single dependency change that I choose to make can affect a substantial number of transitive deps.
A cooldown time alone is not actually a sufficient solution. What people really need to stop doing, is not properly pinning their versions and checksums, and installing whatever newer version is available. That would cause a problem even, if the date line is moved 90 days into the future for all packages. If however, one only updates versions of dependencies when one consciously makes that choice, there are far fewer points in time, when versions change, and therefore the chance of catching something is also much lower. Combine that with a cooldown time/minimum age for versions, and you got an approach.
> Installs the newest "aged" version
Probably want to install version that has CVE-fixed instead, i.e find the cve for packages and install latest version that has all of them fixed but not later.
Technically someone could fake a cve to get people to upgrade but that's a far more involved process
Scheduled, audited updates are good.
Installing random npm packages as suggested here is also bad. Especially with "--global", although I'm not sure if that makes any difference because Node by default of course can access all of your file system.
https://blog.yossarian.net/2025/11/21/We-should-all-be-using...
Most of the time, you need quick patches because of fairly recent dependency changes, so if you just wait and kind of "debounce" you dependency updates, you can cover a lot of supply chain vulnerabilities etc.
Its the opposite of "keep your software up to date"
The idea of “safe” in terms of risk and security has misled a lot of people into this wrong idea that there’s a binary state of safe and unsafe.
It’s all about risk management. You want to reduce risk as inexpensively as possible. One of many inexpensive approaches is “don’t install dependencies that are new.” Along with “don’t install dependencies that nobody else uses.” You might also apply the rule, “don’t install dependencies that aren't shipped with the OS.” Or “don’t use dependencies that haven’t been formally proven.” Etc.
Indeed, calling it “Safe-NPM” can be misleading. As if using it achieves some binary state of safety.