PSA: Please be cautious because this is an excellent opportunity for taking over packages and injecting malware by malicious people.
Example: https://www.npmjs.com/package/duplexer3 which has 4M monthly downloads just reappeared, published by a fresh npm user. They published another two versions since then, so it's possible they've initially republished unchanged package, but now are messing with the code.
Previously the package belonged to someone else: https://webcache.googleusercontent.com/search?q=cache:oDbrgP...
I'm not saying it's a malicious attempt, but it might be and it very much looks like. Be cautious as you might don't notice if some packages your code is dependent on were republished with a malicious code. It might take some time for NPM to sort this out and restore original packages.
Hi folks, npm COO here. This was an operational issue that we worked to correct. All packages are now restored:
> I was here.
> We made history! Fastest issue to reach 1000 comments, in just 2 hours.
> cheers everyone, nice chatting with you. 17 away from hitting 1000 btw!
> Is GitHub going to die with the volume of comments?
Kind of disappointed the NPM community is turning github into reddit right now.
NPM is extremely vulnerable to typosquatting. Be cautious with what you install. The install scripts can execute arbitrary code. NPM's team response is that they hope that malicious actor won't exploit this behaviour. According to my tests, typosquatting 3 popular packages allows to take over around 200 computers in 2 weeks time it takes their moderators to notice it.
Btw. for those who don't know:
Yarn (which is an alternative to npm) uses a global cache  on your machine which speeds things up, but probably also protects you from immediate problems in cases like the one currently on progress (because you would probably have a local copy of e.g. require-from-string available).
Hmm, I Java world we pretty much always used a local (company-owned) Maven proxy server, which grabbed packages from public repos and cached them locally to make sure builds still work if public servers were down or slow... or packages disappeared.
This isn't a standard practice in JS world?
So they didn't learn anything from left-pad situation from 1.5 year ago?
Packages that are published should be immutable, just like in maven repo case.
I never understood the love for package managers that directly hook and import things into your codebase or repo or even worse servers. I guess the benefit is that "it just works", but the fact that you do not know where a package is coming from can't be worrying just me.
In my company we take the stable version of the library we want to use and we self-host it. We basically have added a cache that we manage and control what goes into it instead of just trusting a manager. Especially for server-side deployment this is mandatory for security. Things like let's say ffmpeg etc - we never get from random packages but we host them ourselves.
We really need to hear from NPM why this happened.
There is currently no way for a user to remove their own packages or unpublish packages anymore from the public NPM API ( a change following the `left-pad` incident ).
This leads me to believe this was an internal NPM error. My guess is employee error.
> Update - Most of the deleted packages have been restored and installation of those packages should succeed. Nine packages are still in the process of restoration. > Jan 6, 20:12 UTC
Gah. Moments like these always gives me a bit of panic, since I realize that so much of my software relies on external sources.
Relying on npm, Atlassian/GitHub etc really hurts when stuff like this happens. Issues always gets resolved, but cases such as the GitLab incident should be enough to always keep some local copies around.
Could someone please add NPM to the title?
> " Several packages including "require-from-string" are currently unavailable. We are aware of the issue and are working to restore the affected user and packages. Please do not attempt to republish packages, as this will hinder our progress in restoring them. Posted 4 minutes ago. Jan 06, 2018 - 19:45 UTC
Late to the party, but can't wait for the technical write up on this.
Ergonomically, I currently thing it's ahead of many other package managers because of how simple it is to get running. The number of "gotcha's" after npm install is nothing to shake a stick at, though.
One of the things you can do to get builds that aren't as suspect to npm registry issues is configuring an offline mirror .
From the post:
One of the main advantages of Yarn is that it can install node_modules from files located in file system. We call it “Offline Mirror” because it mirrors the files downloaded from registry during the first build and stores them locally for future builds."
Can anyone explain why the npm registry still exists if it cannot guarantee that uploaded packages remain available? The current state makes it pretty useless as a reliable source te base software on because you never know if you're able to build it again in the future.
They should take a good hard look at NuGet, which does not allow packages to be deleted so builds are guaranteed to be reliable. Still doesn't hurt to locally cache packages with software such as Klondike.
I said this back in the left pad days
Store all of your dependencies locally.
If something disappears then at least you can continue until you find a replacement.
While it may not be the right time to _start_, incidents like this are an excellent reason to consider an internal read-through proxy package repository. The last couple of organisations I've worked with have used Artifactory: https://jfrog.com/artifactory/
And ppl think i'm crazy for keeping packages in SCM repo. NPM get so much abuse, people depending on them without paying a dime. At least put up a caching proxy hosted by your own if you depend so much on npm for your operations.
In my org, we use Artifactory as a cache between us and external sources. They have a free version too. I'd encourage everyone to use it, or something like it. Stop pointing your package managers to the public registry.
How are these going, by the way?
With npm 4 things went south and never came home to us. We use macs, pcs, linux machines and nowdays we fear `npm i` like the plague. I don't care if it's the registry, the executable, the stupid packagelock.json, node-npm version mismatch or an installer script, the endresult is frustration.
This is insane. This is like Google changing their v1 APIs, except worse since ANYONE could come in and put new malicious APIs up in its place. I say this as a firm supporter of Node and the ecosystem - this should NEVER EVER be allowed to occur. This completely erodes the trust model based around "popular packages" even further - the only saving grace is that hopefully most devs are shrinkwrapping their modules.
I wish the NPM community would grow some humility and learn some lessons about how the debian environment was built. Have sane licensing that allows mirroring, have crypto-hash of packets. Have open governance.
And this is why I avoid "package managers" that follow the wild-west model like the plague.
There are over 700 comments on this issue on GitHub. It is turning into live chat room.
Be part of the history :)
EDIT: there are now over 1100 comments/memes.
2018 looks interesting, everything seems suddenly broken.
NPM specifically asks people to not try and republish the broken packages while they repair it. Incident status:
The sheer number of software development organizations who cannot function when github or their package repository happens to be unavailable (for whatever reason) is incredibly disheartening.
Was just discussing this elsewhere online. Package management is broken (or incomplete, depending on your viewpoint). What's needed IMO is the following:
1. Allow a single package file, including multiple clauses (or sub-files, whatever) for different languages. Let me manage my Angular front-end and Flask back-end in the same file. A single CLI tool as well - Composer and Bower aren't all that different.
2. Be the trusted broker, with e.g. MD5 checking, virus scanning, some kind of certification/badging/web of trust thing. Let developers know if it's listed, it's been vetted in some way.
3. Allow client-side caching, but also act as a cache/proxy fetch for package retrieval. That way, if Github or source site is down, the Internet doesn't come to a screeching halt. I see the value of Satis, but it's a whole additional tool to solve just one part of this one problem.
4. Server-side dependency solver. Cache the requests and give instant answers for similar requests. All sorts of value-adds in analytics here, made more valuable by crossing language boundaries.
5. Act as an advocate for good semver, as part of the vetting above.
NOTE: These features are not all-or-nothing, I believe there's value from implementing each one on its own. Also note that nothing here should lock people into one provider for these services. There's a market to be made here.
And this is why you vender your dependencies. Nothing in production should ever require any external service to build and run.
You should also not be cowboy updating things just because there is an update.
Here is the official response from npm:
TL;DR: "no malicious actors were involved in yesterday’s incident, and the security of npm users’ accounts and the integrity of these 106 packages were never jeopardized."
A more detailed report will follow in the next days.
Could we have a better title perhaps, giving at least "[npm]" or something as a hint?
I don't use NPM (or community managed package managers in general), but anyone know why there isn't an LTS feature with packages? So that, when searching packages, if a package is flagged as LTS, you know that it and all its dependencies have long term support and there are contingencies on what happens if the package is abandoned. Obviously, there would need to be a community that reviews and approves packages that aim to be LTS.
Stupid question from non-pro here: everyone's always like "never commit libraries into source control." But, um, this kinda thing?
Sometimes your project becomes bigger than you, and perhaps private ownership isn't the best way to handle that anymore:
Well, Q allows you to choose between “each individual publishes their own stream” and some degree of “centralized publishing” by management teams of groups. So who should publish a stream, the individual or the group?
If the individual - the risk is that the individual may have too much power over others who come to rely on the stream. They may suddenly stop publishing it, or cut off access to everyone, which would hurt many people. (I define hurt in terms of needs or strong expectations of people that form over time.)
If the group - then managers may come and go, but the risk is that if the group is too big, it may be out of touch with the individuals. The bigger risk is that the individuals are forced to go along with the group, which may also create a lot of frustration. For instance, the group may give rise to into three sub-groups. They are deciding where to go, but some people want to go bowling, others want to go to the movies, others want to volunteer in a soup kitchen. Even though everyone belongs to the group. Who should publish these activities?
So I think when it comes to publishing streams that others can join, there should be some combination of groups and individuals. And it should reflect the best practices of what happens in the real world: one person starts a group that may later become bigger than him. Then this group grows, gets managers etc. After a while this person may leave. In the future, other individuals may want to start their own groups and invite some members of the old group to join. They may establish relationships between each other, subscribe to each other’s streams, pay each other money, etc.
Several packages just disapperead, now some are re-appearing, potentially uploaded by differnt users (!).
See https://github.com/npm/registry/issues/255 for details.
Very annoying, breaks builds all over, also prevents installing react-native.
Why Node.js comes with a client for a for profit company is still baffling me. NPM team has proven time and time again they are not competent enough to handle this responsibility yet they are given the free ride by the Node.js foundation.
Node.js package manager SHOULD BE COMMUNITY OWNED/DRIVEN
The status page at https://status.npmjs.org/incidents/41zfb8qpvrdj says that "We apologize for the temporary unavailability of some packages.".
If this was only a matter of missing packages, this would "only" be a matter of breaking builds.
But it looks like third parties were able to take over the missing packages, see https://github.com/npm/registry/issues/256 - which is a HUGE deal, considering "npm install" blindly executes the scripts in a package's preinstall property (as well as the packaged module itself possibly containing arbitrary backdoors)
This is why you should depend on exact versions whenever possible. But even if you do, your dependencies most likely won't, so you are screwed anyways.
The caret syntax for auto-upgrading to the next minor version is the open door to a world of bullshit.
Does anyone know what happened to the author? It seems he was still on Twitter as of yesterday (Jan. 5), responding to someone about a merge request:
I don't remember the intricacies of NPM or Yarn, but don't one/both of them have resource integrity enabled, so that you know that the package that's being installed is the one in your lock file? If not, why isn't this a feature especially after the clusterfuck of the guy deleting all his packages back about two years ago, breaking tons of things including Babel and React?
This wouldn't fix the issue of someone deleting the actual package (this happened here?), but it would prevent some malicious code being installed if someone uses the same package name.
Did NPM not learn from the leftpad incident?
If you have a build system or a production system that relies on npm to be there, you are an idiot. Find your boss and tell him to fire you.
Vendor your dependencies that are needed to run build your applications. Period.
I just don't understand how this can happen. In Maven Central for example (Java) if you publish a package it is immutable and stays there until nuclear fire immolates the Earth.
How to bring down half the internet - randomly delete your NPM packages, then stand back and watch millions of web developers scream in frustration.
What an ecosystem we have built.
You know, back in the "old days", we used to host packages on these sites called "mirrors", so when one went down, we could get the package from another, and verify authenticity using multiple sources and signed files. There would be hundreds of mirrors for one set of files.
Kind of funny how shitty modern technology is. But I heard a quote recently that kind of explains it: "The more sophisticated something is, the easier it is to break it."
It seems to me that if packages "disappear" from upstream, it shouldn't have any effect other than preventing an update due to the missing dependency.
Well, the packages seem to be all from floatdrop.
Glad we cache ours on ProGet now!
The latest Manifest.fm podcast episode is about typosquatting:
The Maven Central repository for JVM dependencies doesn't share the problem of packages being removed like NPM periodically has, but Adam Bien has been instructing users to download the source-code for their dependencies and then compile them to their own repositories for quite a few years.
I wish I'd taken his advice as there are a couple of JAR files that I can no longer update.
I don't understand much about the blockchain, but one thing I have heard is that it's impossible (or very hard) to remove things from it. It is immutable, sort of append only, if I understand it correctly. So my question is, is there anyone working on moving npm to the blockchain? Or doing something like a package manager on the blockchain? If not, why not?
Couldn’t this be solved in the future by npm storing packages with user names? I.e. “MaxLeiter/example” instead of “example”
Could someone explain why dependency-management-systems don't enforce immutable releases? Ie, package owners can publish all the new versions they want, but they are never able to edit/remove/liberate an already-released version. It seems like that would solve so many problems, such as the left-pad fiasco.
I don't get why not just use git repo registry (e.g. github) for package management. If you work in a "strict" environment you can basically fork all your dependencies and use your own git repo registry.
NPM already allows using git repos, but needs some tweaks to allow better support:
* allow versioning via git tags
* store git commit in `package-lock.json`.
* maybe something else...
Would be interesting to see the load curves on github and npm servers due to people and servers retrying downloads.
As someone unfamiliar with NPM, why does it not lock package names for a certain period of time? Rubygems has a 90 day period, so if a package is completely removed, the name can't be used for that long. That seems like it would help with the security side of these problems.
this issue is avoidable by using shrinkpack:
HN: Shrinkpack – npm dependencies as tarballs, prevents “left-pad” style breakage - item?id=11353908
When will people learn to host their own private mirror for things on the web that they depend on?
I made a half-joking comment on that thread that "'Bout time NPM goes blockchain." Either someone deleted it, or GitHub lost it among all the traffic to that issue.
Wonder if npm, Inc. would view a decentralized registry as a threat to their business model?
Didn't something similar happen last year? I think it was packages with similar names.
Apparently the restore is now complete:
Can't help but wonder if committing node_modules to your repo is now a good idea...
I wrote "The Github Threat" about this possible issue https://carlchenet.com/the-github-threat/
module.exports = typeof Promise === 'function' ? Promise : require('pinkie');
I can't even install webpack-dev-server. Because this package is missing.
EDIT: it's back
Is there a possibility that npm turn package names into "author/package" style, so there would be less confusion on what the users are installing and less chance of name squatting?
Remember to freeze your packages after installing them as a project dependency. You should have the packages in your source tree or your own internal package manager (local nuget, for example).
Is there any chance of something similar happening for Nuget? I rely on it heavily for project dependencies, and I'd like to know if there's a ticking timebomb there too.
I don't understand why npm even has the facility for a user to remove public packages.
It defeats the entire purpose of using a public repository.
I'd prefer to store dependency on permanent storage service like IPFS, so it can work even when npm or github are down.
Wow, they comment faster there than one can read :D
So this is an NPM.js issue, I thought this was a broader issue on github according to the headline.
So that's why my "npm install" on Heroku randomly started failing.
Is pip also vulnerable to this type of exploit? Or is this unique to npm?
I would say we should be signing packages but...
So has npmjs.com been hacked or what?
Is Yarn with its cache affected?
What exactly happened here?
F by xv v
So, funny story: I registered the "nazi" npm package. When you require it, it says "I did nazi that coming." That's it. (Though it would've been a funny name for a linter.)
... Or it did. I received a harshly worded letter from npm saying they axed it. It hit all the talking points about inclusiveness and making sure no one feels even slightly annoyed.
Meh. No point to this story. Just an interesting situation with an inconsistently curated package manager. I was surprised there was an unofficial undocumented banlist.
Don't. Deploy. From. Internet.
And here we go again...
I thought this was about leftpad, but nope we're doing it again!
People get the leaders they deserve. This holds true for nodejs users, too.
This is why I'm obnoxiously cautions about adding external dependencies.