Yarn (which is an alternative to npm) uses a global cache [1] on your machine which speeds things up, but probably also protects you from immediate problems in cases like the one currently on progress (because you would probably have a local copy of e.g. require-from-string available).
Already counting down the days before yarn is considered old and broken and people are recommending switching to the next hot package manager/bundler...
yarn is one of those things coming out of the JS world that is actually really well made. yarn, typescript, react; say what you want about js fatigue, these are rock-solid, well-tested projects that are really good at what they do.
A major reason for the high toolchurn in that ecosystem is how many of those tools are not designed from the ground up, don't quite solve the things they ought to, or solve them in really weird ways (due to the low barrier of entry partly). But that doesn't mean all of it deserves that label.
Can't say anything about react, but yes: yarn and typescript are good.
This is coming from a long time Java programmer who still likes Java and Maven but now might have a new favourite language.
This is made even more impressive by the fact that it is built on the mess that is js. (Again: I'm impressed that it was made in three weeks I just wish a better language had been standardized.)
It badfles me that technologists commonly complain about new technology. As far as I can tell your complaint boils down to “people should stop making and switching to new things”.. I find it hard to understand why someone with this attitude would be a technologist of any kind, and I find the attitude really obnoxious.
I take it that you've never had to work at a big organization? When you have multiple teams in different offices, it's incredibly difficult to constantly "herd cats" and point everyone to $latest_fad. And when you DO by some miracle get everyone (devs and management) to switch to $latest_fad, it's a huge pain to go back through and bug test/change every process to accommodate the new software.
I don't think "people should stop making and switching to new things" is a fair distillation of the parent comment, as it seemed like they were just expressing frustration at the blistering pace the Javascript community is setting.
Independent teams providing business capabilities through APIs would mostly eliminate the need to keep consistent technologies as long as the interface design follows shared guidelines.
Most companies of any size are allergic to "pick your own toolchain" development strategies. The infrastructure team has to support them. Someone has to be responsible for hiring. Security needs to be able to review the environment. Employees should be able to be moved between teams. And so forth.
Sure, I suppose devops can mitigate the infrastructure support problem, but overall most companies strongly prefer standardization.
No. My complaint is that things never get fixed properly. The complex problems around software distribution (which proper package managers have made a good stab at solving for decades) are ignored in favour of steamrollering over the problems with naive solutions and declaring that everything "just works" only for the wheels to come off a few years later running into a dead end which many of us saw from miles off.
This is particularly true for package/dependency management, but the attitude is found more broadly.
For what it's worth, the javascript world isn't alone here. Python, with its new Pipfile/pipenv system is on its, what, fifth, sixth? stab at solving package management "once and for all" and it's all truly dire and not something I depend on when I have the choice.
Nix solves pretty much all of these problems and a few more, but I expect it to be a decade or so before people realize it.
I'm not complaining about new things. These aren't new things. They're about a decade behind the curve.
Because each thing has a constant price in learning effort that is familiarizing yourself with its idiosyncrasies, which you have to pay even if you're experienced in the domain. When tools constantly get replaced instead of improved, you keep paying that price all the time.
> Because each thing has a constant price in learning effort
That's not, in my experience, how it works. Learning your first tool (or language) takes a lot of time. Learning your second is quicker. By the tenth, you're able to learn it by skimming the README and changelog.
It works like this for languages too, at least for me. My first "real" language (aside from QBasic) was C++ and it took me 3-4 years to learn it to an acceptable degree. Last week I learned Groovy in about 4 hours.
It still "adds up", but to a much lower value than you'd think.
But it does, you're just focusing on the other component of learning.
Put another way, for a new tool, learning cost is a sum of a) cost of learning idiosyncrasies of that tool, and b) cost of getting familiar with the concepts used by it.
You're talking about b), which is indeed a shared expense. But a), by definition, isn't. And it's always nonzero. And since new tools are usually made to differ from previous ones on purpose ("being opinionated", it's called), even though they fix some minor things, this cost can be meaningful. And, it adds up with every switch you need to do.
Some of it is a normal part of life of a software developer, but JS ecosystem has taken it to ridiculous extremes.
My argument is that the a) part's cost is indeed non-zero, but - contrary to what you say - trivial in a vast majority of cases. It's just my personal experience, but it happened every single time I tried to learn something: learning "what" and "why" took (potentially a lot of) time, but learning "how" was a non-issue, especially if a "quick reference" or a "cheat sheet" was available. I also disagree that the a) part is never shared between tools: there are only so many possible ways of doing things, but a seemingly infinite supply of tools for doing them. The idiosyncrasies are bound to get repeated between tools and, in my experience, it happens pretty often.
As an example, imagine you're learning Underscore.js for the first time. It's a mind-blowing experience, which takes a lot of time because you have to learn a bunch crazy concepts, like currying, partial application, binding, and others. You also have to learn Underscore-specific idiosyncrasies, like the order of arguments passed to the callback functions and the like - mostly because you are not yet aware which things are important to know and which are just idiosyncrasies.
Now, imagine you know Underscore already and have to learn Lo-dash or Ramda.js. As the concepts remain very similar, you only need to learn a few conventions, which are different in Ramda. But! Even then, you don't have to really learn all of them to use the library effectively. It's enough to keep the diff of the Underscore and Ramda conventions in mind: learning that, for example, the order of arguments passed to callbacks differ is enough; you can then check the correct order in the docs whenever you need. You know where to find that piece of information, you know when it matters and, by extension, when it's not a concern. There is no need to waste time trying to learn trivia: not doing something is always going to be the fastest way of doing it. By your second library, you start to recognize trivia and are able to separate it from informations that matter. Learning prelude.ls afterward is going to take literally 30 minutes of skimming the docs.
This is just an example, but it worked like that for me in many cases. When I switched from SVN to Bazaar, for example, it took quite a bit of time to grok the whole "distributed" thing. When I later switched from Bazaar to Git it took me literally an hour to get up to speed with it, followed by a couple more hours - spaced throughout a week or two - of reading about the more advanced features. Picking up Mercurial after that was more or less automatic.
I guess all of this hinges upon the notion of the level of familiarity. While I was able to use bzr, git and hg, it only took so little time because I consciously chose to ignore their intricacies, which I knew I won't need (or won't need straight away). On the other hand, you can spend months learning a tool if your goal is a total mastery and contributing to its code. But the latter is very rarely something you'd be required to do, most of the time the level of basic proficiency is more than enough. In my experience, the cost of reaching such a level of proficiency becomes smaller as you learn more tools of a particular kind.
That's the reason I disagree with your remark that that cost is "constant". It's not, it's entirely dependent on a person and the knowledge they accumulated so far. Learning Flask may take you a week if you're new to web development in Python, but you could learn it in a single evening if you worked with Bottle already. On a higher level, learning Elixir may take you months, but you could also accomplish it in a week, provided that you already knew Erlang and Scheme well.
So that's it - the cost of learning new tools may be both prohibitive and trivial at the same time, depending on a prior knowledge of a learner. The good thing about the "prior knowledge and experience" is that it keeps growing over time. The amount of knowledge you'll have accumulated in 20 years is going to be vast to the extent that's hard to imagine today. At that point, the probability of any tool being genuinely new to you will hit rock bottom and the average cost of switching to another tool should also become negligible.
To summarize: I believe that learning new tools gets easier and easier with time and experience and - while never really reaching 0 - at some point, the cost becomes so low that it doesn't matter anymore (unless you have to switch really often, of course).
I'm not sure. I did it because of Jenkins Pipeline DSL; I learned enough to write ~400 loc of a build script from scratch. I was able to de-sugar the DSL and wrap raw APIs with a DSL of my own design (I'd say that I "wrote a couple of helper functions", but the former sounds way cooler...). I did stumble upon some gotchas - the difference between `def` and simple assignment when the target changes, for example.
EDIT: I wonder, is that level of proficiency enough for you to at least drop the scare quotes around "learn"? I feel that putting the quotes there is rather impolite.
> did you skim some docs and just learn what Groovy should be?
As I elaborate on in the comment below, there are different levels of proficiency and I never claimed mastery - just a basic proficiency allowing me to read all of the language constructs and write, as mentioned, a simple script from scratch, with the help of the docs.
> And did you already know any Java beforehand?
Well, a bit, although I didn't work with it professionaly in the last decade. However, knowing Java wouldn't be enough to make learning Groovy that fast - I have another trump card up my sleeve when it comes to learning programming languages. You might be interested in a section of my blog here: https://klibert.pl/articles/programming_langs.html if you want to know what it is. To summarize: I simply did it more than 100 times already.
> the scare quotes around "learn"? I feel that putting the quotes there is rather impolite
When I say I've learned (or learnt) a programming language, I mean more than a 4-hour jump start to basic proficiency level. Perhaps I was letting off some steam over the wild claims many programmers make regarding their PL expertise.
Did you know that Jenkins Pipeline cripples Groovy so all its features aren't available, specifically the Collections-based methods that form the basis of many DSL's?
> Did you know that Jenkins Pipeline cripples Groovy
Yes. I've run into some limitations; first because of a Pipeline DSL, and when I ditched it in favor of normal scripting I ran into further problems, like Jenkins disallowing the use of isinstance (due to a global configuration of permissions, apparently - I don't have administrative rights there) and many other parts of the language. It was kind of a pain, actually, because I developed my script locally - mostly inside groovysh - where it all worked beautifully and it mysteriously stopped working once uploaded. A frustrating experience, to say the least.
> over the wild claims many programmers make regarding their PL expertise.
I believe I'm a bit of a special case[1] here, wouldn't you agree? Many of the languages on that list I only learned about, however, many of them I learned, having written several thousand (on the low end) of lines of code in them. It's got to be at least 30, I think? I'd need to count.
Anyway, I argue that such an accumulation causes a qualitative difference in how you learn new languages, allowing for rapid acquisition of further ones. It's like in role-playing games, if you buff your stats high enough you start getting all kinds of bonuses not available otherwise :)
[1] If I'm not and you know of someone with the same hobby, please let me know! I'd be thrilled to talk to such a person!
The problem isn't with that one tool alone. The problem is with the entire ecosystem, in which all the tools get regularly replaced by "better" ones. It all adds up.
To be precise, new tools are continuously created to address the weaknesses of other tools. This happens in other languages, just more slowly due to smaller community sizes.
"new tools are continuously created to address the weaknesses of other tools, instead of fixing those weaknesses in those other tools" - FTFY.
> This happens in other languages, just more slowly due to smaller community sizes.
Yeah, my point is that there is a cost for learning learning a new tool; the faster those new tools replace the old ones (instead of someone fixing the old ones), the more you have to pay of that cost.
What ideally should be happening is that existing tools get incrementally upgraded to fix issues and add improvements rather than scrapped and replaced as if they're disposable.
To be completely fair, it isn't exactly drop-in. There's new commands for a bunch of things, mainly around adding new packages locally and globally. I led the yarn switch effort on my direct team and had people coming to me weeks after asking how to do X because of the different commands.
I suspected that someone would mention this, but the fact of the matter is both systems are mostly interoperable. The switch from npm
to yarn would be nothing like migrating from Gulp + Browserify to Webpack.
To switch to yarn, I printed out a one-page cheat sheet and taped it to my wall. I’ve had one blunder in the time I’ve used it (misunderstanding what `yarn upgrade` did x_x), but it was easily reverted.
Even in this relatively close case, it's not a zero-overhead transition. There are some changes. There are some new behaviours. You still need to know which things really work exactly the same and where the differences come from even if those differences are only minor. You always need due diligence about whether a new tool is reliable, future-proof, trustworthy, etc. And that's all after finding out about the new tool and deciding this one is actually worth looking into.
Multiply all of that by the absurd degree of over-dependence and over-engineering in the JS ecosystem, and it's entirely fair to question whether the constant nagging overheads are worthwhile.
It _badfles_ me that _technologists_ (whatever that means) dismiss others writing without actually reading it. It's not us, the detractors, complaining about using new technology because it's "new". For one, it's not new, it's the n-th undeveloped iteration of a technology 20 years old. We're not complaining about you using technology, it's us complaining about you ignoring the advances that could buy alcohol in the US by now.
JS ecosystem is pretty well know for changing very fast compared to other mainstream languages. This is a fair point, NPM could implement the local cache without (hopefully) breaking anything
From my understanding they’ve always had one, but until npm@5 it wasn’t safe for concurrent access (side note: Maven still isn’t) and was prone to corruption. I think they’re making their way toward true offline cacheing a-la yarn, if they haven’t done so already.
We are talking about tools here. Standards are a different beast.
For example it is cool to have multiple tools doing the same thing is cool because you have the choice to use what fits your need (e.g. different Web Servers).
On the other hand, having multiple competing standards for the same job is just technological cancer and mostly the result of some commercial competition (or the attempt to fix a standard by replacing it).
Yarn (which is an alternative to npm) uses a global cache [1] on your machine which speeds things up, but probably also protects you from immediate problems in cases like the one currently on progress (because you would probably have a local copy of e.g. require-from-string available).
[1] https://yarnpkg.com/lang/en/docs/cli/cache/