Hacker News Comments on
10 Things I Regret About Node.js - Ryan Dahl - JSConf EU
JSConf
·
Youtube
·
1001
HN points
·
37
HN comments
- Ranked #17 all time · view
Hacker News Stories and Comments
All the comments and stories posted to Hacker News that reference this video.See:"10 Things I Regret About Node.js - Ryan Dahl - JSConf EU" https://m.youtube.com/watch?v=M3BM9TB-8yA
> …I already have taken care of a lot of those Node issues in my own projects, there is not a ton that I see compelling in Deno.I felt this way too before using Deno for a while. So far I enjoy:
- Breaking from npm/node modules support on purpose turns out to be a real plus for me. It's refreshing to have dependencies referenced by URL, and it's good to have them cached centrally by default (in $HOME/Library/Caches/deno on macOS and $XDG_CACHE_HOME/deno or $HOME/.cache/deno on Linux, for example).
- Use of web platform APIs (https://deno.land/manual/runtime/web_platform_apis ) and the work to standardise those across platforms (https://wintercg.org/ ) is encouraging.
- `deno lint` and `deno fmt` make adopting and using JS/TS feel more like Rust/Go/other languages with good built-in ceremony-free tooling.
- Fresh is turning into a very nice Next.js/Astro alternative (https://fresh.deno.dev/ ) that I found very easy to learn and deploy, with great performance and developer experience out of the box.
Bun is interesting, but I wish it didn't embrace node modules: perpetuating its use instead of attempting to move the community on by recognising it for the mistake it was feels sad to me. (See Ryan Dahl's “Design Mistakes in Node” PDF or talk for more: https://tinyclouds.org/jsconf2018.pdf and https://www.youtube.com/watch?v=M3BM9TB-8yA )
This explanation resonates with me.I first saw Ryan Dahl's talk about his Node.js regrets[0] during the early days of the pandemic. I thought I'd be eager to try Deno when it became more mature, but I still haven't tried it. I guess I'm not convinced that it's 10x better than Node.js.
i also asked this on twitter so just putting some replies here:- Doug Engelbart https://en.wikipedia.org/wiki/The_Mother_of_All_Demos
- Twilio https://avc.com/2016/06/best-seed-pitch-ever/
- Stripe - "7 lines of code"
- Netlify - https://www.smashingmagazine.com/2015/11/modern-static-websi...
- Heroku - https://12factor.net/ and git push heroku master
- Cloudflare - https://mixtape.swyx.io/episodes/cloudflare-at-techcrunch-di...
- Node (https://www.youtube.com/watch?v=ztspvPYybIY&feature=youtu.be) and Deno (https://www.youtube.com/watch?v=M3BM9TB-8yA)
- Firebase (reportedly) - https://twitter.com/_davideast/status/1537864335715860482
smaller companies/less impactful pitches that i still like
- Redux https://www.youtube.com/watch?v=xsSnOQynTHs
- Stackblitz https://twitter.com/sulco/status/1537867531511287808?s=20&t=...
- Comm https://www.notion.so/commapp/Comm-4ec7bbc1398442ce9add1d795...
- Mongodb https://twitter.com/mongodb/status/1192530877148008448
- Let's Encrypt https://twitter.com/mbleigh/status/1537866383710511104
- Figwheel-Clojurescript https://www.youtube.com/watch?v=j-kj2qwJa_E&t=598s
⬐ swyx(me again) I've now added this to my tracking list of pitches:will keep this live as stuff comes in!
⬐ creativenoloGreat question. Extra points for this comment too!⬐ johnsThis is the Twilio demo you should watch https://avc.com/2010/08/how-to-pitch-a-product/
The creator of NodeJS talks about how one of the things he regrets is hard-coupling Node to the NPM registry[1]. I imagine this makes it hard to have curated or trusted third-party registries (although note that it is possible to configure private or third-party registries in Node). This is also one of the problems the creator tries to solve in his new runtime, Deno.
If you want a couple other examples here's what I've got offhand..Perhaps the original example, Hoare helped popularize null pointers, and then gave the "Null Pointers: The Billion Dollar Mistake" talk https://www.infoq.com/presentations/Null-References-The-Bill...
The creator of nodejs talking about some of its mistakes: https://www.youtube.com/watch?v=M3BM9TB-8yA
Nada Amin was part of the Scala team, and also wrote a wonderful paper about how Scala's type-system is fundamentally unsound: https://namin.seas.harvard.edu/publications/java-and-scalas-...
bradfitz was a core go team member, and wrote a post about what their net.IP type got wrong https://tailscale.com/blog/netaddr-new-ip-type-for-go/
I have no doubt there's many more examples too, but those are the ones I can think of offhand.
Happy birthday, Node.js.Sure, your creator has said some unkind words, much of it true and reasonable ¹; and has left to create another child, a newer and better version of you ².. But we know you're here to stay, decades into the future - even if as legacy code. Having written numerous applications, many of them running for years now, I'm glad to have known such as easy-going, friendly server-side runtime.
[1] 10 Things I Regret about Node.js - https://www.youtube.com/watch?v=M3BM9TB-8yA
⬐ peanut_wormAfter using Deno once I don’t want to switch back. It isn’t a massive difference but it sure is nice.
> Just because they are attempting to address it does not mean they will succeed.Strongly agree this statement , hence I don't see how switching from "npm" to "raw urls" will solve anything...
The problem with Node dependency is bigger than just "npm is not a good package manager"... Honestly in this case just fork node and replace npm with something else...
Here the problem relies on a mixture between poor built-in apis which are buggy a lack of vision with the language , which have been core to the language since it's origin.;.
Deno doesn't seems to address those at all...
Again it's just seems to be "npm is bad , and i want to use typescript natively with web apis"...
I just know very well that Ryan is redoing exactly the same mistakes as Node with the same obsession he had on "EPOLL"[0] back then that will end up in a new fiasco.
[0]https://youtu.be/M3BM9TB-8yA Can't find the specific part where he mentioned "EPOLL"
⬐ v8dev123npm maybe bloated but I'd say it's better than pip. pip is insane, you have to set up virtual env (hurts UX) but with node you don't have to.⬐ e12e> (...) will end up in a new fiasco.Are you saying nodejs is a fiasco?
⬐ gadrevre-[0]: 15:43⬐ tarruda> I just know very well that Ryan is redoing exactly the same mistakes as Node with the same obsession he had on "EPOLL"[0] back then that will end up in a new fiasco."epoll" is a Linux API for listening on multiple file descriptors. Different platforms have equivalent APIs, and these are normally core of any scalable non-blocking I/O network program.
Can you elaborate why do you think he had an obsession with "epoll"? More importantly, can you elaborate what was the fiasco? Epoll is still used under the hoods by Node.js (through libuv) and many other network servers such as Nginx
Confirmed. https://youtu.be/M3BM9TB-8yA
⬐ barry27nope, it's den-o. it cant be dee-no. that would be silly. like saying moss-cow.
Ryan Dahl (author of the announcement post) is the creator of Node.js, so I think he's got a right to say these things! Also see "10 things I regret about Node" [0]
⬐ cphooverRyan Dahl left the leadership of the Node.js pretty early in its development. A lot of people can be considered "the creator" of Node.js to be fair.⬐ murukesh_s>Ryan Dahl left the leadership of the Node.js pretty early in its development. A lot of people can be considered "the creator" of Node.js to be fair.Don't think so. I have been using Node.js since 2010 and following the ecosystem. Node.js is the child of Ryan Dahl. Other than Ryan, perhaps Isaac Schlueter is the most influential person who developed the NPM as a separate project and later merged into Node. Ryan could have done it himself or embedded it in Node.js, but guessing from Deno he is not keen on a centralised package repository so could have let Isaac build NPM as a separate project.
From what I could observe, the overall API is still pretty much the same from 2010. What's changed is V8 getting updated to newer versions (and thereby supporting new ES features) and other performance optimizations. But no one can't deny or take away the creator credit from Ryan.
⬐ brooksideNo. Ryan Dahl was the initial individual creator of Node.⬐ nrmitchiWhile I'm not going to argue the history of if/when he left the project, my understanding is that it's fairly agreed upon that he is the creator of Node.js. If you google "node js creator", he's an embedded answer (not a search result).The first line on Wikipedia in the nodejs history section is "Node.js was written initially by Ryan Dahl in 2009".
You can make whatever point you want, but maybe try to do it without rewriting history, or changing the agreed-upon definition of "creator".
⬐ notriddle> If you google "node js creator", he's an embedded answer (not a search result).Embedded answers are worse than useless.
https://www.google.com/search?hl=en&q=who%20invented%20hands
⬐ XCSmeThe result is relevant and the information is (presumably) correct: "Introducing hand disinfection standards"⬐ nrmitchiFinding a exemplar of a bad response does not make the entire category "worse than useless".
Recommended "literature":1. 10 Things I Regret About Node.js - Ryan Dahl - JSConf EU (https://www.youtube.com/watch?v=M3BM9TB-8yA)
2. Brendan Eich: JavaScript, Firefox, Mozilla, and Brave | Lex Fridman Podcast #160 (https://youtu.be/krB0enBeSiE)
⬐ redismanSure but there are all known issues to any Senior Devs. You don’t have to pull shady hairballs from npm for every little feature you can think of. I’ve ran node in production for the last 5 years so at least to me it counts as battle tested and “boring”.
Deno is a JavaScript runtime much like Node. For the reasons on why creating Deno I recommend "10 Things I Regret About Node" by Deno's author [1]Deno is different than Node in several aspects; most notably:
- Deno supports only ES modules, there's no built-in support for CommonJS modules
- Deno's APIs are all promised based
- Deno does not use NPM, instead it can pull code from any URL, much like browsers do
- Deno has built-in permission system that by default runs your code in full sandbox allowing to opt-in into breaking out of sandbox (eg. to read a file from disk)
- As you've mention Deno can run .ts files without explicit build step
- Deno comes with a full toolchain in a single binary (formatter, linter, test runner, bundler, doc generator)
⬐ mpoteatPersonally I appreciate being able to choose my linter, compiler, dialect etc. I also tend to prefer distributed solutions. Deno running the entire environment is a negative for me, at least for now. To me, it just shows an approach of ignoring what already exists and reinventing the wheel.I've tried to get Deno to work before in production, but it had so many compatibility issues last I tried it would take weeks or months to refactor things so it would work.
⬐ bartlomieju⬐ z3t4> Personally I appreciate being able to choose my linter, compiler, dialect etc.I completely agree with that! But on the other hand with plethora of tools available it can be quite overwhelming to configure all the tools, especially for new users.
> I've tried to get Deno to work before in production, but it had so many compatibility issues last I tried it would take weeks or months to refactor things so it would work.
Work on compatibility layer with Node is ongoing [1]. With every release there's some new API being compatible, but far from over.
⬐ mark_and_sweep> Personally I appreciate being able to choose my linter, compiler, dialect etc. (...) Deno running the entire environment is a negative for me, at least for now.Nobody forces you to use the deno dev tools. You can still run eslint, prettier and closurescript, if that's your sort of thing.
Personally, I prefer deno lint over eslint and deno fmt over prettier since they are much faster. I'm even using dprint (which is a standalone project for code formatting, https://dprint.dev/) in Node projects.
Similarly, before deno test, I created my own deno testing tool. Now I use deno test instead since it's just better - not because anyone's forcing me to use it.
The integrated dev tools are a convenience feature. I hope that's kinda obvious.
⬐ apatheticonionI did too, but the amount of compiler configurations out there kills me.If you write a library, you need to support several export types for node packages. In your `package.json` you must include a `main` `module` and `exports` object and provide two compiled outputs, one commonjs and one es modules.
Then you need to worry about mutating import paths to include the file extension, which can cause trouble when you keep the commonjs and esmodule files in the same folder.
Also the esmodule loader in node doesn't have access to things like `__dirname`, so certain things can break.
not to mention node_modules...
It's like IE support but on the back end.
Then you step into the front end and it's another whole layer of chaos.
Give me opinionated compilers with minimal configuration and let me write code.
⬐ pimterry> If you write a library, you need to support several export types for node packages.This is a hassle, but didn't used to be true for node - you could say the same as all the above for commonjs, wasn't it nice to have a single opinionated standard.
In the short term, deno avoids this by dropping backwards compat, great! But in the long term, I don't see how it doesn't end up in exactly the same place as soon as the next big change to JS modules comes out, or the next new build environment or wasm integration becomes bigger or...
Unless they have a fundamentally different strategy for the future (either 'we will never evolve' or 'we will evolve with ecosystem-wide breaking changes') they're going to end up in the same state as node today, eventually. I haven't seen any discussion of such a strategy at all. It's just a temporary reset - unlike node, they get to break backward compat and support the One True Format because they're new, that's all.
Node.js module system (kinda like CommonJS) is what made Node.js popular. ES modules while taking away features like scoped module support and dynamic import, it's very complicated and allows bad practices like include files.Promises are very complicated compared to first class functions. What makes JS/Node hard to grasp is that it's async. Async is an (often unnecessary) optimization, with tradeoffs.
Loading modules from URL's is a cool concept! ES modules helps here, but you could also have a package-list file that lists all dependencies of dependencies as well as download mirrors, or hashes with peer-to-peer distribution.
A permission system is nice, modules should not have system access by default.
Not everyone wants to use TypeScript. It will probably become obsolete once optional type annotations gets added to JavaScript.
An opinionated toolchain is nice, but should be optional IMHO.
⬐ wperronIt's important to remember that while, yes, CommonJS made Node popular, it did so because it filled a void in the Javascript syntax and specs. There was nothing to formalize the concept of a "package" back then.That's not true any more, ES Modules have made it into the spec, so that's what Deno is using.
As for package-lists, the current convention in the community at the moment if you have a decently sized library is to have a `deps.ts` file where you re-export all of your dependency, making it an equivalent to package.json and helps with upgrading dependencies across a codebase.
TypeScript is already optional in Deno! it will run any .js file just fine, and you even skip the compilation part.
⬐ mark_and_sweep> Not everyone wants to use TypeScript.You don't have to use TS. Deno runs plain JS, too.
> [TypeScript] will probably become obsolete once optional type annotations gets added to JavaScript.
What makes you think that type annotations will be added to JS? I think it's far more likely that browsers and other runtimes will natively support TS as a separate language rather than JS evolving to become TS.
> An opinionated toolchain is nice, but should be optional IMHO.
It is optional. You don't have to run deno lint, deno fmt, deno test, etc. But at the same time, they are pretty good tools so you might want to try them.
⬐ z3t4⬐ richeyryan> What makes you think that type annotations will be added to JS? I think it's far more likely that browsers and other runtimes will natively support TS as a separate language rather than JS evolving to become TS.It has already been tried with Dart. Dart was made because JS lacked a type system, preventing further optimisations. Support for Dart was added in Chrome.
Another popular JS transpiler is CoffeeScript, most of it's syntax is now in JavaScript.
⬐ mark_and_sweepI have no data to prove this, but I feel like TS is far more widely used than CoffeeScript and Dart have ever been.Support for Dart in Chrome was added before Dart was popular (if you can even consider it popular at all). Since TypeScript is already popular now, I think if Chrome added support for stripping the types and running TS code as JS, most devs would welcome that.
ES Modules have dynamic imports (https://github.com/tc39/proposal-dynamic-import). The proposal is at stage 4 and is available from Typescript 2.4. We're on version 4 now so I'd expect Deno to have it.I'm not sure what you mean by "bad practices like include files" so I can't comment on that.
I've found most Javascript developers prefer the await syntax with promises to using callbacks. It gives the code the appearance of being synchronous with the ability to do things more asynchronously if you need. I haven't encountered many who actively prefer the callback style. It gets unruly fairly quickly.
I can't see optional type annotations ever being added to Javascript. They would have to be checked at runtime which isn't something I'd imagine browser vendors wanting to implement.
The feeling that I get is that the standards committee is trying to bring Javascript to be the best dynamic language it can be and if people want more comprehensive guarantees then there are excellent tools like Typescript which give that option.
The fact there have to be multiple implementations of the standard in the various JS runtimes makes it a hard sell to evolve Javascript too far.
⬐ twodaiReally like async await. The only case ive found for callbacks is for the top level function call in a script. Calling ".then()" is useful since top level await is still not a thing, and may never be.⬐ richeyryanTop level await is at stage 3: https://github.com/tc39/proposal-top-level-awaitIts in Typescript 3.8, Node 14.8 and probably in your Babel setup. I probably wont get to use Node 14 in prod for a while but I get by with a `main` function that has everything in that gets called at the bottom.
In his 2018 talk, "10 Things I Regret About Node.js" https://www.youtube.com/watch?v=M3BM9TB-8yA&vl=en he identifies seven (not ten) regrets.1. Not sticking with Promises: This is changing, slowly. You can `import {readFile} from 'fs/promises'` in Node and it works as you'd expect, including top-level await. (Backwards compatibility means the callback API can never go away.)
2. Security (your linter shouldn't have complete access to your computer and network): Deno hasn't done a great job with this, either. You can restrict the access that a Deno process has, but you can't restrict the access for individual modules. If any module in your server needs to access something, then every module in your server can access it.
I predict that module-level authorizations will be solved some day by browser vendors, and that Node and Deno will adopt the thing. Deno will probably have to throw out their thing what that happens.
3. Build system (GYP). This has no effect on userland Node developers. You build node with make. Another build system could be adopted, but I think nobody's bothered. Deno has a protobuf FFI to communicate with V8. You can do that with Node if you want. Shrug.
4. require("package") relies on package.json. Deno uses import maps. Node will probably honor import maps someday, too.
5. node_modules: He said it "complicates the module resolution algorithm." Meh. He also points out that node_modules is too large, but that's a Node cultural problem. Deno's community is still small, but it will have that problem, too, except it will have a large shared "cache" instead of a large local node_modules folder.
6. require("module") without the extension ".js": Deno does this, too, using import maps. It's fine.
7. index.js: Again, it "complicated the module loading system." Meh?
⬐ jswnyAs a user land Node developer I’ve had tons of problems with GYP and packages I install trying to use it. It absolutely leaks into the developer experience.⬐ realityking⬐ davidtranjsThings should be getting better with the increasing adoption of N-API. Though it‘s still a long journey.Wait, there is a fs/promises module? I still use Bluebird.promisify in every recently project :facepalm:⬐ e1g⬐ 52-6F-62Node also ships "util.promisify" in its core https://nodejs.org/api/util.html#util_util_promisify_origina...> You can `import {readFile} from 'fs/promises'` in Node and it works as you'd expect, including top-level awaitI can't believe I missed that. I've still been writing promise wrappers like a fool.
⬐ city41util.promisify is also useful: https://nodejs.org/dist/latest-v8.x/docs/api/util.html#util_...
It's just so different, lots of people could claim different killer features. A strong contender for one section of the community might be the first class typescript support. Other people might like the way you can run so much from a single executable.Anyway, node is a pretty mature platform now, if it dies, it'll die like java-the-language is dying. Very very slowly, and with parts of its platform holding out much longer than other parts. Or alternatively it'll evolve and slowly adopt things from other ecosystems.
I don't think we're at the stage where we can say that a bet on deno is definitely right, but it has a lot of interesting ideas and lots of people will be picking it up where they can.
If you're interested in the difference between node and deno, check out the talk '10 things I regret about Node' https://www.youtube.com/watch?time_continue=1&v=M3BM9TB-8yA&...
⬐ silentprogSince when is java slowly dieing?⬐ snazz⬐ throwaway189262Java mindshare has been dying for a long time. That said, it's still taught in schools and all sorts of companies still use it (including "cool" ones like Google and Apple).⬐ The_Colonel⬐ andaiJava mindshare might be slowly shrinking but it is very, very far from dying.⬐ rektide⬐ spiritwanderI'd ask, what is sustaining java? Where are the replacement techies picking up the torch, where are youngsters seeing java?I have some answers. It's not a lost cause. Death is not likely, & I do appreciate a lot of java (cdi rocks, microprofile is doing great, performance is good, it has excellent big data tools & many serious pieces of infra are built with it). But how java can retain liveliness, over time, & how the experience & knowledge base continues, is a real challenge. Being on Android helps a lot but also it's a radically different immensely unlike the server side world, with its own elaborate platform specific architectures & libraries. There's not a lot of places java has a hold on UI/ux centered systems, outside Android, so it risks becoming too invisible.
⬐ pjmlpWhen SAP rewrites their stack to something else other than Java, Amazon or Alibaba stop contributing to Java, then I start worrying.Microsoft decided that Java death is so imminent that they are now an OpenJDK contributor, collaborate with Red-Hat on VSCode Java support, have bought jClarity and Java has first class support on Azure.
⬐ j-pb⬐ SiValWhen big banks stop maintaining their cobol bases, then I'll start worrying about the death of cobol.SAP is a backwater for programmers. They're at least 20 years behind the curve.
⬐ pjmlp⬐ lenkitePlenty of programmers dream of SAP level salaries.Do you know that several JVM optimizations have been contributed by them?
I for one, rather work with SAP than dealing with npm ecosystem.
⬐ j-pbThe fact that you bring up salaries perfectly represents everything wrong with SAP.A job is more than a money maker, it's a platform for expressing yourself.
SAP wouldn't understand that though, so they hire the mediocre (to write Java, a language explicitly designed to cater to less capable programmers) to produce joyless, soulless stuff. Of course they pay well, they have big customers with no taste for good work, but plenty of money, nobody in their right mind would take on a SAP job if the money wouldn't make up for the grueling tedium.
If you think that it's either SAP or node, oh boy, thats a false dicotomy of hilarious proportions.
And BTW, I've heard that Cobol salaries are quite high too.
⬐ pjmlpArtists express themselves, Software Engineers get the job done and go home.Yeah, ironically COBOL is more modern than C in language features, yet C seems to still be quite appreciated for a 50 years old language.
⬐ rektideWe are building the noosphere here. We don't all have to wear that mantle, but some of us should have hopes and ambitions in our jobs. They don't even have to be consuming, but simply seeing this as an ongoing continual project seems due. We're coming close to RFC10000. No one goes home. We strive to open possibilities for humanity so we can all each keep exploring & going further.Edit: oh good to see you again pjmlp, after last week. We fell to different sides then too, but I have continually enjoyed your writing, & again, while I talk & disagree, I would also not say you are wrong. You are more than correct, in the vast amount of cases. But I would suggest that this field in particular must also be heads up looking up & about, that just a job doesn't fully describe us all.
⬐ j-pbThe way you get the job done is also an expression of yourself. There is never only one way to do things.The issue is that if you work for a company like SAP, you most likely don't have enough personal flavour to come to interesting or meaningfull solutions. If you don't have passion, be it even for correctness, you'll build mediocre things.
Germany paid SAP millions for their Covid tracking app. Reaearchers had already worked out a protocol for them, apple and google had already implemented it. All they had to do was write a simple frontend and be done.
But politicians decided that such a big and important job, needs to go to big and important companies like SAP and T-SYSTEMS, for millions of euros.
SAP tried to be cool and agile, they even forced their employees to create github profiles (something none of the had done prior).
In the end they delivered a month too late, with critical bugs. And millions of server maintenance fees.
A system that was already 90% implemented, with network requirements that can be satisfied by fiber to the home connections, or a single S3 instance.
Why? Because they used mediocre people, who didn't have enough passion to create github profiles. Because they used the least common denominator, Java, and reinvented stuff that was already implemented. And because everybody just wanted to get the job done and go home, no expression of self involved.
But hey, I heard they pay good.
Also add Apple and Netflix to that list.⬐ dotekaI guess a Roman, asked circa 300 AD how he feels about the decline of the empire, would have replied similarly.Java’s mindshare just isn’t where the exciting things in this field are happening. It’s more the domain of overengineered “big data” platforms and clunky enterprise software. Yes, SAP employs a ton of java developers I’m sure, but many devs would rather switch careers than work on anything resembling a crufty ERP.
Thus, the ecosystem stagnates, due to the dead see effect - everyone who could push it forward into new areas has no interest being anywhere near it. And stagnation is a long, drawn out death all the same. It still has a ton of momentum, and millions of outsourcing devs who only know Java, so it will be with us for a while - but make no mistake, the decline started quite a while ago.
⬐ pjmlpOn my domains only Java or .NET stand a chance to win RFPs, and if anything .NET is the one having troubles, because many are unsure what Microsoft actually wants to do with it, and slowly pissed with the multiple rewrites where there is always something left behind.I only see RFPs for Java based technology going up.
⬐ yrio⬐ lenkiteDo you think .NET's future is bleak?⬐ yrioDo you think .NET is currently being badly managed compared to Java? And do you think it will negatively affect .NET's long term future?⬐ pjmlpKind of, Microsoft seems to trying to fix UWP, while trying to turn .NET into a cross platform runtime that has an eco-system that has been Windows only for the last 20 years.So while they are doing lots of nice performance improvements, there are plenty of business not so happy that their 20 year investments don't run on .NET Core, and if a rewrite is needed (e.g. WCF vs gRPC) then why not just jump into something else.
Just see the lengthy roadmap, and the considerations that not everyone was happy with "AOT" (packing everything into an exe that unzips on execution).
https://github.com/dotnet/designs/blob/main/accepted/2020/fo...
Also check the repositories from Project Reunion, it is pretty much WIP right now.
Most of the world uses Java, including "happening" companies. Please be highly aware of HN bias. If one only reads HN, one can get a highly distorted view of the reality of software engineering.⬐ dragonwriter⬐ rektide> Most of the world uses Java, including "happening" companies.On the contrary, even boring enterprises often don't actively use Java. Sure, lots of places do, but lots of places have either never adopted or dropped (or at least relegated to certain legacy systems) Java, especially, on the enterprise side, places that at some point became largely Microsoft shops, where even if they've since moved beyond that, they probably didn't go back to Java except maybe for any Android work.
⬐ lenkiteI guess you are in a different bubble. Netflix's backend is mostly all Java, Apple Backend (apple media products engineering) is Java based. Neither of these companies will ever become a Microsoft shop.Personally I find the Java engineering to be excellent & not overblown, & I appreciate so many of their patterns. Making Service Provider Interfaces for everything? Epic awesomely powerful. Maven? Shockingly regular & predictable & clear, although what a lot of the plugins do is wild. I really don't understand most people's grief & complaints about Java, they seem abstract & emotional, but I for one don't mind typing, & don't feel bogged down in JAva.There's some good words in the grandparent about who uses Java, that I respect a lot & holds enormous truth.
I don't know anything about COBOL. But neither do any of my programmer friends. But COBOL is also far from dead, yet it might as well be to the rest of the programming world, I feel like. It has no mutual impact, it's too far removed from the regular happenings, & I'm not sure how or where dialog would open.
So yes, like, I think the Roman example is really good. Communication are getting cut off, people are stuck. Things might be good here, but the world is regressing to a pre-Romanized status, with little overland travel, unable to harvest the breadth & intelligence of the many great citizens, that Java used to be a contributing key part of.
I don't think Java is stagnant or dead, it's not so glum. Micro-profile is being wonderful. DropWizard is a very lovely quite popular scene still. But right now, Java's presence in the AP Computer Science curriculum is doing an enormous amount of heavy lifting for Java, and once that dam breaks- and it doesn't seem like there are many fitting replacements atm, with all the nice neoclassical columns & facades to make the language feel academic/computer-science-y- it's gonna be harder days for Java, & the weakness within, the being more cut off, is going to hurt.
"Where are youngsters seeing java?"Funny you should ask that. If a high school kid takes Advanced Placement (AP) Computer Science in the US (including the kids here in my neighborhood in Silicon Valley), the only language taught is Java. Kids who want my help with their programming projects or classes here in the Valley, even the undergrad CS majors at Stanford, are always going to ask about one of three languages: Python, JavaScript, or Java.
In the shadow of the Apple spaceship, the iPhone is exactly as Apple intends: an appliance for you to communicate with friends and buy things from Apple, certainly not a computer for you to program. Nobody ever asks me about Swift. If you want to program a computer--sorry, "code"--it's always and only Python, JavaScript, or Java.
java is not platform-dependent,it is a platform..⬐ nzlyIt is nothing about Java or even mindshare, it is all about money, shareholders, new companies, or even just have to reshape a wheel again. This type of subject is a pseudo-proposition. When many people are using Java, some guys need to create something new (may not really new) to gain the financial interest. Of course they have to pick up something bad about Java or whatever languages and then fill his new stuff in the marketplace, but actually nothing really new. C, C++, Java, Php, Ruby, Python and do on. When everyone is familiar with something or know something for long, the capital and money stop moving, which is bad for game players such as new companies in Silicon Valley and money makers from Wall Street.It's been 3 billion devices for twenty years :)⬐ findjashuaJVM as a platform is doing just fine, but other JVM languages seem to be taking over Java’s territory:1. Kotlin replaced Java for Android development
2. Scala has become the primary language for some of the major tools in the data stack - spark, kafka etc
I’m sure there are other examples as well
⬐ pjmlp⬐ pjmlpJava is going away from JVM as much as UNIX is loosing C.1. Politics due to how Google screwed Sun with Android Java
2. Scala is a tiny dot on JVM world and it remains to be seen how Scala 3.0 will be taken up, specially if Python like syntax is actually adopted
⬐ reader_modeWell UNIX (or Linux more importantly) is "losing" C, C++ and more recently Rust are slowly displacing it. It's still probably going to be the ABI everything is based on but the language itself is not a first choice for many and that number is increasing over time.Is that what you were implying ? It doesn't sound like it from the rest of the comment because you seem to disagree that Java is losing mindshare on JVM.
⬐ pjmlpFirst get a UNIX clone to take over the world written in something else other than C.⬐ bitwizehttps://redox-os.org⬐ pjmlpThat is not an UNIX.Having some level of POSIX compliance doesn't turn an OS into UNIX, unless you mean Windows is a UNIX as well.
It is a HN thing, on my circles the choice is only Java or .NET and it isn't going to change for years to come.Java is dying so fast than nowadays even Microsoft contributes to OpenJDK, having bought jClarity in the process.
⬐ nullsense⬐ ShornWhere I am it's largely dominated by Java and .NET with bits of other languages here and there.I jumped ship to .NET after Oracle changed their licencing. I don't like them as a vendor and don't want to be tied to them.
Java will still live to a pretty ripe old age but it does feel like it's a little over the hill nowadays.
⬐ lenkiteThe problem with .NET is that Microsoft have a proven history of seeing the next shiny over the hill and dropping support for existing frameworks and libs while the "new and improved" framework is rolled out. See ! This way is better! Well, until the next better way comes up within a few years.I know of teams who have moved to Java simply to avoid the churn. Many folks prefer the boring but stable Java ecosystem.
⬐ pjmlpAs if Microsoft would be an angel vendor..NET Core also doesn't run in all platforms where there is a Java implementation and not everyone is happy to rewrite their .NET Framework into Core, while Microsoft is in mix of leaving again stuff behind like the ongoing discussions about CoreRT, .NET Native, Project Reunion, MAUI vs Blazor vs WPF vs Forms show.
⬐ nullsenseCool. I'm happy enough with it all, so that's me sorted.You're on HN.If it's not growing explosively - it's dead.
Do I really need a "/s" here?
⬐ alephu5⬐ kybernetikosAs well as this, I think it's also perceived to be a clunky old language. Our modern languages have much more ergonomic build systems, are much less verbose and have coding conventions tailored to modern tastes.This is important, so I predict that over the next few years, as the 20-30 year olds of today gain more senior positions, they're going to choose python, JavaScript, kotlin, golang or rust for new projects.
Well, part of my point is that it's very very slow, and because of that, there are all kinds of points I could pick for 'since when'.Maybe I'd pick when apple removed the default install from mac os, or maybe I'd pick when browsers made it impossible to run applets, or maybe I'd pick when the major java IDEs started pushing other languages like kotlin or xtend, or maybe I'd pick the oracle acquisition (never a good thing), or maybe I'd pick the release of go - a language squarely targetting pretty much the only niche that java has left (enterprise development of servers by mixed ability teams), maybe I'd pick the point when java development felt like it became more like configuring metadata for frameworks than actually coding, or maybe I'd pick when google started showing kotlin as the default language for developing on Android rather than Java.
Here's google trends for java: https://trends.google.com/trends/explore?date=all&geo=US&q=j...
Java-the-language is not going away any time soon, but if you expect it to do anything except decline, I think the future will be disappointing.
⬐ pjmlpYep, so much decline with 80% of the mobile phones, even if it is based on Android Java, on top of which do you think all Android tools run on?Then do you think the likes of SAP, Adobe, Amazon, Azure, Ricoh, Gemalto, PTC, Azul,.... are going to port their golden eggs to Go now?
⬐ vfistri2Nop they’ll probabbly just buy new software that is built on smth like go⬐ sinsterizmeThis doesn’t refute his point, Java the language is separate from the JVM. And those big companies will follow the decreasing Java usage trend due to developer supply and demand. Anecdotally, it seems pretty obvious some of those companies are already doing this by writing new stuff in other, better languages when possible! To me that’s a good indicator a language is slowly dying, maybe you use a different definition.⬐ pjmlpMost JVM implementations are around 80% Java with some C++ for the JIT/AOT and GC, GraalVM and JikesRVM have even bigger percentage.It is an illusion to use JVM without touching or understanding Java works.
⬐ andykx⬐ andyroidI mean, I know tons of Scala and Clojure devs who don’t touch Java at all in their day to day work. I don’t think the user(s) above were referring to the implementation of the JVM.⬐ pjmlpThen something blows on their face and they are completely lost, because they don't know what powers their code or the boilerplate magic that they generate to pretend to be Java for the JVM.I have seen a couple of those guys, where I get called to sort out their problems.
⬐ afioriBut this is still compatible with Java slowly dying. It is possible that in a future Java role in the JVM will share some similarities with C's role in CPython.Pretty much any language running on the JVM will make it dead simple to import and use libraries written in Java - in fact one of the main reasons to choose the JVM is the huge amount of high quality libraries available. So while true that Java != JVM, it’s not as simple as that either.Java is still the best choice for many backend systems. C# and Go still don't have nearly the breadth of open source libraries available. And performance between the three is very similar, besides Java's new GC's are far better.For a majority of enterprise type integrations, which I would argue are the majority of software projects, Java and C# are still the only viable choices. And I don't see that changing any time soon. Go and JS haven't really touched that space despite widespread use for webapps.
If you need a big enterprise MMORPG type system with tendrils reaching across the cosmos into your stagnant data ponds, creaky AS400, and React customer facing mirage, Java is the legendary weapon. It's your only choice, your destiny.
⬐ ornornor> React customer facing mirageNice
It also reintroduces one of the main problems of node back into deno. https://youtu.be/M3BM9TB-8yA?t=581
⬐ buttercubznot really since it is around imports map something native to deno, the purpose is to make it something familiar like npm⬐ 29athrowaway⬐ tatefnpm should not be a model to be imitated. npm modules make security audits impossible.⬐ buttercubzTrex currently uses deno.land/std, deno.land/x, nest.land and any repository. nest.land provides a blockchain-based service, we just take the way npm is used for our CLI. Trex doesn't try to look for the same npm issues, we just take the import maps and create a tool to manipulate them in a friendly way for those who already know the nodejs ecosystemIn my opinion, Trex is actually working against what npm introduced to Node. Though I don't exactly know what "main problem" you're referring to, I can say that:1) Trex is supporting multiple module registries, not just one. (Thus not necessarily centralized) 2) Trex is not associated with one entity in particular. This means that they have no company dedicated to hosting modules in house (another reason that they aren't centralized) 3) Node's package.json included many other things than just imports. Using Trex is completely optional, and if you do use it, an import map does not hinder one's development workflow. In my opinion, it makes dependencies easier to manage.
Well, in all the initial presentations Ryan Dahl gave about Deno, he phrased it as solving the things he'd come to realise he hated about Node. So that's what I take as the raison d'être of Deno - taking the lessons he'd learned from some years mainly coding Go and thinking about what he'd learnt from Node, and fixing what he could. Here's the classic video on the topic: https://www.youtube.com/watch?v=M3BM9TB-8yA
I just hope in 5 years we're not seeing a talk saying that in retrospective, dropping static types was a huge error... and DoNe is a new project to fix issues like that!(I'm just referring to this nice talk about the regretful mistakes done in Node: https://youtu.be/M3BM9TB-8yA)
Deno [1] is a new JS runtime by one of the creators of NodeJS, Ryan Dahl.He had a great talk [2] about the lessons he learned when creating/maintaining NodeJS. Many of these lessons are being applied to Deno.
> for all the hate node_modules gets for being enormous, at least it works reliably!Funnily enough node_modules is one of the main regrets Ryan Dahl, the creator of Node.js has: https://www.youtube.com/watch?v=M3BM9TB-8yA&t=755s
⬐ IshKebabYeah it's not a great design but it does at least work reliably!
I don't think that's correct. From what I remember of watching his talk[0], Ryan is a fan of JavaScript. TypeScript gives optional typing, so you can still write normal JavaScript anyway. I don't think there was ever a plan to enforce types in Deno.
I think he meant Ryan Dahl's talk: Design Mistakes in Node https://www.youtube.com/watch?v=M3BM9TB-8yA
For context:Deno's lead is Ryan Dahl, the creator of Node.
In the past, Dahl's expressed regrets over choices he made early on in Node's development, and on the direction Node has gone since he left the project many years ago.
He bravely presented on this topic at jsconf eu, 2018: https://www.youtube.com/watch?v=M3BM9TB-8yA, it's a fantastic talk.
The last 10 minutes are a pitch for what a "better Node" would look like, if he were to start from scratch in 2018. The end result of that train of thought this project, which we should probably think of as bourne-again node.
Deno's development is (presumably) strongly influenced by his experience and furstrations with Node's shortcomings in both performance and developer ux.
⬐ ignoramous> The last 10 minutes are a pitch for what a "better Node" would look like, if he were to start from scratch in 2018.Is it fair to say (a managed) deno is what Cloudflare Workers is? If not, what would be key differences between them?
⬐ steveklabnik⬐ rosywoozlechanPM on part of Cloudflare Workers and someone who was in physical attendance for this talk here.They're not really directly comparable other than "A JavaScript runtime built on top of V8."
Workers doesn't support TS directly, though you can compile TS to JS and run it, of course. (My team maintains a worker and this is what we do, and it works well)
Deno has its own APIs, as does Workers. Worker's main API is the Service Worker API, Deno's is not.
Workers is focused on V8 isolates as a means of achieving multi-tenency. I don’t believe Deno does anything specific here.
Deno is mostly implemented in Rust, the Workers runtime is written in C++.
Deno is open source, Workers is not.
Workers is being used at scale in production, Deno just launched its 1.0.
I am very excited to see what happens with Deno. :) Fun history: I had been dreaming about doing "Chakra Core + Tokio" a few years back, but didn't find the time. I'm skeptical of the dependency approach Deno is taking, we'll see what happens!
⬐ mrkurt> Workers is focused on V8 isolates as a means of achieving multi-tenency. I don’t believe Deno does anything specific here.Deno implements the web worker API, which launches different isolates. You could implement something kind of like CF Workers in pure TypeScript, but probably not replicate your resource enforcement.
It's also a pretty good Rust v8 implementation. Before we (fly.io) abandoned our JS runtime we were rebuilding it with that.
Ironically, we also tried Chakra Core + Tokio. It sure would be nice to have a not-google JS engine.
⬐ steveklabnikThanks for confirming, that’s what I meant by “specific”, I was guessing they implemented the spec. It’s just a very big focus of Workers, and I don’t think it’s a focus of Deno. Not good or bad, just a difference.Ah interesting! Did you abandon it for reasons other than “deno exists”? Would love to hear more about how it turned out, good or bad.
⬐ mrkurtWe tossed it because people needed to do much heavier compute than we expected and we realized running arbitrary executables was more useful. It wasn't techniecally bad, just wrong for our customers.Deno’s existence gives me hope we can bring back the JS API, I'd love to have nginx-as-a-TypeScript-API.
> which we should probably think of as bourne-again node.Or maybe as Perl 6
⬐ lghhAnd now, he'll make a whole different set of good and bad choices to potentially regret later.⬐ pulposusGood that it’s called Deno, not Done!⬐ gen220such is the curse of having the audacity to develop software :)⬐ lghhNot necessarily. I prefer trying to improve what we have verses making a new thing every time we have a relatively minor disagreement, even if that disagreement is with our past self.EDIT: I'd like to add that clearly he is able to spend his time as he wishes.
⬐ uryga⬐ nkozyraafaik modifying node doesn't make sense here – it'd be so incompatible that it'd effectively be "a new thing" anyway⬐ gen220I generally agree. Although, some of these are pretty "opinionated" breaking changes (promises all the way down, package manager changes). It would be hard to convince the whole node community that these upgrades are worth the risk forking in a python 2/3-esque way.Forcing TS is a change node could adopt in the next major version if everybody agreed, but the node community might be too big and diverse at this point, to make such an opinionated switch.
Sure, but there's a tendency to start over when the development gets hard to maintain or support instead of just fixing the mistakes.This really feels like the fundamental response in the js world and why we see much churn.
⬐ tengbretsonWhen other people go ahead and build billion dollar companies on top of your development you can't "just fix the mistakes"⬐ nkozyra⬐ polishdude20Why not? Billion dollar companies were built on flash, on asp, on Perl ...Software changes, languages change. Billion dollar companies adapt or migrate.
But isn't this an inherent problem in software development that is really super hard to not do?Let's say you are deciding how to make node when it was first conceived and how it would work. You've made decisions about how the thing fundamentally works. Then after using it and developing for it many years and after having millions of critical software projects dependant on it, you slowly start to see the shortcomings of the software that you could only see at this stage. The problem is that these shortcomings come from a false assumption or solution to the fundamental problems you had to solve when making node. Now, the only way to fix node is to change how it fundamentally works. But if you do that, millions of users' code will break. So, do you require everyone now to fix their broken code and potentially piss everyone off? Or do you give people a choice? Stick with node if you aren't noticing any of those fundamental problems you found, or switch to the new thing on your own time? It's a question of how to affect the least number of people in the least negative way.
⬐ nkozyraI think the answer is a break in backwards compatibility on a major version release.People in JS/frontend world are willing to drop the world for the latest new thing, I see this as less jarring than, say, Python 2 -> Python 3.
⬐ alessioalex> People in JS/frontend world are willing to drop the world for the latest new thingExcept there aren't any breaking changes in JavaScript, are there? Even in Node if anything is deprecated that is done over time in many years.
⬐ haackI find it amusing that you use the example of Python 2 -> Python 3, a breaking change in a widely used language, that has famously been very difficult and long for organisations to deal with.Compare that with javascript which has never had a breaking change. On top of that Typescript is a backwards compatible superset of javascript.
More to the point, Ryan has a humble explanation of what regrets he has about Node.js[1], why they exist and in some cases why there isn't an easy fix.
The point that I assume you're making, that sometimes it is better to spend significant energy to fix something, rather than throwing the baby out with the bath water, is a good one. However I'd suggest this is not one of those cases.
⬐ nkozyra> I find it amusing that you use the example of Python 2 -> Python 3, a breaking change in a widely used language, that has famously been very difficult and long for organisations to deal with.Why is that amusing? I specifically chose that example for that exact reason. I was highlighting the difference in the audience and use case.
> However I'd suggest this is not one of those cases.
I don't see the argument that supports that, either in the post or your reply.
The thing is, I can see Beepboo 1.0 being announced in 2025 to address the things that went wrong with deno. Because there will be design mistakes. And at what point do you say 'oh too many people rely on this software to fix this, I have to start over'?
Couple this with a very real trend-chasing and resume pushing in frontend dev and I'm starting to understand why people are so cynical about this stuff.
Typescript is something more palatable to me because it wasn't throwing the baby out with the bathwater.
⬐ haack> Why is that amusing? I specifically...My apologies I misread.
> ... there's a tendency to start over when the development gets hard to maintain or support instead of just fixing the mistakes.
The thought that Node.js should have been 'fixed' instead of creating Deno is where I disagree. At a glance I can see a few reasons:
- Node.js maintainers + community may not even think there is something to be fixed (see various discussions in this thread about module resolutions)
- Politics, death by committee, inertia
- Effectively a dependency with npm registry (although not technically)
- Lack of backwards compatibility with changes (e.g. module resolution)
> The thing is, I can see Beepboo 1.0 being announced in 2025
Node.js was initially released in 2009 so it's probably fairer to suggest Beepboo 1.0 will be released in 2030. And yes, if it improved on Deno and solved inherent problems that couldn't be solved internally, I would wholeheartedly cheer it along.
I think it's also worth mentioning that Node.js is at a level of stability and maturity that people who plan to and have already built on it, aren't left abandoned.
Powershell has a security model. Also, Ryan Dahl (creator of node.js) is working on Deno as a replacement because he thinks he made a mistake with node.10 Things I Regret About Node.js - Ryan Dahl https://www.youtube.com/watch?v=M3BM9TB-8yA
⬐ wesleytoddRyan is one person and not involved with node anymore. Deno also has many mistakes IMO, so fine to agree to disagree on this one.⬐ okareamanInteresting. You think he is making mistakes again? He complains about "second system syndrome." I've been using Deno for hobby projects and find a lot to like about it.
> ..The node_modules package resolution algorithm is how JS does namespaces.. I wonder whether they got that right through sheer genius, or by accident.I recall hearing Ryan Dahl's regrets about design decisions in Node.js, a few of which had to do with the module/package resolution algorithm.
10 Things I Regret about Node.js - https://www.youtube.com/watch?v=M3BM9TB-8yA
There are some good aspects of it though. I like how modules/packages are just imported objects (the path resolution is a bit weird and seemingly inefficient, i.e., checking every parent folder's node_modules folder). It goes together well, that modules are also self-contained function scopes by default (I suppose a kind of anonymous namespace), that exports a plain object with assigned properties.
---
Wow, I just learned about C++20 Modules, looks like a game-changing feature.
⬐ skrebbelYeah I saw that talk too, and I was baffled. He seems totally ready to throw out the baby with the bathwater. Node's package resolution algorithm might not be perfect, but it's better than nearly every other language's (which is "single big unhierarchical dump of packages, good look sorting out the dependencies kthxbye").
The talk by Ryan Dahl that introduced Deno to the world: https://www.youtube.com/watch?v=M3BM9TB-8yA
⬐ kadfakOverview of the talk from a comment[0] of that video:> The goal of Node was event driven HTTP servers.
>
> 5:04
> 1 Regret: Not sticking with Promises.
> * I added promises to Node in June 2009 but foolishly removed them in February 2010.
> * Promises are the necessary abstraction for async/await.
> * It's possible unified usage of promises in Node would have sped the delivery of the eventual standartization and async/await.
> * Today Node's many async APIs are aging baldly due to this.
>
> 6:02
> 2 Regret: Security
> * V8 by itself is a very good security sandbox
> * Had I put more thought into how that could be maintained for certain applications, Node colud have had some nice security guarantees not available in any other language.
> * Example: Your linter shouldn't get complete access to your computer and network.
>
> 7:01
> 3 Regret: The Build System (GYP)
> * Build systems are very difficult and very important.
> * V8 (via Chrome) started using GYP and I switched Node over in tow.
> * Later Chrome dropped GYP for GN. Leaving Node the sole GYP user.
> * GYP is not an ugly internal interface either - it is exposed to anyone who's trying to bind to V8.
> * It's an awful experience for users. It's this non-JSON, Python adaptation of JSON.
> * The continued usage of GYP is the probably largest failure of Node core.
> * Instead of guiding users to write C++ bindings to V8, I should have provided a core foreign function interface (FFI)
> * Many people, early on, suggested moving to an FFI (namely Cantrill) and regrettably I ignored them.
> * (And I am extremely displeased that libuv adopted autotools.)
>
> 9:52
> 4 Regret: package.json
> * Isaac, in NPM, invented package.json (for the most part)
> * But I sanctioned it by allowing Nod's require() to inspect package.json files for "main"
> * Ultimately I included NPM in the Node distribution, which much made it the defacto standard.
> * It's unfortunate that there is centralized (privately controlled even) repository for modules.
> * Allowing package.json gave rise to the concept of a "module" as a directory of files.
> * This is no a strictly necessary abstraction - and one that doesn't exist on the web.
> * package.json now includes all sorts of unnecessary information. License? Repository? Description? It's boilerplate noise.
> * If only relative files and URLs were used when importing, the path defines the version. There is no need to list dependencies.
>
> 12:35
> 5 Regret: node_modules
> * It massively complicates the module resolution algorithm.
> * vendored-by-default has good intentions, but in practice just using $NODE_PATH wouldn't have precluded that.
> * Deviates greatly from browser semantics
> * It's my fault and I'm very sorry.
> * Unfortunately it's impossible to undo now.
>
> 14:00
> 6 Regret: require("module") without the extension ".js"
> * Needlessly less explicit.
> * Not how browser javascript works. You cannot omit the ".js" in a script tag src attribute.
> * The module loader has to query the file system at multiple locations trying to guess what the user intended.
>
> 14:40
> 7 Regret: index.js
> * I thought it was cute, because there was index.html
> * It needlessly complicated the module loading system.
> * It became especially unnecessary after require supported package.json
>
> 15:28 Talks about Deno.
[0]: https://www.youtube.com/watch?v=M3BM9TB-8yA&lc=UgwteTb0N1PFq...
⬐ alpbThanks. This largely made me realize most decisions behind node.js were made on-the-fly rather arbitrarily without putting too much thought into it, and help me compare and contrast it with Go. :)⬐ lowercased⬐ dmix> most decisions behind node.js were made on-the-fly rather arbitrarily without putting too much thought into itthat seemed rather apparent even at the time - what's been more interesting is watching others defend some of these decisions as if there was a lot of thought put in to them, and that they're some example of great architecture.
not specifically ragging on nodejs - I see this a lot in various projects - small/minor decisions compound over time, and even if they were not originally planned/intended to have significance, they have it at some point, and often people who weren't involved in the original decisions think there's a lot more 'there' there behind the original decisions, when, usually, there isn't.
>> * Many people, early on, suggested moving to an FFI (namely Cantrill) and regrettably I ignored them.A bit off-topic but Dahl referenced Cantrill here, who I figured to mean one of the authors of DTrace, Bryan Cantrill, who I then found from his Twitter (https://twitter.com/bcantrill) just last month started a new "computer company" which sounds super interesting, especially with his past experience and the passion he seems to have for attempting to solve a tough, bold problem:
Edit: which apparently was posted on HN with 500+ upvotes https://news.ycombinator.com/item?id=21682360, regardless it's a good read worth re-mentioning here
Edit 2: Also thank you for the excellent write up, I wish every tech talk had a text TLDR
⬐ PudgePacketFor more context, Cantrill was senior at Joyent, who has Nodejs listed as one of their products on Wikipedia and has been the corporate sponsor of Nodejs for a long time.
Watch 10 things I regret about node. js - https://youtu.be/M3BM9TB-8yA from the creator of both node and deno to undersatnd his motivations behind the deno project. A very intriguing talk.
⬐ 29athrowaway> Access between V8 (unprivileged) and Rust (privileged) is only done via serialized messages defined in this flatbuffer.Expect to see this in "n things I regret about deno"
⬐ spraak⬐ pvgCan you explain why?⬐ 29athrowaway⬐ kevinkassimoEvery deno API function call goes through flatbuffer serialization + deserialization + more steps. Sounds like a lot of overhead.Replying to Flatbuffers concerns:You are right, we will try to get rid of it for some faster serialization mechanisms (after some huge internal refactor lands). See the talk I posted, Ryan mentioned about it near the end.
It is very interesting although I didn't really understand the 'security' part. The motivation seems to be twofold - some bad things have happened because of compromised npm packages and v8 happens to have a robust sandbox. This sounds like a solution looking for a vaguely defined problem. The illustrative example he gives is 'malicious linter'. Is malicious linter that important a threat?⬐ ricardobeatIn the example the linter itself is not malicious, but used to deliver a malicious program that can have unrestricted filesystem access. Not vague at all, see recent news on the ‘event-stream’ package being used to steal cryptocurrency wallets.⬐ pvgThe 'vague' part is not that it doesn't happen - see the comment you are replying to.
Meanwhile, Deno (alternative server-side JS built from scratch by the creator of Node) keeps chugging along:https://github.com/denoland/deno
It will be interesting to see which happens first: developers migrate to Deno or Node finalizes its ES module implementation.
I think both groups are doing excellent work, but the Node team seems to be at a disadvantage having to continue support for some decisions made early on in Node:
⬐ settler4> import { test } from "https://unpkg.com/[email protected]/testing.ts";This feels both very pragmatic and frightening at the same time.
⬐ Ataraxy⬐ pvorbI'm not an "expert" but that feels just as insane as the npm argument people make. I'd love to hear from someone more in the know as to why they aren't the same.⬐ gcb0⬐ strictneinbecause npm's is implicitly in the build pipeline. You won't run 'npm install' on production! you run that on development/certification and then push the validated image to production.With deno, there is no distinction anymore, kinda of. You still can send to production images that have the deno cache. Only thing that changes is the default. Previously you would have to explicitly run 'npm install' on the production host, failing that the code fails. With deno you still can choose to push to production an image without the caches (same as 'npm install' in prod), but now the default is that untested code in QA will auto install without hash check!
in summary: absolutely no practical change (i.e. no new feature impossible before) other than production defaulting to installing remote dependencies of proven-not-tested functions.
⬐ mmmeffThey really aren't if you think about it. Going straight to a URL for a version of a dependency is the same as pulling it from a registry, except it's decentralized from a single source (NPM) and removes the extra hops in between the package vendor and the package consumer.On the flip side, that extra hop adds a ton of convenience in the form of name-resolution, security and governance. It's the age old double-edged sword of centralization.
⬐ oscargrouchMaybe if you turn that url into a hash, than just use the hash to check if the package has a local copy already, it wont be so bad. But you will need to add the package version in the URL so that you know you will always have the package you really want in your local cache.Without SRI or similar this is very frightening.edit: they're thinking about it:
https://github.com/denoland/deno/issues/200
Security shouldn't be an afterthought.
⬐ christophilusWhat bothers me most about it is the lack of a checksum, which is something Go modules support. I think that’s a mandatory feature to prevent certain attack vectors. Other than that, I have no problem with this approach.⬐ piscisaureusPackage validation (using a checksum or signature) is definitely on our radar. We just haven't gotten around to it yet.Let's throw everything out of the window and start from scratch. The JavaScript way of doing things is why I don't like to invest much in this ecosystem.⬐ rhlsthrmBeing a daily TS/Node dev for a while now, I'm super excited about Deno. Any idea of when it will be production ready?⬐ bartlomieju⬐ ToucheDisclaimer: I contribute to Deno. IMHO it will take 1-2 months before someone deploys simple production services. We're already working on standard library, which you can see here: https://github.com/denoland/deno_std⬐ rhlsthrmAwesome, how has the experience been on the project? I'd love to get involved as well, are there any contribution guidelines or starting points?⬐ bartlomiejuExperience has been awesome. This tech is really challenging for me, but at the same time amount of knowledge I gain is huge. Deno community hangs around on gitter, here: https://gitter.im/denolife/LobbyQuestion about contributions has been raised on gitter several times already and the answer was to look for "good first issue" labels in GitHub repo. However there are not many of them right now.
Does Deno support native ES modules or does it implement a require() shim and rely on TypeScript transpilation?⬐ bartlomieju⬐ tobrPR for this is here: https://github.com/denoland/deno/issues/975It's expected that this should land in next month.
Thanks for reminding me about Deno. It really feels like some of Node’s homegrown solutions to things like modules and async is starting to become a burden.> It will be interesting to see which happens first: developers migrate to Deno or Node finalizes its ES module implementation.
I guess that depends on what you mean with "happens". Only one of them has a clear point where they're done happening.
⬐ k__Hehe,I want a Deno Lambda runtime :D
⬐ nrb⬐ sickcodebruhYou can do this now with Lambda Layers and the Lambda Runtime API, right?https://aws.amazon.com/blogs/aws/new-for-aws-lambda-use-any-...
⬐ k__"The runtime bootstrap uses a simple HTTP based interface to get the event payload for a new invocation and return back the response from the function."Interesting. I had the impression it would take more to get your language of choice up and running.
This is my first time seeing it but so much of it looks great. I don’t think I’m a fan with how it handles dependencies, though. It feels like they’ve just reinvented GOPATH.“Deno caches remote imports in a special directory specified by the $DENO_DIR environmental variable. It default to $HOME/.deno if $DENO_DIR is not specified.
...
Production software should always bundle its dependencies. In Deno this is done by checking the $DENO_DIR into your source control system, and specifying that path as the $DENO_DIR environmental variable at runtime.“
https://github.com/denoland/deno/blob/master/Docs.md
The suggestion to import and reexport dependencies in a central `package.ts` file sounds absurd to me.
Clearly, I’m not a user and am just now reading this for the first time. Has there been discussion about this in the community? Are there plans to correct it? Go has been digging out of this hole for years, I can’t imagine why they’d start in the same place.
⬐ bartlomiejuDisclaimer: I contribute to Deno.Contrary to Go, when using Deno you don't need to invoke any command to install a package. You just put your import statement with full URL (you can pin version there, eg. https://unpkg.com/[email protected]/dist/liltest.js) and Deno downloads it on first run and caches for later use. Changing $DENO_DIR is somehow equivalent to changing node_modules location. This eliminates need for package manager.
⬐ mmmeffI feel that the Deno project's goals to get away from NPM/centralized registries is causing developer experience pains that it feels like you're trying to sweep under the rug.Maybe it's time to take a look at building a replacement package manager.
The convenience of having a single package manager that handles name resolution, versioning, governance, and security is why NPM is as prolific as it is today.
That being said, the reason NPM even received any traction from the start was because Isaacs bundled it with the Node distribution.
In my opinion, this is the clear chance for Deno to provide its own, better package manager, built on top of IPFS/Dat or some similarly decentralized protocol, and correcting the many mistakes that have been openly accepted by Isaacs and others in the lineage of Node/NPM.
⬐ bartlomieju⬐ stephenI don't have knowledge about early NPM stages, but from my perspective the goal right now is to get rid of need for package manager.⬐ mmmeff⬐ austincheneyHow does Deno need a package manager at the moment? I don’t understand your perspective.⬐ k__I think the point here is, that Node.js needs one and Deno is created to get away from that.⬐ mmmeffHow exactly does Node.js require a package manager?⬐ k__Well, it doesn't technically require it...⬐ mmmeff⬐ johnhenryExactly.Deno doesn't need to solve for this, it's already achieved it. An interpreter has nothing to do with a package manager. One problem that Deno is actually solving is the tight coupling of NPM and Node. Commonjs is very much purpose-built for NPM modules.
It's a smart call for the team to tackle this problem as it's one of the biggest pain points of using Node and the biggest threat to its long-term sustainability. But I fail to see how no package manager is a viable solution. As someone who's used Go and Node for the majority of the time they have existed, trust me when I say that a package manager is a good thing.
Node is an incredible ecosystem. We just need a better implementation of its package manager.
const packageManager = require('package-manager'); // QED⬐ mmmeff`const npm = require("npm")` actually works AFAIKYou don’t need package management dependency hell just like you don’t need frameworks. This pain is self imposed due to a desire of perceived convenience or the lack of competence to see otherwise.When your total dependency graph is less than 8 packages you can easily manage this manually. This especially true since there are bots that can automatically check for version updates.
I'm tempted to apply a Greenspun-ism, that by "not having a package manager", you're going to effectively have a half-implemented, bug-ridden package manager anyway. :-)More seriously, dealing with the version hell that is inherent in dynamically assembling (your project has it's own unique dependency graph) publicly shared, uncontrolled libraries (only Google has a true no-version-numbers-ever monorepo) is nontrivial and so it seems like surely there has to be something being "effectively a dependency manager", though perhaps it is just semantics of embedding that in the runtime "because imports are urls in the source files and not a separate Json file"...not sure what fundamental benefit that brings though?
Worth watching "10 Things I Regret About Node.js": https://www.youtube.com/watch?v=M3BM9TB-8yA
⬐ sephwareAnd the slides: http://tinyclouds.org/jsconf2018.pdf
Maybe deno will implement something like vgo. (If you don't yet know about deno: https://youtu.be/M3BM9TB-8yA )
In a recent video, Ryan Dahl mentioned that he regrets incorporating package.json and NPM into Node:https://youtu.be/M3BM9TB-8yA?t=588
He has a very early attempt at breaking from compatibility with Node called "Deno". Here's the part in the video about his current thoughts on importing external modules under Deno:
https://youtu.be/M3BM9TB-8yA?t=1257
He has some other security proposals as well.
⬐ pedro_habAnd he even mentions linters (in the first Deno slide) as untrusted utilities that should take advantage of the JS sandbox and not have access to network.
It looks like Ryan Dahl predicted it: "It would be nice to download the massive codebase that ESLint is and run it without it taking over my computer - which it could."https://www.youtube.com/watch?v=M3BM9TB-8yA&feature=youtu.be...
⬐ msoadSandboxed by default is the best design decision of deno.Using URLs for import is another one. npm as a company has too much power and responsibilities that is really unnecessary. We're downloading from a URL anyways. package.json and npm makes it look like everything has to be hosted there
⬐ inglorJust to be clear, the reason I (and others) use the NPM registry is that it saves us a ton of time and helps us be more productive.There is nothing stopping me or you from putting "regular" JavaScript files in your `node_modules` and `require`ing them or running node with ESM and a loader that loads URLs (requires Node 10) and have the same semantics as deno with packages.
Some problems I personally see with URLs is versioning (and versioning of dependencies) and the fact it's harder for multiple packages you're using to have the same dependency without downloading it twice.
⬐ specialist⬐ pitajOne of the few things maven got right was explicitly including the semver in a module's path. eg ./root/leftpad/1.2.3/bundle.zip (IIRC)⬐ eeccproductive... sorry but I'm starting to hate this word. What do you mean by that? Being able to randomly import blobs of code that maybe does something that might help you as you go figure out what you're supposed to implement while at the same time the "customer" is also figuring out what they want?Argh, please! What a crazy asylum...
⬐ inglorSo a few things - now that this won't get any more 'public' attention.> What do you mean by that?
Compared to the alternative (of manually importing code) it saves me time. The 'download code from the internet' problem isn't new to NPM or new to package managers or even related - it is orthogonal in my opinion.
> Being able to randomly import blobs of code that maybe does something that might help you
I usually know exactly what the code I'm using does and why I'm importing it.
> you go figure out what you're supposed to implement while at the same time the "customer" is also figuring out what they want?
This lights up a bunch of red lights to me. This is nothing like the development process I know. I strongly suggest you consider looking for a place that doesn't treat developers like that (mostly technical companies tend to be better in this regard).
It is a programmers' market and you get to choose a workplace with good culture. The path of feeling like your goals are unclear or unimportant while caring about them leads to burnout. Stay strong :)
( > Argh, please! What a crazy asylum...
FWIW, I think the mental health reference is in bad taste. )
⬐ NoneNoneURL imports is a terrible idea. There's a reason we have actually package managers and repositories instead of installing from GitHub.Properly made and run package managers are more secure and offer a better developer experience than importing by URL.
⬐ msoad> package managers are more secureHow?
⬐ ricardobeatWhich package managers are properly made, and how are they more secure? As far as I'm aware npm's security model is shared with almost all of them.⬐ eptcykaCargo is far better in this regard, for one.⬐ Arnavion⬐ erik_seabergCargo packages can also execute arbitrary code at compile-time through build scripts, which run with full permissions of the original `cargo` command including filesystem and network access.RPM and .deb packages have GPG signatures and lists of trusted maintainers. NPM doesn't have that; https://medium.com/redpoint/introducing-pkgsign-package-sign... looks like a very early start on a big project that could fix this if it catches on.
I love Swift but at 4.2, its still a distance from being useable for me.Swift is suppose to be modern, only to be burden by API compatibility work, which is non trivia.
This significantly slow certain important developments like ABI Stability (5.0), Full Generic (5.0), Concurrency (Maybe 6.0)?
A year since 4.0 release and in the coming few months, 4.2 will be release and not 5.0. This mean the timeline for 6.0 get push even further back.
While Swift is modern in area like optionality, first class immutable struct and (my favourite feature) enum with associated values, it lack many other modern features we come to expect from modern language. e.g. callback are still the way for async path control (1 of the regret of Ryan Dahl in his JSConf EU talk https://www.youtube.com/watch?v=M3BM9TB-8yA)
1.0 to 3.0 was spent getting the API right. This is a significant positive investment in the long run, but as someone who have to maintain codes, it was not pleasant at all and I still have code stuck in 1.0/2.0 eras. I have crashes with getting conditional conformance working with generic. Some wasn't crashing on 1.0 or 2.0 but crash on 3.0. Swift clearing is a WIP.
---
At the same time, TypeScript happened. TypeScript turn JavaScript into optional typed language. I see JavaScript and Objective-C in similar light. Since Objective-C start getting some syntactic sugar (generic, nullability), I wonder what if they have taken the TypeScript approach instead.
TypeScript have no choice but be pragmatic (probably after seeing how Dart was not adopted by the larger community for going the Swift way).
Apple basically act like a benevolent dictator, whatever direction they take is more or less the future, we have to figure out how to work around the new "world" order, which get updated every year at June.
The best iOS/Mac developer thrive in this environment and get handsomely rewarded (App Store ranking, recognition from Apple), I tried and failed miserably.
These are the slides to Ryan Dahl's presentation at JSConf EU 2018.A video of the presentation is here: https://www.youtube.com/watch?v=M3BM9TB-8yA
⬐ thosakweThis was one of the more interesting software talks I've listened to recently. I like that it was very real - there are serious, serious problems with Node.js, and the fact that even the creator acknowledges these problems caught my attention.I'm also a long-time user of Dart, so when he brought that up, and compared TypeScript to its shortcomings, I definitely agreed.
That being said, even with the Deno project, I'm not so sure what can come in terms of performance and security from running JavaScript outside of a browser. The choice of V8 also raises concerns for me about build systems. He mentioned the failing of GYP, but anything using the same build toolchain as Chromium always introduces a wealth of complexity for anyone building said software, as not only do you need Google-specific build tools, but also very specific configuration including Python.
It will be interesting to see what comes in the future.
If it were up to me (which I guess it isn't), I'd probably prioritize portability/less complex builds, built-in FFI, a flat module system, and optimizing calls to C from JS.
⬐ vanderZwan⬐ tnolet> the fact that even the creator acknowledges these problems caught my attention.Not to take anything away from Ryan Dahl's ability to reflect and be open about what he considers his design mistakes, but it probably also helps a bit that he walked away for quite a while before coming back. A bit of distance helps in these matters.
⬐ sebringj"I'm not sure what can come of performance and security running JavaScript outside of a browser" well we already know about performance. Ryan was mentioning and demonstrating that JS is in a sandbox already and if modifying node to have message passing with dispatchers mirroring on native and js sides, you in effect have very tight security as you are strictly allowing only sys calls that are going through a single point and can be managed through a single point. Seems like that one is figured out with Deno.⬐ hliyanQuestion: have you written any server side Dart? I'm sold on the client side with Flutter.⬐ mschuetz> performance and security from running JavaScript outside of a browser.I'm just now building a node app to filter point clouds, so lots of number crunching. In two days I've got something in javascript that's faster than the C++ solution I've been working on for a week. Mostly because javascript/node makes it trivial to parallelize file IO while doing work in the main thread. This app reads 13 million points from 1400 files (~200mb), filters all points within a clip region, and writes the 12 million resulting points in a 300MB file, all in 1.6 seconds. (File reads were cached by OS and/or disk due to repeated testing, but file writes probably not)
My personal conclussion is that javascript can rival or even exceed the performance of C++, not because it's inherently faster, it's obviously not, but because it makes it much easier to write fast code. For the highest possible performance you'll defenitely want to use C++ but sometimes you'll have to spend multiple times the work to get there.
⬐ blattimwind> (File reads were cached by OS and/or disk due to repeated testing, but file writes probably not)Unless you flush the pages manually, your dirty pages (written files) live on long after your process died. Depending on system and configuration, minutes or even hours can pass before they are flushed to disk.
⬐ mschuetz⬐ ex3ndrDo you know about best practices to benchmark with uncached data? It's something I've been wondering about for a long time and I've seen many attempts at benchmarking things without regard for disk caching. e.g. benchmarking in-memory data bases to out-of-core databases but because of caching due to repeated runs, the results were meritless since the out-of-core databases had their stuff in memory as well.⬐ zbjornsonTake a look at https://github.com/Feh/nocache. It works effectively for reads. I don't know about writes.Can you share some sources? I tried to process simple CSV files in a very straight forward (but async) way and got reading 200mb CSV and just splitting it to columns (with simple split by comma) takes ~10 seconds.Also simplest HTTP request in express is handled in several ms and that's A LOT in my opinion.
⬐ mschuetz⬐ brlewisI don't really have any explanatory sources at hand, just the source code here: https://github.com/potree/PotreeServer/blob/redo/src/Regions...Points of interest:
Schedule files to be loaded. This will usually load around 1000 files: https://github.com/potree/PotreeServer/blob/redo/src/Regions...
Whenever a file is loaded, iterate through the points in it, do the filtering and write the results to the output file: https://github.com/potree/PotreeServer/blob/redo/src/Regions...
Wait until all files were loaded and processed: https://github.com/potree/PotreeServer/blob/redo/src/Regions...
Wait until the output was written to disk: https://github.com/potree/PotreeServer/blob/redo/src/Regions...
What happens is that ~1000 files will be loaded in the background, points are filtered in the main thread and even while some files are still being loaded, we already start writing the results to the output file.
> I tried to process simple CSV files in a very straight forward (but async) way and got reading 200mb CSV and just splitting it to columns (with simple split by comma) takes ~10 seconds.
CSV may be much bigger challenge since it's ASCII data. That always tends to take multiple times longer.
⬐ orfThe project is private.⬐ mschuetz⬐ flukusWhoops, my bad. ^^ Guess I wanted to keep it private till it's actually finished.If file buffering is the bottleneck have you tried using tools like cat to buffer the input? Buffering input and output is the sort of thing you get for free in unix.Did you use typed arrays, or did the benefit of async I/O outweigh the cost of using double-precision floats for everything?⬐ mschuetz⬐ chrisco255I'm reading the values directly from the Buffer object that is returned from fs.readFile using buffer.readUInt32LE and buffer[index], then convert the values to doubles to do the hit-test and if it succeeds, it's immediately written to the output buffer. The nodejs Buffer object is a subclass of Uint8Array as far as I know.On writing, the double coordinate values are transformed back into a fixed-precision integer format and stored in the output Buffer object.
I'm not generating an intermediate buffer since that does decrease performance a bit. It's directly from input buffer to output buffer. Output buffer is initially allocated with the same size as the input, and before sending it to the output stream it's cut to the actual size.
One thing I've previously learned and which has shown to be still true is that writing individual bytes to a buffer is faster than writing integers.
So originally I did this:
But this turned out to be significantly faster. (~20%)outBuffer.writeInt32LE(ux, outOffset + 0);
// do once let tmpBuffer = new ArrayBuffer(4); let tmpUint32 = new Uint32Array(tmpBuffer); let tmpUint8 = new Uint8Array(tmpBuffer); // do many times tmpUint32[0] = ux; outBuffer[outOffset + 0] = tmpUint8[0]; outBuffer[outOffset + 1] = tmpUint8[1]; outBuffer[outOffset + 2] = tmpUint8[2]; outBuffer[outOffset + 3] = tmpUint8[3];
⬐ zbjornsonJust constructing a Uint32Array view of the buffer and accessing values by index way(!) faster than using the buffer.read* methods.Likewise, you don't need to copy bytes in your last code sample. You can copy uint32s.
⬐ mschuetzThe buffer contains not just integer but also byte, unsigned byte, short, etc. Due to this, the buffer length may not be a multiple of 4, and Uint32Array views can only be constructed on buffers with a length of a multiple of 4.Edit: Also, the stride from one record to the next can be anything, e.g. 15. An Uint32Array would need a stride of 4 or a multiple of 4 to be useful.
Edit2: I could try to create 4 Uint32Arrays with byteOffsets 0 to 3, and with a view length of a multiple of 4, then use the one that works with the attribute I'm currently processing. Not sure if that's really going to be faster but who knows.
Right, V8 works great at optimizing JS and it handles streams great. Productivity is one of the most important factors to think about when building software systems since human time is much more expensive than CPU cycles. That's why Node.js works great.⬐ maerF0x0This fails from the false dichotomy of speed of execution vs speed of development (which includes fixing bugs). Well written languages optimize to a certain weighted preference of the two and some languages deliver more of _both_ than others.For example; Typescript is fast to write. and Golang is reasonably quick to write, _and_ execute. Both should have ~15% less bugs than javascript, potentially making them faster to develop (where bugs matter).
⬐ ninjakeyboardwhat is the 15% figure from? I'm suddenly working with a dynamically typed language (elixir) coming from scala and I do find more bugs. Would like some hard evidence to support that so just curious.⬐ pera⬐ chrisco255Have you tried to use Dialyzer or Dialyxir and typespecs?⬐ ninjakeyboard⬐ maerF0x0no. I probably should.15% is from this paper as I recall it: http://ttendency.cs.ucl.ac.uk/projects/type_study/documents/...https://blog.acolyer.org/2017/09/19/to-type-or-not-to-type-q...
⬐ ninjakeyboardthanks!Once you've paid the upfront cost of learning Golang, I might agree with you. But then again, in particular when building a full stack app (and not when on a team that has back end and front end specialists), it's helpful to use the same context (JS/NPM) when devving. People are bad at context switching.⬐ weberc2Right, but the cost of learning Go is ~1 afternoon :p Anyway, I don't really buy the argument that there are efficiencies from using the same language on the backend and frontend. I think there are efficiencies from using a language you're more familiar with or languages that are less error prone or more ergonomic or which have better tooling/ecosystem/etc, but I've never had a problem I would chalk up to context switching.⬐ chrisco255⬐ maerF0x0Ah, so when you're using two different languages, you're telling me you never have to Google anything for either language? It's all in your head?⬐ weberc2No, I’m saying that when I google, it’s not because I just switched from a different language.That's why I mentioned TypeScript. It can run on both "sides", and has faster speed of development (as I defined it above, considering bugs) ...⬐ notdonspauldingI've heard this argument several times now and it finally hit me what I dislike about it.If you aren't context switching between your backend code and your frontend code (even when both are JS), you're probably incurring technical debt in your architecture to be paid in even greater numbers of dev hours down the road.
When you are writing an all-JS full-stack app, do you really feel like you're only working on a single app, as opposed to two different apps which happen to share the same repo?
⬐ chrisco255It's not even likely they're in the same repo when working with Node (although possible). The context switch between apps is one cost you have to pay (surely your backend will differ from frontend in architecture). However, the costs of switching languages is higher. Golang / JS conventions are very different. It is possible to share some common libs front end / back end (lodash, validation logic, etc) and that helps too.⬐ dozzie> The context switch between apps is one cost you have to pay [...]. However, the costs of switching languages is higher.Is it? I was working with a system where server is written in Erlang and client (and another server) is written in Python. No problems with switching back and forth.
⬐ chrisco255Yes, that's a cost your brain is paying. And a higher cost if you bring on new devs to that project, who then need to learn and understand both Python and Elixir. Debugging is different in both languages, libraries are different, standards, conventions, top level APIs, runtimes, capabilities, all that has to be understood to operate at a high level. That's a non-zero cost and it's pretty significant. That you've learned both so well that you can switch between them is great, but that's equivalent to knowing two musical instruments as well as one.⬐ dozzie>> I was working with a system where server is written in Erlang and client (and another server) is written in Python. No problems with switching back and forth.> Yes, that's a cost your brain is paying.
What cost? I said I haven't noticed any.
> new devs [...] then need to learn and understand both Python and Elixir.
No, they don't need to learn even a speck of Elixir.
--
What you described is a trusim that one needs to learn two languages to write in two languages. Yes, this is obviously true. What I'd like to hear is an argument that switching between them used in a single system is costly, because I haven't observed that. This is what I discuss with, not with that learning another language has its cost.
⬐ always_goodHow wouldn't there be a cost?There are more things you have to remember. Workflows in both languages. Of course it's more stuff, thus more context. And you have to use both languages constantly to stay fresh in them. The syntax isn't the only problem, just the easiest one.
Is it just as easy to maintain Spanish and English skills than just English?
"I don't notice it" isn't a very strong argument. I bet you don't notice the effects of slight dehydration and your diet and exercise on your output either. But if you were actually experimenting with it, I guarantee you could soon perceive it.
⬐ dozzie> How wouldn't there be a cost?How wouldn't there be a cost of switching between two languages? Normally. You could try, you'd know then. Though the prerequisite is a system that is designed, with clearly designated borders between the parts, not a system that has just emerged.
Proving something's non-existence is a little like proving that you're not a weapons smuggler. How would you expect to even start?
> There are more things you have to remember. Workflows in both languages. Of course it's more stuff, thus more context.
But this is irrelevant to switching between the languages. You have just as much to remember if you write unrelated things, each in its own language.
> Is it just as easy to maintain Spanish and English skills than just English?
"Just as easy than"? Really? In a thread about languages?
You picked wrong analogy. It is just as easy to write prose with every second paragraph in English and Spanish as it would be with just English. The prerequisite, obviously, is that you know both languages.
> And you have to use both languages constantly to stay fresh in them.
For some value of "constantly". It's not like people forget everything about a language when they don't use it for a week or a month.
> "I don't notice it" isn't a very strong argument.
Well, at least it's some argument. On your side is only "how wouldn't there be a cost?", clearly from a position of somebody who doesn't use many languages.
Having worked with Maven, Gradle, Ruby Gems, Pip and the non-existing Go package management I must say I actually really like the Node / NPM combo. I guess artists are their own worst critics.edit: forgot Scala's SBT, admittedly a builder using Maven repo's but still an excellent example of how bad UX in this area can get.
⬐ kenhwang⬐ sebringjRecently, when I use npm, it mostly just works. There's still the occasional node/npm version mix and match to get certain libraries to work and accidental sudo; the former might just be the poor quality of the ecosystem, and the latter is almost just user error.I'd put it par with rubygems, ahead of pip, gradle, maven, a little bit behind mix, and far behind cargo. Not a bad spot to be by any means.
⬐ saurik⬐ innocentoldguy> far behind cargoFor the purposes of this discussion, it is useful to note that cargo was written by Yehuda Katz (wycats), who had previously written Bundler, and so actually had some concept of what mistakes he had made before and experience specifically in this area, in order to apparently (I haven't used it yet, but I have heard lots of good things) finally have built something truly great.
⬐ jergasonHe also helped write yarn, an alternative client to the official npm one.I've worked with Maven, Ruby's gems, Python's pip, whatever Go's non-existent package management is called, and Node, via npm and yarn. I'd have to say my favorite tooling and package management is found in Elixir's mix utility though. I don't mind the others. They are all decent enough, but I think the Elixir team really nailed it with mix.⬐ CSMastermindnpm is the worst software I use daily.* The maintainers have pushed several breaking updates by mistake (I'm a teapot recently).
* There have been a few cascading failures due to the ecosystem (leftpad).
* node-gyp (alluded to in the talk) break cryptically on install in different operating system/package combinations. It also obscures the actual package contents.
* The lack of package signatures and things like namespace squatting significantly hurt the overall security of npm.
And let's not forget how terrible things were pre-yarn with the nested folder structure of node_modules and no lock file.
Compare that to NuGet where I've literally never had any of these problems.
⬐ xab9⬐ stickfigureI can't argue with you, but "most of the time" it works. Not that I have a choice (unless I stop being a frontend developer), but the time we spend with node/npm debugging is not critical in the timesheet/log.⬐ pjmlp⬐ darzuThere is a choice, I do native frontend.⬐ xab9What tech are you using? Guess not react native then :)⬐ pjmlpWindows Forms, WPF, UWP, Android, Qt, depending on the project.Here's one thing Node and npm are great at and NuGet fails at completely: local development of two packages. With npm, I can use "npm link" to redirect a package reference to a local folder. With NuGet, the best you an do is edit the .csproj and change the nuget reference to a project reference (if you can find the original source code). This makes simple step-through debugging across package boundaries a chore every time, whereas a source-based package system doesn't have this issue.⬐ n0us"npm link" only establishes a symlink between the two directories and doesn't respect .npmignore or behave in any comparable way to publishing a package and installing it. Sometimes the only way to debug is to repeatedly publish and re-install the package you are developing.I've worked with all of these as well, and npm is probably my least favorite. Above all else, I expect my build system to do ONE THING:Exactly reproduce a build at a later date
Part of it is technological (npm didn't have package-lock.json until very recently), part of it is organizational (the npm repository is surprisingly fluid), and part of it is cultural (the JS community likes zillions of tiny constantly-changing libraries). The net result is that I cannot walk away from a JS build for three weeks without something breaking. It breaks all the time. UGH.
⬐ donatj- https://research.swtch.com/vgo - https://github.com/golang/dep⬐ jonny_ehYou'll find much harsher critics of Node/NPM in these parts!I thought Ryan did a great job of explaining his regrets without giving the impression that Node was a "mistake", is "inferior", or anything so drastic.
⬐ allover> You'll find much harsher critics of Node/NPM in these parts!They're ill-informed. GPP is correct that, for example pip is fundamentally inferior to npm [1], and those that insist on throwing shade at npm on HN should be corrected. They're wrong, and insulting a sound, well maintained project, without basis.
⬐ kbenson> those that insist on throwing shade at npm on HN should be corrected.Preferrably by giving them better ammunition, since I do see NPM as substandard in quite a few ways, which is inexcusable when there do exist examples to learn from (whether it be a positive or negative influence).
First, it helps to clarify whether we are talking about npm the client or NPM the repository and ecosystem. Client issues are generally easily resolved, just use a different client. For npm, this could be yarn. For cpan, this could be cpanm, or cpanplus, etc.
If it's indeed the repository we are talking about, there are some obvious things that could be done to greatly improve it the NPM module ecosystem. For example, how about automating module tests against different versions of Node to determine whether it's in a good running status now for the current and prior interpreter versions, on the platforms it can be run on? [1] How about a prior version, in case you're trying to figure out if the version you're on has a known problem on the platform combo you're running on? [2] Or perhaps you want to know what the documentation and module structure looked like for a module a long time ago, like20 published versions and over a decade ago, because sometimes you run across old code? [3] Or as an author, the ability to upload a version, even for testing, and getting an automated report a couple days later about how well it runs on that entire version/architecture matrix with any problems you might want to look into?
In case you didn't notice the trend, I'm talking about CPAN here, which has been in existence for over two decades, and many of the features I've noted have been around for at least half that time. All in and for a language that most JS devs probably think isn't in use anymore, and on encountering a professional Perl developer would probably think they just encountered a unicorn or dinosaur.
Sure, NPM isn't all that bad compared to some of the examples that were put forth, but the problem is that those examples are a limited subset of what exists. Given the current popularity of JS and the massive corporate interest and sponsership, I frankly find the current situation somewhat disgusting. The only thing keeping JS from having an amazing module ecosystem is ambition. Sure, NPM might be a sound, well maintained project (points I think are debatable), but it could be so much more, and that's what we should be talking about, not almost annual fuckups[4] they seem content with dealing with.
1: http://matrix.cpantesters.org/?dist=DBIx-Class+0.082841
2: http://matrix.cpantesters.org/?dist=DBIx-Class+0.08271
3: https://metacpan.org/pod/release/MSTROUT/DBIx-Class-0.08000/...
4: https://hn.algolia.com/?query=npm&sort=byPopularity&prefix=f...
⬐ allover> Sure, NPM isn't all that bad compared to some of the examples that were put forth, but the problem is that those examples are a limited subset of what exists.That was all I was responding to.
I definitely learned some cool stuff from your comment, and appreciate that, but my point was simply that the all the drive-by FUD that npm gets on HN is unwarranted.
> I frankly find the current situation somewhat disgusting.
This feels so hyperbolic though. The things you mention are cool 'nice-to-haves', to say not having them is 'disgusting' is a huge stretch in my opinion.
⬐ kbenson⬐ mncharity> This feels so hyperbolic though. The things you mention are cool 'nice-to-haves'What I find somewhat disgusting is the massive amount of mistakes they've made over the years, and the time they've had to take to fix them, that could have been mitigated or entirely avoided by surveying best practices from other package management systems that have gone through the same pains.
That's just from the first page of the HN search I included previously (link 4), I doubt it's really exhaustive. Now, to me, that list of problems would be bad enough, but NPM is actually run by a for-profit company, and gates certain features behind paid accounts. So what we have is a business, catering to what is likely the largest current group of developers that exist, for a language with corporate backing by multiple very large companies, providing vital infrastructure support for that language and those users, and getting their asses handed to them in comparison to some others who are manned by people volunteering spare time, skill and equipment.2018-05-28 - ERR! 418 I'm a teapot (this is not a joke) https://github.com/npm/npm/issues/20791 https://news.ycombinator.com/item?id=17175960 2018-02-21 - Critical Linux filesystem permissions are being changed by latest version https://github.com/npm/npm/issues/19883 https://news.ycombinator.com/item?id=16435305 2017-08-01 - Typosquatting package names https://twitter.com/o_cee/status/892306836199800836 https://news.ycombinator.com/item?id=14905675 (a little obtuse, but moderated package namespaces with trusted maintainers can mitigate this, and spread load from levenshtein distance checks.) 2017-11-03 - Visual Studio Code 1.7 overloaded npmjs.org, release reverted https://news.ycombinator.com/item?id=12860806 (10% increase in NPM load, specifically to 404 pages, causes NPM to fall over due to naive 404 handling and apparently, poor ability to scale. Good thing they caught it at 10% instead of the 200% it would have reached...). 2016-03-29 - changes to npm’s unpublish policy https://blog.npmjs.org/post/141905368000/changes-to-npms-unpublish-policy https://news.ycombinator.com/item?id=11382885 2014-02-28 - npm’s Self-Signed Certificate is No More https://blog.npmjs.org/post/78085451721/npms-self-signed-certificate-is-no-more https://news.ycombinator.com/item?id=7320833 2012-03-08 - npm (Node's package manager) leaks all user password hashes and salts https://gist.github.com/jashkenas/2001456 https://news.ycombinator.com/item?id=3679996
I mean, I would cut them a little slack if they seemed to have plans for making stuff better and a roadmap and it was just a matter of time, effort and resources they were lacking, but it seems to continuously be a case of them waiting until the shit hits the fan and they're forced to first take a look and see how to fix this new problem they've never envisioned, and then figure out their solution. Sure, it can sound hyperbolic initially, but I think that's just because people haven't really stopped to take stock of what's really going on here, and how it's not really getting better in any useful way. In the midst of emergency fixes is not how you should plan your new features. :/
> there are some obvious things that could be done to greatly improve it [...] it could be so much moreAs with much of programming language design and implementation over the last 3+ decades?
below> that could have been mitigated or entirely avoided by surveying best practices
Yes, people who would spend their time working on language designs and implementations, should at least be familiar with the many surveys of best practices. Surveys of repos, type systems, memory layout, parallelism, and so much else. Language choices are intertwined and subtle, and adhocery has enormous downstream costs to the field and to society. The programming language design and implementation wiki exists for that reason. To accessibly distill our collective experience. Not using these resources is negligent - a disregard of our profession's responsibilities to society.
Oh, wait. Our field can't be bothered to create surveys of best practices. Or a wiki. Knowledge is inaccessibly dispersed among balkanized and siloed human communities, assorted academic papers, and scattered code.
Shall we continue to blame pervasive failure on individual language developers and tools? For how many more decades? At what point do we start addressing it as a systemic problem?
:) So I agree with your observation, but suggest the problem extends far beyond package management systems.
Why I admire Ryan is "I shouldn't just complain without giving a solution..." and he gave a solution, more than once. I have done this on a no-one-cares scale but it really is better to do yourself when you can. Also, Ryan has some sharp sarcastic wit which is pretty fun to watch on this talk.⬐ draw_down⬐ iamleppertI thought that was nice too, but I wish we as an industry placed a little more value on simply admitting that something is bad or suboptimal. You're not supposed to "complain" or "be negative", which I think is unfortunate in lots of ways.⬐ elliotec⬐ ahmetkunBut people do complain and be negative without offering solutions by far most of the time. And there is not much value in that anyway.⬐ sebringjYah I get that. I remember at my work, I said something negative about how we were sending out emails that they looked like they were from 1990. I got a lot of backlash for complaining but next day gave them a modern solution and bam, they were super happy. Maybe its just human nature but providing something to fix it rather than just being unhappy alone is more powerful.You can see the same sarcastic wit in this talk too: https://www.youtube.com/watch?v=jo_B4LTHi3IIt's still my favorite tech talk, very fun to watch.
He talks about not to add in features you think would be “cute” because they are always a mistake..Then a few minutes later says “I thought being able to specify URLs in import statements would be cute..”
Uhh...Houston, we have a problem with this one.
⬐ dgreensp⬐ arenaninjaHaha! Ryan may not have been the best person to design a novel package manager, and while he has some insight into his mistakes the first time around, he still may not be. Neither am I; call me if you need a compiler or a GUI. Everybody has a different set of areas where they have gone deep and developed good instincts.⬐ bronson⬐ jonny_ehYehuda! Calling Yehuda!⬐ toobadsosadIsaac S. created NPM based on a Yahoo! internal package manager.I dunno, importing from a url seems really smart and practical to me. It divorces the run-time (Deno) from a package manager (like npm). I wonder how it handles dependencies though.⬐ pknopf⬐ nicky0It sounded as if he just wants everyone to use Parcel. Even then, you would use npm/yarn to get your dependencies before bundling.It sounds like he doesn't want me runtime to handle dependencies at all.
You can always reference urls with version numbers present in them.
⬐ rmrfrmrfThe main problem is that URLs offer even fewer immutability guarantees than package management servers do, plus they're even easier to 1) break and 2) attack.⬐ iamleppertIs it synchronous? Does it follow redirects? What happens in a 404 situation? Does it obey cache headers? What happens when a timeout occurs or the resource isn’t code? What happens with recursive dependencies and other edge cases as a result of not knowing the dependency tree until runtime? What about error handling and recovering from these failures at runtime or compile time? Should all resources be secure? How does it work when you are developing on one of those dependencies? Do you have to run a local web server? How does versioning work? Where is the cache stored and is it content addressable? How to clear it? Is it global or per user?I could go on but you get the idea. A package manager is not simple and requires A LOT of choices. The best one I’ve seen is old CPAN.
⬐ fredstedDoesn't Go use URLs? How did they solve all these issues?⬐ kristoff_it⬐ iniminoIt doesn't, it uses $GOPATH. The packages are downloaded using `go get` in a directory structure that mimics the full url: ./github.com/user/repository/...The Web has all of these issues and yet importing JS libraries this way has worked just fine.Obviously it's not what you should do if you are publishing a library for others to use, but for local use, and the kinds of exploratory scientific computing he was talking about, it sounds perfect.
⬐ pitajIt has _not_ worked "just fine" which is why we have bundlers, minifiers, and other build systems.⬐ iniminoWe have those things as a result of having too much code pulled in from too many sources. This has approximately nothing to do with URLs as identifiers, which is one of the foundations of the Web.whoosh!⬐ girvoI do believe that’s on purpose...One takeaway from this is how Microsoft is killing it currently:* Much-beloved TypeScript
* Much-beloved VSCode
* Much-beloved GitHub
If they hire Ryan to flesh out his vision for deno we'd probably need a new acronym, MAFANG
On a more serious note, I wonder if deno could support a lower level construct like Observables. As much as Promises are perceived to be an improvement over callbacks, they still have major flaws (only one of which is mentioned in this talk), this is something that Observables can address
⬐ pjmlp⬐ pan69* Much-beloved UbuntuJust leaving it here, given Azure Sphere and WSL.
⬐ darzu⬐ davedxNitpick, but Azure Sphere doesn't use Ubuntu.⬐ pjmlpTrue, it runs Microsoft's own distribution highly customized for the security context of Sphere.My suggestion is just the distribution I would expect Microsoft to acquire, if they would decide to go shopping instead of pursuing their own.
⬐ the_afDoes anyone remember the joke/hoax, back in the old days, of the alleged Microsoft Linux distro? :D (Of course, this was in the age of the Halloween Documents and the "Linux is cancer" mindset [0], back when Microsoft was a very different business).[0] https://www.theregister.co.uk/2001/06/02/ballmer_linux_is_a_...
TypeScript has async/await built-in, and Deno has top-level await too. Promises won't be necessary most of the time.⬐ jonreem⬐ workintheheadasync/await is powered by promisesI've already seen the acronym expressed as FAAMG, which is probably more accurate to start with; Netflix has great performance but is nowhere near the market cap of the other companies.I've said it before and I say it again; NodeJS is an infrastructure component, not a general purpose application runtime environment.I totally recognise the IO problem in our connected world and NodeJS really does solve the problem around the "many simultaneously persistent connections", something that would be really hard to do without something like NodeJS. In essence (and in my humble opinion) NodeJS is basically a programmable socket server and it's main feature is "websockets".
Writing software applications in NodeJS is the most awkward experience due to it's async nature. Business logic is inherently sync, not async. Most of NodeJS's existence has been trying to find an elegant solution to make this async behaviour look like it's sync, from callbacks, to promises and now actual language features added to JavaScript itself (async/await).
The problem with Javascript on the server is not Javascript, but the runtime in which it's executed.
I think it would be interesting to have an sync version of NodeJS that acts more like traditional Ruby and Python next to the async variant that we currently have. Both types could then be used along side each other, each solving the problem for which it is best suited.
⬐ bschwindHN⬐ mromanuk> NodeJS really does solve the problem around the "many simultaneously persistent connections", something that would be really hard to do without something like NodeJSWhat part of it is really hard to do without NodeJS? You can do this easily in Go, Elixir, C++, Rust, etc.
⬐ pjmlp⬐ wruzaJavaScript everywhere mentality?⬐ aphexairlinesThe nice thing about Node.js's design is that (almost) all IO is non-blocking by design, so developers can't block their programs on IO by accident.In other platforms developers need discipline to choose non-blocking APIs over blocking ones, or to wrap their blocking calls in async execution contexts like threads or coroutines.
⬐ bschwindHNI'm just disagreeing that it's "really hard" for languages besides JavaScript. Concurrency in Go is stupid easy, it's built into the language. Pretty sure Elixir is the same story. C++ and Rust are slightly more difficult but still nowhere near "really hard" when it comes to making a socket server that can serve hundreds of thousands.Here's a nice WebSocket server in C++:
https://github.com/uNetworking/uWebSockets/blob/master/READM...
⬐ RapzidAll the "sync" commands are blocking. In my experience developers need discipline to avoid using those and blocking the whole app.Of the languages mentioned I believe Go is actually closest to all io being non blocking by default; with the caveat that you are using multiple go routines.
>Writing software applications in NodeJS is the most awkward experience due to it's async nature. Business logic is inherently sync, not async.The first thing I did for a latest project in node was a sync =>{done()} queue for ws requests. There is a room for fine-graining, but it is a routinely job of determining parallel cases that don’t hit db “ci” in acid much. Not a big deal. The choice for node was obvious in that it has most straightforward installation and maintaining on windows boxes (I’m language and OS agnostic [except js allergy], but peers aren’t). I think that async is yet another hipster bullet, since our to-profit businesses rarely need much traffic and those that need are barely fit for 10kloc of js.
It’s awkward personally not so because it’s async, but because it’s too low level and npm doesn’t feel like a well-documented and thorough module set. It lacks unixlike formality and strictness in phrasing, oriented on beginners, but not on -pedantic needs. It’s hackerish and cool until you meet borders that are required beyond the “move fast” thing — it slows you down. And I didn’t meet a good abstraction over ws and db layer yet. It feels like a bicycle and a shovel when you have to dig a career (pun intended).
⬐ gcommerhttps://github.com/laverdet/node-fibers Is a project that enables writing code this way. It's used, for example, in webdriver.io so that you can write synchronous test code that transparently (ie, without `await`) calls out to async helpers.⬐ pdeuchlerExcept javascript's design is completely antithetical to those use cases. It has no type system, its performance is hard to profile and reason about, single threaded (unless you want to start forking and managing threads in javascript...), weird edge cases, piss poor numeric computation, etc etc.⬐ tree_of_itemThis is a solved problem. async/await makes your async code look sync. This is a strange thing to focus on IMO.⬐ always_goodI'd say your intel is far out of date. Async-everything + async/await makes Node more elegant than the same programs in Ruby/Python.Even little things like "make these two database requests in parallel and wait on them both" or "process these urls but only have 8 requests in flight at a time."
⬐ davedx> "process these urls but only have 8 requests in flight at a time."What's the state of the art solution to this in node.js?
⬐ always_good⬐ tgtweakYou can find a "map with concurrency limit" implementation in Bluebird (shown) or standalone on NPM.const results = await Promise.map(urls, url => process(url), { concurrency: 8 })
For example, off the top of my head this would probably take me 40 lines of ungeneral wait group code in Go.
⬐ cloverichThere's many ways to do this, but at its simplest because Node is single threaded, you can have a shared array of URL's and then a pool of workers who .pop() a URL off and stop when there are none left. Each worker's http request runs async, so with the `async` keyword at the start of their signature, and `await fetch(url)` inside, the code reads (and in some ways behaves) like regular synchronous iteration but runs (mostly) concurrently.I'm always impressed at what one thread can do with non blocking io. I think node strikes a decent balance on parallel programming without the pitfalls of threading. GO is undoubtedly better at lightweight threading, but it's not as simple to grok (or debug)I was thinking, watching the video "Ryan should fork it, or start over"... until he revealed it: https://github.com/ry/deno :)⬐ sametmax⬐ pvsukale3My head can't wrap itself around:import { test } from "https://unpkg.com/[email protected]/testing.ts"
There is so many issues with that I don't even no where to begin.
⬐ ChicagoDaveI'm not a huge NodeJS developer, but have done enough (and other development) to think this is not a good solution to the packaging/dependency problem.My experience with npm issues were usually that some dependency had its own build process.
There are so many inconsistencies in how JS libraries are packaged and downloaded. I think scripting in general leaves the whole dependency thing out in the wind.
Now. If we could turn packages into testable objects, that would be cool.
⬐ thecrazyone⬐ dylrichWell, packages are testable. You can test the api of a package today. Having people write said tests is the tough part.Did you mean to say something else?
⬐ ChicagoDaveI meant more like a binary compiled object. So many libraries have a highly varied set of folders and files. It’s just messy.His arguments for it seemed reasonable - would love to hear your criticisms⬐ freditup⬐ iniminoNot OP, but a couple guesses off the top of my head:* It becomes a more painful process to upgrade dependencies (have to find/replace across your codebase).
* Many versions of the same library get pulled in. If you depend on [email protected] and a dependency of yours depends on [email protected], that's twice the dependency size as compared to both just using 1.5.4. This matters more in the case of web app bundle size though than running local programs.
⬐ t-sauer⬐ sametmaxYou could have an entry file for your dependencies (let's say dependencies.ts) and reexport everything from there. When you have to update, you only have to upgrade it in the entry file and you can avoid multiple versions.⬐ wereHamsterHm, how about we call that file package.json?⬐ myth_busterHow about we link them to modules downloaded onto local file system instead, just in case of network issues.⬐ WA9ACEHow about a cli tool to cache and build those modules/native extensions all at once? We could call that Deno Package Manager.What the others say.And also, golang tried no centralized package management by using git repo. It didn't end with people go getting from moving masters. Of course they did.
And in an area of finally accepting lock files, do you really want to go back in middle age ? Lock files are not a constraint. They are a god saver. You want lock files. You don't want to have either vague dependencies or pin pointed ones. You need both to stay sane.
And of course somebody will do something dynamic that will open a security issue.
And of course typo squatting is going to be so much easier.
And removing a bad lib ? From npm you signal the admin. When it's on it's own domain ? Good luck.
And then searching for libs is going to be fun. And naming, naming will be amazing.
And having a quick glance at the dependency of a lib ? So much fun.
And wait for the search/replace in the code that will change your entire run time by mistake.
And no possible alternative package manager. Hope their dependency resolver never sucks cause you will be stuck till you can install the next node... if they ever fix it.
Ah, and the git blames to see what dependencies have changed are going to be just peachy.
Oh, and your juniors copy/pasting code from the net is going to get extra crispy.
But wait, running the code on conditional imports means your project may install something at ANY moment in its life cycle. And could be changed by somebody editing the code by mistake without really asking to change a depandancy. You know, like a ctrl + D on "0.1" to replace all those floats quickly.
Also cool URLs don't change. Until they do. Tiny URLS dependencies are going to be hilarious.
I could go on and on and on and on...
I can't understand how you can be intelligent enough to code freaking node JS and not see THAT elephant in the doll room.
⬐ sagichmal> golang tried no centralized package management by using git repo. It didn't end well.What? There's no fundamental problem with it. It will continue. What do you mean?
⬐ allover⬐ JeremyBanksThe go team recognise there's a fundamental problem with it, otherwise they wouldn't have created dep and vgo [1]⬐ sagichmal⬐ jcranendonk`dep` and `vgo` both continue to use URLs as import paths. They optionally provide other means of fetching code.I like go as a language, but its lack of package management is the single reason I no longer use it.⬐ sagichmalGo has had primitive package management for 5+ years, and official package management (in some form or another) for a year.You are imagining the worst possible execution of these ideas. Nobody is proposing that you should start using libraries that pull in code from random domains, unless you have some specific need to. Whitelisting sources is such an obvious step, given the security focus, that you should really have applied the https://en.wikipedia.org/wiki/Principle_of_charity in your speculation.⬐ NoneNone⬐ NoneNone⬐ sametmaxNo one proposed to make spaghetti code with "goto".Yet we did. And we replaced it with "if" or"while", to avoid repeating history.
This is a prophecy: this dep managing concept, if kept in this form, will cause something terrible. It will.
Good luck.
But this is exactly the way the Web works. Right? And that platform seems to have done alright for itself.For libraries or professional projects, ship your dependencies or use a build system. But for one-off projects, being able to pull in a utility library without needing any tool but the interpreter you're using or some extra build step seems like a big win.
⬐ sametmax⬐ __sThe entire world moved out of inline dependencies in the HTML page to a package managers. We invented bower or npm __because__ of it.⬐ iniminoWe invented bower or NPM because people thought it was a good idea to have 20MB of JavaScript on a page and 19MB of it written by third parties. Generally speaking, this turns out to have been a mistake.What if it said import { test } from "https://npmjs.com/[email protected]/testing.ts" ?⬐ MatekCopatekCare to elaborate?⬐ sametmax⬐ JoeriI did. But I was not expected to have to. It's quite unsettling.Actually, I can’t think of a single issue that npm doesn’t also have.Possible issues:
- trust
Not an issue since if you trust an author’s packages on npmjs you should also trust them from unpkg, github, etc... (Aka there is no trust in npm land except trust in the author)
- Versioning
Put the version in the url instead of some json, same difference.
- Lack of ^ for automatic upgrades
That’s a feature. Avoids the need for lock files.
- Repetitive
Have a dependencies.ts to group external imports, same thing as package.json
Any other issues?
⬐ jonny_ehDependencies, especially for dynamic imports?⬐ ricardobeatGoing from what he said about packaging, every program/module is compiled into a single file, including dependencies.⬐ jonny_ehWouldn't that lead to even more bloat than what he regretted in the talk?⬐ pknopfRuntime dependencies is what he is trying to get ready of.You will always have built time dependencies with npm/yarn.
Can anyone explain why he called Dart a complete failure?https://youtu.be/M3BM9TB-8yA?t=19m55s
ps: I am a junior developer. I am taking up Dart to learn Mobile apps development using Flutter.
⬐ thosakwe⬐ StreamBrightI wouldn't say Dart is a failure at all.That being said, the Dart project has long abandoned the goal of replacing JavaScript in the browser, and instead provides a VM, Flutter, and a to-JS compiler.
Many people do not know this, and have written the language off entirely. Hence a lot of the negative reaction to Flutter.
(Though, if I'm quite honest, this is a bit off-topic from the OP)
⬐ hajile⬐ munificentDon't forget that dart now has an AOT compiler and is strongly typed. It is close to Java in performance in a lot of stuff (faster in some).Syntactically, Swift is almost identical (dart being older) aside from GC vs ref counting. I think it has more of a future in the Java managed language area (but with better typing, better syntax, and first class functions with closures)
I'm on the Dart team (but I don't speak for the entire team here). Dart had two initial goals:1. Get a native Dart VM into Chrome and eventually other browsers.
2. Get a significant number of client-side web developers that were using JavaScript to move to Dart.
It's probably not obvious, but these goals are in tension with each other. In order to motivate adding a giant new VM to a browser, you need to make the language pretty different from JS. Likewise, you need to make your implementation much faster than JS.
Both of those push you down a path where interop with JS is difficult. You don't want your language's semantics too close to JS because that reduces the value proposition of the language. And you don't want JS interop requirements to limit how you implement the VM around things like garbage collection.
But for (2), to get people to move, you need the absolute smoothest migration path you can get. You'll make all sorts of compromises and edge cases in your new language to reduce friction when getting developers to migrate to yours and you'll do anything to make interop seamless to support heterogeneous projects. (For example, TypeScript pokes quite large holes in its type system in order to play nicer with JS idioms.)
The Dart leads prioritized (1) over (2). The idea was that the VM would be so great users would flock to it giving us (2). That didn't work out, unfortunately. In practice, I think it's very hard to create a language implementation so much better that it trumps the value of existing code. So you really do need to win at (2) at all costs, if you want to a successful web-only client-side language.
That's the approach TypeScript has taken, and they did a fantastic job at it. Having one of the world's best language designers doesn't hurt.
In the past couple of years, in response to this and other changes in the landscape, we pivoted Dart. We now aim to be a multi-platform client-side language. In particular, we're the application language of Flutter, a cross-platform mobile framework.
Flutter is a very different platform than the web -- there isn't an existing entrenched corpus of billions of lines of code. Performance and memory usage matters more. You can't JIT on all platforms. Developers coming to Flutter are equally likely to be coming from Android (Java) and iOS (Objective-C, Swift) as they are the web.
Those different constraints play well to Dart's strengths. And, in particular, they align nicely with Dart's move to a full, sound static type system. Dart 2 is more "C# with less boilerplate" than "JS with more types".
Dart is still also a web language, and the better static type system really helps with static compilation to JS, but it's not our only path to success.
⬐ kjeetgill⬐ TheDong> 1. Get a native Dart VM into Chrome and eventually other browsers. > 2. Get a significant number of client-side web developers that were using JavaScript to move to Dart. > The Dart leads prioritized (1) over (2). ... That didn't work out, unfortunately.That's what I remember hearing when Dart was just getting off the ground. So what happened? Script tags can specify text/javascript or text/dart. Did the the Chrome team just veto any integration? Did a prototype into chromium ever even exist? Google put out experimental quic stuff out before so why not a new interpreter?
I'm convinced if it was out there we'd have seen some use and some hype! I'm a little disappointed to hear Dart pivoted away. Such high hopes!
⬐ mileycyrusXOXO⬐ darzuThey released an experimental version of Chrome called Dartium that included the Dart VM.⬐ kjeetgillNice. I never heard of it, I'll take a peek. I wonder if it was a lack of marketing that tanked Dart's browser aspirations...Thanks Miley Cyrus!
How would you characterize the positioning of Dart with respect to Go? Dart is for client-side, Go is for server-side? I'm also curious if you see having separate languages for these roles as desirable or just incidental.⬐ munificent⬐ davedx> How would you characterize the positioning of Dart with respect to Go?I think it's easy to over-estimate how much "positioning" Google actually does with projects like this. We are a very big company and different parts work fairly independently of each other.
Your description is how I think of the two languages, but you might get different answers from different people. I like classes and object-oriented programming in general, and I think it's a fantastic fit for UI applications. So I think Dart is an easier fit for that domain.
Meanwhile, Go's concurrency model and nice standard library seem to be a good fit for servers.
Given the breadth of software people write today, I think there's plenty of room in the world for lots of languages.
⬐ isoosDisclaimer: I've been working with Dart for 5+ years now, I'm running several small server-side Dart apps myself, and I also contribute to the Dart app that is behind pub.dartlang.orgThe Dart VM itself is great for server-side, however Google is focusing on the mobile and web tooling and support. While they do develop server-side packages e.g. for AppEngine-, gRPC-, or Memcache-support, connecting to databases like Postgresql is through a community-supported package, and sometimes it is hard to find an actively developed one.
Considering these limits, there are still good server-side frameworks, and there exists couple of big full-stack Dart applications. I've created a HackerNews-crawler twitter-bot (@DartHype) in a matter of hours in Dart, and it is running almost unchanged since then. Not that it is a big feat, but it was an easy thing to implement given the ecosystem.
If you have a fresh project, and you can select your database and other parts of your stack, Dart can be a good choice. Depending on the domain, the performance is close to the one in Go, or in Java VM, and it is much easier for beginner to pick up than other languages, while the tooling provides more safety than JavaScript or TypeScript.
However, if you need to connect to Sybase, it may not be the best choice.
⬐ hliyanI've recently started taking Dart seriously. Could you point me toward a good community / community resources? The official documentation seems a bet sparse...⬐ TheAceOfHeartsThe Dart community is tiny and package selection is incredibly limited. I'd argue that Dart would be a very poor choice for most developers. It's not easier to pick up than something like Rails or even Spring.⬐ isoosYour argument is noted, we just happen to disagree. I've mentored high school students for a couple of weekends to help them build their mobile app. Java (Android): struggle with the bloat. JavaScript (React Native): shoot themselves in the foot couple of times - lack of tooling. Dart (and Flutter): instant success.The language, its consistent API, the IDE and tooling support with the static analysis is just great for beginners and advanced developers alike.
People like to hate Dart because it threatened to take away their beloved JavaScript. For those who have actually tried in the past few years, I only hear they wish their IT stack could be migrated to Dart. If you start a new project, choose wisely :)
> The Dart leads prioritized (1) over (2). The idea was that the VM would be so great users would flock to it giving us (2).This is actually why I, as a developer, decided not to use Dart: because it would give Chrome a competitive advantage and more control over the other browsers. Chrome has enough advantages as it is, and I like having competition in the browser market.
⬐ munificent⬐ pvsukale3The hope was that other browsers would eventually have Dart VMs too and there would still be fair competition.In many ways, this is similar the path that asm.js/WASM took. First it's polyfilled to JS in all browsers. Then some browsers get faster native support while still polyfilling the others. Then eventually all browsers support it natively.
The initial asm.js design was a little different because as a subset of JS, the polyfill was a no-op. But the binary form of WASM, I think, required a JS polyfill on browsers that didn't support it natively.
Dart could have taken a similar path, but we didn't get the level of user excitement required to motivate browsers to follow along.
⬐ BrendanEichHi Bob - quick comment in this. The comparable to DartVM is not WebAssembly but PNaCl, which was Chrome-only. Wasm was much easier because asm.js proved the concept source-compatibly and in multiple browsers. I spoke to Anders Hejlsberg, Steve Lucco, and others at MS in fall 2013 and they got on board, even using OdinMonkey code licensed under ASL2 by Mozilla to overcome MS objections to the MPL.Meanwhile Dartium could not land in WebKit pre-Blink, because two GCs impose a cycle collector “super-GC” with inevitable extra cost. See https://lists.webkit.org/pipermail/webkit-dev/2011-December/... (I found this link from my HN post at https://news.ycombinator.com/item?id=12773857).
DartVM like PNaCl was trying for too much in Chrome, exceeding what other browsers could afford to embrace at high direct and opportunity costs in a competitive post-Chrome browser market. A spec would take many years and multiple competing implementations to forge. Code is spec with such big single-company-grown projects.
Thanks for your work on Dart.Does your team plan to release any solution like Flutter for native Desktop GUI apps?
⬐ wstrange⬐ ramses0It's not an official Google project, but there is a lot of activity here: https://github.com/google/flutter-desktop-embeddingPlease reconcile Dart with Haxe. Why should both exist?My previous question in 2016 was "Reconcile Dart and Go", the best answer I got was: """This isn't true; they're both general purpose programming languages with strong static type systems and decent async I/O stories. You can write a web app or a server in either language, though Dart has a better ecosystem for frontend development, and Go has a better ecosystem for backend development.""" https://news.ycombinator.com/item?id=12131397
It would be great if you could give a new comparison (post-Dart-pivot) as to how you differentiate between Dart and Haxe when both (apparently) have similar goals.
⬐ munificent⬐ todd8> Why should both exist?In general, I dislike existential questions about programming languages. No one seems to do that for other product categories. "Why should Monet and Renoir both exist?" "We already have Castlevania. Why do we need Metroid?"
I understand languages are a little different because each requires an ecosystem that is fueled by programmer attention. There is an opportunity cost that time developer X spends writing a library for language Y is time not spent writing a library for language Z.
But, overall, I try not to get too hung up on that. I find thinking of things in zero-sum terms stressful and often not very useful. There are enough programmers to support a wide variety of languages. If we're worried about not having enough total nerd capital to support all of those ecosystems, I think there are plenty of other inefficiencies we could focus one. Eliminating the 7,000+ different Node libraries for doing asynchrony would probably free up a few person-decades of effort.
Thanks for your insights. I’ve always liked the language Dart. It seems to me to be a fine language—a design that could be widely adopted. I was disappointed to see Google fail to achieve goal #1, I’ll be watching flutter. Good luck and thanks for your work on Dart.In terms of mindshare, ecosystem, and real usage, it's a failure (when compared to popular languages).Regardless of how good a language it is, typescript fits in a similar space and is much more popular, has a more vibrant ecosystem, and is easier to migrate to.
⬐ lern_too_spel⬐ ScarbuttOne thing that allowed TypeScript to succeed is that it was never proposed as a native browser language and was a transpiled language from the start. Once it came out that Google wanted to make Dart a native browser language where it would likely displace the warty Ecmascript, Eich immediately railed against it and said Mozilla wouldn't support it, limiting what could be done in Dart without resulting in code bloat during transpilation.⬐ BrendanEichYes, I alone on my Skull Island stopped Dart! HAHAHAHA!!!!Google never sent me kickbacks to provide for foolish and doomed DartVM-in-Firefox work, so I laugh once again from Skull Island at such idiocy.
Flutter is relatively new compared to when Dart was created, Dart initial goal was to be a language for web programming, outside of google, I think almost everyone agrees it failed at that.They have change their mind many times about the design/future of the language, the churn is still experienced today, see Dart 2.0 (not released as stable yet), which some would say its a new language from Dart 1.0.
⬐ PedroBatistaInitially Dart was ( another ) wet dream from Google that would take over the world by overthrowing Javascript and conquer the Web. It would do what TypeScript does and much more, so by that metric is a complete failure.They gave it another life with Flutter and it looks like it can be a big deal this time.
⬐ sametmaxThere is only one metrics in programming language: popularity. Not quality. Not elegance. Not features. Not performance.The Dart community is ridiculously small compared to anything to the competition, despite the heavy investment in it.
⬐ goatlover⬐ jenscowIt's kind of sad this is so often the case in tech.⬐ isoosI have got to know a few small teams in my local area, developing in niche languages (including Haxe and Dart), and they do not talk about it on forums or on meetups. They are busy developing their product, and the platform's the quality and features help them, the performance is good, and they don't care what the rest of the world thinks with the current hype trends that would fade soon anyway.I lost track how many times I've heard my friends cursing JS and even TypeScript, and telling me they wish they could use a sane language instead. Well, you actually can, and I believe should.
FUD. He's a fan of JS, and probably has no experience with Dart.⬐ pjmlpDart was supposed to replace JavaScript and was head-to-head with Typescript, CoffeScript and many others in that goal.Obviously it failed at that as was rescued by the Google AdWords team that adopted it.
Now they are trying the 2nd coming of Dart via Flutter, it remains to be seen if it will ever take off, or how much Google is committed to release an productive version of Fuchsia.
Personally I am not bothering with it, until I see Flutter listed here https://developer.android.com/guide/components/fundamentals
Maybe it is just me but Javascript is not my favourite dynamic language. Given project like ReasonML and other languages that have type inference prototyping is not harder than without types. Is it a valid argument that it slows you down? Not sure.⬐ gameswithgo⬐ z3t4at least some research into types and productivity showed that the highest productivity was when function signatures have to have types but type inference exists inside functions (like auto in c++, var in C#)⬐ k__Seems like he prefers TypeScript because it's a super set of JavaScript.He also mentions he only knew MS made it, but not who specifically, which lets his choice sound more profound.
⬐ sebringjI was not happy having to do TypeScript on a project thinking in your point about slowing you down which initially is completely true if you are forced to tslint it to hell. However, I've done a 180 in that TypeScript is very nice when you relax the tslinter and you sprinkle it on as you go which gives you that extra feeling of being auto-guarded when its just more practical but without getting in your way which is exactly what Ryan was saying in his Deno approach. I've saved loads of run-time errors from VSCode knowing types ahead of time for example.⬐ acemarkeCould you summarize what settings you used for the TS compiler and tslint for that "sprinkle it on but keep it out of the way" approach?⬐ machiawelicznyTo have nice prototyping & development I use it more like typechecker than compiler.I set "transpileOnly" for tsc loader and set different more strict command like "npm typecheck" to "tsc --no-emit ./src/index.ts". I usually add only strict null checks and it's enough for me. I also disable most of tslint as it's way too heavy in standard CRA app for example.
This way you get hints from tsc in editor so you see what is wrong but at the same time you can run your app without fixing everything upfront.
⬐ sebringjBasically you turn off as many settings as possible that make things impractical. Its more of how you work, not a catch-all. I have VSCode and installed a supporting library to work with it. Basically it tells me what's wrong and how to disabled it as well. I either inline disable it or go to the specific tslint rule and disable globally. It is most difficult when you have to use another person's biolerplate that was built without typescript AND tslint is turned all the way up.For security I make Apparmor profiles for each script. I think the module system and NPM is what made Node.JS popular. And personally I like function passing style aka. callbacks, Promises and async/await looks more terse but is actually more complicated and prone to errors. I also don't like that TypeScript extend the JavaScript language and ads a compilation step to it, it's much better to add doctype like comments and you would get the best of both worlds, although I don't think type-checking is needed if you already do testing. For me static typing is mainly for performance, like in Dart, you can't simple make JavaScript more performant without it. With TypeScript you have "performance optimization" but without the performance benefit. If your code needs type annotations for others to understand what it does you need to use better names. The type annotations are for the compiler. Auto-complete and parameter hinting can be done via inference. And public parameters and methods should have documentation.⬐ rb808Interesting that he's unsure about Go. Would be nice to hear about why.One huge strength about Node (&Deno) is having the same language and tools on the front end and back end. Its a huge benefit to have a team on one language, even if it might not be the optimal choice. I'm not sure if that is the problem he had with Go though.
⬐ faitswulff⬐ breatheoftenThe repo itself says it's "Segfaulty," which is far less likely in safe Rust, for example. Perhaps that's why he mentioned it as a candidate. Regardless, it sounds like he was doing a lot of work in Go, so perhaps it was just the too that was closest to hand for him when he decided to build the prototype.⬐ curun1rIt might have to do with the overhead of calling into C/C++. Since a project like deno will have to do a lot of interfacing with V8, a language like Rust wouldn't have the overhead of cgo when dealing with the embedded VM. Also, while go routines are one of the strengths of go, they don't necessarily play well with concurrency constructs in other languages. Since a project like deno will probably be keeping Node's single-threaded, evented model of concurrency and using libuv (also a C interface), that basically means avoiding go routines.To your point, though, you'd still use basically the same language on the server and client, since deno runs typescript and typescript can compile to run in a browser. The Go side would take the place of Node's native modules, which currently have a C++ interface.
⬐ zemothe point of Go is to have one language that's the best of both dynamic and static languages. the point of Rust is to be safe and fast. if you're building an infrastructure where you're externalizing the dynamic language runtime, such that there's always a host and a guest language, Go isn't necessarily optimized to act as a host language in that context. Rust is, at the expense of being arguably more tedious to program.⬐ pknopf⬐ garyclarke27He isn't (wouldn't be) using Rust for a dynamic runtime. It would be strictly for bootstrapping V8 and a ffi. With the ffi, you interop with whatever net stack you want.At the end of his talk, he mentions possibly using Rust or C++ instead of Go, I would gues for maximum possible performance.⬐ TheOtherHobbesI've never understood why this is a benefit when the one language is - let's say - suboptimal in many ways.The interfaces between server- and client-side code should be well-defined and language-independent. You don't want the same people writing both, because it's harder to check that your API is working to spec if it doesn't get fully independent testing.
There's also a lot of useful server-side optimisation and security management that Node - or a high-level replacement - can't handle.
V8 may improve things, but it's going to have to improve performance a lot to be competitive.
https://www.toptal.com/back-end/server-side-io-performance-n...
⬐ megaman22Too bad SOAP was such a buggy, incompatible, painful mess to use in practice.I've liked protobufs the little that I've used them. Maybe that could be a solution, but I don't think it'll happen.
A huge portion of my pain in web development is how amorphous and ill-defined the client-server interface inevitably is.
⬐ rb808> The interfaces between server- and client-side code should be well-defined and language-independent.They dont have to be and that is the point. If you write JS/TS front and back you can make everything much simpler with the associated benefits.
⬐ LargeWu⬐ ex3ndrBut in what specific way way does it make it simpler? What are the tangible benefits? I've been working in a Node/React environment for the last 2 years and there is virtually no overlap between the code we use on front- vs backend projects.⬐ __shttps://etg.dek.im https://github.com/serprex/openEtG/tree/master/srcQuite a bit of the server & client share code. Card handling logic, deck code handling in etgutil, eventually I'd like to move the game engine to being serverside, user management (easy optimistic protocol handling), svg rendering was used on both until recently due to Chrome having a buggy svg renderer
⬐ overcastBecause not everyone is using React type frameworks on the frontend? There is old fashioned javascript still, and because I know it on the front end, I know it on the backend.⬐ LargeWu⬐ SaltyBackendGuyBut I know lots of languages, and just because I'm using JS on the frontend doesn't mean that's the best backend choice.Whether I'm using React or Angular or Ember or Vue or jquery or vanilla JS on the frontend doesn't have any impact on the code I'm writing on the backend. Because one is concerned primarily with handling UI, and one is primarily for data access. There's just naturally not a lot of overlap between those.
⬐ overcastThat's awesome you know lots of languages, well done. It's not necessarily about what's the best backend choice, it's about getting shit done as efficiently as possible. For the vast majority of projects, you're never going to reach any type of limitation in Node performance, that you may or may not be suggesting. I'm not sure why you're so hung up on the front impacting the back, it impacts MY workflow. I don't have to switch gears, and look up documentation on how to do x y or z in this language, when I already know it in this language. Do you use arrays? Variables? Any basic programming structures? There is plenty of overlap.⬐ LargeWuI'm not talking about overlap in simple language features, I'm talking about actual code reuse. It's often touted as the reason to use JS on front and backend, but it's a pipe dream. It almost never works in practice.If I'm trying to get things done as efficiently as possible, I'm not using Node because the ecosystem is pretty terrible compared to other languages like Ruby and Python. Sure there are lots of libraries available in npm, but I've found that a lot of them are janky or hard to use, missing features, poorly documented, incompatible with each other, etc.
⬐ ScarbuttMeh, forget about code reuse, its about using only one language for a full stack web app and knowing it well.I really don't see how python or ruby is any better than nodejs (its more a matter of taste and familiarity and you don't sound that familiar with nodejs/js in general), if anything, nodejs because of v8 is light years ahead of the ruby and python official implementations, and all three communities have lots of bad and good libraries.
⬐ overcastI'm not sure where it's touted code reuse is the reason to use the same language on both sides. The reason is because it's the same language, and you're not going back and forth between different ones.⬐ LargeWuHere's the original post I responded to:> > The interfaces between server- and client-side code should be well-defined and language-independent.
> They dont have to be and that is the point. If you write JS/TS front and back you can make everything much simpler with the associated benefits.
That post implies that if you're writing in the same language on the front and back ends, it not only makes writing that language easier, but also somehow makes it easier to marry the two. Since most web services operate through some sort of HTTP interface, which is language agnostic, I can only infer that that poster somehow meant code reuse.
So there's that, plus I've seen countless times before over the years where people explicitly tout code reuse as the primary benefit to using JS on the backend. I'm not buying any of it.
⬐ overcastYou use what you want, I'll use what I want. No sense even arguing about it. In the end, users don't give a shit what made what, as long as they get something useful from it.So much this. Although in theory I can see why this would be appealing, however in practice, our Node backend and Ember frontend have very little code they can share. Maybe there's an argument for devs being able to jump back and forth. But again, in practice, the paradigms are very different so I am not sure there's a huge benefit.It is very useful if you want to make your devs not client or server but both. This works well in many organizations that i have seen: developer is responsible to build feature from ground to top - this speeds up things a lot.⬐ chrisweeklyYou're presupposing an old-school 3-tier architecture. Baking that client:server approach into your org structure is not necessarily optimal.⬐ mrigheleHaving the same language allows to share some code between frontend and backend. This can be used to prerender your SPA on the server, for example, or to share some logic with the client to enable offline usage.⬐ lloeki⬐ overcastBeing able to share code is really not that great because you don't share that much code between front and back in reality. Being able to share the paradigms, structure, mindset and tooling (like linters, formatters, code generators, packagers, whatever..) is what's awesome. You remove a whole lot of context switches and cognitive dissonance, smoothing the train of thought which greatly eases the expression of ideas.Source: used quite a bunch of golang + gopherjs, ruby + opal, js + node
⬐ davedxIt depends. I was working on a websockets based game using node.js on the backend and React, redux & canvas on the front end. As my game grew, more and more code went into the /shared/ folder. It's extremely useful to be able to mirror the game logic that's run on the server for security in the front end for convenience and performance (avoid waiting for round trips for everything).For example, I had a standard RPG style combat system. Being able to share all of the code that does calculations for combat allows you to put those calculations into the UI as much as you want.
I think standard CRUD web apps are different because there's much more asymmetry in the responsibilities of the back end and front end.
⬐ kybernetikosI'm working with a system that lets you develop code in the client, then gradually move it further back in the stack as needed. It's incredibly useful for testing.⬐ jonny_eh> you don't share that much code between front and back in realityGood luck rendering React server-side in Go or Ruby.
⬐ lloeki⬐ repsilatTurns out not every application needs, let alone require React.⬐ repsilatI've never looked into server-side react, can you explain a bit about the requirements? Does the server keep a lot of session data around to maintain a shadow DOM for each connected client? How far is it in practice on the continuum between a fancy templating language and a full app framework? Does anyone put server-side React on server-side Redux?Whatever the case, if people write clients in react to talk to Ruby backends, can't they do the same to get "server-side react clients" that run in front of their backend app? Or does that go against the programming model somehow? Would it enforce too much of a barrier between "the view" and the model than is traditional these days?
⬐ acemarkeWhile I haven't done SSR myself, my understanding of the general process is that it's about rendering that first initial page that's sent to the browser. This is most beneficial for speed of initial load and SEO purposes.An SSR React app that uses Redux would probably create a unique Redux store instance for each new client, dispatch just enough actions to fill in whatever data is needed for the initial render, and then serialize the Redux store state to the host HTML page so that it can be used to re-hydrate the Redux store on the client.
For more info, see the "React Server Rendering" [0] section of my React/Redux links list, and the "Server Rendering" recipe [1] in the Redux docs.
[0] https://github.com/markerikson/react-redux-links/blob/master...
Depends a great deal on what the application is, I think. If you're doing heavy lifting (or anything a bit complicated) in the client and then need to make similar functionality available to APIs offline it's a godsend.My current hobby project is a kind of REPL/programming language that does all of its compute in the client, and being able to hit a button and export a "serverless" microservice is lovely side-effect of being able to run JS on the server. It isn't exactly "the backend" to my application, more "a headless client exposed to the net," though, I guess.
(As someone new to JS, the frigging Node ".mjs" fiasco almost derailed the whole thing, though... What were they thinking?)
Edit: oh, and "optimistic rendering" is also a big win -- if you can show the result of an action before it has persisted (or preview it before the user has done it) shared code can also come in handy. Though perhaps we're too scared of network round-trip times.
⬐ Scarbuttfiasco? isn't '.mjs' still the path forward for them?⬐ repsilatMaybe it is for them, but `node -r esm` is what I'm going to do for now.I'll probably start webpacking the backend soon instead, but I'm sure as hell not going to rename every single file in my codebase.
You don't see how having to know only one language, is easier/faster than having to know two languages? A lot of people are doing both front and backend work. Often times it's a single person running the show. I'll tell you right now I use node for ALL my web projects, because I only have to focus on a single language. When I wear every hat managing the domain, server, databases, mail, user questions, and everything else involved in running a modern web app, one less thing to learn sounds great.⬐ BigJonoI don't see it. Any old idiot can learn a programming language. The difficulty doesn't come from the language, it comes from the environment and tooling.⬐ overcast⬐ bronsonSo what's your alternative? I don't know of any environment easier than installing Node. I type 'n latest', and yarn add/install. Done.Ruby? Laughable. PHP has a zillion dependencies. Go with all of its system paths. Java VM, no thank you.
⬐ BigJono⬐ iniminoYou completely missed the point. My point isn't that node is better or worse than any of them, my point is that if you're starting from scratch with no back-end knowledge, then the benefit of already knowing the language you're going to use on the back-end is dwarfed so heavily by the other stuff you have to learn that it's inconsequential.Then the environment and tooling sucks. This is a large part of what Ryan Dahl tried to solve with Node, and I think he got a lot of it right, leading to immediate popularity of the project.Most of the "infrastructure" stuff around deployment and, obviously, the JS ecosystem with all of its build tools and dependency explosions is still a mess, but it's important to keep eyes on the goal of getting rid of incidental complexity. Early Node did that well, modern typical Node development is of course another matter.
Once you get rid of as much accidental complexity in the environment as you can, having one less language to use is a huge win.
Half my Rails apps end up with JS code executed server-side by TheRubyRacer. It's odd, but significantly better than having to keep duplicate logic in sync, some written in Ruby and some in JS.⬐ eckzaWhen the only thing you have is a hammer, everything looks like a nail.You’d be surprised at how well screwdrivers work on those funny-looking nails with the threads and the slotted heads.
Snark aside: learning a second language is not really that hard, and will make you a better user of the first language in ways that you couldn’t possibly have predicted.
⬐ overcastI know plenty of languages, it doesn't mean I want to use more than one for a single project. I don't know what the argument is here. It's not like we're discussing why some no name language is being used, Javascript is prolific to say the least. It's backed by two of the biggest companies on the planet. As much as some wish it would, it's not going anywhere.⬐ eckzaThe argument is, "just because some of the biggest companies on the planet have started equipping their carpenters with specialized screw-hammers, doesn't mean that they wouldn't be better off using the right tool for the right job".For my part, I don't want it to go anywhere. Javascript powers some of the most interesting and exciting things in the world of software right now - but that doesn't mean that it's the best solution for every problem.
Why would deno implement “download on first encounter” for its module system ...?Would you ever want this vs an explicit “build” step that downloaded all the required resources ahead of time ...?
Does deno walk the program source and download everything that could be required upfront or do the resource loading on demand as it’s executed ...?
⬐ breatheoften⬐ jaequeryI think it would be a real shame if the module system design made it impossible to statically enumerate all the module's reachable from a program entry-point without executing the program ...much respect to ryan for coming out like this. i used to think that he might have thought node was the bee’s knees and that callbacks and promises were sent from the gods. glad to hear him confirm what a lot of js devs are feeling right now with the wretchedness of the js callbacks, promises, and generators. there is a better way and im glad he is trying to do something about it.⬐ tlrobinson⬐ chriswarboPromises (with async/await) aren't so bad. He explicitly says he regrets taking promises out of early versions of Node.js.⬐ Guillaume86He looked pretty ok with async/await (which uses Promises under the hood), one of the advertised features of his new project is top level await.⬐ spankaleeWhich is unfortunate. Top-level await is a very bad footgun and should almost never be used.⬐ tlrobinsonCan you elaborate? It seems useful for simple scripts and REPLs, at least.⬐ lostctownIt is useful for those things. I implemented a fairly involved scripting system that we now use inside all of our server instances. Each script in the system is an entry file mapping to an alias command and the entry file is invoked from the top level. To get the benefits of async/await, we write the logic of each entry script under main = async () => do_logic(), and call it at the end of the file as main().I just don't see how this workaround is the correct solution. Just saying that's it's generally a bad idea is not enough for me. At some point there will come a time when you want to treat a file as a function, and when that day comes you will want async/await at the top level.
It's nice to see some recognition of the idea that interpreters are safe by default (excluding infinite loops/OOM), and we can avoid many security concerns (access to files/network/etc.) by simply not including that functionality in the interpreter (unless opted in via a startup parameter, as described).I'm also a fan of using env vars for configuration and locating dependencies, as mentioned in the talk. Much simpler and easily extensible compared to e.g. vendored directories (requires messing with the contents of the source directory, which requires write access and breaks hashes, etc.), hard-coded system paths like /usr or ~/.some-default-location (causes conflicts when running multiple incompatible versions), etc. A simple env var of paths, e.g. colon-separated, maybe with some sane quoting convention, can be used for all of those if desired, whilst making it super easy to extend-with or restrict-to any other location(s) instead. It's also trivial to "bake in" an env var to an application, just call `PACKAGE_PATH=foo:bar my_app` rather than `my_app` (or make a one-line wrapper script).
I agree with others that importing from URLs seems like a bad idea: network I/O is one of the least reliable actions we can take, which would make importing far more complicated than necessary (what if we're offline? should we follow redirects? should we check for proxy settings? how should we report errors? etc.). All of this complexity and the inescapable problems of network failures are completely avoidable by just downloading things up-front. Package managers/build tools can fetch whatever they like (URLs, git repos, etc.), however they like (with proxies, caches, etc.), to wherever they like (one big cache, project-specific vendor dirs, whatever), and just stick the resulting directory (possibly of symlinks) in the program's environment (e.g. via a wrapper script, as above).
⬐ tome⬐ misiti3780> I'm also a fan of using env vars for configuration and locating dependenciesO please just let things be configured by a single JSON value, built from smaller JSON values if necessary.
⬐ chriswarbo⬐ spankaleeSure thing. How do you get such a JSON value into an application though?What I'm saying is to use env vars. You could put your JSON straight into an env var, or if you want a persistent JSON file on disk then put the path to that file in an env var.
⬐ tomeThat's totally fine. I just never want any more than one env var, otherwise the API surface becomes very flakey very quickly.URLs can be relative and absolute paths like HTML supports, they don't have to hit the network. This still leaves room for a package manager, but it has to put packages in known places, or resolve imports at install time.i have always really enjoyed listen to ryan speak. he seems very humble for all of his accomplishments.⬐ k__⬐ matchagauchoYes. He seems like he doesn't care much about the success of Node.js.It was just a cool project and he moved on to the next cool thing.
⬐ curun1rAgree completely.Although one of the more interesting "talks" I ever heard him give was a discussion on promises. I was lucky enough to be in (the ridiculously long) line behind him waiting for food at NodeConf in 2012 and he and an engineer from Microsoft had a pretty spirited discussion that explored the subject in way more depth than I had previously thought possible. Ironically, given that he now considers the removal of promises to be a mistake, it was the engineer from Microsoft who took the pro-promises side of that argument.
⬐ tlrobinson> it was the engineer from Microsoft who took the pro-promises side of that argumentJavaScript's async/await is essentially taken from C# (with "promises" instead of "tasks"), so that could explain it.
He buried the lede. The talk highlights some regrets, but the most interesting part is a discussion about a new framework he's building based on Go and TypeScript: deno https://github.com/ry/deno⬐ tlrobinsonI'm glad to hear he (and the Node community in general) has come back around to promises.I remember the very early version of Node.js that had them (looks like they were added in v0.1.0 and removed in v0.1.30), although they weren't true chainable promises we have now.
⬐ BenoitEssiambre⬐ tannhaeuserDear god no. Promises are ruining the best thing node had going for it: Its beautiful lisp like async model.Promises add hidden state, corner cases and incompatible libraries for very little gain.
⬐ tree_of_item⬐ jonny_ehCan you explain why you think Lisp is involved here?⬐ BenoitEssiambreJust how easy JavaScript, like lisp, makes it to pass around pure anonymous functions and make good use of closures (if you are willing to tolerate a few brackets). Promises had to ruin it by wrapping functions into stateful objects.Really the best case I found against Promises comes from, believe it or not, the pro promise chapter of a book:
https://github.com/getify/You-Dont-Know-JS/blob/master/async...
I really don't understand how the author managed to view the things he described as positives.
Just the length of the chapter is a testament to how overly complex promises are. On top of bringing in a ton of jargon such as "thenable", "rejection", "fullfilment", "future value" (I think the author means you are guaranteed a nextTick whooptidoo), "uninversion of control", "revealing constructor", it admits that promises do not solve most of the issues mentioned about callbacks in the previous chapter and often makes things fail in more subtle ways.
It describes how you have to manually call things like Promise.race() to solve some of the problems hardly making things more automatic then callbacks.
He talks about "Thenable Duck Typing" saying horrifying things such as
"Given that Promises are constructed by the new Promise(..) syntax, you might think that p instanceof Promise would be an acceptable check. But unfortunately, there are a number of reasons that's not totally sufficient.
Mainly, you can receive a Promise value from another browser window (iframe, etc.), which would have its own Promise different from the one in the current window/frame, and that check would fail to identify the Promise instance.
Moreover, a library or framework may choose to vend its own Promises and not use the native ES6 Promise implementation to do so. "
and
"The standards decision to hijack the previously nonreserved -- and completely general-purpose sounding -- then property name means that no value (or any of its delegates), either past, present, or future, can have a then(..) function present, either on purpose or by accident, or that value will be confused for a thenable in Promises systems, which will probably create bugs that are really hard to track down."
He praises immutability but ignores the fact that promises add mutable hidden state between your call and callback.
He mentions that promises can silently swallow errors and calls it the "Pit of Despair". There is much more. Read the chapter.
Pit of despair indeed.
Callbacks are so much simpler and beautiful:
https://medium.com/@b.essiambre/continuation-passing-style-p...
⬐ fernandopjI personally prefer promises over callbacks, but kudos to your compelling reasoning. Promises are indeed a beast to tame, they do make debugging very difficult among other things. I think the ecosystem evolved to just agree we should move as fast as possible to using async/await and be done with callbacks AND promises.I find it really bizarre that Node still doesn't have full promisified built-in libraries. It's not like it'd break existing APIs.⬐ tlrobinsonNode 10 has fs, at least:require("fs").promises.readdir(".").then(console.log)
⬐ jonny_eh⬐ dvlsgWhy not just:require("fs").readdir(".").then(console.log)
⬐ tlrobinsonThat's the legacy callback version.In practice you'd probably do:
const fs = require("fs").promises; // ... fs.readdir(".").then(console.log);
⬐ kjeetgill⬐ salehenrahmanAh, thats pretty nice. The previous example was pretty janky but I didn't foresee it being used like this. Eyes opened.It can potentially break things.There are legacy code that assume that calling `readdir` will yield undefined, and will have just passed that result to a function, that alters its behaviour based on whether a parameter is undefined.
⬐ jonny_ehInteresting. Seems like a great opportunity for a major version bump.I think they're testing natively provided promise APIs with the fs module.https://github.com/nodejs/node/pull/18297
It's of course not a full promisification of every built in, but it looks like theyre at least trying it out.
Let's not forget node.js wasn't created in a vacuum but was based on CommonJS (module format and std lib) also implemented by TeaJs/v8cgi, Helma, and many others [1]. In fact, server-side JavaScript was a thing as early as 1999 or before (Netscape Server). That it's based on a highly portable language also used on the browser is what made it attractive over alternatives for me back in 2012 or so.[1]: https://en.wikipedia.org/wiki/List_of_server-side_JavaScript...
⬐ pibi⬐ davidwYep, I remember I discovered node (was @v0.1.20 or so) when I was working with Jaxer, 2008 maybe.⬐ mlinksvaThe Netscape Server SSJS thing was originally called LiveWire. I used it in 1996 and it was terribly buggy, but the web says it dates from 1995.Some kind soul want to do a summary? I got through "uh hey, uh so" and remembered why I don't do videos.⬐ d0m⬐ logicalleeTry "right-arrow" (you can keep it pressed). Sounds stupid but it used to be an issue for me and now I can watch lengthy videos in a minute, a bit like how one can go through a book quickly by flipping pages and going back to the interesting bits.⬐ pookehI watched on 1.5x speed.⬐ billbrownHere is his PDF of slides - http://tinyclouds.org/jsconf2018.pdfHere is an amazing quote from a different interview:https://www.mappingthejourney.com/single-post/2017/08/31/epi...
Apparently these days,
"Ryan: Yeah, I think it’s… for a particular class of application, which is like, if you’re building a server, I can’t imagine using anything other than Go"
!!
⬐ iniminoIt's wild to see this after having been around in the early days of Node, and now having also moved most of my attention to Go as well.Ryan has always had, IMHO, excellent taste and good instincts and guts to make things simpler. A lot of the accidental complexity in our industry persists simply because people tolerate it, and it's great to see that he hasn't lost his fire in that area. Looking forward to see more about Deno.
The point about promises was an interesting one for me personally since I was one of the people at the time who argued in whatever small way for taking promises out, in the hopes that the language community would come up with something better.
I have mixed opinions on the topic now, but it's interesting to speculate about what might have been. The Node.js ecosystem was weakened by having different ways of handling the async question, and by a lot of developers not knowing the best practices in using callbacks effectively and leading to "callback hell", which is totally unnecessary. It's possible that having promises baked into an early Node would have constrained or even fragmented what we ended up with in the language, and that would have been worse. I'm still a little disappointed that we didn't end up with anything more elegant than promises in the language itself.
It's interesting to compare package management in Node and Go. NPM got the early adoption and became the de-facto package manager at a time when JS had no such thing. In Go, the package manager question has been unsettled for a much longer time and there are more chances to experiment. Package management is simply difficult, and it seems impossible to design a good programming language and resolve the package management questions at the same time. It's sad to see some of the criticism against NPM... it's much easier to criticize than to build a better system.
⬐ azylmanIt's interesting to hear him say that npm and node_modules are regrets since lots of complaints about Go packaging from people new to Go ask for something similar...⬐ geocar⬐ partycoderYeah I was surprised about node_modules as well. I think that's actually a killer feature (that we don't need $NODE_PATH or virtualenv or similar)!⬐ jonny_eh⬐ k__I've found it has certainly helped with debugging. I find it nice that all of my project's code is in one directory.To save on disk space, use pnpm, best of both worlds!
I think the package system consists of many parts.Some parts of npm are much better than with most package systems, some really suck.
Maybe if npm weren't included so deeply into Node, it would make something like Yarn emerge sooner and replace npm without much hassle.
⬐ jessaustinyarn just seems like a set of incremental improvements over npm? Which is great, we're all for improvements. However, the vehement complaints I've seen about npm [0] make it seem that nothing short of a complete re-architecture could be tolerated. yarn does not seem to be that.I don't agree with those complaints, but I do agree with you that "some parts" are really good. node really figured out the right search strategy for an unscoped import. (e.g. "require 'foo';" rather than "require './bar/foo';") Just look for a directory with the name "node_modules". If you don't find it, go up one directory and try again. So simple! So predictable! So complete! It works so well, all manner of "left-pad" abominations can be supported. Any other system should think very carefully before using a different import search strategy.
[0] with the exception of those related to path depth: I think those are resolved now? I wouldn't know because I stay on an OS that doesn't go out of its way to frustrate me.
⬐ k__Well, since npm was the de facto default, it had enough time to catch up, I guess.Deno is not a good idea and Ryan Dahl should just let node.js developers moving to Go continue doing just that.Go is a well designed language that is productive, has a very tolerable learning curve, good documentation and does not need a JavaScript facade.
If kids in the 80s without the Internet could learn programming, you can learn Go in 2018.
Expect another video in 10 years: Things I regret about Deno, probably having advice very similar to this.
⬐ danschumannI still remember the first time I saw his node.js release video- changed my life.⬐ NoneNone⬐ dxhdrI don't understand the rationale behind using V8 for server code. Yes V8 is a general purpose JavaScript engine but ultimately all of the performance trade-offs and design decisions are made with browsers as the optimization target.It sounds like Ryan is still interested in making V8 work so I have to ask: why do you want to be writing server code on a client browser engine?
⬐ sametmax⬐ VikingCoderBecause that's the best existing engine (even with Zilla nicely catching up), with dozen of genius and millions of dollars dedicated to it since the beginning and for the next years to come.Why do you think JS became so much faster compared to the anemic octopus is was before ?
And can you imagine the perf of Ruby or Python if a 10th of those resources were allocated to those ?
Why msg.proto? Why not json?This seems like an odd dependency to me, and it seems like it adds no value.
⬐ pknopf⬐ qaqPerformance.⬐ VikingCoderThat seems highly unlikely.I'm going off of what I've seen myself, and what I've seen in the xi editor:
https://github.com/google/xi-editor
"The protocol for front-end / back-end communication, as well as between the back-end and plug-ins, is based on simple JSON messages. I considered binary formats, but the actual improvement in performance would be completely in the noise. Using JSON considerably lowers friction for developing plug-ins, as it’s available out of the box for most modern languages, and there are plenty of the libraries available for the other ones."
⬐ dwetterauDoes the xi editor use a JSON transport layer for all syscalls though?Low-friction interfaces for developers are great but I'm not sure I agree with the comparison here.
⬐ VikingCoder⬐ iniminoYes, it does. The frontend and the backend only speak JSON."Performance" in a text editor means latency, and it's not going to matter there when the text editor also has to edit the text, update the display, etc."Performance" in a server also means throughput. There's a reason why protobufs exist and it is all about performance (and type safety which is related) and there's no way JSON serialize/deserialize overhead, in both directions every time, is going to be a good choice here.
Deno looks very promising⬐ ebbvRyan rightly warns about adding “cute” and unnecessary features to projects. Then he goes and adds the “Load module directly from a URL” feature with all its complexity to Deno.Ryan; kill that feature now. It’s not needed. It’s just cute.
⬐ dergachevI first read this as "Things I regret about Node.js by Roald Dahl" and was instantly intrigued!⬐ modzu⬐ snekcan somebody downvoting the comment above kindly explain why?⬐ sdfsdfdsfgs⬐ genjipressHackers don't like fun. They are serious and important people.⬐ bshimminOne-line jokes are generally not well-received on HN unless they are so exceptionally amusing and original that they are beyond reproach. The standard rationale for this - which can make HN seem very dry and humourless at times - is that people want to avoid HN becoming like Reddit.⬐ modzuthanks for answering. but thanks to the parent comment i learned about a new (albiet widely recognized) author. thats useful because we all come to hn to learn things. i suspect most downvoters probably dont even understand the reference and just see a joke qua joke. boo.⬐ hboonI don't know how entertaining they are for adults, but I enjoyed reading Boy and Going Solo when I was younger.⬐ cafard⬐ nsomaruI encountered them probably in my 40s, but did find them enjoyable."Boy", collection of short stories including "Henry Sugar"⬐ whatsstolatInteresting its strange to hear that someone hasn't heard of Roald but that's a generational and nationality bias.⬐ chris_wotWell, don't feel you can't read his children's books. They are incredibly entertaining. Try reading The Twits without laughing... very difficult!"Things I Regret About Java," By Roald Dahl, author of "Charlie And The Class Factory"⬐ jacquesm"Charlie and the Chocolate Factory Factory"⬐ wolfgkeRather: Charlie and the Chocolate Factory Visitor.tl;dr converted ts dev talks about how he screwed over the node.js ecosystem and how everyone should therefore use his shiny new project instead :/⬐ coldtea⬐ ameliusHe didn't just "screw it" -- he also created it.TL;DR, from quickly browsing through it:He mostly regrets the way Promises work (edit: that the async programming APIs don't work with them), security/sandboxing, and the module system.
Then he introduces "Deno", a successor which is under construction and which will fix all these problems. It exposes a language based on TypeScript. Internally, some parts of Deno are written in GoLang.
⬐ k__⬐ bitwizeHe regrets not including promises early on."And JavaScript is a the best dynamic language." [sic]Uhhh, Ryan, there's this thing called Lisp...
⬐ nodesocketWe've missed you Ryan. :-)⬐ emehrkayInteresting that he had promises in node that early. Maybe if he kept them, the community wouldn’t had standardized on two space indentsEdit: so two space indents weren’t a consequence of “callback hell” ?
⬐ abiox⬐ krmmaliki could see nested callbacks favoring two spaces, but overall it's not a style unique to javascript.for example, a lot of ruby i've seen (admittedly it's been a while) used two spaces, and i don't recall it being any more prone to deeply nested code than other languages.
that said js code tends to be rendered in a variety of other places like browser dev tools, which may influence this as well.
⬐ jessaustinAfter getting used to 2 spaces in js I try to use them everywhere now. Except python, which obviously must be 4 spaces...Great talk. I have missed Ryan and im not even a developer. I just love his mind. He always demonstrates a lot of purity of thought in many ways.By the way, was i the only person to notice that “Deno” is an amalgamation of “Node”?
⬐ quickthrower2⬐ uqimerioniAlso an anagram of "Done"⬐ bitsoda*anagramWTF slide brought be here. In the world where Python and Ruby exist - calling javascript best dynamic language is nothing but heresy.⬐ always_good⬐ rmrfrmrfHow?Javascript has a lot going for it from async-everything to conveniences like destructuring. And it works in the browser and has things like Typescript.
I have to find good reasons to use another dynamically-typed language over Javascript.
⬐ LargeWu⬐ bgormanI would argue async-everything the biggest hassle with Javascript. Most of the time, I need things to run synchronously. Do this thing, then do that thing based on the result of the first thing. The need for async is the exception. So what we end up doing is expending extra effort forcing all the async stuff to run synchronously when that should be the default case.⬐ duderificThank you for summarizing this so succinctly. The true use case for async in the browser is rare. Sure, you're not blocking the UI while making server calls, but how often do you need to click around while waiting for something on the server to happen?⬐ emilsedghWith the callback methods it was really a hassle.With async/await is so nice now.
await step1() await step2()
That's it.
⬐ LargeWuI get what you're saying, and yes, async/await is much better. But the entire reason it has to exist (and be used everywhere) is because javascript's default async behavior gets in the way so often. The irony is that when I do want some sort of async behavior, I often have to reach for something like bluebird anyway because I want better control over the Promises. So either I'm circumventing JS's default behavior by using async/await, or using a library that makes promises manageable. Almost never is it preferable to me to use JS's standard plain async behavior.Clojure is quite clearly the best dynamic language ever created.⬐ ScarbuttI think you are taking it too seriously.⬐ NoneNone⬐ mschuetzUnlike Python, javascript did a lot to fix its issues in recent years. Python still can't get scoping right.⬐ NoahTheDuke⬐ rictic> Python still can't get scoping right.Python's scoping is easy, it's just also very shallow. What are your problems with it?
I like all three languages. I spend most of my time with JS because it runs in the browser with no overhead, and it has the highest quality VMs.TBH I felt like a lot of this could have been resolved if he stayed involved with the community and opened issues with the relevant repos.Seems pretty classless to rage quit the community, brag about how much better Go is, then mark your return at a JS conference by shitting on Node and hard-forking the server-side JS community.
⬐ davidy123The first post in this issue has a summary of some performance problems with node.js: https://github.com/ry/deno/issues/162Though, it's posted in a confrontational way so the conversation deteriorates.
⬐ Touche⬐ VeejayRampayNo, that thread is everything that is wrong with open source.⬐ blindwatchmaker⬐ apazzoliniSomething similar happened from Linus Torvalds just recently: https://news.ycombinator.com/item?id=17242103Is being a dick pretty common in open source discussion?
⬐ davidy123I don't think your comment is clear at all. Do you mean the "community" or (arbitrary) toolsets of open source?⬐ NoneNone⬐ ToucheThe community. Specifically that someone thought it was appropriate to post an issue shitting on someone's old project in that person's new project's issue tracker.⬐ davidy123The original post could have been constructive. It's at least a good possibility that a new thing can be developed based on criticism of an old thing, and some of the author's points were just building on what Dahl said, so it could have been a useful discussion. I don't agree with another comment in this thread that it's easy to (succinctly and with depth) point out what's wrong. The person who initially responded to the issue could have included these possibilities rather than playing full-on cop. These systems are still largely organic. But the original poster's replies are hostile and in light of the "radio silence" request mentioned later in the thread, it wasn't ultimately a good post.⬐ DemiurgeIt's also kind of pathological when someone focuses on just one technical aspect of software, like performance, or security, completely unable to reason within a large context of people doing things for other people.All I could think while reading that issue was "you're not wrong, you're just an asshole."⬐ always_goodOP was incredibly immature and the resulting discussion was worthless.> I do not need any help from you thanks. This is aimed towards Ryan
Yikes.
Also, calling everything a horrible miscarriage is not a discussion.
Pointing out the problems with something is easy. Anyone can do it. What's hard is evaluating problems in relation to the trade-offs that were made. Only amateurs think everything is "horrible" as if there are never trade-offs.
I agree with my sibling: it's only good example of worthless Github issue "conversation".
⬐ davidy123OK, that's fair. I'll edit my comment."And then there were people adding modules. I thought to myself, this projet is done now. So wrong."Seriously why are we as a community still entertaining those outlandish statements and attitudes? If you heard that sentence as a non-tech person, you'd swear modules is a computer virus or something, that it's so inherently bad that it's not even worth discussing.
Props to Ryan for his amazing work, but come on let's try to be civil and respectful here.
⬐ abiox⬐ royjacobsi'm not sure what you mean by "outlandish statements and attitudes".when he said "this project is done now" my takeaway was that he didn't anticipate the popularity of node. and "so wrong" was him acknowledging that he was very obviously mistaken.
⬐ sonofaplumThats not what he meant. He was not saying "People are adding modules, modules are bad, this project is ruined, [modules are] so wrong." He was saying, "People are creating modules, people are building on top of my work, this project is COMPLETE. [I was] so wrong."⬐ bluesrooI'm glad you elaborated on this because I also interpreted it as an overly negative attitude towards modules. He was definitely a little nervous while talking!⬐ VeejayRampayAh my bad, completely misunderstood what he was saying.Thanks for providing context.
I think it's quite interesting to see that originally node.js was presented as a bloat-free alternative to "enterprise languages" like Java, C# or even Python or Ruby. A lot of complexity was subsequently added in an ad-hoc way which has resulted in (for example) a package management system that's wildly out of control.It's very popular of course, so I'm definitely not arguing that metric. However, the stuff that was originally called examples of tooling that exhibits unneeded bloat and complexity (Maven) is now reimplemented in Javascript, but poorly (npm).
⬐ dnomadWatching the Javascript poorly reinvent the wheel has been very disappointing. Very simple mistakes like immutable, never changing build releases that Java developers understood 15 years ago are become recent front page news in this community. Ironically, even though all the code is open-source pre-existing knowledge does not get leveraged in the open source world. There's a kind of market failure at work here it seems; the lack of commercial selective pressure results in the flourishing of lots of poorly researched OS solutions.⬐ wolco⬐ cpetersoI like the energy around the javascript everywhere movement. So what if they reinvent the wheel, sometimes you find a better wheel and break the rules along the way.There is something exciting about developers using a language in ways it was never designed. Then having the language change to support the changing ecosystem...
⬐ ppeetteerr⬐ minitechIt's when you build a business on a technology and then have to re-invest to rebuild the product, that's when it becomes an issue. Think of all the start ups that built running businesses on Angular 1.⬐ abiox⬐ ahansenas far as i know, angular 1 still works just fine... :)(as it happens 1.7 was released recently.)
⬐ ppeetteerrIt does but try finding a developer looking to work on Angular 1So true, at the end of the day people are working like this because it's the way they feel the most passionate about. You can't really blame them considering how disinterested people can get working on that last 10% of even their own projects.⬐ kingluditehaha, why do we do this? Should we judge programmers by how finished their thing is? I think it is the most impressive quality when I see it (which is non of my own stuff haha)> Very simple mistakes like immutable, never changing build releasesCan you expand a bit more? Not sure what this means.
⬐ dnomad⬐ jecxjo[1] https://en.wikipedia.org/wiki/Npm_(software)#Notable_breakag..., [2] https://www.csoonline.com/article/3214624/security/malicious..., [3] https://news.ycombinator.com/item?id=16087024And the list goes on and on IMO. What's disappointing is that these were lessons learned a long time ago and now they're being re-learned.
⬐ staticassertionThis stuff isn't relevant at all to the talk - he never talks about npm or anything to do with package managers but instead how node does imports etc.But anyways, [2] is at least a problem in many other package repositories. [1] would probably be a problem for many - given legal pressure (vendor your shit, that's the solution). [3] was a bug, not a design issue - no package management system is immune to bugs.
The one thing Java has is that it uses namespaces, which may help with [2] (but barely). [2] certainly has been a problem in PyPI.
Certainly all of this could happen to PyPI. We see it happen with js more, I think, because js happens to be extremely popular so there's a ton of packages for it and it's also much younger (especially node) than others.
⬐ tcheard⬐ ProAm> This stuff isn't relevant at all to the talk - he never talks about npm or anything to do with package managers but instead how node does imports etc.He does have it in his slides.
Slide titled: "Regret: package.json", last 2 points:
> Ultimately I included NPM in the Node distribution, which much made it the defacto standard.
> It's unfortunate that there is a centralized (privately controlled even) repository for modules.
⬐ staticassertionYeah, I remembered the package.json bit, but that part still had nothing related to the issues/ mistakes mentioned.⬐ tcheardUnderstood, sorry I wasn't trying to dispute any of your points about said issues/mistakes.Just trying to clarify that he does actually talk about NPM and his regret about it.
⬐ staticassertionYeah, I actually only remembered the bit about 'package.json' and not the other quotes as well lol"The Wheel of Time turns, and Ages come and pass, leaving memories that become legend. Legend fades to myth, and even myth is long forgotten when the Age that gave it birth comes again." - R.Jordan⬐ jjjensen90⬐ fold_leftI'm afraid the current frantic pace of reinvention in JS/web might cause the Breaking of the World and throw us into a Third Age where no one quite remembers the true lessons of the Ages before.More frustratingly, at least for me, is that some of us have been warning about these things for absolutely years without many paying any mind, only for them to keep in happening again. Eg. https://news.ycombinator.com/item?id=16090120⬐ irrational⬐ minitechThe problem is, how do we know that those shouting warnings are not false prophets?⬐ lgas⬐ RetraPresumably by reading and understanding their arguments.⬐ irrationalBut, how do we know which ones we should be reading and understanding their arguments?I've found that some people tend to take a ton of pride in the assumption that they have to make mistakes to learn from them, but you almost always want to learn from other peoples' mistakes first. Probably overlaps quite heavily with those people who desperately want to tread on new paths.Sounds to me like the list has one item: “the npm registry once allowed users to delete packages”. [2] and [3] have nothing to do with immutability. None of them have to do with reinventing the wheel, either, unless you wanted Node to use Maven for package management?⬐ rmrfrmrfAnyone who's acting like this wouldn't happen in Maven is lying. Just look at what happened when CodeHaus went out of business.⬐ OvermindDL1They don't mean immutable as in language form, but immutable packaging system is the general form where you can add but not remove packages to it so as to not break things, which is the common form among most maven/cargo/hunter/etc.../etc... dependency packaging systems. It's generally considered that npm supporting deleting packages was a major mis-design, which became very public when a popular tiny package got deleted, which then broke so many things, so they learned that the hard way instead of learning from the systems that came before (obviously not cargo, but you get the drift ^.^).⬐ minitech> They don't mean immutable as in language form, but immutable packaging systemI know. [2] and [3] have nothing to do with a package repository you can’t delete things from.
This issue is not just a lack of learning from past failures, it's an active issue that is systemic to web development, especially node. Everyone wants reinvent the wheel rather than supporting a similar, already existing project. I don't know if it's that everyone wants to be the lead on something or if they all lack group skills, but there is no reason we need dozens of similar, partially functional libraries. I can barely, and I mean sooo very barely get behind the fact that all of these SaaS companies need to create their own versions of frameworks, but it amazes me just how many square wheels there are in the web community. It was one of the major reasons it took me so long to start doing full stack development, just way too many cooks all wanting to make almost the exact same meal, only theirs is more superior.⬐ SwizecIt’s because the average dev has less than 5 years of experience according to the StackOverlow survey, and web is the fastest growing field inside software engineering.A large majority of people you’re chiding for not learning from others, don’t even realize those other things exist.
⬐ sidllsIgnorance is curable, but requires the cooperation and desire of those who lack to achieve the cure. From my vantage the world of software development seems filled with mediocre individuals who all think of themselves as the Jon Galt of software.⬐ evilmushroom⬐ jecxjoOr people just enjoy building something that scratches an itch for them?⬐ SwizecOn average most people are average, yes. Most above average people are average in most situations, even.And sometimes it’s just resume building or intellectual curiosity itching.
But it's not even independent green developers, it's everyone. Chai, mocha, jasmine, jest, should, expect, lab...omg do we really need another unit testing library? Sure they are all slightly different but there is no reason they all couldn't be condensed down to one or two libraries. Shall we list all the reactive UI frameworks? Or routing frameworks? Everyone is at fault here.⬐ TeMPOraLI have a feeling that JavaScript, and some other areas of open source, have a popularity contest problem - people building projects not because they're needed or useful, but for that brief moment of Internet fame.⬐ tiben_⬐ SachoI have the same feeling about this. Github "collect the most stars" effect?⬐ pjmlpIt gets worse when instead of your CV, you get hired by startups based on your Internet fame, or wasting your private life building Github (sorry Gitlab) portfolio.Half of those are assertions libraries, not unit testing libraries. What are you comparing this list to? What is the appropriate number of unit testing libraries a language should have? Do you scale that number for community size?⬐ chrisco255Chai is not a testing framework, it's an assertion library compatible with all the other main testing frameworks you mentioned. Yes, we do need experimentation and innovation in testing frameworks. Jest was a real innovation to the space and is particularly awesome for React testing with it's snapshots feature. This kind of argument never gets made with anything else. "Why can't we all just stick to the Model T. It's perfect."⬐ jecxjoI am all for experimentation and creating something new. But so many of these projects out there are not forks of current projects, they are complete rewrites. Is this because the new project is vastly different? Nope! That is the issue, they aren't extremely different.The reason this is a problem is because web tech is constantly changing, to the point that so many of these projects end up in the scrap heap far faster than other tech. It causes problems with long term service due to compatibility issues with ever changing dependencies.
⬐ dangerfaceThe Model T is a product, but we are talking about tooling, in that context the same spanner that can fix the Model T can fix the latest Tesla.The car industry probably wouldn't be as big, if you had to learn a new tool for every new car.
That's a good example of the Innovator's Dilemma: the enterprise incumbent is unseated by some "crappy" lightweight solution that is easier to get up to speed and solves enough of the problem. The complexity, accidental and essential, comes later.⬐ weberc2⬐ didibusNevertheless, people who need a lightweight language are always able to find one because there is always a new language at that point in its lifecycle. Further, there are languages like Go which seem to be determined to remain easy to get up and running, and don't seem likely to change anytime soon (for better or worse).⬐ cpeterso⬐ orfIt will be interesting to see if some new language eventually seeks to disrupt Go by "out Go'ing" Go, returning to its approachable roots.(Amusingly, my iPhone autocorrect replaced "roots" with "Russ". Russ Cox is an engineer who works on Go. :)
See: mongodb⬐ mxschumacher⬐ royjacobsnow a publicly listed company with $2.55bn in market cap⬐ orf⬐ StavrosKand that makes it a good database, right?This is pretty much a perfect example. That and dynamic languages, although dynamic languages happened because the happy medium of types is type inference, and previous popular typed languages were too verbose/inflexible.We're definitely reinventing the wheel a lot, though.
Yes, this is exactly what's happening. Existing tools are seen as too complex because people don't seem to be realizing that the complexity is not accidental, but necessary.I'm not trying to say that Java has no accidental complexity of course, I don't want to open that can of worms :)
⬐ jeremyjhWell there is also a problem with people who don't want to say no, and don't want to stop working on their project when it is finished. Adding all the features that appease a different 1% of your user base is what leads to the bloat - and it still is bloat when 90% of your users will never use the functionality that externalizes all kinds of leaky abstractions and other costs onto them. Just because that bloat may happen to your successor as well doesn't make it right in either case.⬐ alirobeWhich complexity is actually necessary? Does it change when you have 400gbit, SSDs, and watchOSes? How about 1TB of core memory? If we aren't wrangling with handling 75 spinning disk's connected to a 10mbit network with 13" blurry CRT monitors, perhaps we don't need to discuss the finer points of engineering efficient client/server LOB applications? Perhaps we ought to discuss MDM, RF, ML, and lifestyle impacts.This is all just the process of evolution at play. What seems obvious today wasn't yesterday - applies to biology, material science, medicine, engineering, art, music, architecture, design, taxi services, marketing, government, politics, and so why shouldn't it be so in computing?
Sidenote: I love the humility of this video. I remember the days when node was first unleashed. I could not have imagined how it has changed the way we all work. It all seemed so obvious from day one, and here we are today. What a brilliant contribution.
I think the truth is even sadder. The new generations hear the complaints of the old. They hear, I hate spring, Java is so bloated, XML hell, etc. So they think damn I don't want to touch Java with a ten foot poll.That's when they go instead with the newer system, that didn't exist long enough to have accumulated criticism. Which is backed by enthusiasts still in the honeymoon period.
⬐ grosjona⬐ bartreadYes so true. People should spend more time thinking about the side effects of their actions and speeches before they make them.⬐ PinkMilkshakeNo, people should speak what they believe to be true. The onus is on the listeners to not blindly accept everything they hear as reality.⬐ grosjonaWhen an authority figure says something, listeners are more likely to accept it, even if it is wrong. That's just human nature. So authority figures have an extra duty to think about the effects of what they're saying.They owe their success to these people and so the way that they can pay it back is by using their voice as a tool for improving things.
I think the problem with "bloat-free" is it's a fine ideal until you try to solve any kind of reasonably complex problem, and honestly it starts to creep in even when you're solving something that isn't particularly complex.Here's a concrete example. Your classic node.js or express.js sample app is something fairly simple like a hello world, or an IM server. A more complex sample probably looks something like that venerable nodecellar app from a few years back. In all cases the spiel is, "Hey, look how easy it is to create a web server with node."
Except that I'm looking at my node server source right now - for an honestly fairly simple app containing a handful of pages and a blog - and here's what I have:
- Routing (obviously)
- Cookie and body parsing
- Session management
- MongoDB integration
- Passport.js for authentication with a couple of providers (FB and Twitter)
- File system access
- HTTPS and SPDY/HTTP2 support
- Compression support
- Logging with winston and morgan, including loggly integration
- Referer spam filtering
- Pug templates
- Hexo blog integration
- Path resolution support
- Request validation and redirects
- Static content support
- Stripping headers such as X-Powered-By, and adding other headers such as the all-important X-Clacks-Overhead
- Error handling
There's probably a couple of other items I missed, but you get the idea. It seems like a lot but, as far as I'm concerned, this is express.js app MVP for anything you might want to put into production.
I haven't even mentioned the gulpfile I use to build all this, which targets desktop, mobile, along with embedded versions for a particular mobile app due to launch in the near future, and has become something of a behemoth[1]. Nor have I mentioned that I have Cloudflare to sit in front of this, primarily to deal with the heavy lifting of some of the media files I serve up.
On the face of it, this might feel like "bloat" but it's all necessary to run the application and, like I say, a lot of it is the bare minimum for an MVP web app in node.
[1] Yes, I know I could/should switch to webpack, but gulp works, and switching to webpack "just because" doesn't justify itself with the value it might add.
⬐ p2t2p⬐ DanielBMarkhamThis why I love Java so much. Take a look at what Spring Boot does for me:- Routing is done in two lines of code: @Controller @RequestMapping("/myroute")
- Cookie and body parsing - no need to write any code to do that, I just have method parameters and all of the data flies in. Whant a validation? Only one keyword on a method parameter - @Valid. Custom validators are supported as well.
- Session management. It just here for me and does the right thing by default. I can replace storages with custom implementations but by default no code is required from me.
- MongoDB integration - Spring Data MongoDB and you only need to define interfaces using naming convention. The code to access the actual database is generated for you.
- Spring Security supports multiple authentication mechanisms and gives you neat DSL to configure it.
- File system access kind obvious thing.
- HTTPS and HTTP2 support provided by Spring MVC as well.
- Compressions support - it just "server.compression.enabled=true" in your config
- Logging - slf4j + logback come with Spring Boot and there plenty custom appenders available to put you logs into logstash/splunk whatever
- Referrer spam filter - not sure about that one but CSRF protection comes OOB and enabled by default.
- Multiple tightly integrated template engines to chose from. Zero configuration code as well.
- Static content comes OOB and enabled by default, just put your stuff into resources/static.
I mean yeah, modern webapp is a complicated thing! So whenever I see somebody trying to do anything "not bloated" it means that I end up writing low level code that has been written multiple times again and again.
The other day I was trying to code a simple thing in Clojure because I love Lisp. Well, it's just embarrassing. I got to simple page showing stuff from Postgres and the boilerplate/business code ratio was at about 70%. Manually configure connection pool, manually start it, manually prepare your component systems, manually mention all of the dependencies for components, manually configure template engine, manually enable static resources support in ring, manually configure and enable session support in ring. Then we come to authentication and don't even try to sell me Friend. EVERYTHING is manual. The only good thing was "environ" which did the right thing but again with "bloated" Spring Boot it comes OOB and I don't need to configure it!
If you don't use something "bloated" it only means that you're writing code yourself, again and again.
⬐ EtienneK⬐ i_made_a_boobooAgreed. Spring Boot is how you do frameworks right.It is opinionated and provides libraries and solutions for almost everything you need to do, BUT it always allows you to use your own if required.
I love Spring Boot and I wish there was something even remotely as good and full featured in other languages.
⬐ i_made_a_boobooI'm switching to Rails after doing years of Spring Boot apps. The bloat of Java is 3x to 5x.⬐ pjmlp⬐ cutlerGood luck running it fast enough.⬐ jrochkind1⬐ cutlerMeh, not really an issue.Spring Boot 2 with Kotlin can be a lot less bloated, especially with the async Reactive Web option which uses Netty.Have you tried Luminus? It comes configured with Buddy for auth though the docs for both Luminus and Buddy could do with some work.⬐ p2t2p⬐ JachI haven't. But I definitely will try to use it, I'm really eager on getting to the same level of productivity in Clojure that I have in Java with Spring. I have a huge hope that Clojure + something like Spring Boot for it could make me even more productive. Some of the stuff that we have in Clojure really is wonderful, hugsql for example.⬐ cutlerYou'll never get anything close to Spring Boot for Clojure. Spring grew out of J2EE and Clojure's culture is diametrically opposite, favouring curated libraries. Clojure's main/(only?) web framework - Luminus - has a very small team behind it, though it does a fine job.What are the barriers to using the Spring stuff through Clojure's interop capabilities?⬐ sidllsNo, Spring Boot is one of the best examples of the worst kind of terrible patterns in the land of Java development. The bloat in that framework is awful, and the gods help you if something goes wrong in the annotations-everywhere code for anything but the most trivial of applications.⬐ p2t2pCould you provide an example of what you consider terrible?⬐ nimchimpskyI agree, but it seems we are in a minority⬐ humbleMouseIf you limit the annotations to only the basics,(controller,config,bean,requestmapping,etc.) - not much can go wrong. It sounds to me like you haven't worked with any large spring boot apps and experienced the stability annotations can provide.⬐ swsieberExcept circular dependencies. I work on an app where a circular dependency failure happens depending on what order spring finds our annotated classes. It made writing a faster bean scanner a little tricky because I had to replicate Springs ordering method.⬐ humbleMouseTo clarify what you said - You wrote a custom "fast bean scanner" and it's not working properly? Or you had to re-write it because spring's bean scanner wasn't working? What version of spring is this?⬐ swsieberAh, the spring bean scanning was working, but startup wasn't. The reason? Our app was apparently very brittle and the mere act of registering beans in the wrong act would cause a circular dependency error during startup.To be more concrete, to the original spring bean scanner, we were passing in a set of package names, which it would scan. Spring registers those bran defininitions in yhe order that it scans the beans. My custom scanner (found and registered all the same beans), broke our app because it wouldn't startup anymore due to a circular dependency error. Once I sorted the bean definitions by the original pack path inputs, that startup error went away.
I think we are 4.x
Extra details: I used the fast -classpath-scanner library. I subclassed the annotation cand date componend scanner class (well, something like that), and rewrote a method to load the resources for the string specified, treating the path a a fqcn, not a package path. Then I could feed that class the output from the fast classpath scanner (which was the list of classes with the annotations). Until I sorted the input by the original package paths, my app wouldn't start. Mind you, the method I overwrote simply created bean definitions. But that ordering difference made all the difference.
I can dig up exact class names if you are curious. The scanner of course didn't replicate all spring bean sear check capabilities - just the ones we were using. But it cut the scan time by 60% (several seconds).
Nevermind the additional complexity of integrating with payment gateways and handling complex authorization requirements and team invitations etc.Even a 'simple' web app is a convoluted mess of shit if you are to run a real world production grade system. I'm so sick of all these 'hello world' toy examples.
⬐ royjacobsIt's interesting that you mention gulp and webpack when those tools too are now considered too complex and set to be usurped by something like Brunch.It's a shame these tools keep being rewritten because there are definitely good ideas in all of them, but for some reason they can't seem to be unified.
⬐ panopticon⬐ mikekchar> It's interesting that you mention gulp and webpack when those tools too are now considered too complex and set to be usurped by something like Brunch.While Webpack is a little dense, it appears to strike the right balance between complexity and customizability (and probably more important for longevity, library buy-in). It doesn't seem like anything on the horizon is going to unseat it anytime soon... certainly not Brunch.
⬐ pknopf⬐ pknopf> While Webpack is a little dense, it appears to strike the right balance between complexityI don't know, I thought the same thing about Browserify.
And now Parcel is here, gaining steam...
> usurped by something like BrunchI've been out of Node for like 6 months, wth happened! I give up!!
⬐ spiralx⬐ yen223Brunch was between Gulp and Webpack, so don't worry! And since Webpack there's only been Rollup and Parcel to consider :)⬐ chrisco255Brunch is not usurping Webpack. But these are tools built on top of Node.js. They're for front end development, and have nothing to do with running a Node.js server on the backend.Somewhere along the line, we as developers have abandoned the Unix philosophy, especially the oft-forgotten second part ("Write programs that do one thing, and one thing well; and write programs that work well together").Without the ability to compose multiple small libraries to form the exact solution that we need, we had little choice but to rely on the One True Framework to solve every problem that we will have.
This means if the One True Framework doesn't serve the exact need you have (and it almost definitely won't, there's a combinatorial number of requirements out there), it's time for a rewrite!
⬐ zzbzq⬐ ArnavionNPM is the closest environment to the Unix philosophy apart from Linux. Lots of small packages instead of a large base class library like all the other language ecosystems.The thing is, even with Linux distros, most of the stuff you want is built-in by the distro. Once you start to add your own stuff, it can get really ugly and you have to be really pro to get anything done. It seems like every time I'm working on updating a Linux image I have to do some really bizarre thing where the package manager doesn't even work right and the instructions or some forum have me doing some mind-blowing workarounds I don't even understand.
So I think you are combining two different topics. I am all-in for libraries over frameworks. But the larger, more heavily curated libraries where you only need minimal customization are just objectively better. Having a large, curated standard library != a framework.
All these tools start off as simple alternatives to the existing bloated tools. Then as they gain more and more features to support real-world situations they end up becoming as bloated as the tools they set out to replace.⬐ royjacobsRight. I'd just hope that at some point that cycle ends and people try to fix existing tools instead of replacing everything wholesale with something new that will eventually fail again.⬐ Moru⬐ mediumdeviationSorry, there is always a new developer thinking "I can do this much better if I just start from scratch, getting rid of all the bloat".Webpack and Gulp were never meant to be simpler alternatives. Gulp used streams instead of serially processing files like Grunt, which made it faster, but obviously streams working in parallel are more complex than just processing the files one by one.Webpack pulls dependencies into one file and deduplicates them. This is obviously even more complex since now you have dependency resolution logic as well as dealing with the various module systems JavaScript has invented.
IMHO, the point of using a no-frills library/framework is because you want you write the rest of it yourself. The advantage of this is that it meets your requirements exactly and is therefore smaller/less complex.When doing a project that takes only a few weeks, I would probably choose a framework that has everything in the box. But if you are building something that is going to be developed over a period of years, the reduction in complexity achieved by building your own can be life saving.
When I look at your list, most of the things fall into the categories of "Pretty easy to implement" or "Don't want at all". However, there is an advantage for not reinventing the wheel if there is no reason to do so. If there is a nice library that gives me what I want and doesn't impose itself too much on the design, I will use it. But the main advantage for not baking it into a big framework is that I can pick and choose what I want.
As an older programmer, I come from an era where libraries and frameworks cost a lot of money. We built stuff by hand because there were not a lot of other choices. These days, though, virtually every library and framework is free software (not only free of charge, but you get source code too!) It's like living in Candy Land, and I'm not about to complain about it :-) However, I think that programmers today reach too quickly for the pre-built and do not understand the long term advantages of bespoke development. Like most things, there is a balance to be maintained.
⬐ menorI don't come from that era but couldn't agree more, every time one of my coworkers suggest using a library I tell them that is ok as long as they maintain it, I prefer to spend my time coding and like to understand as much of the codebase as I can, instead of having to learn and maintain tens of external libraries. Specially if we only need a couple of functions from that library.I normally lose those debates though, and the thing reaches a point where the complexity of the code makes it impenetrable.
⬐ NoneNone⬐ pknopfThe gist, every dependency you bring in your app should be thought about, really hard.If your using 10% of the lib, just implement it yourself.
If the lib is critical, like openssl, bring it in. Other people have solved the hard problems for you.
But yeah, it's a balance.
⬐ fnordsenseiUnless you're using something that has competent dead code elimination, like Google Closure Compiler. Then you can go ahead and include 30mb of libs, knowing that only a fraction will actually be shipped in the production version.It's always like this.I'm lucky enough to have been around the industry for a while. I could probably count a dozen or more things that started out "Like X, only without all the BS!" --- only to end up with just as much BS or more than X ever had.
⬐ Moru⬐ spraakYes, this is not only about tech-sector. You see it everywhere there is humans. There is always new people thinking that why do they do everything so stupid? I can do it much better if I do it this way. Sometimes they are right and everyone is happy, sometimes they do it exactly the same way it was done ages ago and found out to be inefficient or fail in some way and they get to hear "There was a reason we did it the other way".I try to explain why we do things the way we do and if they still want to try to change things, I make sure it's easy to go back again in case it fails the usual way.
⬐ eeZah7UxNo, not always. Some communities tend to do their research and web developers have a decades-old track record for not doing that.⬐ aldanor“X for Humans!” ©⬐ DenisMIt's an anthropological effect, not a technological one. A new generation of craftsmen faces a choice between submitting to the rules of the old guard and making up their own rules.Reduced complexity is often a rallying cry, but I think the root of the phenomenon is in trying to find one's own social and professional standing in the situation where all the prominent positions are already taken and what little is left requires years of hard labor (complexity, certification, corporate review system, etc).
If this situation upsets you consider the alternatives, they might be worse.
⬐ DanielBMarkhamAgreed.Tech is ripe for applied group psychology and anthropology. The social, psychological, and anthropological factors are obvious to casual observers -- but completely invisible to the people they affect the most.
There's a reason for it, and perhaps overall it's a good thing....but that still doesn't mean that it can't be accepted and acknowledged as a facet of the community.
⬐ rwallaceThis reminds me of when I first read about fractals: a collection of phenomena I've been staring at all my life, but never really saw until someone pointed out how to see them.Currently popular music mostly sounds like noise to me, but that's not the point. What are the current generation of musicians supposed to do, be silent and spend their lives listening to the great bands of my youth? It's impossible to match Pink Floyd in the style of Pink Floyd. They need a new style.
Facebook is losing traffic to whatever is the latest trend in social media, not really because people are suddenly paranoid about privacy, but because each generation needs a network where the previous generation is not.
And for as long as humans write programs, there will be a need to invent new languages, not because the old languages were technically inadequate, but because each generation of programmers needs a way to escape the shadow of the previous generation, the way acorns need squirrels to carry them away from the shadow of oak trees.
⬐ zippitydoodah68..and this is the problem. The assertions of escape. I don't understand it. I would write prose the same way Jack Vance (did) and Gene Wolfe (does) if I could write. There can be one true way for the expression of logical thought. We can try exceptional dialects but they fail because they cannot encompass every concept we should expect and they cater to the ego and domain.OTOH, the young always have a fresh perspective and they usually have good ideas based on the times. They should be listened to and mentored. Very few active older (1995+) folks left in IT ,after the method management and purposeful purging methods of the last 10 years, to mentor them. Most of us weren't great at teaching anyway. It was a paycheck.
⬐ scnsLove the analogy with the squirrels⬐ DenisMExactly. It's not just music, consider modern and post-modern art. Even earlier art can be seen as a form of protest against the tyranny of the establishment of yore.It's interest to see which parts of our civilization came which side of the divide. Market economy, for example is a great way for a young enterprising person to find their own footing away from the old (hence startups). Academia OTOH went totally the other way (hence grad school).
I missed that in the presentation. Are you referencing something else when you say that> originally node.js was presented as a bloat-free alternative to "enterprise languages"
?
⬐ nurettinI am surprised that you've only mentioned npm. The current complexity of webpack and the number of frameworks, the language going over multiple ES revisions and type-safe alternatives to the language makes it seem like the eventual complexity is almost unavoidable and simplicity is a marketing fad.⬐ rmrfrmrfMaven is still garbage and 100 times more bloated than npm. It literally has all the same issues npm has, only with less support.⬐ jrochkind1I think this is a story that gets repeated lots of times in our world of open source software dev.1. X is SO bloated and poorly engineered full of bad legacy decisions.
2. We can totally do better let's invent a new thing, Y!
3. Wow, Y is so clean and fast and understandable.
4. But it doesn't do this thing a bunch of people reasonably really need... let's add it.
(repeat 3 and 4 a few hundred times)
5. Y is so bloated and poorly engineered and full of legacy decisions. We can do better! (Go to 1.)
The grass is always greener, but mature complicated software is _usually_ complicated for... reasons.
⬐ manigandhamYes, because the loop never solves the original problem. It's about organization and bloat, not pure speed and leanness.Rebuild from scratch but also recreate all the existing functionality in a much better standard library and finally the chain can be broken. But nobody wants to do that.
⬐ pspeter3⬐ elliusI think Go does this for me for the most part⬐ hshehehjdjdjd⬐ ahansenGo has been out for almost a decade and they’re still working on the package management story.⬐ pjmlpLet alone G* when CLU had them in 1975, but lets not rush.⬐ nojvekI’m super bullish on rust. I feel like it was designed with the right intentions.⬐ algestenI love rust, but the standard libraries are nowhere near the same abstraction level of nodejs.All services I've deployed built on rust pulls in a kitchen sink of deps.
Granted. I get a static binary as my end result, so maybe it's fine.
⬐ jzoch⬐ danpalmerRust is designed that way, to be fair. They expressly did not want to be batteries included like python is. The reasons are what they are and not particularly relevant to the conversation, but pulling in well designed third party crates is the point.⬐ devxpySpeaking from a python user's perspective, the batteries included philosophy works great when you have a neutral implementation. Python does a good enough job, and provides extensibility in a way that I don't need to download a package to do basic things. On the other I have to spend hours trying to find a package in JS that just gets shit done. The third party package way is only required for ui parts because you don't want everything to look generic. But having a good standard library to do non user facing stuff is essential. That's why every node project ends up with a thousand dependencies. Because the language is not batteries included. There in JS there is no "one correct and obvious way to do everything" which makes doing basic programming painful.Yep. There are some languages that start out trying to solve fundamental productivity issues in previous languages - some more than others.I think we had a generation of ecosystems with Node, Ruby, Python, that tried to do solve the unapproachable systems around the Java/etc ecosystems and make them more open.
They succeeded, but the next generation seems to have been about solving the plethora of tools that came with those languages. Rust, Go, etc, having first-party tools are trying to improve upon that, and yes I think Rust is by far the best implementation I've seen.
I'm interested to see what the next generation is.
I believe this is what Microsoft is trying to do with .NET Core. It's been successful so far, though they aren't at feature parity yet.⬐ rubber_duckC# is very verbose and tedious compared to more expressive languages - having to deal with CLR types/API at runtime while using a language with very limited expressiveness (C#) is not very productive. It's better than Java if that's what you're aiming at - but JVM has an incredible ecosystem of stuff that works - much larger than .NET core which is not very mature in many areas (recently had to revert to .NET 4.7 because some encryption method used by a government SOAP service we were talking to wasn't supported).TypeScript and JS underneath is actually quite malleable - you can escape static typing at any point and revert to simple JS object model when things don't map cleanly in the type system - and then still have types at the boundaries - makes meta-programming trivial in some cases - where it would look like a monstrosity in C#.
F# is interesting and has a lot of advantages over C#, but few people seem to be willing to invest the time to pick it up in the .NET community.
So I don't really view .NET core as a superior alternative, I've worked in JVM land, they are more mature and while Java sucks there are other languages on top of it as well and are decent to use (Kotlin ~ C#, Scala ~ F#)
⬐ manigandham⬐ manigandhamYes Typescript (also from Microsoft) is fascinating and fantastic at combining the strengths of static typing while still maintaining all the flexibility of dynamic types if necessary, however it's pretty much the only realistic non-academic of such a thing, so basically everything else is pales in comparison if that's what you're looking for.Why is C# not expressive? It has the DLR and `dynamic` keyword which behaves just like JS typing if that's what you want, because it seems like your issue is really with static typing in general. Functional languages are nice but it seems C# with functional its slowly and carefully integrated functional extensions is actually more productive for most developers.
⬐ sametmax⬐ QuarrelsomeI'm having a hard time understanding what's fascinating about typescript.I agree it makes JS better. I agree it's a good tool for its purpose.
But "fascinating" ?
It's hardly the most elegant scripting language down there (Ruby, Python, Kotlin and Dart doesn't have to live with the JS legacy cruft).
It has a very small ecosystem outside of the web.
The syntax is quite verbose for scripting.
It has very few data structures (and an all-in-one one).
Very poor stdlib.
Still inherits of important JS warts like a schizophrenic "this".
Almost no runtime support if you don't transpile it (which means hard to debug and need specific tooling to build).
And it's by no mean the only scripting language having good support for typing (e.g: VSCode has great support for Python, including intellisens and type checking).
What's so fascinating about ?
What's fascinates me is that we are still stuck with a monopoly on JS for the most important platform in the world.
⬐ manigandham⬐ rubber_duckTypescript IS javascript, so of course it inherits all of its problems. The data structures and standard libraries are what you get from JS, nothing more. It's called a programming langauge but its more of an extension to JS with a powerful compiler.The typing system is what is special though, especially in how seamless it is in adding strict types alongside pure dynamic objects, but also allowing you to choose pretty much anything in the middle of that spectrum depending on your definitions.
You can have a few strong-typed properties mixed with others in a generic type that inherits from something else but can only take a few certain shapes. It's unlikely you need all that in most programs but it's the fact that you can do it which makes it great. In fact, the Typescript type system is actually turing complete.
Perhaps this video on Typescript from Build 2018 would help: https://www.youtube.com/watch?v=hDACN-BGvI8
⬐ sametmax> Typescript IS javascript, so of course it inherits all of its problems. The data structures and standard libraries are what you get from JS, nothing more. It's called a programming langauge but its more of an extension to JS with a powerful compiler.That's pretty much my point.
> he typing system is what is special though, especially in how seamless it is in adding strict types alongside pure dynamic objects, but also allowing you to choose pretty much anything in the middle of that spectrum depending on your definitions.
> You can have a few strong-typed properties mixed with others in a generic type that inherits from something else but can only take a few certain shapes. It's unlikely you need all that in most programs but it's the fact that you can do it which makes it great. In fact, the Typescript type system is actually turing complete.
Apparently you haven't read my comment because I clearly says it's not special. Others languages do it to.
> Perhaps this video on Typescript from Build 2018 would help: https://www.youtube.com/watch?v=hDACN-BGvI8
Perhaps this article would help: https://www.bernat.tech/the-state-of-type-hints-in-python/
Dynamic doesn't behave the same as JS typing, you're still using CLR object model and typing rules, you're just losing compile time checks - it gets complicated really fast if you want to do meta programming even with DLR and it's not really ergonomic in C# (like casting/boxing primitive types, etc.)Think about AutoMapper and then compare it to a TS solution using spread operator. How much boilerplate automapper crap do you see in your typical enterprise C# project ?
And that's not even touching on functional features, like you can't even have top level functions in C#, it's "one class per file dogma" + multiple wasted boilperplate lines and scrolling. I recently rewrote a C# program to F# - didn't even modify much in terms of semantics (OK having discriminated unions and pattern matching was a huge win in one case), just by using higher level operators and grouping stuff - line count went down to 1/3 and was grouped in to logical modules. I could read one module as a unit and understand it in it's context instead of having to browse 20 definition files with random 5 line type definitions. I could achieve similar improvements by rewriting to TS or Python.
C# adds overhead all over the place, people are just so used to it they don't even see it as useless overhead but as inherent problems they need solve - like how many of the popular enterprise patterns are workarounds around language limitations ?
When I bring this up people just assume I'm lazy about writing code - but I don't really care about writing the code out - tools mostly generate the boilerplate anyway. Having to read trough that noise is such productivity drain because instead of focusing on the issue at hand I'm focusing on filtering out the bloat from the codebase.
⬐ ChicagoDaveThis sounds like a personal preference for dynamic vs strongly typed.I could rewrite your entire comment in reverse about how I find C# highly expressive and readable while dynamic languages or Kotlin (blech) are a mess of inconsistent whack-a-doodle experimentation.
But my opinion is useless.
The value in any platform is productivity and if any given team can be productive, it doesn't matter if it's COBOL, RPG-3, Pascal, BASIC, or a functional language like F# or plain old JavaScript.
⬐ rubber_duckActually I like static typing, I mentioned I rewrote a project in F# in like 1/3 of the code from C# solution.It's more that C# is static typing done poorly IMO - a relatively limited type system that adds overhead compared to dynamic languages or more expressive static languages.
> using a language with _very_ limited expressiveness (C#) is not very productive.o_0. Think you need to check yourself mate.
I believe the productiveness of more "expressive" language tends to be undermined by the loss of productivity that occurs when you're compelled to write blog posts or comment on hacker news about how amazingly productive and expressive your language is.
⬐ btschaegg...as if we wouldn't be writing comments on HN anyway. ;-)⬐ rubber_duckI can do like 3-4 hours of productive work a day realistically - after that I lose focus. I can push this in some periods - but that's the ammount of time I limit myself to be functional over long term.If I need to waste that time sifting trough boilerplate than I'm pretty upset because I get less shit done in that time window.
Chatting on forums is a casual brain teaser and keeping up to date on industry stuff.
⬐ QuarrelsomeI would suggest using more personal language when expressing personal opinion and toning down the force (very).> [I find that] using a language with limited expressiveness (C#) is not very productive for me.
Like, I'd figure you can be mad productive in any language (even COBOL?) although I'm only completely cosy in a couple. There's no need to be so dismissive of the tools that others use.
⬐ weberc2I don't think "expressiveness" or "boilerplate" are the things that slow me down. I use Go, and I find that it is both expressive and a little verbose, but it's still very simple and there's usually one clear way to do things, so I find that I can move a fair bit faster than I can in C# _or_ Haskell (in the latter case, it might just be that Haskell has a huge learning curve and I'm nowhere near over it).Yes, arguably the .NET Framework almost did it and is still one of the most productive frameworks available, but .NET Core has definitely improved things substantially. It's fast, well-designed, and full-featured and I expect usage to pick up greatly.⬐ zzbzqSo far the package manager hell has been kept in check because they keep re-doing everything in such a way that you don't intermingle it. So when you're on MVC5 you're on MVC5, and when you're on AspNetCore, you're on that. You're not using 5 of library x and 6 of libarary Y. Likewise the startup and DI stuff has all fully rebooted twice in a few years. But nonetheless, some of that sort of package hell has already seeped in, where you're using different packages that depend on different versions of some underlying thing with breaking changes. I think the choices are either keep rebooting everything or stop making new stuff.⬐ manigandham⬐ lewisincYes, but it's rapidly getting better with .NET Standard combining all the libraries into a single definition that can be used on any framework implementation.MVC5 was never released though, and the changes have been rather minimal from ASP.NET Core v1 to v2 with straightforward migration guides, so it might look messier than it actually is if you were working through all the previews and release candidates instead.
Nevertheless, Microsoft has a long history of having messy v1.0 with most of the stability coming after v2.0, so you can consider the foundation pretty stable now that it's on v2.1 and more.
Can I use F# with it? Because I would love to learn me some F# some day.⬐ manigandham⬐ poilcnF# with .NET Core? Yes, it works fine.There are some challenges coming up with design changes to the compiler and C# that might overlap what F# already has but it'll get sorted out.
⬐ vizzier.net core F# support has been pretty great from the initial stages of .net core from my basic usage. Biggest challenge seemed to be around type providers (F# system of generating strongly typed classes from dynamic data such as XML, CSVs, HTTP etc) but that's largely resolved. More info at https://github.com/fsprojects/FSharp.TypeProviders.SDKGreat resources for getting started with F# at https://fsharp.org/
My personal preference is generally to install the SDK and use the http://ionide.io/ with VScode as it seems to work most reliably cross platform.
⬐ davidgrenierI'd be very much interested how anyone is using F# on Linux without mono.I have .NET Core but the whole thing seems to require Mono and it isn't clear from fsharp.org that you can do without.
⬐ workintheheadMono is not required: https://docs.microsoft.com/en-us/dotnet/fsharp/get-started/g...What I don't like in Microsoft's frameworks is that they've made lots of things multiple times in slightly different variations, like they always do with all their software (10 variations of each type of programs which were outdated before they were finished). Mostly it exists due to historical reasons, but it only underlines the problem of a multi-billion corporation having design skills of a sophomore. They redo and redo things, bloating their frameworks and increasing their number and you have to guess which CookieContainer you should use this time. It makes me understand why language designers like Rust developers insist on a small core library. Because it's better to have one separate library that will do everything regarding Cookie management (and you could control its functionality by including additional traits from it), than to have incompatible variations of it in the standard library and in each framework.⬐ pjmlpAndroid Frameworks will make you love .NET variations.This was a really funny experience for me as a self-taught guy going in the other direction. I started with Node in my spare time, and when I finally got a professional coding job my first project involved Java and Maven. I was kind of dreading it due to Java's reputation as this big bloaty terrible enterprise language, but once I actually got started I was like, "Man, this type safety thing and opinionated build tool thing etc etc are really nice." By no means is it (or any language) perfect, but a lot of the criticism suddenly seemed really overblown.⬐ hyperpalliumCurrently happening with JSON (instead of XML, instead of C0RBA) despite the brutality of ripping out comments to keep it simple. We now have json schema, soon jslt, json namespaces etc.To be fair, it's not impossible for some improvement to occur in this process.
⬐ stcredzeroExactly! "originally X was presented as a bloat-free alternative to 'enterprise languages'"For awhile, "Burn the Diskpacks!" was a battle cry of the Squeak Smalltalk community. That sort of policy fights bloat, but leaves old users in the lurch. I think that we are now to the point where a language/environment can trim bloat while not abandoning old users. If the language has enough annotation, and has the infrastructure for powerful syntactic transformation tools, then basic library revisions can be accompanied by automated source rewriting tools. We were pretty close to it in Smalltalk, without the annotations.
⬐ redavniThis is why it's so important to have good leadership. For example, Linus Torvalds.⬐ osrecBut we can learn from prior mistakes in each iteration and spring clean the software logic. I know it's a lot of effort to seemingly reinvent the wheel each time, but I like to think it does yield some benefit in terms of efficiency and cleaner logic.⬐ daxfohlWhich really isn't so bad. Y eventually goes corporate, and is still presumably better than X, having learned from its mistakes. But for those who hate the new bloat, along comes Z and the cycle repeats itself. Chicka Chicka Boom Boom.⬐ weberc2Yes, but not all applications need that complexity either, so slim, simple tools are often really useful.⬐ pknopfHow do you break the loop?⬐ josho⬐ edgartaorI don't think the loop is necessarily bad, it shows progress.Think about Java, it solved a class of problems that C was unable to address (e.g. unsafe memory, native threads). Thus enabling a new class of programs. But the new class of programs created opportunities for new platforms to solve with the benefit of a clean slate and fresh design having learned from past successes and failures.
⬐ megaman22⬐ oh-kumudoI'm increasingly sketical. Maybe we move ahead a few inches each cycle, but it's starting to look distressingly like each generation of programmers has to learn all the lessons that their greybeard predecessors learned the hard way. Then, when they acheived some level of enlightenment, the next batch of bright-eyed whippersnappers comes along to rinse and repeat.There's a disturbingly low-level of historical knowledge passed along in programming. Some bits and pieces are encountered in a quality Computer Science curriculum, but usually in rarefied, theoretical form, and inevitably balkanized into drips and drabs as part of subject-oriented coursework.
⬐ j45It's interesting to place today's techs on the Java maturation timeline - each became what they thought they hated but realized may have existed for some necessary reasons.New platforms bring exciting and meaningful evolution often at the cost of what techs like .net and Java have a few decade advantage in. It's also interesting to see what Java devs are innovating with themselves, Scala, Kotlin both have good things happening.
Maybe using one large, inter-syntax friendly world like JVM will help.
When experience is overlooked for youth, we relearn and reimplement the same libraries repeatedly in every new tech to feed some developers needs to build temples to their greatness.
Still, Fitzgeralds quote comes to mind... "So we beat on, boats against the current, borne back ceaselessly into the past." and technology is held back by reinventing the wheel.
⬐ megaman22The biggest problem i see is the weird hole circa 2006 that arranges with Sun selling to Oracle, that kind of still-birthed Java as the next great language.That hole I can credit as giving C# the advantage in that tight niche, and stilling the development of the JVM platform in general.
By the time that the rust on JVM improvements were dusted off, all initiative was lost. Java was playing catchup to the competition.
⬐ j45As we're seeing with WhatsApp, guardianship and supporting the direction of a project isn't easy. I'm not sure where Java would have ended up if someone else took it.⬐ pjmlp⬐ pjmlpAdditionally Oracle haters seem to forget Oracle was one of the very first companies to get into bet with Sun regarding Java, with their whole Java based terminals idea and porting all their Oracle Database GUIs to Swing.On the other hand, Oracle has probably developed Java further much more and kept Maxime around making it into Graal.IBM gave up on the first counter proposal, Red-Hat and Google didn't bother to rescue Sun.
So we might even have been left with either Java 6 or being forced to port our applications.
I don't think we have to.Loop, as it might seem, doesn't mean there is no progress made in between.
⬐ jrochkind1I think you gotta have a good understanding of the domain and use cases you want to hit (which is really hard, especially so when it's a general purpose programming language whose domain is... everything), and design from the start with a vision of hitting those use cases, instead of having to shoe-horn them in later.Of course, use cases will still evolve, and your initial understanding is always flawed, there's no magic bullet, designing general purpose software (or a language or platform!) meant to hit a wide swath of use cases flexibly is _hard_.
And then, yeah, like others have said, you need skilled, experienced, and strong leadership. You need someone (or a small group of people) who can say 'no' or 'not yet' or 'not like this' to features -- but also who can accurately assess what features _do_ need to be there to make the thing effective. And architects/designers who can design the lower levels of abstraction solidly to allow expected use cases to be built on top of them cleanly.
But yeah, no magic bullet, it's just _hard_.
As developer-consumers of software, we have to realize that _maturity_ is something to be valued, and a trade-off for immature but "clean" is not always the right trade-off -- and not to assume that the immature new shiny "clean" thing will _necessarily_ evolve to do everything you want and be able to stay that way. (On the other hand, just cause something is _old_ doesn't _always_ mean it's actually _mature_ or effective. Some old things do need to be put down). But resist "grass is always greener" evaluation that focuses on what you _don't_ like about the original thing (it's the pain points that come to our attention), forgetting to take into account what it is doing for you effectively.
⬐ stcredzeroRefactor and trim the bloat on the basic libraries, but have a policy where bulletproof automated source rewriting tools are provided in those cases. Perhaps this isn't possible with Javascript, but it might be possible with other languages.⬐ pc86⬐ ppeetteerrIf you think anyone has "bulletproof automated source rewriting tools" I've got a bridge to sell you.⬐ jrochkind1What's the bridge, how much, and is what everyone else is using and/or the next big thing?⬐ stcredzeroIf you think anyone has "bulletproof automated source rewriting tools" I've got a bridge to sell you.I've used an excellent one. The Refactoring Browser parser engine in Smalltalk. I've used it to eliminate 2500 of 5000 lambdas used in an in-house ORM with zero errors -- all in a single change request. (Programmers were putting business logic into those lambdas.) Like any power tool, it's not stupid proof. However, it gives you the full syntactic descriptive power of the language. So if you can distinguish a code rewrite with 100% confidence according to syntactic rules, then you can automate that rewrite with 100% confidence.
Here's where it can go wrong: If your language is too large and complicated, there the probability you can run into a corner case that will trip you up. Also, it will always be possible for a given codebase to create something which is too hard to distinguish, even at runtime. (You can embed arbitrary code in a Refactoring Browser Rewrite transformation, so you can even make runtime determinations.)
"Bulletproof" isn't "invulnerable." A vest with an AR500 plate will stop certain bullets hitting you in certain places. It won't protect you from being stabbed in the eye or stepping on a landmine. Despite that, it is still a useful tool.
You have strong leadership that makes good decisions.⬐ flukusI think vi -> vim -> neovim shows a pretty good model.Neovim is an effort to modernize and remove cruft from vim, so they get to keep all the good parts and throw out the backward compatibility. If it works out it can eventually replace vim, not to different to what vim did to vi.
I'd like to see similar stuff done to much of the GNU tools. Make for instance has to worry about backward compatibility and posix compliance that makes it hard to progress. As of today there have been about 12,000 attempts to replace it with something else and I find all of them inferior for one reason or another, they've all reinvented the wheel poorly. If someone had taken the fork and modernize approach we might have something better by now.
It doesn't even have to be a "hostile" fork. The same can be done by the developers of the existing tools.
⬐ mirekrusinText editor and programming language are slightly different things, backward compatibility story is completely different.⬐ hjekNot when the text editor includes a programming language (Vimscript). And backwards compatibility of plugins is a big issue.Obligatory xkcd reference: https://xkcd.com/927/⬐ BigJonoNot everything that gets added is stuff that people reasonably need, either. If you cater to everyone's needs, then you'll end up with 10 solutions for the same problem, because every one of your users has their preferred one.I think Javascript suffers from this quite a bit. ES6 "classes" should never have made it in, for example. Not only did they add an extra level of abstraction for beginners to learn, but the only reason for doing so was "my code doesn't look like it does in other languages"...
⬐ tripzilch> I think Javascript suffers from this quite a bit. ES6 "classes" should never have made it in, for example.Well, I'm really happy that the class-statement got in ES6 though.
Before that, whenever I needed something class-like, I had to search how do you do this in Javascript, and find five different answers, no but really which is the proper one for prototype-based inheritance, waste an hour (or more) and frankly I still don't know, there's just so many ways you can do it, which is the RIGHT one? and Javascript wraps in on itself in so many cool ways, but there were never any definitive answers, just more rabbit holes.
Now, there is the class-statement, and it's one less rabbit hole to get trapped in. I actually get more stuff done now that there is one right way to define a class, or class-like object with a constructor, properties, methods, etc.
Similar thing goes for the function arrow notation. Javascript, and the event-based environments it usually operates in, wants you to use anonymous functions a lot. But the relatively verbose way to define them, still held me back from using them freely as much as I wanted, trying to "optimize" them away if possible.
⬐ jcadamYea, I don't use the 'class' keyword in ES6 at all (but then I keep to a fairly functional style in my JS). Modules and lambdas are the killer features in ES6 as far as I'm concerned.⬐ mieseratte> but the only reason for doing so was "my code doesn't look like it does in other languages"...As much as I beat the FP drum these days at work, I find the class syntax a much nicer way of organizing solutions to certain, pardon the pun, classes of problems.
Whether or not you find this to be semantic diabetes is a matter of taste, I suppose. I'm curious what, specifically, you find to be the major issue that makes you say they should have been left out.
⬐ nojvekES6 classes were such a relief compared to the prototype bloat you had to right. I love syntactic sugar that makes my life easier.ES6+ flavors of JS & Typescript really made me take web programming seriously again.
⬐ noir_lord⬐ shadosAgreed, classes with typescript and Vue make sense to me in a SFC approach.Something about components just fits the class model well.
⬐ workintheheadOr, you know, you could actually learn the language you use. Prototypes are not nearly as bloated as classes, and you usually don't even need them. OO is not the one true paradigm of coding.⬐ wow-botalso agree, would not go back. TypeScript is savage⬐ elyoboI loved the classes at first, but as I've got more in to it I've found that classes and prototypes are redundant. Closures let me maintain all of the state that I need without the extra boilerplate.There's very few things JS classes can do that ES6 modules/named exports along with closures returning plain objects can't do better, in term of code organization, isolation, and extensibility. For the 6 times a year where I need an actual class (it does happen!), I can write the prototype code.The main issue with adding classes is that they're very, very complex if you want to make them useful. The initial version was pretty harmless, but it was also almost pointless. Now they need to back fill all of the missing features (eg: private fields), which brings in an enormous amount of complexity. Most of the time, if I need private fields, I can just use symbols (not quite private, but close), or I can do
function() { let private = 123; return { // use private in functions here } }
It adds (there ARE things classes are better add) very little compared to the insane amount of work that has to be put in the language to get it all working. Decorators are in a similar boat, where many decorator usages can be expressed just as easily with a higher order function, so adding the extra syntax is just bloat.
The cost isn't worth the reward.
The biggest thing classes give us that is very difficult to replace in vanilla javascript is a semantic construct that is easy to statically analyzed. In the typed flavors (Flow, TypeScript), I can analyze the interface of plain objects, but not in vanilla JS. Part of why React using ES6 classes can be useful.
Thats a great benefit, but Im not sure it's worth the trouble.
⬐ joquarky>In the typed flavors (Flow, TypeScript), I can analyze the interface of plain objects, but not in vanilla JS.I've found that developing JavaScript in an IDE that reads and validates type information from JSDoc allows me to introduce strong typing while maintaining the flexibility and simplicity of vanilla JavaScript without getting as bogged down as I get with Typescript.
⬐ shados⬐ styfleFor sure, though my comment referred more to analysing code for things like automatic transformations at scale (like codemods) than during development. Like, figuring out that a stateless function component is a component is hard.You can do almost everything with jsdoc comments in flow and TS, of course. It's awesome.
> There's very few things JS classes can do that ES6 modules/named exports along with closures returning plain objects can't do betterThat's a really good point. I was about to disagree with you but then I created a thought experiment.
Thought Experiment:
I wonder what the JS landscape would look like if ES6 Modules were introduced as part of ES5 about 8 years ago? I could definitely see how that would make classes fare less appealing if we already had a great module system (sure CJS existed but browser didn't support it).
Looking at the timeline of when these features were implemented in all major browsers:
* ES6 Class[0]: implemented 2.5 years ago * ES6 Modules[1]: implemented 1 month ago
[0]: https://caniuse.com/#feat=es6-class [1]: https://caniuse.com/#feat=es6-module
⬐ shadosAnd ES6 classes were designed long before they were implemented, around the time people were making class hierarchies with backbone and AMD modules were the future.The rise of Java and the OOP revolution isn't that far behind us (2 decades seems like a lot in the tech world, but its still within a single generation of humans).