716

January 12th, 2024 × #nodejs#performance#optimization

JS Perf Wins & New Node.js Features with Yagiz Nizipli

Yagiz Nizipli discusses Node.js performance improvements he has contributed, optimization techniques, complexities around URLs and factors enabling future TypeScript support.

or
Topic 0 00:00

Transcript

Announcer

I sure hope you're hungry.

Announcer

Oh, I'm starving.

Announcer

Wash those hands, pull up a chair, and secure that feed bag, because it's time to listen to Scott Tolinski and Wes Bos attempt to use human language to converse with and pick the brains of other developers. I thought there was gonna be food, so buckle up and grab that old handle because this ride is going to get wild.

Announcer

This is the Sanity

Scott Tolinski

Friday's supper club. We're gonna be talking to Yagiz Nizupli, all about Node. Js Performance, all kinds of stuff.

Scott Tolinski

And Yagiz has been doing a ton of really amazing things. He's, been a voting member of the Node. Js As technical steering committee, a voting member of the Open JS Foundation, Cross Project Council, the founder of the Node. Js performance team, The author and maintainer of Fast Query String, a 150 plus commits on Node. Js, and he's the reason why we have dotenv ESLint Node. Js.

Scott Tolinski

So welcome to the show.

Guest 2

How's it going? What what's up? How are you doing? Good. Good. Yeah. Thank you for inviting me and hosting me today. It's it's good. I was just getting used to my new job at Sanity, so, mostly learning new stuff. How's everything with you? Yeah. Going great over here. And I've been seeing on Twitter your

Scott Tolinski

PRs you're submitting. It seems like you, Hit the ground running over there doing

Guest 2

some pretty pretty major, PRs right out of the right out of the gate, so it's pretty incredible. The easiest thing That's an engineer could do is to replace or refactor or rewrite something. So that's what I'm basically doing is that replacing old tools with new tools so that we have the performance boost and maybe save some money while doing that. So Nice. Nice. Oh, that's awesome. Yeah. We're gonna talk about,

Wes Bos

I think kind of like 2 main things today, and then we'll see where that goes.

Wes Bos

1st, I want to talk about, like, performance and how you figure out How things are slow and how you figure out how to make things faster, as well as I wanna talk a lot Scott, just Node. Js in general. Like, you've been Yarn of maybe you can explain to us your involvement in Node. Js, but we know you've been we're a founding member of Perf team, and you're pretty heavily involved as well as we have some pretty exciting announcements of stuff coming to know that you're gonna You're gonna leak to the audience. Right? Yep.

Topic 1 02:33

Yagiz has been involved with Node.js performance

Guest 2

That's true. Like, what's your involvement with Node. Js? Right now, I I'm a Node. Js technical steering committee member.

Topic 2 02:39

Yagiz focuses on Node.js performance improvements

Guest 2

And on top of that, around, last year, December, a year ago, I founded the Node. Js performance team with the focus of improving Node. Js performance and the crucial parts.

Guest 2

Up until September or August 2023, I was the performance strategic, initiative champion, And I resigned, 2, 3 months ago from that.

Guest 2

But, yeah, I mostly work on performance. And, if I had some time and if the tasks are noncontroversial, I try to add new features, but I mostly work on, like,

Wes Bos

performance stuff. Awesome. And let's talk about that real quickly. It's like you implemented dotenv support, into Node. Js, and that was, like, so well received by the community. It was one of those, like, Yes. Thank you. And it feels like Node lately has been saying, you know what? Maybe we should do Add some of these features that people are doing. Like, I feel like a long time ago, it was like, no. Like, this is part of the community, the tooling. You can install a package to do that type of thing.

Topic 3 03:27

Common Node.js optimizations involve reducing serialization costs between JS and C++

Wes Bos

And recently, I feel like Node has switched to saying, you know what? Maybe we should make the DX a little bit and bake a few thing in in like like .env. Has there been a shift in, node at all for that? Yeah. Yeah. So

Guest 2

In the past so I started contributing Node a year and a half ago, maybe around 2 years ago.

Guest 2

So I didn't see any specific ESLint from my side, but What I could say is that because Node is an open source project, and it's maintained by, random people around around the globe.

Guest 2

And because it runs on democracy and, like, people can reject and either approve or reject any changes.

Topic 4 04:43

Node.js is shifting towards including more developer experience features like dotenv

Guest 2

Yeah. I think the the other, general opinion of Node core collaborators have changed, and that's why we're seeing those kind of Deno featured right now.

Guest 2

And on top of that, we have, attracted more c plus plus developers in the past couple of months.

Guest 2

And as a result of that, we have these drastic changes because, of the performance gains and ease of Implementing implementing those kind of features.

Guest 2

But, yeah, bat m is one of those features that I've developed and got really positive feedback.

Guest 2

So I initially started contributing to Node because I rewrote the URL parser, which had much more impact than dot m file, but because it wasn't open and And it was not the the scene

Wes Bos

or easily noticeable by engineers. It Scott that much attention. The URL parser is the, like, new URL, and then you get the everything, the port, the path name, the the search brands, All of that. And I initially, I was like, well, like, isn't that part of that's not part of VA. That's a a browser API, right, that's been implemented in node? Yes.

Guest 2

So that's a Chromium implementation.

Guest 2

And because we don't have Chromium and we have v eight, we basically implemented it With the help of Daniel Lemery, a professor from Canada, and released a library called Ada, which is named after my daughters.

Topic 5 05:58

The URL parser is a complex 20,000 line state machine implemented in C++

Guest 2

So yeah.

Guest 2

So every time you call a new URL, it actually runs 20,000 lines of c plus plus code.

Guest 2

So it's not as simple as as and the impact is extremely huge, because Wes if you run a node index dot Wes s and it doesn't even have any files in it, it actually creates new URL 5 times.

Guest 2

If you run a fetch, it creates 5 more types.

Guest 2

If you import any modules in e s m, it initializes more than I can count.

Guest 2

So is everywhere, and we just don't know it because there isn't anybody like me that tells you that.

Guest 2

Those kinds of small things, they're extremely impactful for performance. And Holy smokes. That 5,000 cuts is the correct,

Wes Bos

explanation of the whole situation. What kind of, like, sick person writes 20,000 lines of c and euro?

Scott Tolinski

What could it possibly be doing in in all of those lines?

Guest 2

Like, what what what is it doing? Yeah. You're now so there's a group called Watweegee, which is Web application bubble. So it's like Safari, Chrome, Firefox developers, they come together, and they release this URL specification.

Topic 6 07:17

The URL specification is very complex with thousands of edge cases

Guest 2

It basically contains, it it basically is a state machine, but they're really glorified and extremely unperformed Scott machine. And in order to make, make that, performant, we Use, the most, up to date technologies and methodologies in order to speed things up. That's why we have this change.

Guest 2

So yeah. So whenever it when so I can give you a quick example. If you have h t p s google.com, it goes character by character, iterate through the whole whole string.

Guest 2

When when it sees a column at h t p s h t p s column, It says that, okay, the protocol state has ended. So right now, I can have a authority state, a path state, a file path state, Or so on and so forth. So in that scenario, it checks for an at character because if you have username, colon, password at local host, for example, that username is the username and password is the password. So it checks for the at characters, but it couldn't find it. So it goes back to the column, and then it says It sees that there's a 2 slash character.

Guest 2

This means that right now, we are in the host name state. So it goes to the host name state until it sees a a column, Which which is the starting point of port or slash, which is a starting point of, path name or a question mark, which is a starting point of query Or a a hash character, which is a starting point of fragment. It differentiates and goes through all of those layers. And each of those States has different encoding and decoding parameters.

Guest 2

This means that if you have a space character in query, parameter, It's translated into percentage Wes.

Guest 2

This is because a lot of people on Google Chrome, they go into the toolbar and instead of Rewriting a you're out there writing space or syntax space FM, and that needs to be translated into something that, it's Sanity.

Wes Bos

So Just all kinds of I I don't think people realize how complicated URLs can be.

Wes Bos

It's The URL is the original state machine, or it's the original state manager. Right? You can put so much in it, and I I probably every developer in their life has just said, I'm just gonna split it on the question mark and then, split it on the equals and loop over it. And, Oh, I've got the query params, and then you realize, oh, there's 7,000,000 edge cases down. Is there is there, like, a test suite for all the different Possible URLs, and and do you know how many tests are in there? Yeah. So

Guest 2

just yesterday, I was invited to the web platform tests organization, which is the organization that maintains the tests across all browsers.

Guest 2

Yeah. So, it has The URL has around maybe 3,000 or 2,000 tests.

Guest 2

Wow. ADA ADA URL parser Node more tests on top of that which influence other libraries as well, including Bos. For example, boost ported All of our ADA tests today are Node Bos, and they found 6 different bugs in their code by just porting our Wes. So Wow. It's extremely, it's an extremely complex and, underrated part of, programming that we all accept as a Bos that we always have to pay. Mhmm. Wes. But it is it it's not. So So you mentioned that

Scott Tolinski

those kind of things might get overlooked.

Topic 7 11:05

Finding performance wins in Node.js by optimizing frequently used fundamental parts

Scott Tolinski

Right? How do you how do you go about finding those types of performance issues or performance areas that you could Even begin to start figuring out, you know, untangling the, the web of what makes it faster. So If you ask this to,

Guest 2

I like, to 100 different engineers, they would like, 90 or 80% of them will say that I will just run a profiler, and I will just look into the flame graph and see what Yarn that takes the most time, and I will just Try to optimize that.

Guest 2

But I personally, I'm much I'm more interested in, having much more impact than the flame breath that I'm currently looking at. This means that I want I like to look at to the quote that no one is Dare to look into that Node, and they will accept that be because it's working, it's okay. So this is how I, started working on because it was a trivial task, I also assume that it was a really easy task as, as others.

Guest 2

But it it result return resulted in writing 3 or 4 different implementations.

Guest 2

So, yeah, I basically select, what is the most fundamental thing that runs on every part of the Node Bos? That might be, URL, or file system operations or query string parser or or those kinds of things that we always have to run, and I try to optimize that. And, yeah, that's what I did. And that's typically done in in c plus plus. Right? So for anyone

Wes Bos

Not who doesn't know? Node. JS is not written in JavaScript. It's it's written primarily in c plus plus, and some of the APIs are written in JavaScript. And part of what you do is say, okay, this API is slow because it's been implemented via Node libraries in JavaScript. So you can then take a library that's been implemented in JavaScript and just rewrite it to To be native, meaning it runs in c plus plus? Yeah. So the the thing is,

Guest 2

most of the non crucial APIs, nonperformance wise, not not not important, but not crucial APIs are written in JavaScript.

Guest 2

And most of the performance wise required Tasks are written in, c plus plus. I basically and whenever you call a c plus plus function from JavaScript, there's this cost that you always have to pay, which is the serialization of the data that you Yarn passing.

Guest 2

If it's a string, If it's an, a really complex, data structures, they we always pay for this cost. It's like calling JSON stringify and Converting it to string and then passing string to somewhere else and then coding JSON parse. Node that cost is is, is is JS a must.

Guest 2

And I basically try to find what is the most optimum way optimum way of, Paying this price, but also making it fast.

Guest 2

So not always moving them to node, to c plus plus site is always a Good site. Most of the time, most optimizations and nudges are, are a product of this move.

Guest 2

Like, for example, just recently, we had maybe 30, 40 different PRs, from 40 different collaborators in Node, improving the improving the error path of Node FS modules. So we were previously calling, a c plus plus function from JavaScript and that c plus plus function was returning an object. And if if that object had an error, key that we were throwing error.

Guest 2

But if if we move that error state to c plus plus site, We realized that we could improve the error, the faulty path of these functions by 100 or like 150%, Speed it up. So these these kind of things are what, what is the most impactful.

Scott Tolinski

Mom, so what what are you what are you using to measure those changes to know if you're having an impact or not? So we have the Bos So

Guest 2

benchmarks, we basically run thousands of benchmarks in Node.

Guest 2

On top of that, It's important to find the edge cases that you pass.

Guest 2

So invalid states, valid states, and whatever different, inputs does it take, Wes try to enumerate them, and we have a dedicated benchmark CI in the Node. Js infrastructures that we run these kinds of things. And because that machine is extremely old, it it reflects the worst case scenario of the optimization And which, provides a good baseline for, giving, a Proper reasoning for that particular change.

Wes Bos

And do you have to run those benchmarks on different hardware as well? I remember, like, By 10 years ago, Node one of the node team members was running a bunch of raspberry pies or Yarn no. It wasn't Arduino. A bunch of raspberry pies in their office as part of the The test suite because that was one of the the things they targeted at the time Wes, how fast does it run on a Raspberry Pi? Yeah. So

Guest 2

Right now, I think we have maybe 20, 30 different CI machines. We have only 1 benchmark CI. This is mostly because we Well, we are like Node. Js is an open source or organization, and there's there isn't any company behind it that could give us those kinds of benchmark big machines to run those benchmarks.

Guest 2

But, and mostly, we don't write hardware specific code in No, Jess, which means that we don't write specific instructions for, Neon, which is Wes For SSE 2, for Windows machines or x t x 60 4 machines.

Guest 2

In particular for, ADA, URL parser, we have made those optimizations. That's why ADA is extremely fast.

Guest 2

But that requires to have Have a direct access to the machine itself, and that's where, Daniel's expertise comes in. And because he's a professor and working and have all of these different machines.

Guest 2

He helped and optimized

Wes Bos

for different architectures. So let's talk about, like, There's everybody listening to this is a web developer. They're writing JavaScript every day. Like, there's probably very few people listening that write C plus plus In their day to day.

Wes Bos

I'm just curious about at what point you should start caring about performance.

Wes Bos

So an example I have is I'll often have an array and I need to distill it down. So I'll run a filter on it, and then I'll I'll need to map over it and add some data, and then I might I might chain a couple of maps and a couple of filters together because it's much easier for me to reason about it, Debug it to read the actual code Vercel I could probably do it in a single reduce, but it's gonna be much more complex. And I'll often opt for the Multiple map filter reduces, and people will say, hey. Like, that's what about performance? And my answer to that is I'm looping over 3 spans on a page. I don't think it matters.

Wes Bos

Do you have any sort of guidelines as to, like, when you should start caring these about these things? So

Topic 8 18:39

Caring about performance is a mindset independent of specific code instances

Guest 2

Performance is a mindset. It's not something that I can turn on or or turn off Yeah. Even though that particular Node is called maybe 1 or 2 times in the lifespan of that whole project. Yeah. I basically Care about, writing the best Node that I could write for that particular portion. But in In order to answer that question, let me rephrase the quest rephrase the question for a different industry.

Guest 2

When, Let's say we we talk about Sanity, when you should when, when should you care about security? Would you care for security if you get hacked? Yes.

Guest 2

Would you care for security if you don't get hacked at all? No. You won't care about it, but but because You don't care about because you don't get hacked, you don't look into that particular approach to our Node, and then you don't realize that you're already hacked.

Guest 2

So this is Yeah. This is like a prisoner's dilemma, and also a chicken in the egg problem as well. Because Up until to the point that you care about performance, you don't know that it's slow. And if you if you if you look into that and realize that it's slow, then You change how you write your code and you try to optimize it. So it's not about iterating or calling map or filter multiple times. It's about The principles.

Guest 2

How many times like, what is the big o notation of this four loop that I'm iterating? I am I iterating o o two n or or or or what's the is it growing exponentially or so on and so forth? So the question is, If if you know those kinds of things and if you also make the deliberate choice of doing it, I think that's okay. But As to to summarize, it's not always, like, a black and white, scenario, but for me, When you when you learn those kinds of things, you can't turn them off.

Guest 2

That's that's why I'm here.

Guest 2

Yeah. What it led me to all of those.

Wes Bos

Yeah. Uh-huh. So if if you're writing, like, a a loop in JavaScript And you think, should I use should I spread this thing into an array and then slice the items out of it, or Should I do it a different way? What would you do in that case? Would you go to, like, JS bench or JS perf or something like that and and write a test case and run it A 100,000 times to see

Guest 2

if there's any difference between the the 2 of them. If you run if if you iterate and if you call the slice, this means that You copy the memory twice. So this is if you have a thousandth array, then you have another memory as well. So if you are doing this A lot of times then we are basically use triggering garbage collection a lot of times, and that becomes a v eight problem.

Guest 2

So and because v eight is a JIT, which corresponds to just in time compiler, it's extremely hard for v eight to optimize those kinds of So, what I basically do is that I basically try to write it as fast as I can, With using less memory as possible, and then I benchmark that particular portion. But the question is not about benchmarking the whole thing. I benchmark the whole process.

Topic 9 21:30

Avoiding inefficient operations like unnecessary copying in hot code paths

Guest 2

For example, parsing a URL or getting, or read file sync. I basically do it, that it's like an integration test. In instead of calling that particular function, you call everything, and then you realize what is the percentage of the time that I spend on this particular function? That's what, flame grafts give gives us.

Guest 2

And in that portion, if it's corresponds to 10%, but you spend 90% of the time on v eight or any other function, you realize that There's much that you can do. There's limited things that you can do in order to improve this function, and then you realize that you give up, give up, and That becomes an educated decision rather than Mhmm.

Guest 2

Any,

Wes Bos

emotional one. Yeah. I think learning how to read a flame graph JS a very good skill for anyone to have. So I had some issues Cloudflare Worker. They give you 50 milliseconds, and I was using some library, and it it kept going over, like, kept going 80 80 milliseconds. And I I distilled it down to simply just Creating an object was creating 80 milliseconds.

Wes Bos

And I was like, this is this is ridiculous. Right? And, like, it might Scott. That's a A really good example of, like, your mindset of performance first because whoever wrote that code that caused that 80 millisecond instantiation probably was not Intentionally trying to make it 80:80 milliseconds, but Scott happened by accident, and it wasn't caught.

Wes Bos

And, I I was posting on the GitHub, and bunch of people got the flame graphs Scott. And it was Node function that was being called that was taking, like, 99% of the actual Cool thing. And the flame graph was very visual to show it, and I was like, man, like, I don't know if I would have ever been able to figure out okay. It's obviously this part. I can comment out the function call, But Yep. You gotta go deeper than that. So the the the question is,

Guest 2

because I contributed and read a lot of those Node. Js functions, Node. Js internals.

Topic 10 23:53

Knowing Node.js internals helps reason about costs of operations

Guest 2

Right now, I know the not a numeric value, but I know the overhead of calling a function. And, with with with this knowledge, I tend to think towards saying that, like, okay. If I call this function, it will make 2 c plus plus calls. But if I do this 1, it will make 1. And depending on usage, this might be faster or not. And then the question becomes much more clear, and you start to look into the flame graph less, unless it's extremely specific thing or you have a really big project, but you have no idea what's what's going on, then Mhmm. You dive into the flame there. We talked to Jared Sumner from BUN,

Topic 11 24:38

Yagiz replaced Sentry's webpack with faster Rust-based esbuild

Wes Bos

and, obviously, BUN is very performance minded, and they always push these graphs that are a 1000 times faster than than Node. Js, and we we asked him, like, like, how is it possible that you you can make it so much faster? And His answer to that is that he handles, like, lots of common use cases, in the code base. So, like, As I understand it, there will be simply like an if statement for common use case. Oh, if it's an array of one thing, we don't need to do this work. Is that A common thing to do in performance is to

Guest 2

just up front check for these per perf wins. Yeah. So so that's That's what we call an happy path, which means that, like, what's the most Oh. The positive path towards a function execution that goes.

Topic 12 25:23

Happy path optimization targets positive execution path

Guest 2

And, For example, if you call the read file sync, the default value of encoding is UTF-eight.

Guest 2

And you know that most people call the read file sync, And they just want to get UTF 8. So if you produce if you add a specific if case and say that if encoding is equal to UTF 8 and You call a specific function super optimized for UTF 8, then you have a performance win. This is what I did, maybe 6 months ago in read file sync, which improved the read passing by 40, 50% or something. So this is what Wes we commonly do as well.

Guest 2

You basically look into what users do or, let's say, go to Webpack just if you want to optimize f Wes.

Guest 2

Make sure, like, just, like, point out what how many times each FS functions get called and what with what arguments.

Guest 2

You try to find the most common, ancestors, the the most common values for that and try to optimize for that. And then you will have a significant boost in that

Wes Bos

keypad. Yep. And another one I hear, and maybe you can explain what that this is into performance world, is the hot path. Yeah. Hot yeah. Hot path is

Guest 2

If that particular path is called a lot of times, that's what we call a hot path as well. Okay. So Happy path is towards, like, the positive outcome of this function.

Guest 2

For example, when you call an FS feed file sync, you don't assume that it will throw an error. So because it's like, if if you make, like, a probability analysis, you will see that 95% of the times if you call or or read file sync in any application, It will not throw an error because the developer already checks if it exists or not. But on the 20%, it will throw an error. So in that case, You ask yourself this is a much more detailed application the detailed explanation for happy path JS that Should you optimize for the error path? No. You should optimize for the happy path because that's the. And then we have this hot path, which is The UTFA. So yeah.

Guest 2

I I I should have used this example, instead of the happy path.

Scott Tolinski

That's good. I I've recently saw that you made a PR on the Sanity code Bos to replace webpack with RS pack. I've been seeing RS pack pop up a whole bunch. Is that something that you think is ready for prime time use as a a drop in? To summarize for those who doesn't know, I replaced, The, Sentri's webpack build with RS pack

Guest 2

that comes with SWC, and it's all written in Rust, so it's extremely fast.

Guest 2

The downside is that, it it has some weird Bos.

Guest 2

And to answer your question, I don't think it's It's ready to be used by a project with the size of, Sentry because it's any sort of Issues issue with minimizers or any sort of issue with, CSS or whatever is going to have huge impacts on the company. So I wouldn't use it for any company with the size of Sentry, but I would use it for my own personal projects. And because it's Extremely easy to migrate from Webpack to. It's also a really good choice because you don't need to Change a lot a lot about your deployment.

Wes Bos

Oh, here's a question.

Wes Bos

And this is something that I saw on one of the your poll Wes is, Like, what's more important? And how do you decide what's more important JS how fast something runs or how big the bundle is? Is it worth adding more code, therefore inflating the bundle size as a trade off for faster code base? And I know the answer is probably it depends, but I'd be curious to see if you have any thoughts on that. If it's a front end application, the, yeah, bundle size is a lot more

Guest 2

Yeah. I'm getting back full.

Guest 2

But I think the most like, the most recent example that I can give you is that BRAFT has this option to optimize for space or optimize for speed.

Guest 2

And if you optimize for speed, this means that you will have a really high, storage bundle size because you add all of these large binaries in order to optimize for those each path.

Guest 2

But if you go for storage, then your application becomes significantly slower, but the bundle size is extremely Node. Because you're using the most native, APIs, that the operating system offers.

Guest 2

I mostly, Depending on the usage, depending on the API that we warp calling, and depending on the impact of that function, I go for, speed. I don't go for storage because you can always buy more, SSD. You can always buy, like more, hard disk and more storage, but even though if you have the latest computers, like Like, 32 gig, machines with 10, 15 core CPUs.

Guest 2

If the application that you're running is not optimized for running fast, Then there's there's much that you can do with even with the fastest architecture and fastest computer.

Guest 2

So, Yeah. I would go for speed at the end time. And this is what we did with, ADA, and I think this is what is going for Node. Js

Scott Tolinski

as well. Awesome. I I just wanted to have a quick question because I know you're you're you mentioned that you have been committing to Node for what you said.

Scott Tolinski

How how long was it not that long? Right? 2 years? Yeah? Yeah. Oh, almost 2 years, I guess. Almost 2 Yarn? What what's it like working on a a project of that Nature.

Scott Tolinski

Is it I I feel I get the vibe that working on Node can be kind of, like, thankless at some Sometimes. Do do you get that, or or what's it what's it like? So it's,

Guest 2

it's extremely beneficial for personal development Because you have you basically have access to a lot of really smart and intelligent people that wants to improve the call that you've written because you share a common, I think with that Vercel, but in terms of the, like, in in terms of impact, it's impactful both personal and both worldwide, but It's the most like, you never get any positive feedback about, From people all all around the globe, and most of the time, you even get, crucified is the correct word, I guess, because of what's what's what you're doing. And because it's a free software, people use it and because they're using it On their most crucial, applications Mhmm. They have this expectancy towards it, and that makes it a really, Like, really hard job to do it because you are contributing it on your own time.

Guest 2

Like, I'm not getting paid to contribute Node. Js, but, People don't understand that and assume that, Node, Wes is a company and Scott just an organization, but it's not. Yeah. Yeah. So it's extremely lonely that I can say that. Yeah.

Guest 2

Yeah. I get that. But, like, if if I look into the past, I can easily say that The changes that I did or any contribution contributor to Node. Js that it impacts the world in a in a In a size that we can't comprehend. This means that because of the performance improvements, we have we, The environment is a lot better because we are, we are releasing less carbon dioxide because of the usages of the computers. And because of, the impact because, like, it's it's it's literally used everywhere. So that's that's really good. Yeah. It's it's crazy the scale

Wes Bos

making A change can have on on things like that. I remember years ago, I had a friend. He worked for a company that sent out Millions and millions of emails to I can say it was it was to BlackBerry. And every time BlackBerry had to email every single one of their users, they had to design, like, an email template and whatever, and they, like, distilled it down to every single character would cost them an extra dollar in bandwidth in order to send it out from their data centers, and it's it's wild that, like, Such a small thing, including 1 extra character blown up by 200,000,000 users or however many users they had Would would cost so much. So you think about that in in perf, you know, like, if a CPU is literally running 20% less Because you've improved this common use case. Like, that's major both in terms of, like, how much energy it's using JS well as

Guest 2

The amount of money people are spending to run run this compute. Yeah. And just recently, I realized that, like, because electron use is Wes.

Guest 2

Even improving Warp URL parsing or fast operations by 5%, you directly have an impact to 1,000,000,000 people because WhatsApp Wes, Discord, Microsoft Teams, they all use Node. Js, and their applications start a little bit faster Because of the change that you did on that particular Node, and that's that's what what keeps Node. Js going.

Guest 2

And Because it's not, company owned and it's community owned, it's, it it puts Node JS in a really crucial and important

Wes Bos

path. Yeah. I always think how funny that is with, you know, like, when you go have a piece of hardware and they have, like, a licenses section, That show everything.

Wes Bos

Often I'll go through the licenses of the Instagram app or the infotainment of my Desktop. And I'll be like, there's people's names in this piece of hardware that we've had on the podcast.

Wes Bos

And I would just love To one day be able to, like, go to a friend or or people who work on browsers and be like, hey. Do you see how that Text is aligned there. I worked on that, and every single person in the world that is holding an iPhone and seeing left aligned text or or or Flexbox or Grid or Using a library to fetch data, be like, I wrote that code, and now it's being used by a good chunk of the world. So the funny thing is,

Guest 2

Around 3, 4 months ago, I was applying for green card.

Guest 2

And because I'm I was under the o one visa, which is an extraordinary ESLint pizza pnpm for US.

Guest 2

I was applying for green card for extraordinary talent. And one of the requirements is that your licensed software needs to be used by a really well known company.

Guest 2

So I basically pnpm, 2, 3 months googling every day trying to find my name on one of those licenses so I can show it To the relievers of the US government. And I I I recently found out that Oracle is using Fastify. And because Fastify uses 1st query is, my name is included in the Oracle's, one of the, like, the manuals.

Guest 2

And it's, like, extremely Extremely funny. And we were talking with Matthew in Node month ago, and I was like, hey, man. Like, your name is on Oracle. Then he was like, Yeah. So what? He was so he he was so used to it that I was like, oh my god. This is really, really. Yeah.

Guest 2

But, like, this this is the this is the reality. Like, Everybody uses, open source libraries, and, they put a license. But the weird thing is no, Node. Js license is included in, in those, packages.

Wes Bos

Oh, yeah. So

Guest 2

even though a lot of people use Node. Js, you will Vercel Node. Js license in those functions, but they include, 300 ESLint, Npm package license, but nothing to Node jets, which is Scott of beard. But that's the reality.

Wes Bos

I I remember years ago, there was a, I think, a blog post from Sanity, who is the creator and, maintainer of curl.

Wes Bos

And just the amount of emails he gets from random people who don't understand what curl is because they find it on their device or they find A file on their computer somewhere being like, you hacked my computer.

Guest 2

Yeah. Yeah.

Guest 2

Yeah. That's that's Really the, like, the reality of the situation.

Guest 2

And even, like, even my own parents or my friends doesn't realize the impact of the thing of the things that I do, but Yeah.

Guest 2

At the end, I can I can Wes whenever I put my, Put my hat to the pillow? I'm extremely satisfied because I did what I did what I could do for the better of the world, and That's something that's, no amount of money can, pay or, get to you. What do you think about TypeScript

Scott Tolinski

Support in Node.

Scott Tolinski

What are your personal thoughts on it? I know, you know, we have Deno, has TypeScript support, Bun has TypeScript support. Is that something that you think Node will ever adopt? So,

Guest 2

before answering what what I think is best is that We didn't include it in yet JS that because type even TypeScript team doesn't want us to, vendor TypeScript Because TypeScript doesn't follow, semantic versioning, so any changes can come in any version, and that's kind of really, Like, scary situation for Node. Js collaborators and maintainers.

Guest 2

On top of that, I think, Yeah. The the benefits of TypeScript is, like, there's no discussion there.

Guest 2

With with GSTalk and IntelliSense, those kinds of benefits up up until the certain point is also beneficial for it. And with with I think there's a new proposal Sanity c 39 for making TypeScript, types as a command so that you don't need to build, and transpile your code, to, like, Wes. I'm warp CGS.

Guest 2

That's also solved as well.

Guest 2

I think without rendering TypeScript, We are extremely close to supporting TypeScript on Node. Js, which is if you just Node to install, Loders, And you can just pass Node minus minus loader equals whatever that package name is, and you can run any TypeScript module right now.

Guest 2

And these changes are done by the, Node. Js modules team, including, several, really smart people.

Guest 2

And, without rendering TypeScript, we could literally do anything because of all of those changes that's happening to the SM launch right now. Yeah. I when loaders came out, this was probably 3 years ago, but that's

Wes Bos

immediately what I thought. Like, Low maybe you can describe what loaders are because I think it's sort of an unknown feature of Node. JS, and it can Make things like supporting TypeScript maybe a bit easier than

Topic 13 40:28

Node.js loaders will enable TypeScript support without vendoring TS

Guest 2

people think. So loaders are what we call like, whenever you, like, import or require a module, before resolving that particular file, the contents of the file. We pass it to a hook, And you can, tap in and mutate the contents of the file. So if that's if that file is a TypeScript, You can dynamically compile the TypeScript file into a JavaScript and then return JavaScript from that file without updating or building it. This is extremely beneficial because this means that the contents doesn't matter.

Guest 2

It can be mutated or updated or changed dynamically by a third party application, which, enables those kinds of improvements.

Guest 2

So, yeah, there it yeah. Node of the

Wes Bos

loaders that I think I'm not sure if it's being written or not, but I just found out that CSS modules. I didn't even realize that CSS modules is a ECMA spec.

Wes Bos

And the way that you can import

Guest 2

CSS modules into JavaScript is via a loader. Is that right? I I don't know. I I don't know. I Maybe it's pnpm not what you're doing. Yeah. Yeah. But I I thought that was coming. If if it's a CSS file or if it's something else, like, you can basically do anything. You can just load a macOS application. You can you can load Rust code in Node. Wes. If you, so right now, we don't have we have a FFI, but it's an open PR.

Guest 2

And we can basically Compile Rust dynamically and enable, enable it as a module by using f five with Node change. Oh.

Guest 2

So this so you can basically run or whatever language that you want.

Wes Bos

That's cool. Because that's Scott just wrote a really cool Loader for our own website. You want you wanna talk about that, Scott? Or not a loader, a Rust file. Yeah.

Scott Tolinski

Yeah. Do you know just just, You know, trying to get my my hands dirty a little bit in the Rust world, so I I wrote, like, a a script that is checking to make sure We have the right thing set up, the right E and V variables, you know, duplicating from an example if, the example has been updated, but the personal e n v. Basically, just trying to get somebody onboarded into the site, seeding the database Bos those types of thing. Yeah. Nice.

Wes Bos

Oh, yeah. That's pretty cool. But you have to, like, you have to compile that and ship that with the the repo. Right? Because it's Correct. It's native. So this this FFI

Guest 2

feature might help us. So you can basically if if there's a Bos compiler that Has like, if if it if the Rust compiled it in NPM, you can basically call that, compile it, and make it an executable pnpm execute that function or Through child process or whatever, you don't even need to call FFI and then return the response of that to the.

Wes Bos

It's it's possible. So you could write a loader that requires a compiler that Pnpm installed and then return the compiled Rust from that. Yes.

Guest 2

Node, basically, because it's an async, you can do that whenever you require Rust function. It might Compile it. If it's not compiled, it can create a child process or a worker thread, execute that.

Guest 2

Like, if it's Child's process, it could execute it and return the STD out by the output of the loader, and then you would just get the response then. You executed pnpm.

Guest 2

Wes, yeah, there's like regarding this, like, the limit is yourself. Like, you can't completely do anything, and that's what Frightens me the most and also makes me excited me the most. Yeah.

Wes Bos

And Node. Js is people are gonna groan when they hear us, but stay with us because I think it's actually really necessary. Node. Js is getting a config file. Is that what you're working on right now? Yes. Right now, I am working on a Config file, which basically is a JSON file that you can,

Guest 2

pass any node options to Dash.

Topic 14 44:43

A Node.js config file is being added to configure flags externally

Guest 2

The main reason for this change is that, there's a limit to what you can give to the arguments, because of the, operating system limitations.

Guest 2

Mhmm. So if you have an file system if you have a config file, it's extremely easy. And because I recently added SIM JSON, which is a Superfast library written by Daniel Lemirek.

Guest 2

It's it's really easy to Parse and super, super fast parse,

Wes Bos

JSON file. Oh, and people are probably thinking, like, don't we already have A config file, which is the package.

Wes Bos

JSON. Why do we need another config file? And I'll say real quick, The idea with the config file is that, like, you often have different npm TypeScript, right? And you have like A dev command and you have a production command and maybe you'll have, like, another command that runs it in with a bunch of different Flags added to it. And like you said, those that gets kind of unruly where you have all of these flags. So with a config file, I could set like, alright, I want debug, maybe a port value, anything specific to that type of running instance. Is that right? So,

Guest 2

Yeah. So the issue is that even though we bundle Npm, Node project and NPM project is extremely different projects, and it has different goals.

Guest 2

So so let me I just wanted to clarify that. Yeah. The other thing is that If so, because of how the, Wes and CJS folders are implemented, if you execute a node index JS command, it will Check for a package, Jason, in your current directory up until to your root directory. So if you are under slash users, yeah, against desktop coding, blah blah, it will make 7, 8 8 different, file system operations trying to find the tech adjacent.

Guest 2

This is mostly done because that tech adjacent contains either type module or TypeScript Wes. So you know how where to How to load a ASM module? Whether you should call a ASM module or a common JS module. So if you have Lots of configurations in here. We tend to do it. ESLint has support for, having a key and package Pnpm.

Guest 2

Has it. A lot of Volta has it. A lot of different packages has it. And if you make this package JSON file really big, then it becomes A performance bottleneck because then you need to parse a huge JSON file and trying to find the values that you care most, which is name, type, exports, imports, and one more thing that I forgot. So, and using a config file is extremely optional. We don't want, I don't want to affect any existing applications in terms of performance.

Guest 2

So we shouldn't Encourage people to use package JSON as a key value storage because that's not the intention of it. I don't know the Backstory of how package, JSON become pack package, JSON, but for the sake of it, it support. But what you can do is that if you want to use a package, JSON, of course, you can just You just need to run nodes with Node config file JS equal to package JSON.

Guest 2

And if you have those specific key value pairs, then you can use it as a config file as well. But that's not the intention because we basically check for JavaScript that we we check for a JSON Mhmm. File, and that JSON file is technically package JSON is also a JSON file, but, so it can be used, but if, it's highly unlikely, and it's I I don't recommend it at all for Go from the streets. Awesome. And that

Wes Bos

what was it? The JSON lib that you included, what was that called again?

Guest 2

The s I m d, JSON.

Wes Bos

Simmed Jason. Yeah.

Wes Bos

Does I'm just quickly googling. Does that support JSON five, which is not a standard, but it's the Better way of writing JSON Wes because I've always wanted to be able to use trailing commas and put comments in my Comment. JSON in Node. JS, but I've I've brought I've made issues many times specifically in the Npm repo, and they've they've closed it saying it's it would cause too much breakage to the community.

Guest 2

So the thing is, JSON is a specification. JSON 5 isn't JS far as I know. No. It's not. And yeah. SIM JSON doesn't support it Because it goes for Sanity of the JSON file. Mhmm. So I think Biome and other, Implementers, they support these kinds of things, and they even support, having faulty JSON, but also continuing to parse the remaining values.

Guest 2

But we don't, and I don't think SIMD, JSON supports it as well. Okay. But SIMD JSON is the fastest JSON parser, I think, in the world right now.

Guest 2

So,

Wes Bos

so yeah. Six gigabytes a second to minify a JSON Validate at 13 gigs a second. It's it's true. Like, you think you think about podcast, people who run podcast Studios or podcast apps, they have to download gigs and gigs and gigs of XML.

Wes Bos

There's probably other people that have to download gigs and gigs of JSON as well. Yep. So

Guest 2

SIM JSON is written by, again, Daniel Lemery, the professor that Pnpm worked with on either URL parser. So, So, yeah, it's an extremely opt well optimized library. I I I'm try so they are currently using SIM JSON with package JSON resolver.

Guest 2

So whenever you find the correct one, we parse that package JSON and retrieve the only the, the important ones. Mhmm. So yeah. The, with that PR, I think ESMA, CSS applications just got 5% faster.

Guest 2

So that's the impact of, I like what you like this. It's amazing.

Scott Tolinski

It's amazing to have such a big impact with such a a small change. I I know that's been a kind of a theme throughout this conversation.

Guest 2

But yeah. Wow. It's really, really awesome. Yeah. Well, by the way, that PI was 120,000 cut lines of code Because I included SIM JSON as a dependency Node, but the actual call to call change was around maybe 1,000 lines or 800 lines, But it's extremely small, and we just need to look into the Node Bos and try to Find those kinds of bottlenecks, and even small things can have really huge impacts.

Guest 2

But, unfortunately, because companies doesn't want to bad money. They don't want to sponsor or whatever their internal agent is.

Guest 2

We we don't have that much, improvements.

Scott Tolinski

Thank you so much for all of that stuff on on pnpm in general. I I mean, I think a lot of it is, like, pretty eye opening to the challenges that, You know, the the Node team or any developer's face here. I think now where we wanna go with the show is taking in the part of sick picks and shameless plugs, where you can bring us a sick pick, which is something that you're just you just like right Node, and a shameless plug, something you wanna bug, something you want people to check out. Okay. So for sick pigs, I can say that I'm, I I so I'm a huge fan of rust based,

Guest 2

like, linters, Including bio, but there's also a new kid in town, which is OXC JS written by Boschian And it's maybe 3 times faster than BIO. But doesn't have any formatting right now, but the ESLint is extremely fast. So,

Wes Bos

I really like that. Oh, I never heard of this, a JavaScript oxidation compiler

Scott Tolinski

Let me tell you, Wes, I heard about it this morning while I was looking through Yagit's Twitter and found, A a retweet about this, and I was like, I've never heard of this. Yeah.

Guest 2

I'm I'm I'm I'm a sucker for performance tools, so I really like, Node.

Wes Bos

Those kinds of things. Yeah. We've been hearing just this week, everybody's talking about biome, and now I'm hearing about this.

Wes Bos

I think it's really cool because linters and more matters. And and the good side is Boschian.

Guest 2

Yeah. The Boschian was a contributor to BIOME in the past Enroll. Okay.

Guest 2

And so yeah.

Guest 2

But the performance, if you look into the benchmarks, it's good. And what I really like about the project is that this is extremely good quote saying that performance issues are considered as a bug in this project. This this is what he said, and it's like that tells something about the character of a project and that I really like and respect.

Scott Tolinski

Yeah.

Wes Bos

Yeah. Man, if somebody can figure out how to parse and maybe replicate some of the ESLint

Guest 2

configs, But in Rust, I don't know if that's possible. They did. They did. Rome Did they? Did it. Yeah. Rome did it and OXC. So Bosch and is also, like, 1 by 1, moving ESLint rules to OXC.

Guest 2

I think he just did, Unicorn. Yes. Then config Unicorn.

Scott Tolinski

Mhmm.

Guest 2

And there's, like, maybe 500 different rules or whatever. I don't know. The impact is huge. And for us, I I think the, one of the lead engineers of view Told wrote on Twitter saying that running on the view repository takes 50 milliseconds. So Oh my gosh. 200 enabled rules. And it's like, how how is that possible?

Wes Bos

No. Jess? 200 rules on 590.

Wes Bos

Yeah. Evan Yu, 200 rules on 590 files Finish in 50. That's true because, like, I'm starting to see sometimes in my in my editor, specifically in our Svelte project, is sometimes I hit a hit save Twice in order for the linter to catch up. It'll show me an old error. Yes. And I was like, no, I fixed that, and you have to hit save again. I don't know if that's in the saving or in the parsing of it, but

Guest 2

that's awesome. Node. Js Node. Js is initialized. So if you run a console log Hello, world. On an project, it takes 150 milliseconds because of the bottleneck.

Guest 2

Yeah. We trust applications. It's extremely optimized, so you don't need to do it any and if you're a good engineer, just like, then you can

Wes Bos

basically go for IO bound, Which is the limitation of the computer itself, and this is what's happening in here. Holy. So it's it's 3 times faster than a console log to ESLint your entire project.

Guest 2

Oh, that's Yep. That's awesome. This this is why this is why even though I really like Wes. In projects, and they they really revolutionized the industry and, Node. Js applications. They are unfortunately doomed to, rot. Not rot is the correct word, but, like, to, lose that particular point because of how these super optimized winters are coming. But Node bottleneck in the scenario is that because they are run on Bos and they don't have any JavaScript context, All of the configurations in all of these projects, they they are not pluggable. I can't add a new rule without contributing to BiOMORE or OXT.

Guest 2

So, hopefully, they will find a solution which is worth, worth it. Like, even if we write a rule in Rust, I'm okay with it.

Guest 2

But,

Wes Bos

Yeah. Yeah. Yeah. Wes had Nicholas Zakas, who's the author of ESLint on, and he's been he's like he's like, we can't ever rewrite the whole thing in Rust because, Like, you need to be able to let the community author this stuff in JavaScript.

Wes Bos

But there are many parts of ESLint That can be rewritten in Rust Mhmm. Especially parsing all of the files and,

Guest 2

parsing the JavaScript. So this is what I did with PMPM.

Guest 2

5 months ago, I started rewriting PMPM with Rust, and I even, it's called Pacquette, P a c q u e t, which which is under PMPM organization right now.

Guest 2

But eventually yeah.

Guest 2

Those kinds of projects, They need more than 1, collaborators to finish it.

Guest 2

And the question becomes so for the for the case of rust, Even writing super Wes Bos is extremely hard because right now then if if you are comparing single thread to different like, single thread ROS and single thread Node. Js, Node. Js is almost, fast if you write the code really good. Mhmm. But if you write multithread, then the question becomes, how can I Right? A really performance multi threaded application, and then the real engineering comes in that you don't have to think about in any Node.

Guest 2

Application because it's,

Scott Tolinski

single thread. Yeah. I was doing a, I was trying to write a video encoding app. And I was doing frame by frame of raw video, and, I mean, those those are just massive vectors.

Scott Tolinski

And When I went to do it single threaded in Rust, immediate frame rate bottlenecks and all sorts of things. And the okay. Now I have to get into Multithreading, which is something I've never had to think about as a web developer in my entire life. And, it was shocking to me how much how How much a even a small dive into that world was able to unlock a lot more performance for me in my project, get me to, You know, 60 FPS, what I was trying to hit. And it was just yeah. It was a it was a nuts experience to to actually see that that type of world in action. So, there is this particular

Guest 2

gray area that most people don't know or don't want to contribute is that Before going into multi threading, you can you could just use SIMD instructions, which is single instruction, multiple data, which means that If you have a for loop and if you have a 100 elements, instead of going 1 by 1, if your instruction if your computer allows it, you can go, like, 16 by 16 bit 16 by 16 and iterated in the less amount of time on a single track.

Guest 2

So, I think on Rust, SIEMD is still experimental, which is a extremely bad case. But On c plus plus, on those kind of things, this is what, people use. And for video processing, SIMD is extremely useful because of All of these really big, mattresses that you need to traverse some,

Scott Tolinski

alloys. Interesting. Yeah. Yeah. Mine was definitely more of a, I'm exploring rather than trying to make it any useful out of this, but, yeah, interesting.

Wes Bos

Cool. Alright. Last thing is a shameless Plug, what would you like to plug to the audience? I don't Node. My Twitter account, if you care about any performance,

Guest 2

I recommend following me.

Guest 2

I'd like to like I'd like to write some blogs, on my personal website, which is the IPs Scott c o. You could just

Scott Tolinski

Follow there. You're a great follow on Twitter, by the way.

Scott Tolinski

Yeah.

Guest 2

Yeah. I I love seeing all I don't I don't BS. That's that's why. I don't BS. I only She care about posting, quantifiable data, so that's that's fine, I guess. But yeah. Thank you. Yeah. Yeah. It's good to hear that Because that's that's where the real engineering comes in, and I want to be known for the engineering, not for the BS at that time. Yeah. Appreciate that. I know it sometimes brings the heat, but

Wes Bos

it's a good follow. Wes, and also for everyone listening, we have a Twitter list of All of our guests.

Wes Bos

So we'll we'll throw you on there as well. Thank you so much for coming on. Appreciate all your time. This was awesome, and, we'll catch you later. Thank you. Thank you. Peace.

Scott Tolinski

Head on over to syntax.fm for a full archive of all of our shows, And don't forget to subscribe in your podcast player or drop a review if you like this show.

Share