AJ_O’NEAL:
Well, hello, hello and welcome back JavaScript Jabber listeners. Today we are at episode five eighty three, I think. And I have here with me Jared Sumner and we're going to be talking about bun. J.S. Did I say your name right?
JARRED_SUMNER:
Yeah, yeah, that's correct.
AJ_O’NEAL:
All right, and
JARRED_SUMNER:
Thanks
AJ_O’NEAL:
I am
JARRED_SUMNER:
for
AJ_O’NEAL:
your,
JARRED_SUMNER:
having me.
AJ_O’NEAL:
yes. I am your host today, AJ O'Neill. Yo, yo, yo, coming at you live. It's always weird when I have to do that for my, my catchphrase is yo, yo, yo, coming at you live, even though it's not, well, it wasn't live until just recently, for years, it's not. Anyway, Jared, it is absolutely awesome to have you. I was the one that reached out to you, I think initially, and I was... I've been very interested in seeing the progress of bun and it seems to have been very, very rapid. However, I think that most of our listeners probably don't know what bun is or why it's exciting or who you are or why you're exciting. So why don't you tell us all of those things?
JARRED_SUMNER:
Yeah, so bun is an incredibly fast JavaScript bundler, transpiler, runtime, package manager, package JSON scripts runner. Yeah, that's the list. And basically the idea is we can make the day to day of using and running and building JavaScript apps a lot faster by putting it all into one tool. And also we can make it a lot simpler by combining by just, when you have control over the whole stack, you can choose the best place to solve various problems. Like for example, the runtime and the package JSON scripts runner both read from.env files automatically. So that way you don't need to have like an extra script to like load your.env files. It detects if you specified node-env production versus development and choose the right. And it also loads like.env.local and... It has built-in support for executing TypeScript files in the runtime, so you don't need to have like ts-node or whatnot. It has built-in dash dash watch mode, so you can automatically reload the runtime whenever there are changes to any of the files that were imported. And all of this is just really fast too, because
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
we just spent a...
AJ_O’NEAL:
So that's, you've said a ton of things there. Can you narrow that down to like a one line summary? What is BUN?
JARRED_SUMNER:
Bun is an incredibly fast suite of JavaScript tools.
AJ_O’NEAL:
Okay. And it runs, you said it reads no DNV and TS node. And so is it a node tool? What is it?
JARRED_SUMNER:
It is a Node.js replacement.
AJ_O’NEAL:
replacement.
JARRED_SUMNER:
It's designed as a drop-in replacement for Node.js. We're still not 100% there yet. We're re-implementing basically all of the Node.js APIs from scratch. So that's like a mountain of work. But we've made a lot of progress. Like NodeFS is in there. uh, node NAPI, the native modules for like C++ add-ons that that's all implemented. Um, yeah, there's a lot there. We have a node module, node module resolution is in there. NPM client is in there.
AJ_O’NEAL:
Okay. So how is this? Why would somebody want to use this drop in replacement for node? What's that you say it's fast, but everybody says everything's fast. I mean, look at any read me on the internet, you know, it's fast. What does that mean practically? What's where would somebody notice the problem that you're solving?
JARRED_SUMNER:
I think basically if you run any script, you'll pretty much, you should immediately notice a difference. The main thing there is that Bunn starts quite a bit faster than Node. Something like 20ms to launch anything in Node at minimum. And then in Bunn, it's something like, if it's on Mac OS, it's something like 6ms. And that kind of, those numbers do scale. So for larger projects, they tend to continue to start quite a bit faster. So you will just feel it, especially with things like we have a built-in test runner that aims to be just compatible. And that can be, in some cases, because of also TypeScript and all that other stuff, it can be something like 100 times faster to run tests. And that's because. that's combining the bundler, the transpiler, all that stuff with, because of that stuff, these things kind of stack on top of each other.
AJ_O’NEAL:
Okay. So all in one tool. So if we, if we look at other languages, so I love bringing up go and rust, go and rust, when I think about them, they have a built in test runner, they have a built in formatter they have basically. When you're using go or rust, if you were only using a text editor and, and compiling the application, you're missing out on like 90% of what it is because it has. all of these things prescribed and ready to use. And so it sounds like that's the kind of experience you're trying to bring. Like
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
the, the go slash rust experience to the JavaScript world.
JARRED_SUMNER:
That exactly. And I think also a big part of it is for JavaScript on the server. A lot of the reason why people use JavaScript on the server is for to build JavaScript front end apps. So you, so it's, that's part of the other challenge here too, is that it needs to be good, it needs to be a really good experience for developing, uh, for, for running the code on the server, but also for developing front end apps.
AJ_O’NEAL:
Okay, tell me more.
JARRED_SUMNER:
So one of the things I'm really excited about is we're about to ship a new Bundler and that's built in directly to the runtime. It's one function, bun.build, and it has a built-in plugin system. The plugins run around five times faster than ESBuild plugins. So it's compatible, ESBuild plugins are compatible with bun.build.
AJ_O’NEAL:
So, okay, there has been so much that's gone on in the last couple of years. So, yes, build is the new grunt. Is that what we're,
JARRED_SUMNER:
Um,
AJ_O’NEAL:
and this
JARRED_SUMNER:
yes,
AJ_O’NEAL:
is
JARRED_SUMNER:
build
AJ_O’NEAL:
the new
JARRED_SUMNER:
is...
AJ_O’NEAL:
ES bill. Well, there was gulp and grunt and web pack and yes, build. They all kind of seem like
JARRED_SUMNER:
There's
AJ_O’NEAL:
they're
JARRED_SUMNER:
a lot
AJ_O’NEAL:
in the
JARRED_SUMNER:
of tools.
AJ_O’NEAL:
same. They're in the same vein. So was this the bun builds is yes, build compatible, but it, it And that does things like what web pack has done or like what about VEET is it that's the is VEET the new hotness or is ES build the new hotness or is VEET built with ES build or
JARRED_SUMNER:
Yeah, so I'll explain a little bit. So ESBuild and Bun are JavaScript bundlers. And the idea is you can catenate a bunch of files together. And you can do a bunch of optimizations to make it so that way the code that loads on the front end is the smallest amount of code it can. Like it minifies the code. And it also removes dependencies that are unused. So. That's what the job of a bundler and also a minifier does. And then there are other tools that build on top of it. So like V uses ES build for the, to like to pre-bundle node modules. Webpack does a similar thing as ES build and bund. And then Gulp and Grunt, those are actually different things. Gulp and Grunt are... task runners. So
AJ_O’NEAL:
What what?
JARRED_SUMNER:
they're like, they're more like task runners. Like,
AJ_O’NEAL:
Oh, Task Runners. Okay.
JARRED_SUMNER:
like you have, it's, it's less like a, it's sort of like a more advanced version of package.json scripts. Yeah.
AJ_O’NEAL:
Okay. So Bun, what the name Bun I'm assuming comes from the, was the original goal of the project to be just a better bundler to, cause I mean, I've used Create React app, I do backend code, but I've had the misfortune of having to use Create React app. And it's like, okay, sit here and wait literally 137 seconds.
JARRED_SUMNER:
Yeah, so the very first version of Bun was just a front end dev server. There wasn't even a runtime. It was one command bun dev. And that was like, that could do similar, it was similar to like Create React app in the amount of stuff you can do with it and like the built in experience, except it started something like 30 milliseconds was the start time. end to end from like pressing enter on your terminal to like seeing the page in your browser.
AJ_O’NEAL:
That is literally thousands of times faster. Literally.
JARRED_SUMNER:
Yeah, and it kind of the it was too, it sort of bun became like a big bun is like it was like a big gap shave basically. Like
AJ_O’NEAL:
You
JARRED_SUMNER:
I wanted
AJ_O’NEAL:
actually
JARRED_SUMNER:
to
AJ_O’NEAL:
mean,
JARRED_SUMNER:
get
AJ_O’NEAL:
yeah.
JARRED_SUMNER:
next JS to work in it. But in order for next JS to potentially work, you really need some kind of server side rendering. So then I was like, well, okay, I guess I could add a runtime. And then I looked at the different, I tried a bunch of different options for what runtime, what engine to use for the runtime. And I noticed that JavaScript Core had a really fast start time compared to V8 and compared to QuickJS and others.
AJ_O’NEAL:
JavaScript core is Mac OS or is it broader than that?
JARRED_SUMNER:
So, yeah, so JavaScript Core is the engine used by WebKit, which is what Safari uses, but WebKit works on Linux. It also works on Windows. It just is mainly developed by Apple.
AJ_O’NEAL:
All right. So the initial goal was just to be. To, to, to make the basically the NPM run dev process better. And then the yak shaving, it turned into.
JARRED_SUMNER:
I was really frustrated with just how slow everything in front end is. That
AJ_O’NEAL:
Yeah.
JARRED_SUMNER:
the iteration cycle
AJ_O’NEAL:
Yeah.
JARRED_SUMNER:
time is just really long and that it's long enough that you check Hacker News while waiting for
AJ_O’NEAL:
Yes.
JARRED_SUMNER:
stuff to rebuild. And anything that, when that happens, it's really hard to have like a... It just takes you out of flow. It makes you unfocused. It makes it so much harder and so much less fun to write code. And so
AJ_O’NEAL:
Yeah,
JARRED_SUMNER:
when you can
AJ_O’NEAL:
I-
JARRED_SUMNER:
have this... So when you can have this, like, basically we want to make front end development like 60 FPS, like really more like 120 because that's what screens actually go at now. But like we want it to be fast enough that you never notice a delay and that you can just see the like this experience where you can just see the code as you change everything updates live immediately. There's no waiting for things.
AJ_O’NEAL:
Yeah. So I miss the old F five days. That has been one of my major complaints. I think that the front end development has just gone off the rails and we were better back in the jQuery and backbone days, because at least then
JARRED_SUMNER:
Heh.
AJ_O’NEAL:
you could, I mean, you could look at a file, you could understand it. So What? So the front end front is just really, really complex. And it seems like the complexity lends itself to slowness and you're tackling the slowness problem. Do you tackle it all the complexity problem? Do you, is there something that you offer to people that helps them to keep it simpler as well? Or is it just... You're re-implementing APIs and the way you re-implement the APIs ends up being thousands of times faster.
JARRED_SUMNER:
Um, well, it's, it's sort of both because the way to make it faster is to, to unify the tools into one tool. Cause a lot of what, why front end development iteration cycle time is so slow is because you have these like five different tools that are doing mostly of the same work, but just suddenly don't implement the exact same features. So then they're just wasting a bunch of time. Um, and so when you, so in Bun's case, why, what, a lot of what makes the TypeScript translation step, the JSX translation step, the bundling step, even the HTTP server is all one tool. So we can add a lot of fast paths to make it so that way, and that also means it's a lot simpler because there's a lot less stuff to install. In the earlier versions with Bundev, though, we're going to make some changes to that pretty soon to make it work better with our new Bundler. But in the earlier versions of Bonnet, you would create React up at least hundreds of dependencies. With Bundev, for just the basic build, the most basic development, it's like build, then there aren't dependencies other than React.
AJ_O’NEAL:
Yeah, and to their credit, cause I love, love to rag on React, but to their credit, last time I tried the latest version of Crete React app, I think they've got it down to dozens of dependencies from hundreds.
JARRED_SUMNER:
Oh, that's great. That's, I, I, I'm, yeah, my information must be out of date. And,
AJ_O’NEAL:
Yeah, I know it's
JARRED_SUMNER:
and
AJ_O’NEAL:
fine.
JARRED_SUMNER:
I, and also I think, and that's not to like, I'm not like trying to brag on, on Create React App. They made a lot easier to start, to get started creating React apps.
AJ_O’NEAL:
I've got being being on the back end. I've got some pain, you know Little tangent here, but you know How react doesn't use HTTP status codes correctly? And so you end up having to break your your 404 handler and you know, just just those sorts of things I've got heartburn from it
JARRED_SUMNER:
I'm not actually familiar with that.
AJ_O’NEAL:
Yeah. So because, because react takes over the routing, you can't use a traditional web server that would handle four Oh fours because your four
JARRED_SUMNER:
Oh,
AJ_O’NEAL:
Oh
JARRED_SUMNER:
you're
AJ_O’NEAL:
fours.
JARRED_SUMNER:
referring to the pattern of front end apps where you load a HTML file as the fall background. Yeah, you can configure,
AJ_O’NEAL:
Yeah.
JARRED_SUMNER:
most web servers should be able to just configure like arbitrary, like that's like, the web server should just be able to fix that. That's not a React problem.
AJ_O’NEAL:
I won't go down that tangent much further, but
JARRED_SUMNER:
Hehehe
AJ_O’NEAL:
we have status codes for a reason. And it's nice that when you reach a route that's
JARRED_SUMNER:
Yeah,
AJ_O’NEAL:
unavailable,
JARRED_SUMNER:
no, no,
AJ_O’NEAL:
that
JARRED_SUMNER:
that's
AJ_O’NEAL:
you get
JARRED_SUMNER:
true.
AJ_O’NEAL:
a 404.
JARRED_SUMNER:
It's true. That's a bad hack that that was ever a thing.
AJ_O’NEAL:
Yeah. But, but anyway, okay, cool. So, and are you, are you yourself? Are you, I mean, this seems strange. So also. Bun is written in Zig hurrah.
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
That must be hard because what Zig is changing every weekend. Is it not, or have they gotten more stable?
JARRED_SUMNER:
It still has a lot of breaking changes.
AJ_O’NEAL:
So how often do you have to rewrite? I guess it's probably not too bad, but you have to go rewrite function signatures for a dozen different functions in your code base or something on new releases or.
JARRED_SUMNER:
Um, it's most, most of the changes are in the build system and we're, we, we just don't use the build system a lot. Um, uh, and usually the changes, but yeah, usually the changes are in the build system or the standard library. And so like the language itself is actually, they haven't made that money breaking changes to it, uh,
AJ_O’NEAL:
Okay,
JARRED_SUMNER:
recently.
AJ_O’NEAL:
cool.
JARRED_SUMNER:
Um, it's just, yeah, it's, it's, uh, for us, it's mostly, uh, we just need to keep up, keep up to date.
AJ_O’NEAL:
So, so how do you get to be the type of person that both is programming in Zig and is also concerned with faster build times for react apps.
JARRED_SUMNER:
Um, yeah, so before, before bun, uh, immediately before bun, I was building this, this, this kind of like boxel game in the browser, uh, and, uh, that the, the shell of the game was all, it was all a react app, but then like the actual game itself was like a WebGL thing. Um, uh, but I spent a bunch of time as a front end engineer and, uh, and like I spent much more time as a front end engineer than as like a systems engineer. So, um, Just, I just remember all of that. And that's a lot of what it's fired. But the.
AJ_O’NEAL:
Okay, well that's actually scary, right? Cause when you say, well, I mostly been a front end engineer, but I'm dabbling as a systems engineers, like alarm
JARRED_SUMNER:
What?
AJ_O’NEAL:
bells go off in my head. I'm like, oh no, security issues, crashes, like, ah. So.
JARRED_SUMNER:
Oh no, like I spent, there's definitely, I really, yeah, like I did even before, I spent most of the time as a front-end here, but I also did like a few iOS apps. I did some like Ruby on Rails stuff a while ago. But yeah, overall, a lot of, like Bond was actually my first project in ZIC. uh and I learned a lot.
AJ_O’NEAL:
Okay. So, so help me feel better about this. Cause Zig is basically like C and C is incredibly, incredibly dangerous. And you know, if I, if, if someone were to tell me that they were an amateur writing C programs, I would not want to use anything that they built. So why should I trust you to be writing a tool that hundreds of thousands of people, if not millions of people, you know, over the next several months to years are going to be running on their desktop. How, how do you make me feel better about this?
JARRED_SUMNER:
We spend a lot of time on stability work. The things I talk about the most are new features. What I actually spend the most time on is making Bunn more stable, fixing bugs, and testing for crashes and writing more tests. We run Valgrind a lot. We just spend a lot of time. trying to make stuff crash, and then fixing it.
AJ_O’NEAL:
So our listeners, generally speaking, are probably not familiar with Valgrind. What is that?
JARRED_SUMNER:
Valgrind is a tool that it checks for a large number of possible security issues. Things like undefined, things like, in JavaScript when you access an undefined, when you access an undefined value, undefined is like a known thing. It'll return, it'll say, undefined is not a function if you try to call an undefined value. But in native code, your, this is undefined behavior, it'll most likely, segmentation fault. And so, native projects often will use Valgrind to know for sure whether or not they're accessing undefined memory in the cases where the code was executed, as well as a number of other checks like balance checks. Zig actually is pretty good about, Zig in general has a lot more. runtime safety features and compile time safety features and C in many cases. Like instead of in C, you have like pointers and that you have, there's no distinction between pointers which are arrays and pointers which are pointers, which are single element. In Zig, this is like a, these are types. These are represented in a type system and you have bounds checking. You usually don't usually use slices similar to slices in Go.
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
And those are, those have bound checks. Um, uh, Zig will panic on, uh, integer overflow, integer underflow. Um, uh, yeah, it'll panic on alignment, uh, mismatches. Um, it, it, it has a lot more safety checks built in than,
AJ_O’NEAL:
Okay,
JARRED_SUMNER:
uh,
AJ_O’NEAL:
okay.
JARRED_SUMNER:
yeah. So it's not, it's, but it's still, this is still something that we're very concerned about and I just spend so much time. trying, like bugs are the thing that we spend the most time on.
AJ_O’NEAL:
Okay. Yeah. Cause you know, between node go and rust, they're all memory safe languages and my understanding was that Zig is not memory safe, but it sounds like Zig provides you with, I guess in node go and Zig, the memory safety is enforced by in the case of node, the just in time compiler. In the case of go and rust, it's enforced by the compiler. And then it sounds like in Zig it's enforced by the tooling, but not the compiler, but so you've got the tooling to help you, you compile a program and then you've got Valgrind to help you on the other side of it.
JARRED_SUMNER:
Yeah, it's,
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
Zig, like Z, is not memory safe, but it provides a lot of tooling to make it better.
AJ_O’NEAL:
Okay, cool. Cool. And I'm, I'm actually, I'm a Zig aficionado. I have not earned any of my lizard stripes yet, but I hope to, to get there someday. I, uh, it's, you know, it's just like, you got to take the time to, to pick it up. Uh, but as, okay. So, so this actually is a really good question then. So as a, you were predominantly a, a JavaScript developer, you were predominantly doing react, right? That's what we were saying.
JARRED_SUMNER:
Uh, yeah.
AJ_O’NEAL:
So what was the path that led you to system? I mean, we're not just talking about back end, we're talking system. I mean, because it's like there's back end development, there's front end development, there's back end development, and then there's system development. And you were definitely on the system side of things. So how did you get, what was the bridge to that? And what was that like? Did you have other languages that you knew beforehand? Or tell me about that. process.
JARRED_SUMNER:
Yeah, so I had done a little bit of like a Objective-C before. So I had an Objective-C has a lot of overlap with C. So it wasn't like 100% unfamiliar, at least on the C side. Zig is a lot like C. But immediately before I was working on this voxel game in the browser, and I just had to spend so much time on performance, because we were trying to do this like, a big open world game in the browser multiplayer, and you would build stuff, and there was a server part, there was a client part, and it was really complicated just even to get the voxels to render and to send the data to the browser
AJ_O’NEAL:
So
JARRED_SUMNER:
and compact it.
AJ_O’NEAL:
another quick tangent, you've said voxel a few times. What does voxel mean?
JARRED_SUMNER:
So voxel, you can think of like Minecraft where voxel is a 3D pixel, sort of. And so you have, it's really just an XYZ coordinate, but the idea is you can, in Minecraft you have like blocks and they have different types, they have different textures, they have different other different properties. And So, so Voxel is Voxel games are where you, the map is defined by, uh, these, these coordinates, uh, with this, with some annotation in them. And, but what it means is that you have, if you want to make a game where people can walk around, it's a lot of data, um, because you have like a three by three or like a three dimension grid of like 16 by 16 by 16 or 32. So some, some dimension, uh, and. You have to send all that data to the browser and the browser has to do some processing on it to render it client side. That process is called meshing.
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
So that was really hard to get it to make that work reasonably performantly. And it just kind of made me obsessed with performance. And then that led to... And then I got really frustrated with just how long the iteration cycle time was on a game. And I realized I was spending all my time on the build, like the tooling for the game and not the actual game. So I was like, OK, well, what if I just... focus on build tooling and that led to BUN.
AJ_O’NEAL:
Okay. So, so you had to, I, I'm imagining you had to make a lot of very small tweaks that were probably testing, testing kind of esoteric things like maybe switching multiplication to addition or putting
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
something in a, in a hash mapper or doing these little performance hacks. And then every time you want to make this small little one line change to see, okay, is this going to give me the 10 X increase I'm looking for? You're waiting there 30, 60, 90 seconds.
JARRED_SUMNER:
Yep.
AJ_O’NEAL:
before
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
you get your game back up. Okay. And so that was the frustration that led to, but okay. So do you, do
JARRED_SUMNER:
Thank
AJ_O’NEAL:
you
JARRED_SUMNER:
you.
AJ_O’NEAL:
eat? Do you have, do you live with, with, you know, you're in a building. How,
JARRED_SUMNER:
Hehehe
AJ_O’NEAL:
how does, how does the money happen? How
JARRED_SUMNER:
Yes.
AJ_O’NEAL:
did it get from, I'm building a voxel game to I, uh, I, I work on a, on a build tool for my voxel game. It is the voxel game still happening or is that in the
JARRED_SUMNER:
The
AJ_O’NEAL:
past
JARRED_SUMNER:
Vox we gave
AJ_O’NEAL:
now?
JARRED_SUMNER:
is log log log, but yeah, so initially it was I, I, I was, I worked on Stripe for a little bit. And I quit Stripe for, or after I, I worked on Stripe for not that long, but after I was like, okay, I want to go build a bunch of, I want to go work on ideas basically. And so really I want to work on developer tools and wanted to, I was felt ready to like do a startup. So the Voxel game was one of the things I tried. And I just had a little bit of savings. At that time I had a little bit of savings. And so at that point I just was on savings and just trying not to spend much money. Last summer we raised $7 million led by Kleiner Perkins after Bun's launch. But when we raised money... uh, my bank account, I was actually like the same, uh, the, the same week that like the, the check cleared, I got an email from the bank that was like, uh, your account balances $0. Um, so it was like very close to the edge.
AJ_O’NEAL:
Okay, so this brings into question. Now this is something that we could probably have a whole separate episode on. How? But in brief You're raising money for a developer tool that's open source. Where is the catch and what concerns do you have about the pressure that's going to be exerted from them?
JARRED_SUMNER:
Oh, I really did think of Bond from the beginning as this is whatever, what I'm going to do is going to be a startup. It was eventually, but the specifics of like how we're gonna make money and why it's not just an open source product, we're going to do really fast edge hosting for Javascript specialized cloud. We're gonna run it on, we're starting to work on this pretty soon. But... The basic idea is we're going to do our own cloud for JavaScript. It's not going to use existing clouds. It's going to be like servers and data centers.
AJ_O’NEAL:
So you're, you're actually going to run your own servers.
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
God bless you.
JARRED_SUMNER:
And by specializing on JavaScript, we can make it much cheaper to do the hosting. And that means like the actual pricing can be much better and we can make it much faster. We really want to have deploy times that are in the single digit, maybe low double digit milliseconds from
AJ_O’NEAL:
Oof!
JARRED_SUMNER:
like push to deploy. Um...
AJ_O’NEAL:
Wonder.
JARRED_SUMNER:
That's still aspirational at this point, but like because we haven't done it yet. Just to be clear.
AJ_O’NEAL:
Yeah, but that, I mean, that's one of my frustrations too, is it, it feels like development has become so complicated that yeah, you know, your, your NPM start dev is 90 seconds and then you want to deploy something and it's like 15 minutes before you can see the change. So I am, I am actually working on something not entirely similar and, and that being You know, super focused on JavaScript and that, that complete vertical integration. But, but, uh, I'm actually working with some buddies on some private hosting to try to relieve some of the crufts so that the, the hosting process is. Simpler and easier. And, and, and, and this is actually part of the question I had about the money is because we're, we're doing privately owned, so we're buying cheap old servers. We're hosting out of a privately owned data center where they bought the space. long before the real estate propped up in value. And so we don't have to pay exorbitant prices because they already had it before it was worth so much. Um, but with that, we have the freedom to be less expensive because we don't have any investors that we have to. Please, we don't plan on offering a free tier. So we're not going to have that gimmick of, you know, try S for this many hours or this many, whatever for free. So that because if you have free customers, then inevitably the paid customers have to pay for the free customers. And then instead of getting economies of scale where it's like one bananas, $1 and two bananas, $1 90 and 10 bananas is only $5. You get this very perverse pricing where it's, you know, one banana is free, but two bananas is $5 and three bananas is called for enterprise pricing. So this is, you know, outside of the JavaScript realm, but How do you see a path where you can avoid that type of problem? Um, I mean,
JARRED_SUMNER:
I think it's really important to give people a Generics Free tier. Because that's just how you know, that's how you get started. And that's like, it's also just what everybody expects at this point.
AJ_O’NEAL:
But then how do you keep the prices under control? Because if you've got a bunch of free users, and free users are typically your worst users. They're the ones that abuse the system. They're the ones that whine the most and take up the most support tickets. I mean, the free users are your absolute worst users, and they incur the most cost. And then the paying users, rather than just paying their fair share, generally have to pay exorbitantly. I mean, do you... Do you get what I'm saying there? Do you agree with that? Disagree?
JARRED_SUMNER:
I think you have some clear limits for the free tier. But I think the main thing is make it so the underlying costs are cheap enough on your end so that way you can afford to have a free tier and afford to have a paid tier. Or at least I'm really saying what I'm thinking. I'm not really trying to give you advice. Uh, but, um, so, uh, if you make it cheap enough to run the service, then, then you, then it's not a problem. Uh, and then as long as you be really careful about like abuse, the abuse is the hard thing. Like, for example, I think it's really good that like, I don't know if for sale still does this, they might, I don't know, but for a while you couldn't, uh, create an account using like an email and password you had to like go through GitHub or GitLab. So I log in and that decision is probably really good for, uh, handling fraud because it's much harder to gain GitHub than it is to game like you're like, if you put the challenge with rolling your own off kind of thing is you have to like you, your account is on these other existing, these other companies have spent an enormous amount of time on, on moderation and anti-abuse stuff, and so you can kind of lean on that.
AJ_O’NEAL:
I, you know, it's not as good as you think because I, I did the GitHub thing for a service where I thought, oh, if I just, if I only allow people to log in with GitHub, GitHub will handle the bot users and it'll be okay. And it turned out that was not the case. I still got bots. I think I just got more sophisticated bots, which may have been worse in a way.
JARRED_SUMNER:
Well, I think, yeah, there's no like one, there's no like silver bullet, but like, I think,
AJ_O’NEAL:
Yeah.
JARRED_SUMNER:
there's a lot of, there's like, I feel like it'll, I don't really know yet, because we haven't really gotten as far. But my sense is there's probably just a lot of layers. Like, you see that like, oh, there's some traffic going to like, like we can, with something around like, I don't really know, but Uh, I feel like this is, it's, it's, it is just a problem to monitor and solve as it comes up.
AJ_O’NEAL:
Yeah, yeah, sure. Okay. So I'll bring us back around to more of the original topic because this is tangent. I like the tangents. I hope you do too.
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
I hear from listeners often that they do like the tangents, but we try to not go too far too often anyway. So I'll reel this back a little bit. What was it? Oh, dang it. I just had something on the tip of my tongue and I lost it. Give me a second. We'll just edit out the awkward pause here. Oh, okay. So there's also the question of maturity. So this, I love the sound of bun. I love the mission of bun. I'm, uh, I'm, I'm highly skeptical and I'm really blunt. So I'm not convinced on the business model yet, but I, I don't really need to be in order to, to, I mean, you've got money, you succeed or fail as a business, the tool still gets developed with that money. So that's, that's great as far as I'm concerned. And, but the bun is, I don't know what the general public perception of it is. When, when I look at it, I think this has got to be the best thing that's happened to JavaScript in the last 10 years, just on the face of it. However, when I've gone to use it and I don't know if, I mean, you probably interact with so many people, you probably wouldn't have recognized me specifically. But. Uh, and, and I, and I haven't done you the favor of getting back to you when you've gotten back to me in a timely fashion on a, on a few things, cause I'm bouncing around between projects, but every time I've used bun and I do the weird stuff, right? So I'm on the backend. I actually do some work with an organization called dash incubator, where it's a lot of cryptocurrency stuff and. Hatch algorithms and cryptography and web crypto and all this stuff. And so every single time that I've gone to use bun personally, It's failed and usually I've opened an issue and usually within an hour or two, you say, Hey, that's fixed. Now try it. And sometimes I have, and sometimes I haven't, but then immediately I hit up against another wall. Now, I don't think that that's the general person's experience because it's generally a front end tool more than a backend tool, although
JARRED_SUMNER:
Um,
AJ_O’NEAL:
it's
JARRED_SUMNER:
that's,
AJ_O’NEAL:
good, but go on.
JARRED_SUMNER:
we've kind of done, gone in both, uh, like it was originally really more of a front end tool than a backend tool. Now it's sort of in between. Um, but, but this, this general thing of what you're saying, uh, yeah, Bona is a big yak shave. Um, and so, uh, this is, this is basically why we aren't 1.0 yet is because this is, there's still, there's like a chain of things that need to be fixed. And. This chain is a very, very long chain of things. But I think we're making a lot of progress. And basically, I expect that the experience that you've had is going to be fixed in less than six months, is what I expect.
AJ_O’NEAL:
And
JARRED_SUMNER:
Probably
AJ_O’NEAL:
again, I'm doing
JARRED_SUMNER:
closer
AJ_O’NEAL:
weird
JARRED_SUMNER:
to like three months.
AJ_O’NEAL:
stuff. I'm I am. Well, there was one really normal thing I did. I tried to use the node. I think it was the node HTTP get, and there was a problem with like the implementation of the prototype or something like that. So that, that was
JARRED_SUMNER:
Yeah,
AJ_O’NEAL:
a
JARRED_SUMNER:
I don't
AJ_O’NEAL:
normal
JARRED_SUMNER:
know.
AJ_O’NEAL:
thing, but most of the issues I've encountered have been like weird way off in left field issues. And to be fair, I have open issues and node that still aren't fixed after nearly a decade. Of just weird edge cases. There was a whole bunch of stuff I opened up in the TLS module and node that eventually I think every single one of those got fixed, but I literally couldn't use node for a project because. And I don't know if you've encountered this, but the node created it's it's HTTP layer first, and then it created the TCP layer after the fact and the TLS layer after the fact. And so there's duplicate code and like spaghetti code between those three. So if you ever try to use a raw TCP object from an H from the HTTP layer, it may be fixed now. I think that they fixed this, but, uh, back around node 10, node 12 or so that it was still completely separate incompatible objects. Right. And I don't know if you've had to deal with any of that in bun or
JARRED_SUMNER:
We have actually somebody who filed an issue that sounds a lot like this, like last week, and we need to fix it. So in our case, it's really weird, the internals of these, because NodeHttp and Bun actually just calls fetch, like for the client part. So it's sort of backwards, because NodeHttp is sort of a lower level library, and fetch is very high level. But because fetch is the bun, we previously implemented Fetch as the data for the implemented HTTP client.
AJ_O’NEAL:
So do you get fetch for free from the JavaScript core? Or how does that work
JARRED_SUMNER:
No.
AJ_O’NEAL:
with stuff that's already implemented in the browser? How much of that do you get to borrow or not?
JARRED_SUMNER:
We try to borrow as much as we can because it means that we don't have to worry about that this is a good thing for like security because they run fuzz testing and they do a lot of and it's like in browser staple, et cetera. So like URL is an example of one that's directly from Safari slash WebKit that we've made basically known modifications to. Same with URL search params, a lot of like event target event. headers mostly. I made some optimizations of headers to make it faster,
AJ_O’NEAL:
I've
JARRED_SUMNER:
but
AJ_O’NEAL:
had,
JARRED_SUMNER:
that was the only...
AJ_O’NEAL:
have you had any compatibility issues with those? Because I've noticed that different browsers implement URL differently. And that is kicked me in the butt at least once.
JARRED_SUMNER:
There's
AJ_O’NEAL:
Cause
JARRED_SUMNER:
something
AJ_O’NEAL:
I remember
JARRED_SUMNER:
with
AJ_O’NEAL:
it.
JARRED_SUMNER:
like, something with like file protocol that's like
AJ_O’NEAL:
Yeah.
JARRED_SUMNER:
vaguely different.
AJ_O’NEAL:
Yeah, it's like triple slash versus single slash or something like that.
JARRED_SUMNER:
something like that. And then with headers, the most annoying thing was that in this is not a difference between browsers, but on the server you really want set cookie header and the set cookie header you can have multiple of them and then on the but yeah and that's like allowed but it's the only HTTP header that's allowed to have multiple set cookie headers and the browser API for headers is supposed to like merge all the headers together. And so we had to make like a, and they actually made a recent change to the spec to allow this. It was like very recent. But yeah, but we don't actually take everything from the, like the web APIs, Fetch is a hard one because we don't wanna also include all of the WebKit networking stack stuff. So Fetch is custom-employed. Even the HTTP client itself is custom. Um, which has been both good for performance, but also makes it hard to, it's sort of, we need, we have to be very careful about what we do in-house because it gets harder as we do more.
AJ_O’NEAL:
So how much of it are you implementing down in Zig and how much of it are you implementing up in JavaScript?
JARRED_SUMNER:
Um, there's very little, there's the, the node HVB compatibility, like the node compatibility stuff. A lot of the, a lot of the like node modules are like node built in core modules. Those are all JavaScript for the most part, except for buffer. Um, but pretty much everything else is in SIG. Um, excluding readable stream. Uh, Yeah, it's mostly ZIG. We have like a very, we spend a lot of time on like the bindings layer of like how do you efficiently and also more safely generate code between, or like generate the code to interact with JavaScript directly. It's very complicated, like doing type tracking. It went for the native code side of things because you can have a string, you can have a class that extends string, you can have a string object, you can have a number object. I can go into a really long tangent about this. But, this is like, it's all really complicated. So, and this is, and it's also very performance sensitive because the specifics of, especially for strings, of like how you interact with strings, the rest of the world uses UTF-8 for the most part, unless you're Windows or Java. But JavaScript strings can either be Latin one or UTF-16. So then when you're, so you have to constantly do these string conversions back and forth. And then JavaScript strings themselves can either be rope strings, which are like you did A plus B as strings, or they can be the joint kind, but there are various cases where the, where you have to like, To actually read the string, you have to combine them. And then a lot of the time you're not actually doing that much with that string. So you're just like cloning the string again. Yeah.
AJ_O’NEAL:
I guess, so I, I didn't understand a hundred percent, but I
JARRED_SUMNER:
That
AJ_O’NEAL:
think
JARRED_SUMNER:
was very tangenty.
AJ_O’NEAL:
no, it's fine. I, so I personally, I like the lower level details. I like to understand how things are happening under the hood. So my, my understanding with the JavaScript strings has been that using dot join is normally what you should do and using the plus operator you should normally avoid. And then I don't know about the backtick operators. Is that.
JARRED_SUMNER:
The plus operator is totally fine. I would say that the
AJ_O’NEAL:
I
JARRED_SUMNER:
backtick
AJ_O’NEAL:
thought that it
JARRED_SUMNER:
is.
AJ_O’NEAL:
had a in like in it got to like an N squared or something like that with because if you use join that allocates the memory ahead of time. But if you do plus plus plus plus, then it has to allocate the memory between each evaluation because it evaluates one and then allocates more memory and evaluates the next one and then allocates and then evaluates again. But that may be from like the Internet Explorer six days. I mean, some of the knowledge I have is deep knowledge. That's just wrong.
JARRED_SUMNER:
Well, yeah, so it really did. It's the engine can at any point decide to join the rope string. But if you're if all you're doing on the string is is like the nice thing about the rope string, the plus using the plus operator is that most engines will track will continue to attract the length without having to join the strings back. So in most in a lot of cases, I think people will join the strings together and then. then they might check the length, but then they don't actually care about the full value until they pass it to some other API. And that's the point where we do the conversion.
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
And...
AJ_O’NEAL:
And that's something that you handle. Like when you're parsing it, you, you figure this out. It's, this is not just something you pass off to the jet optimizer with JS core.
JARRED_SUMNER:
Um, usually it's just the engine itself, but in a couple of cases, cause in a particular case, one of the patches I added to our webkit fork, uh, is to make it so we can, when, when we need to look at the actual rope iterations, um, like iterate over the rope string. Um, and so an example where that's useful is in the text encoder implementation. Um, because if you're encoding, uh, If you're encoding the converting the text to UTF-8 and you have like a long string, it's really wasteful to join the string as a UTF-16 string or Latin one string, and then again, encode it immediately because that's like copying the contents twice. So as one optimization, it was to iterate over the rope string and then do the encoding without the extra duplication.
AJ_O’NEAL:
Okay, so to the note of performance, there was two questions that I had that I skipped on asking earlier. But one, well, I'll go with this one first since we're right here. Where do you think the biggest gains have come from? You said one of them is, it sounds like you've defined a language, a super language that encompasses everything that would need to be parsed for JSX. and TypeScript so that you can basically take anybody's JavaScript ish language and put it together. Not, not sounds like that was one. What other been the big heavy hitter, uh, optimizations.
JARRED_SUMNER:
Um... Uh, mostly, I think... A lot of it is just spending a lot of time profiling and being really careful about like memory allocations. And, uh, this is something that Zig is really good at is because in Zig, you, you, you pass it explicitly, you pass a memory allocator whenever you need to allocate memory. Um, so we can see everywhere we're allocating. Um, and then, uh, we use a bunch of different, like depending on the use case, we use a bunch of different strategies for. for how do we make it so we allocate less memory and how do we make it so that way we control exactly when we free the memory. But I would say it's also, a lot of it is also just system calls. So like for example, in Bun, in the HTTP server, you can do return new response, bun.file and then a path. And it's like the web API response object. But what that does internally is, normally if you're streaming, this is like you stream a file to the client. Normally when you stream a file, you're reading the file and then you're reading the next like 120 kilobytes or whatever and then writing it back. When you use Bun for this, it actually is using a different system call. The system call is called send file. And what that does is this moves that copying and that read step and that write step to happen in the kernel. So it avoids this extra round trip in your application and user space. And that makes it much faster at streaming files. It also makes it simpler because we don't have to allocate that. you know, 65, 128 KB or whatever, and then send it back. And so we also do a similar thing when we copy files. We, on Mac OS, it'll automatically use this kind of obscure system called clone file.
AJ_O’NEAL:
Ugh.
JARRED_SUMNER:
And so, and that's what's really nice about that is it's zero copy. So it reuses the same blocks on disk, it still counts as another, like the statistics on the disk will still say you use twice as much disk space or whatever for copying, but from the actual underlying part, it's really fast. And it also works with like directories too. And so we
AJ_O’NEAL:
Have
JARRED_SUMNER:
ended
AJ_O’NEAL:
you
JARRED_SUMNER:
up using that.
AJ_O’NEAL:
have you had any bugs related to that specifically on Mac OS?
JARRED_SUMNER:
Clone file. Basically all of these optimizations, you'd always have like a fallback path where it's like, okay, this is the fast path. We're gonna try to use it as much as we can. We're gonna design the API to make it that way by default people use the fast way. But then if that doesn't work, we have to have a fallback. And that fallback is like the slow way.
AJ_O’NEAL:
Well, the reason I ask is because on Mac OS, if it uses the clone file and you have both references open and you edit the original and save it, the clone will get updated.
JARRED_SUMNER:
No, that's hard links. That's different. Uh... Hard links...
AJ_O’NEAL:
not hard links. Well, hard links, I mean maybe I'm mistaken on this, but hard links is when, I mean that's the old Unix way of
JARRED_SUMNER:
Yeah,
AJ_O’NEAL:
doing things.
JARRED_SUMNER:
the clone file is different than hard links. What you're describing happens with hard links, where you have a hard link and you modify one version of one reference to it, it modifies the other references. If you use clone file, it doesn't.
AJ_O’NEAL:
So I mean, this is tangential, but you can, you can do this with screenshots. So if you take a screenshot of something, duplicate the screenshot, edit the original, then the duplicate will be changed. But from that point forward, if you change them, they will change independently. It's just when you first duplicate it. And I don't believe that's a hard link. I believe that that's the clone file. Cause this was introduced
JARRED_SUMNER:
Yes.
AJ_O’NEAL:
in Mac OS. I think is when they introduced it, something like that.
JARRED_SUMNER:
I'm not sure about the specifics with screenshots, but I don't think, I, you can, there's some, there's definitely a flag you can pass to CP. I don't remember the name of
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
the flag, but, it might be dash F. I don't remember, but, yeah, the, Clone file specifically just copies the, it clones the contents, you know, copy and write fashion. Yeah. And there's a similar thing you can do on, technically some Linux distros, there are some Linux file systems support this as well, but what's nice on Mac OS is that clone file also supports directory trees. So that was one of the ways we made bun install like much faster than NPM and YARN was because it's one system called to recursively clone an entire directory tree. In a lot of cases, Bun installs 5x, sometimes 10x faster on macOS because of that.
AJ_O’NEAL:
Okay. So do you use, cause that the big, one of the things that took node to the next level was when Microsoft infused it with some investment and they created lib UV. Do you use any of lib UV and bun or you're just doing a separate, I mean, it sounds like you're making your own implementation that's lib UV like, but to the most modern operating system calls.
JARRED_SUMNER:
Basically the latter that we're we're we aren't directly using. We aren't. I do read the lib UV code, but we don't really use. Yeah, we don't use lib UV at all. We do use we use. We basically we use KQ and we use E-Poll and we use the various APIs directly. And just spend a lot of time reading about system calls.
AJ_O’NEAL:
Cool. So another question I had related to performance was it seems like one of the huge, the biggest, most complicated time sinks is SAS because it always breaks with every node version and it all. And so we have a, we have a development instance that has four gigs of Ram and it runs single core, but I think that's all you can do in node anyway. And it'll take several minutes to compile the react app and, and the majority of that time is spent in SAS. Do you, does, does bun do anything for that? Or are you just adopting users that aren't using that or what? Why is that
JARRED_SUMNER:
Um...
AJ_O’NEAL:
not a problem for you?
JARRED_SUMNER:
I would say we're probably not going to help with that very much, just being totally honest. And basically, for SAS in particular, they have the C++ implementation, which is the, as far as I understand it, is deprecated. Now they have a newer Dart implementation. I actually, I'd be curious to see what the performance looks like, but I imagine it's not actually going to be very different because running Dart in JSC versus V8. JSC being JavaScript core. Which is the engine that Bun uses.
AJ_O’NEAL:
Well, the primary version of SAS today is actually a self-bundled Dart application. So it ships with
JARRED_SUMNER:
Okay.
AJ_O’NEAL:
the DartJet. And because of the type hinting and stuff, the DartJet is faster than a JavaScriptJet because the type hinting is built all the way through the JET rather than just, you know, being disregarded
JARRED_SUMNER:
But that makes sense.
AJ_O’NEAL:
at the, at the higher level. But it sounds like you are able to take advantage of some of the type hinting and some of the optimizations
JARRED_SUMNER:
No,
AJ_O’NEAL:
you're doing. No. Oh, okay.
JARRED_SUMNER:
no, unfortunately. This is a thing I think is really interesting, but it would be an interesting direction. But I think it's basically TypeScript is not, the types aren't, it's really two challenges there. One is that the types aren't specific enough for it to be useful by an engine. You would need to have, instead of saying like number, it would need to be like, this is a 32, like a 32-bit integer.
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
or we need to be like, this is a double 64 bit float. And it's pretty tedious to write types like that, at least compared to the status quill. And so basically it can't be, TypeScript itself wouldn't be useful. I think, like there isn't, I wouldn't bet this is like a high chance this ever happens, but... It wouldn't be impossible in the future if for JavaScript to have some kind of, uh, like, like types mode, like, you know, there's like use strict, maybe you'd have like use types or something. And then like, you would have, you would then you could specify the type, the very specific type definitions, but this is not something we're working on or really thinking about. I think it'd be really cool. Um, but, uh, and it would really help JavaScript to be like a lot faster. um assuming that like engines would support it and invest enough time into it um but uh yeah
AJ_O’NEAL:
All right. Well, this sounds I'm, I'm super interested in this project. I, I'm really, I really want to take more advantage of it. It's just been, like I said, I'm doing weird stuff. I am accustomed in everything I've ever, I mean, even when I, even when I was using, uh, when I started using go, I immediately hit up against one of like their weird, an HTTP issue, you know, I'm always doing weird stuff in everything. I try. I typically run a run up against some sort of internal implementation bug that that very few people have discovered because I do weird things. But I am interested to give bun another try because I, as far as I'm aware, every issue that I've opened has since been addressed and I just need to. Actually go run one of the projects again and find out. Okay. Is it that, you know, is everything actually working now? Which like I said, I think at this point it would be. Um, because I'm, I'm very strong on what you're, what you're doing and, and hope to see this really overtake node. I think that that, that would be node is it's absolute garbage and it's that way because of historical reasons and it's not anyone's fault because they made dumb choices necessarily. I mean, of course we all make dumb choices sometimes, but it's, it's more the history of it than it is that. And it was, I mean, you're doing this very intentionally. It sounds like, whereas node was very haphazard and then money started getting thrown at it. And I don't know whether that helped or hurt initially. It sounds like your plan is much more methodical. So I'm, I'm really excited about this. What do you think the. Stability is at this point. Do you encourage people like, yeah, start using this. I mean, node at 0.4 was being used in the enterprise. Right. Um, which is crazy, but I mean, do you, what, where's your level of confidence and, and invitation of people is it, Hey, come and have at this, like this is ready to go, we're just fixing bugs or is it no, no, really, you know, hold off a bit. Don't be too adventurous. Where's your, where's your feel on your project?
JARRED_SUMNER:
Um, I would say... We're somewhere between like, it's not, it's definitely not 1.0, but we're getting a lot closer. And I would say, I would be surprised if it was, I would be surprised if we don't have a stable release within a couple of months, but it is still not stable today. However, I think in a lot of cases you can try it and you could use it in a small project. If you're using it in some cases and also there are companies using it in production on some decently, with a decent amount of traffic in some cases. But we're still very much fixing stuff.
AJ_O’NEAL:
All right. And then I think I've maybe got one more question and we're up against time now. So we'll probably bring things to a close. I'll give you another opportunity to, you know, ask, say anything else you want to say. But, uh, the, the one other question that I had a little bit earlier when we were talking is you were talking about, you're using specific underlying APIs and you're also talking about how this is a no drop in. And so. And I've, I've looked over the documentation a little bit. I've seen that you have these different APIs. It's like, you can use the node API. And maybe that's not going to be as fast because it has to do some of this extra ceremony, or you can use the bun API. But the bun API, in many cases looks to be synchronous, which makes a lot of sense for build tooling, but doesn't make a lot of sense to me in my brain for the server side of things. So what, what's the spectrum? there between the native bun methods, whether they're sync or async versus the wrappers for for node methods.
JARRED_SUMNER:
Yeah, I think Node is too async. And what I mean by that is a lot of the time, the overhead from doing some work asynchronously exceeds the time it takes to actually do the work synchronously.
AJ_O’NEAL:
Yeah, I've,
JARRED_SUMNER:
And so,
AJ_O’NEAL:
I've experienced that.
JARRED_SUMNER:
like in Node in a lot of cases, read file sync is faster than read file. There are cases where that's not true. For example, if you're dealing with a network file system or if your hard drive is just really slow. But in the majority
AJ_O’NEAL:
or if
JARRED_SUMNER:
of cases,
AJ_O’NEAL:
it's a large file versus a small file.
JARRED_SUMNER:
yeah, if it's several megabytes, the threshold is something like, on Linux, if you have an SSD and it's a fast hard drive, the threshold is something like a megabyte or so. So in BUN, when you use, like,.file and then you.txt or whatever, it will, it actually will look at the file size and then decide, okay, is this, is this gonna be faster to do this synchronously or is it gonna be faster to do this asynchronously? And then it will choose which one to do based on that, like that
AJ_O’NEAL:
So
JARRED_SUMNER:
heuristic.
AJ_O’NEAL:
does it expose the async API in just next ticket then?
JARRED_SUMNER:
Well, in this case, bundle file.txt returns a promise. So that one is actually async. But in the case when it is run synchronously, it does promise.resolve. It does the equivalent of returning promise.resolve value. Yeah.
AJ_O’NEAL:
Okay. So the optimization there is that the operation doesn't go into the thread pool. It just does
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
it right away. And then it puts it in the next tick of the event loop so that from the programming perspective as a programmer, you feel like it's asynchronous, but
JARRED_SUMNER:
Yeah.
AJ_O’NEAL:
from the perspective of the way the machinery works, it's just loading that. in the next event cycle, which means that when this returns, then that event cycle executes and there's not ever actually a polling or a wait space in real time.
JARRED_SUMNER:
Exactly.
AJ_O’NEAL:
That's awesome. That is that is absolutely genius. Cool. All right, well. I don't have any more questions right now. What do you want to let people know about this? And you! We didn't really talk much about you, unfortunately.
JARRED_SUMNER:
Well, no, that's fine. Yeah, I think we're gonna ship this new bundler pretty soon and I'm really excited about it
AJ_O’NEAL:
Well,
JARRED_SUMNER:
because
AJ_O’NEAL:
tell me about it.
JARRED_SUMNER:
it's something like, in most cases, it's somewhere between 30% and 50% faster than ESBuild. The ESBuild plugins run five times faster. And it's a built-in API bundled up build. And I think it's just gonna make it a lot easier to build front-end apps because, really full stack apps, because it's a bundler directly integrated into the runtime. Instead of having a bunch of different tools, you can just have one.
AJ_O’NEAL:
And so again, this is benefiting from the concept that you're parsing everything once, and then I guess handing that AST around to the various tools or your. Uh, your, how are you, how are you pipelining or, cause there's this idea of separation of concerns, right? We want our one parser to do its one thing or other parts to do its one thing. And this makes things modular. It makes them easier to test. So.
JARRED_SUMNER:
Um, the way that works internally for the transpiler stuff is, uh, we at build time, uh, at BUNS compile time, we generate, I think like there's like, this is, this is one of the nice features of Zig is it has a really good compile time, uh, uh, features. And so we actually generate like 12 versions of the JavaScript parser, um, with one, and it's like a matrix of like, it, it supports, is TypeScript enabled, is JSX enabled? Um. And then there's a few other flags. So, uh, it's, it all does this, it does this in, it's not like two different transpilers, uh, in terms of like, it does, it's not, it's not step by step in that way. It does it all in technically it's two passes. Um, but yeah, it does it in two passes. Um, uh, and then the bundler is it just directly reads the AST from the transpiler.
AJ_O’NEAL:
And so the sharing the AST, the abstract syntax tree around, which is the parsed representation of the text file, which I'm guessing at that point, is it pure JavaScript AST or is it still JSX annotations and TypeScript annotations at that point?
JARRED_SUMNER:
When the bundler sees it, the JSX is gone. It's all the converted format. It's
AJ_O’NEAL:
Okay,
JARRED_SUMNER:
like function
AJ_O’NEAL:
cool.
JARRED_SUMNER:
calls.
AJ_O’NEAL:
And so that's, okay, cool. All right, anything else?
JARRED_SUMNER:
No.
AJ_O’NEAL:
All right. Well, it has been absolutely wonderful to have you on to discuss about this. I was really grateful to be able to learn so much about the implementation and the roadmap and everything. I hope that our listeners got a lot out of it and we'll check it out because. I mean, it, it sounds like if you're doing, if you're doing front end work as a node replacement, it's not stable a hundred percent there, but if you're doing the front end. bundling that sounds like that's where there's just a huge amount of value that's readily accessible right now. Is that correct?
JARRED_SUMNER:
We're about to ship the Bundler. So yes, once that's shipped, I think it'll be really good. Another thing I think, another really good way to use Bund right now is Bund install. It's a really fast NPM client. You sort of, you can replace YARN or NPM with Bund install. And you can expect somewhere between 10 times to 50 times faster package installs.
AJ_O’NEAL:
Okay. And this is compatible with NPM. Like it won't, what will it, because with the one reason I don't use yarn is cause yarns like marginally better than NPM, but it causes all sorts of things to break is bun something where it's like you use either bun or NPM or it, I mean, I guess if I, if, if somebody were to use NPM and somebody were to use bun, would they get the same working results? Or is it you? You need to opt in to bun like you have to opt into yarn.
JARRED_SUMNER:
The the the button also uses like it'll install to a node modules folder. It does use it. We do have a different lock file. But we tried really hard to be compatible with NPM.
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
And
AJ_O’NEAL:
So if
JARRED_SUMNER:
generally
AJ_O’NEAL:
something, if something's not working, it's a bug, not a feature.
JARRED_SUMNER:
Yes, if something's
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
not working
AJ_O’NEAL:
Cause
JARRED_SUMNER:
it's a bug and please file an issue.
AJ_O’NEAL:
Cause in yarn when it doesn't work, that's a feature. And that, that's, I just, ah.
JARRED_SUMNER:
The big compatibility issue we have with Bun install right now that we're going to address is we don't do post install for dependencies, and this is a security and a performance feature, but it's also an issue because there are real use cases for post install. What our plan there is we're going to introduce a privileged dependencies key and package JSON that's optional. where you can allow this these packages to run their post install scripts. So I think that will be like a good compromise of,
AJ_O’NEAL:
Oh, I like that.
JARRED_SUMNER:
cool.
AJ_O’NEAL:
Is that, do you think you'll be able to get NPM in on that? Or are you just, cause I would imagine if I want this to work in both environments, I would have to have it in, I would have to have it listed in the dependencies list and then duplicate have to have it listed in the privilege dependency list. Unless
JARRED_SUMNER:
Um,
AJ_O’NEAL:
NPM adopts it.
JARRED_SUMNER:
the thinking is that the privilege dependencies would be the top would be like a top level array actually. And so it would just be the name of packages that you allow.
AJ_O’NEAL:
Okay,
JARRED_SUMNER:
Um,
AJ_O’NEAL:
cool.
JARRED_SUMNER:
so then, uh, it wouldn't be, yeah, it wouldn't be like a separate object. And
AJ_O’NEAL:
and
JARRED_SUMNER:
then
AJ_O’NEAL:
wouldn't
JARRED_SUMNER:
this
AJ_O’NEAL:
come
JARRED_SUMNER:
also,
AJ_O’NEAL:
into conflict.
JARRED_SUMNER:
yeah. And then
AJ_O’NEAL:
Awesome.
JARRED_SUMNER:
also it wouldn't, uh, uh, and then since NPM always runs post install, unless you explicitly tell it not to, it wouldn't cause any, it shouldn't cause any issues there.
AJ_O’NEAL:
Cool. All right, yeah, that's awesome. So where do people learn more about BunJS? Where do they download it? What should they be following?
JARRED_SUMNER:
You can go to bun.sh to download Bun. And we don't really, we have one up in Twitter, but it's mostly my Twitter really right now. But just go to bun.sh
AJ_O’NEAL:
Okay.
JARRED_SUMNER:
and
AJ_O’NEAL:
And
JARRED_SUMNER:
get
AJ_O’NEAL:
then
JARRED_SUMNER:
Bun.
AJ_O’NEAL:
you on Twitter are
JARRED_SUMNER:
Jared's summer, but I feel weird promoting my own Twitter.
AJ_O’NEAL:
No, it's part
JARRED_SUMNER:
We
AJ_O’NEAL:
of
JARRED_SUMNER:
should
AJ_O’NEAL:
what
JARRED_SUMNER:
have
AJ_O’NEAL:
we
JARRED_SUMNER:
it.
AJ_O’NEAL:
do. Don't be weird about it. Don't be weird about it. So it's at J-A-R-R-E-D-S-U-M-N-E-R,
JARRED_SUMNER:
It's at-
AJ_O’NEAL:
is that correct?
JARRED_SUMNER:
That's correct. Yeah.
AJ_O’NEAL:
Okay, cool. Yeah, well, that's where I hit you up to get you on here. So, you know, if people are really angry and they need to know who to tweet storm at, they need to know, you know? No, I don't think that's the case. All right, cool. Well, yeah, it's been... Great having you. Thanks. Thanks for coming on. We have one more section, which I don't know if you saw before you, uh, came on the show today, but we do this thing called picks, which is you can just talk about anything that is cool. That's that, uh, is going on. So Chuck normally picks a board game. Uh, Steve always has a couple of dad jokes. I usually talk about something that I've red or bought or something. And you know, you just pick two or three things that you are interested in sharing with people, doesn't need to be tech related, totally can be tech related. It's whatever you want. It's just your moment to pick something that you think is neat, that you think people should know about. So I can go first to give you some time to think on that. Or if you want to, you can go ahead and go now, whichever way.
JARRED_SUMNER:
You can go first.
AJ_O’NEAL:
Alright, so what am I going to pick? I actually hadn't thought about this before, but I did finish Skyward by Brandon Sanderson and I was not disappointed. I was kind of hard on, look this dude writes way too many books, I'm just going to stick to the Cosmere, but then I signed up for the stupid year of Sanderson and he tricked me and he sent me materials that were not Cosmere stuff. And so I got this little model in, but, uh, fighter pilot thing. And I'm like, Oh no, now I have to go get the book that is about this thing that came in the, the, the monthly or the quarterly package. And yeah. So, so then I got the skyward series book one, I think it is skyward. I think it's just called skyward. Yeah. And it's a young adult series. It's, it's more young adult than say Mistborn or some of the other things that he has it's YA. This is much more towards the, you know, just it's, it's much more suitable for young younger kids. I don't think there was any swearing in it at all. If I remember correctly, I could be wrong. There might've been one hell or one dam somewhere in there, but it was very, very kid friendly. It was very light. Lighthearted isn't quite the right word because it is about you know an alien species invading a planet. That's where post-apocalyptic humans have taken residence, but it was It was more it just it was more I don't know. It was more young on the young adult side than say Mistborn But it was still completely enjoyable and I'm looking forward regrettably to now being sucked into book two and three and four because there are A total of four books in the mainline series and then a couple of novellas that are, you know, is typical tangent that he does. And the final book, I think comes out in a couple of months. So for people that are not big on starting series that haven't been finished, this one's pretty safe because it's it's in like either final revision process or just pre publication process right now. So I'll pick Skyward from from Brandon Sanderson is my cool thing. to check out as well as what I was talking about before, uh, privately owned cloud hosting BNNA it's bnna.net. And we, uh, should have that rolling here shortly. We got our first server up in the data center and that's just for us to test and play around with that. We, we, and we, we found out that we ordered the wrong, uh, switch, but that we were off by one letter of a model number. And so we are reordering that. There's a couple other things. So it may still be, I was kind of hoping that within a week or two, we would start accepting our first customers, but it's probably going to be closer to yet another month out, but that is coming. So if you're interested in privately owned cloud hosting, uh, we do have bnna.net where you can just drop your email.
JARRED_SUMNER:
I think my pick is, it was kind of random, but the text power tools VS code extension. It's really nice because you can, I just use it a lot because it has like a insert, it lets you like, with the multi cursor setup in VS code, you can like insert a hundred, it like for the test data, if you wanna have like a hundred, you wanna like have lines that are like, one through in a hundred and then all in one, even you can, you press enter twice and then you have a hundred lines that start with the incrementing integers. And there's a bunch of different, like there's like one for like change, to like remove duplicate lines, for like sort lines ascending, sort lines descending. This is just like a random thing I use a lot.
AJ_O’NEAL:
That's awesome. What's the call again?
JARRED_SUMNER:
Text power tools.
AJ_O’NEAL:
Text power tools. Cool. Do they have it for them as well?
JARRED_SUMNER:
Probably, I don't actually, I feel like Vim probably has some default, like there's probably some, I don't know Vim very well at all, but there's probably some, I would be surprised if you need an extension for Vim, but it's probably like more complicated.
AJ_O’NEAL:
You wouldn't, you would just type in, well, to get the incrementing numbers, you, you'd need to do something for that. But if you just wanted a hundred of something, then you just type a hundred and then type your letter and then it will print it out a hundred times. And then you. Yank it and then you do a hundred and then you can get a hundred lines. But
JARRED_SUMNER:
Hehehe
AJ_O’NEAL:
yeah, that sounds cool. That sounds cool. Anything else?
JARRED_SUMNER:
Um, I've already talked about bun a lot. So, uh,
AJ_O’NEAL:
Well, you
JARRED_SUMNER:
but
AJ_O’NEAL:
got to
JARRED_SUMNER:
you
AJ_O’NEAL:
plug it again.
JARRED_SUMNER:
should, yeah, you should check out bun. Oh, another thing you should, that I didn't talk about is we also have bun run, which is a really fast way to run package JSON scripts. Uh, so you can similar like NPM run, uh, or yarn run. Uh, it's, it's that, but it's like, um, it's, it's like double digit times faster. because it doesn't, because it, it like, it's something like the, the minimum time for it to run a script is on the order of four or five milliseconds, whereas npm run by itself will take like 200 milliseconds. So, and that's like exactly, because that works with package juice on scripts and also for node modules.bin. So it's very, you don't have to have, you don't have to use like the runtime or anything to use that.
AJ_O’NEAL:
Cool. All right. Well, again, thanks for coming. Listeners, thanks for joining in. I don't know if there is any of our other normal stuff that I forgot because usually I'm not the host of the show. I'm just the antagonist that comes up with the hard questions when when needed. But I think I think we're good to wrap it up. So we'll catch you later. Adios.
JARRED_SUMNER:
Thanks for having me.
AJ_O’NEAL:
It's great to have you. OK, now I'm going to stop the recording here.