Hey folks, I'm a super busy guy and you probably are too. You probably have a lot going on with kids going back to school, maybe some new projects at work. You've got open source stuff you're doing or a blog or a podcast or who knows what else, right? But you've got stuff going on and if you've got a lot of stuff going on, it's really hard to do the things that you need to do in order to stay healthy. And one of those things, at least for me, is eating healthy. So when I'm in the middle of a project or I just got off a call with a client or something like that, a lot of times I'm running downstairs, seeing what I can find that's easy to make in a minute or two, and then running back upstairs. And so sometimes that turns out to be popcorn or crackers or something little. Or if not that, then something that at least isn't all that healthy for me to eat. Uh, the other issue I have is that I've been eating keto for my diabetes and it really makes a major difference for me as far as my ability to feel good if I'm eating well versus eating stuff that I shouldn't eat. And so I was looking around to try and find something that would work out for me and I found these Factor meals. Now Factor is great because A, they're healthy. They actually had a keto line that I could get for my stuff and that made a major difference for me because all I had to do was pick it up, put it in the microwave for a couple of minutes and it was done. They're fresh and never frozen. They do send it to you in a cold pack. It's awesome. They also have a gourmet plus option that's cooked by chefs and it's got all the good stuff like broccolini, truffle butter, asparagus, so good. And, uh, you know, you can get lunch, you can get dinner. Uh, they have options that are high calorie, low calorie, um, protein plus meals with 30 grams or more of protein. Anyway, they've got all kinds of options. So you can round that out, you can get snacks like apple cinnamon pancakes or butter and cheddar egg bites, potato, bacon and egg, breakfast skillet. You know, obviously if I'm eating keto, I don't do all of that stuff. They have smoothies, they have shakes, they have juices. Anyway, they've got all kinds of stuff and it is all healthy and like I said, it's never frozen. So anyway, I ate them, I loved them, tasted great. And like I said, you can get them cooked. It says two minutes on the package. I found that it took it about three minutes for mine to cook, but three minutes is fast and easy and then I can get back to writing code. So if you want to go check out Factor, go check it out at factormeals. Head to factormeals.com slash JSJabber50 and use the code JSJabber50 to get 50% off. That's code JSJabber50 at factormeals.com slash JSJabber50 to get 50% off.
Hey folks, I'm a super busy guy and you probably are too. You probably have a lot going on with kids going back to school, maybe some new projects at work. You've got open source stuff you're doing or a blog or a podcast or who knows what else, right? But you've got stuff going on and if you've got a lot of stuff going on, it's really hard to do the things that you need to do in order to stay healthy. And one of those things, at least for me, is eating healthy. So when I'm in the middle of a project, or I just got off a call with a client or something like that. A lot of times I'm running downstairs, seeing what I can find that's easy to make in a minute or two, and then running back upstairs. And so sometimes that turns out to be popcorn or crackers or something little, or if not that, then something that at least isn't all that healthy for me to eat. Uh, the other issue I have is that I've been eating keto for my diabetes and it really makes a major difference for me as far as my ability to feel good if I'm eating well versus eating stuff that I shouldn't eat. And so, um, I was looking around to try and find something that would work out for me and I found these factor meals. Now factor is great because a, they're healthy. They actually had a keto, uh, line that I could get for my stuff. And that made a major difference for me because all I had to do is pick it up, put it in the microwave for a couple of minutes and it was done. Um, they're fresh and never frozen. They do send it to you in a cold pack, it's awesome. They also have a gourmet plus option that's cooked by chefs and it's got all the good stuff like broccolini, truffle butter, asparagus, so good. And you can get lunch, you can get dinner. They have options that are high calorie, low calorie, protein plus meals with 30 grams or more protein. Anyway, they've got all kinds of options. So you can... round that out, you can get snacks like apple cinnamon pancakes or butter and cheddar egg bites, potato bacon and egg, breakfast skillet, you know obviously if I'm eating keto I don't do all of that stuff. They have smoothies, they have shakes, they have juices, anyway they've got all kinds of stuff and it is all healthy and like I said it's never frozen. So anyway I ate them, I loved them, tasted great and like I said you can get them cooked. It says two minutes on the package. I found that it took it about three minutes for mine to cook, but three minutes is fast and easy and then I can get back to writing code. So if you want to go check out Factor, go check it out at factormeals, head to factormeals.com slash JSJabber50 and use the code JSJabber50 to get 50% off. That's code JSJabber50 at factormeals.com slash JSJabber50 to get 50% off.
STEVE_EDWARDS: Hola from the Portland, Oregon area.
JC_HIATT: We have AJ O'Neill.
AJ_O’NEAL: Yo, yo, yo, can't lie from Sonny Provo.
JC_HIATT: And today's guest is Paige Neadringhouse. Wanna say hello?
PAIGE_NIEDRINGHAUS: Hey everybody, Paige from Atlanta, Georgia.
JC_HIATT: Tell us a little bit about your background and how you got here.
PAIGE_NIEDRINGHAUS: Sure. I've been a developer now for about three years, full-time. Before that, actually, I was working in digital marketing right after I graduated from college. So I made the switch, joined a coding boot camp, and I was hired right after that to work at Home Depot. And I've been here ever since, first as a software engineer and now as a senior, just in the last couple of months. And it's been an incredible journey. I'm so, so glad that I made that switch.
JC_HIATT: Yeah, nice. And congrats on the promotion.
PAIGE_NIEDRINGHAUS: Thank you.
JC_HIATT: So I know we've got a few kind of bullet points of discussion here, but just as a high level, what are you most excited about from this new release?
PAIGE_NIEDRINGHAUS: Well, at least in the beginning, you won't have to switch everything to import. You may have to change some of your file endings though. They're going to have different file extensions. So modules will end with a.mjs ending and common.js files will be.cjs. So with common.js, you'll still be able to use the require module exports, all that syntax that you're used to. You just may need to change some of the file types so that Node can correctly pick up which type it's supposed to be using.
AJ_O’NEAL: But what about like when I've got, you know, something I'm writing and somebody wants to use it, are they not gonna be able to use it? Cause it's common JS and they're writing an MJS file? I mean, is this gonna be like bifurcation? You're either on one side of the fence or the other?
PAIGE_NIEDRINGHAUS: Maybe for a little bit. I'm not exactly sure how they're going to handle that, but I know that they are, you know, trying to make it work for everything.
AJ_O’NEAL: Sorry, I derailed us. Y'all go back to what you're doing there. I'm a base there over here. Just preaching gloom and doom.
PAIGE_NIEDRINGHAUS: So that's one of the things that I'm most excited about with what's coming in Node 12, but there's some great enhancements that they're doing in other areas as well. They've improved the startup time. So with Node 11, we got about a 60% startup time from worker threads by using the built-in code cache support, but with Node 12, they've managed to speed it up by another 30%. So Applications are going to get the users faster. They're going to have better experiences. It's really pushing the envelope in terms of trying to be as dynamic and quick as possible, which is pretty cool.
AJ_O’NEAL: So does this mean modules are going to load faster or just the node binary is loading faster?
AJ_O’NEAL: What is the code cache? This is something that I'm not familiar with.
JC_HIATT: Yeah, I'm unfamiliar with that too.
PAIGE_NIEDRINGHAUS: Let me see if I can find a good example of how to describe it. I think it's just how they cache the built-in libraries that Node comes prepackaged with, or any of the libraries that you end up requiring in your Node application. I might have to get back to you on exactly what that is, because I didn't do a huge deep dive into it.
AJ_O’NEAL: Okay, I'm just curious, because that's something that I was trying to optimize at one point, and I thought, well, maybe if I can cat things together, it'll be faster or da da da da da. But what ended up seeming to take the most amount of time back when I was trying to optimize it, which was a long time ago, was the VM compile step. And so I would be interested to know if we are doesn't it start out interpreted now and then switch to JIT? I thought V8 introduced that at some point.
JC_HIATT: We go a little more detailed on the async await stuff because, so are we saying that using async functions like is faster than promises themselves?
PAIGE_NIEDRINGHAUS: I am, yeah. This is something that's really cool. And I learned about it, I don't know, a month or two ago.the V8 team actually was able to take the async await syntax and make it two micro ticks. And I can't even tell you how quick a micro tick is, but it's actually two micro ticks faster than promises. So yeah, they were able to drop. I think there was an extra promise that had to be concatenated on when you were, before when you were using async await, but with the newest release, they've actually been able to drop that improve the performance speed of it.
JC_HIATT: So me calling async on a, some sort of fetch request or something like that is going to be a couple of micro ticks. I don't know what a micro tick is either, but a couple of micro ticks faster than, than chaining like dot thins onto a fetch request.
PAIGE_NIEDRINGHAUS:It is, it's actually going to be faster.
AJ_O’NEAL: Why don't they speed up promises?
PAIGE_NIEDRINGHAUS: Because ES6 is the way of the future, ma'am.
AJ_O’NEAL: Oh, he's holding on.
JC_HIATT: He's clinging.
JC_HIATT: Oh man. Well, no, it's,
JC_HIATT: that kind of throws my brain for a loop, right? Like it, cause the way I thought of a single weight was that it is promises under the hood and I, and I know that it is, but it kind of throws my mind for a loop a little bit to say that it's faster than the thing that is using under the hood. I don't, I don't know how, I don't know how to reconcile that.
AJ_O’NEAL: So here's the question in that transition. This is something I've seen and this is why I don't use them. But I'm curious to hear your experience. What I've noticed is that people get really confused about Async08 because, well, I mean, we've got this weird, weird, weird time that we live in when it seems like the software world is 90% junior developers because every year growing like 20, 25%. So within five years, we're now at like nearly a hundred percent junior developers with like, I don't know, 20 percent being junior, mid-level, and then you've got like 1% senior developers now, just because the whole world has become software developers. And so struggle with some of the basics because they're just kind of like thrown into a boot camp and thrown into like here, copy and paste this, you'll get paid. And then with promises, it's super clear. I think, uh, well, maybe not to everybody, but it seems to be much clear with promises there's something that's asynchronous that the way that the code is working is not, you know, line by line, but it is rather by events. But with async await, people lose that context that it's actually events are happening because it, it looks like it's line by line. And then people start to make choices as if it's actually executing line by line and sometimes get themselves into weird situations where just kind of creating something that's convoluted because they think that it's natural and they don't know that it's just events. That's really the, aside from not being compatible everywhere which is becoming less and less of a problem day by day, that those are the two reasons I haven't jumped on board is because compatibility and then just develop a confusion around what are events versus what's happening in real time. That's not exactly the right word, but whatever. So what are your thoughts on that?
PAIGE_NIEDRINGHAUS: I can definitely see where you're coming from. I, like I said, started with callbacks, moved to promises, held on to promises for quite a while.
AJ_O’NEAL: And callbacks were terrible. I think we all agree that that was a disaster, right? Is that?
AJ_O’NEAL: 100% agreed. So other features that Node 12 has, other than making things too, oh man, whenever somebody says the word faster, it's over. Like it's over, whatever it is. If it's faster by one Micro Tick in a benchmark like that's the new hotness. Whatever it was before is completely erased from history.
AJ_O’NEAL: That's going to be interesting. What else is new?
PAIGE_NIEDRINGHAUS: There's better security. How about that? You like more secure applications?
AJ_O’NEAL: I do.
PAIGE_NIEDRINGHAUS: Sometimes, yeah. Sometimes it makes our lives easier. So with Node 12 The TLS, which is the Transport Security Layer, which is how Node handles encrypted streams of communication is upgrading to version 1.3. I know that this doesn't sound like a major upgrade from 1.2, but the protocol is actually simpler to implement, so it's easier for developers to configure, to write. It's quicker to negotiate sessions between the applications. It provides increased end user privacy and reduces request time because it's quicker with the actual HTTPS handshake between apps and it's less latency for everybody. So that should actually provide some major benefits for everybody as well as making applications more secure from the get go.
STEVE_EDWARDS: So in other words, AJ, it's faster
AJ_O’NEAL: Here's the thing with TLS 1.3 and the handshake stuff. I could be slightly off here, but I'm pretty sure this is directly related to HTTP2, because one of the optimizations Google couldn't make was shortening the handshake. Most servers would allow, basically you could skip a step because you knew what the handshake was gonna be. You knew you were gonna send like, hey, I want A and then you're gonna get back. Well, you know, I have B and then you're going to send C. So since you knew what the response B was going to be, you could just rapid fire, send both A and C at the same time. And most would accept that. So Google was able to cut out the handshake and reduce it by like 50%. But there would be a couple of strange cases where those requests would get dropped. So like 99%, which is really, really, really low percentage. That means that one out of every hundred, you know, is going to fail. So like 99% of web servers, which is nowhere near close enough, would work with it, but a whole whopping huge, gigantic 1%, which is a significant portion of the web wouldn't be able to do it. So I think this upgrade with 1.3 is so that, that they can get that handshake down on all servers that support 1.3 and their dreams of HTTP2 will finally be complete when this gets adopted everywhere.
PAIGE_NIEDRINGHAUS: That would be awesome. I did not know about those servers that would actually drop it. So that's fantastic to know that that's part of what they might be addressing.
AJ_O’NEAL: They had to drop part of what they were doing. Like they reversed it. They did one of those things where they tested it out to see like, how does this work in the web? And it was unfortunately a failure. I'll try to link to that article. I'll try to find it and link it to the show notes. But yeah, so I'm excited about 1.3 If I understand correctly, it gets rid of this tiny little itsy bitsy edge case that caused TLS connections to be way slower than they needed to be.
JC_HIATT: Nice, and this is just baked in?
PAIGE_NIEDRINGHAUS: It is. There is a flag that's called max old space size that you can pass in, which is what we would typically use in a Node command line to actually change memory size available, but this properly configured default is just baked in with Node 12.
JC_HIATT: What are some, like, or what is a common scenario where, like, you may have encountered an out of memory error in the past?
PAIGE_NIEDRINGHAUS: One that I encountered was when I was doing a little project for myself, parsing through census data for the US government. They had given it me a file that was, I don't know, like three and a half gigs worth of data. And I was trying to parse through it, pull out various pieces of data like donor names, numbers, things like that. And the list that I was constructing In memory was actually too big and caused my IDE to crash when I'd try and run it in the terminal. So, you know, something where you're constructing an incredibly large object or array of objects is something that could cause just a memory issue in actual production.
JC_HIATT: But now you can just manually specify a much bigger size if you're doing a project like that, right?
PAIGE_NIEDRINGHAUS: Yeah. That was what I ended up doing. I found an NPM package called Event Stream, and I just increased the memory size that my local terminal was running with. Yeah. That let me parse through the really large files with node.
JC_HIATT: Yeah, I had to do a really similar project earlier this year with voter data from my state and the data just like in your case was I think two or three gigabytes of data and it was a huge, you know, a whole bunch of CSVs that I was actually working at Ruby, but it's same concept of had to split out the CSVs and then like bring in a library to kind of give me those over time. Apparently this is a problem in lots of languages.
JC_HIATT: Cool. So what else are you excited about with this release?
STEVE_EDWARDS: So one thing that piqued my interest quite a bit is the diagnostic reports and the integrated heat dumps. As a developer, I constantly live in the debugger and trying to figure out what's going on. So in particular, I'm looking at the ability to format your diagnostic summaries and stuff like that to sort of make it a little more readable. So can you talk about that?
STEVE_EDWARDS: So then the idea is maybe that, uh, with the native stuff that's built into node, you may won't need some of the external services. They're sort of trying to pick up the slack that the external services have been providing to this point. Is that right?
PAIGE_NIEDRINGHAUS: It sounds like they're trying to. Yeah, I wouldn't say that they're necessarily going to replace them because sometimes having a GUI or having, you know, a nice dashboard will still be extremely beneficial in identifying bugs and fixes. But being able to kind of deep dive into what was going on inside of your server or inside of the code when the error occurred, which is usually the hardest part is figuring out exactly what happened that made it freak out in this way that you weren't expecting, will help kind of demystify that, hopefully make recreation of the errors easier, and then the developers will hopefully be able to fix them faster and pinpoint the actual root cause in a quicker way.
STEVE_EDWARDS: Yeah, that's awesome. Any helping to clarify on errors is always a welcome tool for a dev.
PAIGE_NIEDRINGHAUS: Oh, absolutely.
STEVE_EDWARDS: So then you've also listed some...I see some changes here to the HTTP parser.
PAIGE_NIEDRINGHAUS: Yes, once again, we're increasing speed. But yes, the current HTTP parser that Node has been using for probably since close to its inception has not been the easiest thing to maintain or to build upon in the past years. So there's a new up and comer called LLHTTP, which is a port of the original, but it's been ported over to TypeScript. Then it's parsed through another thing called LLParse, which will then generate the C or the bytecode. It turns out that this transformation to LLHTDP is actually faster than the original parser by about 156 percent. It's actually written in less lines of code. All the performance optimizations which had to be handwritten before are actually now automatically generated in this version.
AJ_O’NEAL: It sounds like what you're saying is you go from TypeScript to WebAssembly JIT style code. Is that what you're saying?
PAIGE_NIEDRINGHAUS: Yeah, the LLParse is another module that will run all the TypeScript of LLHTTP through it. And then that actually is what generates the C or the bit code.
AJ_O’NEAL: So C or bit code, I'm a little confused there. Because bit code meaning basically WebAssembly for the V8 compiler, C would be very different. So why would it be one or the other? Is it something that happens because of a flag you pass to when you build or is it something that, how does that work?
PAIGE_NIEDRINGHAUS: Well, from what I understood reading the articles myself, I think you could choose which thing you wanted it to be outputted to, depending on what your end goal was or where the end place was that this was going to be running. Let me see. Yeah, so according to the GitHub for LLHTTP, the library llparse is used to generate the output C and or bitcode artifacts, which could be compiled and linked with the embedders program like Node.js. Does that make more sense at all?
AJ_O’NEAL: To me, that sounds like this is not just intended for Node, but rather this is a parser that's written in TypeScript that can be exported for various projects and Node might be using either the C version or the bit code version. Is that, am I making correct connections?
PAIGE_NIEDRINGHAUS: I think that you are, yeah. Yeah, I'm reading exactly from the LL parse GitHub now, and it just describes itself as an API to compile NTC output and or LLVM bitcode. So I guess it can do both.
PAIGE_NIEDRINGHAUS: That's cool.
AJ_O’NEAL: It doesn't make decisions, a JIT compiler doesn't make decisions at compile time. Like when you write a C program, you have to say, I want to run these certain optimizations and that's it, it's done. The legit, where the code runs and the compiler analyzes the running code and then speeds it up even faster based on how the code is actually used at the time it's being used, rather than guessing at how the code might be used when it's compiled. So just a little background for anybody that didn't know that.
PAIGE_NIEDRINGHAUS: That's really good to know. That's great
AJ_O’NEAL:. I think it's awesome.
PAIGE_NIEDRINGHAUS: Yeah, I'm really excited. They've had it behind the experimental flag in node 11. So bringing it out to the forefront for node 12 and just putting it out there for everybody to use is really going to put it to the test and we'll see how well it can actually perform for everybody's different needs.
This episode is sponsored by sentry.io. Recently, I came across a great tool for tracking and monitoring problems in my apps. Then I asked them if they wanted to sponsor the show and allow me to share my experience with you. Sentry provides a terrific interface for keeping track of what's going on with my app. It also tracks releases so I can tell if what I deployed makes things better or worse. They give you full stack traces and as much information as possible about the situation when the error occurred to help you track down the errors. Plus, one thing I love, you can customize the context provided by Sentry. So, if you're looking for specific information about the request, you can provide it. It automatically scrubs passwords and secure information, and you can customize the scrubbing as well. Finally, it has a user feedback system built in that you can use to get information from your users. Oh, and I also love that they support open source to the point where they actually open source Sentry. If you want to self-host it, use the code dev chat at sentry.io to get two months free on Sentry small plan that's code dev chat at sentry.io.
AJ_O’NEAL: So there's something that was snuck in to node 10. Like, first of all, I don't consider odd versions worth talking about because they're like the beta experimental versions. So technically this was actually implemented in node 10. It's been there the whole time through Node 11. But I think, I mean, I could be wrong. I think most people play it safe and only go with even numbered versions that have some sort of reliability and maintainability guarantee. So for many people that did not know that it was kind of sneaked in halfway through the 10 life cycle, to me this is super exciting. Node 12 is the first legit maintained release, like, Long, or wait, is 12 long-term maintenance or no?
PAIGE_NIEDRINGHAUS: It will be entering long-term maintenance in October, actually. It was just released in April of this year.
AJ_O’NEAL: Okay, yeah. So this is like dependable, reliable, will be supported, will be maintained. So this is the first version that starts with ECC and RSA bindings to OpenSSL being exposed in Node. So you no longer have to shell out to the open SSL program to generate your public private key pairs. And there's a whole ton of projects around getting key pair encryption working in Node that just had various, various problems with them. And now you don't need them. That's something I really think is cool about Node 12, even though technically it's been there in 11 the whole time and it was introduced midstream intent without much fanfare or announcement. It was more treated like a bug fix that the API didn't exist than a new feature. Did you know about that?
PAIGE_NIEDRINGHAUS: Very cool. No, I didn't actually. So that's very cool to hear about.
AJ_O’NEAL: There was a library called URSA that a lot of people use that has since gone defunct because it just fell out of maintenance. And technically you could always generate elliptic curve keys in Node, but It was under an API that if you were looking for it, you wouldn't have found it. It was kind of happenstantial to the Diffie-Hellman exchange APIs. Anyway, that's probably a little bit too much in the weeds. I'm sorry. I'll shut up now.
PAIGE_NIEDRINGHAUS: No, I think it's super cool. I'm always interested to learn more about node, you know, actual low level node instead of just all the libraries that we run on top of it and the frameworks like express and co-op.
JC_HIATT: Can you speak a little more about the worker threads? I see you wrote something down about that, and I'm just curious, first off, what exactly worker threads are for, and then if also you can speak to how they're relevant to this release.
JC_HIATT: It sounds like that would be a nice feature for reading a two gigabyte CSV file.
PAIGE_NIEDRINGHAUS: You never know I guess I personally haven't had enough experience or really looked into it enough to know how to operate worker threads in a way that makes sense or a way that makes them really efficient. But I agree, it could be great for parsing really large data sets like that.
JC_HIATT: Anyone else have experience with the worker threads?
STEVE_EDWARDS: Nope, not here.
AJ_O’NEAL: Are you referring to the native C++ node modules that are compiled to binaries?
AJ_O’NEAL: Okay. So the NAPI.Okay.
PAIGE_NIEDRINGHAUS: Yeah. So the NAPI basically takes care of the compiling down and the, you know, I guess different versions making those node modules compatible with the different versions of node that are running out there.
PAIGE_NIEDRINGHAUS: Thank you for that incredibly detailed breakdown of how NPI works. That was much better than I ever could have explained it and you know much more about it than I do.
PAIGE_NIEDRINGHAUS: Cool, great example. And actually, those problems that you were speaking of and the tool chain is a great segue into one of the last things that I have to talk about with Node, which is that there are some new compiler and minimum platform standards. So if you're on Mac OS, you're probably going to be fine. There the GCC that's now required as a minimum of six, and then there's a GLibc, which is 2.17 on platforms besides Mac and Windows. So for I guess all Linux platforms. Mac users need at least Xcode 8 and Mac OS 10.10, which is Yosemite. So hopefully most of our developers will be far past that at this point and then Windows, if you've been running any of the Node 11 builds, you'll already be up to speed. Those are just the same for Node 11 as for Node 12. For Linux, the binaries that they're supporting are Enterprise Linux 7, Debian 8, and Ubuntu 14.04. And then if you have different requirements, you can always go to the Node website and find more about what exactly you need to install or compile to be able to run the latest versions of Node.
AJ_O’NEAL: Somewhat tangential, do you know anything about Chakra Core? Is that project still active? Is that still something that is going to provide some coolness in the future? Is that kind of dying out?
PAIGE_NIEDRINGHAUS: I am not familiar with it actually. Could you fill me in on what you know so far about it?
AJ_O’NEAL: So Chakra Core was a project to, I mean, Mozilla tried to do something similar with like the Jaeger Monkey or Jaeger Node and then Microsoft took their thing. So V8 is Google, Chakra is Microsoft. And so Microsoft was supporting getting Node to run on the Chakra engine. But then Microsoft also just announced that Internet Explorer switching over to V8. From an academic perspective way awesome. There were some practical improvements that I thought Chakra was possibly going to add to Node to possibly give better performance than V8. But then with Microsoft moving from Chakra to V8, and V8 I think has kind of adopted some of the improvements that Microsoft had researched and proved to be effective. So I'm curious if that project is gonna continue or if there's new stuff that's gonna come out of it. But I am, I guess not worried, isn't quite the right word, maybe sad that I'm thinking it might just end up dying out because everybody's just gonna use V8 now probably.
PAIGE_NIEDRINGHAUS: You're probably right. I didn't come across anything related specifically to Chakra Core when I was doing research on these new improvements to Node. So from what I can gather, it seems like Node is leveraging V8 pretty heavily and it's going to continue to do that instead.
AJ_O’NEAL: I just noticed on the download page, it's still listed down at the bottom. So that's why I was curious.
JC_HIATT: So anything else to add before we go to PIX?
PAIGE_NIEDRINGHAUS: No. Like I said, I'm really excited about these new changes that are coming to Node. And I think it's going to be some really great improvements for everybody who's developing with it. So hopefully they'll keep turning out great features and great improvements.
JC_HIATT: Yeah, and thanks for coming on and kind of filling us in on this.
PAIGE_NIEDRINGHAUS: Absolutely, I'm really happy to.
AJ_O’NEAL: This has been exciting stuff to learn about.
PAIGE_NIEDRINGHAUS: Glad to be a part of the conversation.
JC_HIATT: Cool. So, um, we can go to picks. I can go first. So I have a few picks today. First off, the, um, AWS amplify framework. If you are not familiar with it, it's been around for a while. Uh, but I've really been digging into it lately. I'm doing a, um, kind of a consulting gig on it right now and, um, kind of been learning a lot about Amplify and then just in a broader context, serverless, all that stuff. And it's really, really cool. And at minimum it's kind of magical. Like you can have authentication with a GraphQL database, like set up in 10 minutes. And they even give you like react components that work out of the box for like offflow, like logins, sign up, forgot password, all that stuff. You get, you know, email sign up or social sign up, things like that. It's really, really impressive. And it's all just from a really nice CLI. So definitely check that out. Next up is a book. I'm a really slow reader. I've been working on a book for about two months, just a little at a time. But it's a book by a guy named Jordan Peterson. He has a book called 12 rules for life and antidote to chaos. And it's just been a really good book to read in terms of just like facing adversity and kind of bringing order to your life. And lastly, I'm going to kind of shamelessly pick an upcoming workshop I'm going to be doing. I've got two workshops on React and Gatsby I'm going to be doing in San Diego at the beginning of November. So keep an eye out on my Twitter if you're interested in maybe attending one of those.
STEVE_EDWARDS: Cool, I'll go next. I only have one today, but I think it's pretty monumental in the world of cartoons. Yesterday, I saw a few articles about the Far Side possibly coming back. So the Far Side Gary Larson, he quit drawing it quite a number of years ago, and he actually didn't allow people to distribute his cartoons on the web. And so the only way you can really see those is if you have the books. But the Far Side website had a message yesterday about the Far Side possibly coming back. So people are trying to figure out what that means, if he's going to be drawing new cartoons or just he's going to have an online presence for all of his many previous cartoons. So as someone who is a huge fan of that cartoon, I'm really looking forward to see what comes out of that or what its reincarnation looks like, shall we say.
PAIGE_NIEDRINGHAUS: That's awesome. That's very cool. I have a couple of picks that should probably be pretty popular. Neither one of them is very developer-focused. But the first one that I'd like to plug is the espresso maker that I have at home. It's called the DeLonghi Magnifica XS, and it's an automatic espresso and cappuccino maker. And it is amazing. It's like, you just need whole beans, and you put it in the top, and you press a button, and you've got some of the best coffee you will ever have. So I would highly recommend it. It's a little bit of an investment, but it's been such a great find for my husband and I, and pretty much everybody who comes to stay with us at one time or another. And the other thing that I would love to plug is that I will be speaking at Connect Tech, which is happening in Atlanta in the middle of October. I'll be talking about responsive web design and specifically how to do responsive with ReactJS. So it would be awesome if anybody's coming to attend to come out and see me speak there.
JC_HIATT: Yeah, I'm looking forward to meeting you, Paige. Absolutely. Yeah, that'll be great. Hey, Jay, you have some picks for us?
STEVE_EDWARDS: Well, it was also put out there because George Lucas basically gave away a lot of that technology. He just said, here, you guys can use it. He didn't license it and make people pay for it. And that's what's credited with a lot of the sci-fi explosion is that he invented all this technology and then just gave it to everybody and said, hey, go ahead and use it.
AJ_O’NEAL: When was the last time you watched a movie that didn't say Lucasfilm? Skywalker, Light and Magic and Skywalker Sound at the end of the credits though.
PAIGE_NIEDRINGHAUS: That's just what I was thinking. They're involved in everything nowadays.
AJ_O’NEAL: Pixar, Disney, Marvel. I don't know the last time I saw a movie that has, unless it's a pure drama, any movie that has special effects at the end of the credits, Lucasfilm, Skywalker Sound, Skywalker, Industrial, Light and Magic. There's like five or six Lucasfilm companies. At least three, I don't know if it's five. End of all the credits.
STEVE_EDWARDS: So AJ, here's a couple things about Dune. One, thanks for the spoiler, because I know that book just came out. And so I haven't had a chance to read it yet. Second, do you know what the inspiration for Dune was?
AJ_O’NEAL: I do not. I think it was weed, because the dude is obviously stoned.
STEVE_EDWARDS: No, actually. Um, it's a place down on the Southern coast of Oregon, which is where I'm from, called the Oregon dunes. So back in 1957, Frank Herbert came down because he'd read an article about how people were trying to plant some grass to keep the dunes from shifting all over the place. And that's what sort of kicked off his imagination. And he started writing, uh, dunes once he saw that.
AJ_O’NEAL: Well, that is pretty cool.
PAIGE_NIEDRINGHAUS: That is cool.
JC_HIATT: Paige, if anyone wants to connect with you online, Where's the best place to do that?
PAIGE_NIEDRINGHAUS: Well, I'm on Twitter at PNeedri, P-N-I-E-D-R-I, and I also write regularly on Medium. So you can find me there at page N11.
JC_HIATT: Cool. Yeah, we'll put links to all this in the show notes as well.
JC_HIATT: Well, thank you so much for coming on and chatting with us today.
PAIGE_NIEDRINGHAUS: Absolutely. Thank you so much for having me. This has been really fun.
JC_HIATT: Yeah. Yeah, likewise. Cool. All right. Well, we'll see everyone later.
AJ_O’NEAL: All right.
PAIGE_NIEDRINGHAUS: See ya.
Bandwidth for this segment is provided by Cashfly, the world's fastest CDN. Deliver your content fast with Cashfly. Visit C-A-C-H-E-F-L-Y dot com to learn more.