AJ_O’NEAL: Welcome back for another wonderful episode of JavaScript Jabber. On today's show, we have first our guest, Connor Bronston. Oh, I'm sorry, I spelled your name wrong. I spelled it with two N's.
CONOR_BRONSDON: It happens, man.
AJ_O’NEAL: We'll fix it in post.
CONOR_BRONSDON: All good. Not something I worry about too much. Given that both of my names, you could assume I'm spelling them a little wrong. So we'll usually assume double N's for the first name, Bron's son for the last name. Just don't worry about it.
AJ_O’NEAL: All right, cool. And then you give us the 30 second who you are and
CONOR_BRONSDON: yeah, pardon me here. So I'm the cohost of the Dev Interrupted podcast. We talk with engineering leaders every week, like AJ and Steve and leaders at Slack, et cetera, about topics like DevEx, Gen. AI. I also work at Linear B. And one thing we've really done is dive into data reports around how software engineering and software development lifecycle works. So I'm excited to share some insights, early insights actually from our GEN.AI report that's out tomorrow, along with talking DevEx and some of the impacts we've seen from things like the Dora study, which we worked with Google on this last year. So should be incredible conversation and excited to dive in these insights with you too.
Hey folks, this is Charles Max Wood. I've been talking to a bunch of people that want to update their resume and find a better job. And I figure, well, why not just share my resume? So you if you go to topendevs.com slash resume, enter your name and email address then you'll get a copy of the resume that I use, that I've used through freelancing, through most of my career, as I've kind of refined it and tweaked it to get me the jobs that I want. Like I said, topendevs.com slash resume will get you that. And you can just kind of use the formatting. It comes in Word and Pages formats, and you can just fill it in from there.
AJ_O’NEAL: All right, and then we also have trustee Rusty Steve.
STEVE_EDWARDS: Hello, hello, hello. AJ, we got to work on your hosting skills here. You always do the panelists before the guests. They're the best for last, right? So we got to work on this.
AJ_O’NEAL: You're kind to me on that one, Steve. We'll take notes.
STEVE_EDWARDS: And then Connor, I had to keep a straight face when you put me and engineering leader in the same sentence. My general approach with work and podcast is I always bring on people smarter than me. And so, uh, my, I am the leader in good dad jokes, but, uh, we'll leave it at that.
AJ_O’NEAL: And that's a crucial contribution.
STEVE_EDWARDS: So that's very crucial. Yes. And I will show leadership at the end of this episode in that area.
AJ_O’NEAL: That's right. Leaders joke last. But no, that's not how it goes. No, uh, anyway, but with that all the way, we'll go ahead and introduce today's topic, we're going to be talking about Gen AI, which is like Generation Z, except not at all related. It's just generative AI and then developer experience. And you've got those insights you're talking about. So where, where should we start Connor?
CONOR_BRONSDON: Yes. Uh, I think a great place to start is the definition phase. So you mentioned generative AI. Let's just make sure we level set with the audience and showcase that. Yeah. When we say Gen AI and software development, we're talking about generating code. Uh through the use of an AI tool like Co-Pilot, like these other tools out there. And the cool thing that Linear B has done is we had done a study following up to the great work that the Dora team did this last year and said, okay, the Dora team, when I met with Nathan Harvey and we partnered with them on their Dora report in 2023, they found that there weren't early efficiency impacts from Gen.AI that were showing up in their survey audience. However, what they did see was that teams that were using Gen.AI, and about 50% of the respondents were saw improved happiness on their dev teams. And we correlated that with, oh, maybe we don't have to necessarily spend as much time making tests because we can have the AI help us with that, generating those. Or maybe the annoying parts of the code can get kind of abstracted out so I can then focus on the strategic pieces. And that's where I think there's a huge opportunity initially is like, hey, how can we remove these blockers? How can we improve here? And now what we've done is we've followed this up and we've done targeted surveys with over 150 different development teams across the world and are now adding in quantitative data as well from our Linear B tooling. Linear B for context is a platform for software delivery management and software engineering insights. We help both provide quantitative and qualitative metrics and automation for improvement. And one of the things that we're going to be rolling out here shortly is a beta of how you can actually track Gen.ai pull requests or partially Gen.ai generated pull requests and then compare them over time to that is not being affected by GenAI. So you can actually see, are we getting efficiency gains? Is my GenAI initiative at the executive level making an impact or is this kind of more hype than anything else? And we have some really good answers. Yeah, please.
AJ_O’NEAL:Define GenAI pull request. Does that mean that the AI wrote the description or that the AI wrote the code or that the code?
CONOR_BRONSDON: Code contribution. So we're working on a tool to automatically detect GenAI's contribution there. So linking up with Copilot or other different J&AI code generation tools to say, hey, this pull request is partially written or was written in concert with kind of a J&AI code assistant. And then we can evaluate code quality down the line and say, did we have to refactor it more with our issues that were caused by it? Do we need to maybe add an additional test early on to make sure that everything is working? How does this fit? And we're seeing some fascinating early insights.
STEVE_EDWARDS: Okay, so before we get going too much farther, we got to define some terms here. As a parent, When I hear Dora, I'm thinking swiper, no swiping, you know, Dora the explorer.
CONOR_BRONSDON: Absolutely.
STEVE_EDWARDS: So who is Dora? What is Dora? Why is Dora important and et cetera?
CONOR_BRONSDON: Great question. I am using a lot of acronyms and I apologize to anyone who's fallen along and not get at them. So Dora is the Accelerate State of DevOps team at Google Cloud. And they've done more than 10 years now of seminal research in software development and software development metrics to understand how productive engineering teams are how to make teams happier and help build high-performing engineering teams.
STEVE_EDWARDS: Okay, so moving on, we've established Dora. So what were you about to ask, AJ?
AJ_O’NEAL: No, I already forgot. So Connor, you had just said something about results or insights that you were kind of summarizing and I was probing for more detail.
CONOR_BRONSDON: Happy to dive into some more. So I'll say a couple of things. One across our survey set, we saw that 87% of organizations are already planning to invest or have already invested in the GEN.AI tool in 2024. So the idea that this is something that is very broad across the industry is absolutely true. We're seeing that rapidly accelerating our data set compared to 50% in survey respondents last year for the DORA survey. We're not seeing this continue to increase. And it's a key area of focus. And I think part of that is that efficiency gains people are starting to see. And then the happiness gains I mentioned that are clearly being shown across dev teams that are leveraging this. You know, a great example of a success story on this is Uber and Uber has actually started to help not only generate code internally using a JNI coding tools, but also to generate comments for code. So what's one of the most annoying things that happens in a software development teams life cycle. It's like, oh, I have to go review someone else's code, right? Like, okay, like I have to, you know, pause what I'm doing, maybe jump out of a meeting and I'm like, hey, there's this, what do I have to understand this PR? What's the context I need? We can start leveraging automations, like the Gitstream automation that Linear B has to provide context. And then Uber's actually saying, hey, we're going a step farther and we're actually going to help generate comments, suggested comments that people can then evaluate immediately in that PR to help improve the PR's quality. So huge opportunities there. And I know Uber's developer experience and developer productivity teams have estimated they saved about $10 million in time saved for engineers last year alone. And that doesn't even consider the potential, positive benefits on that happiness piece we talked about earlier of like, if I'm a dev on a team and these annoying tasks are starting to get automated away, or at least it's getting easier for me to do them, whether that's tests or reviewing code or having better tools to help me that are enabled by generative AI. This is a huge opportunity to improve retention, improve dev happiness, and we've already seen a million correlations across various studies with happiness and high-performing teams. So it's a really exciting thing for the industry.
AJ_O’NEAL: Okay, so we've got a gambit from, there are people who literally do not start coding without some sort of AI integration. And then on the other side, we've got people who vehemently refuse or are banned from opening such such tool so can you tell us anything about that? Spectrum and it was any of that type of data captured in the in the report.
CONOR_BRONSDON: Yeah, and this report will be out for everyone to read here the next few days. So Definitely go to linear B.io to check out more information here. It should be out 48 hours ish from now, but the exciting thing here, as you mentioned, AJ, is that, you know, there's these opportunities for improvement, but there are risks. And this is one of the key things that we captured in this report, which is the risks are perceived as very much compliance, quality IP. I mean, a security is a top concern for devs broadly, but particularly with generative AI tools where they can be kind of a black box of what's this code that's coming out of here worth, you know, giving access to our code base. What's the risk level here? And what we saw is that risk levels or perceived risk levels drop across the board as adoption grows. But IP and compliance are major concerns for frontline managers and executives. And interestingly, something we saw is that the frontline managers are actually more concerned about these risks, maybe because they have more in-depth understanding of what the tools are actually doing, maybe simply because they feel more accountable to it than the directors and executives, who I think are feeling a lot more pressure to go, oh, hey, I have an initiative I need to do. My board is telling me we have to have an efficiency initiative. We have to do more with less. I'm sure we've all heard that terminology thrown around. And so there's this odd change between the business side, which is really pushing this and says, we have to improve our software development with AI and the managers who are, to your point, AJ, concerned, and I think frontline devs as well in some cases, who are saying, hey, is this the right thing to be doing? Does this actually help me in and AJ, before we started recording, I know you mentioned that in your experiments with genitive AI code, you felt like there are times where maybe it just wasn't getting things right and you weren't as tightly aligned to what you actually needed here. And that is definitely something we saw as well. So there's a huge opportunity for improvement where like basic tasks, automations tests, these are things that can really be helped, but as code gets more and more complicated, it's harder to get your gen AI code assistant to actually match the in-depth needs of very particular coding needs. So there's very much still a need for senior developers to come in and say, Hey, I understand this code base. Let me help guide this. Let me make this work. And, you know, one of the things that we're predicting is that that need for senior devs is actually going to really increase because particularly as junior devs start leveraging JNI code to just move faster, deliver more code into the system we're going to need a lot more senior devs who can help parse through that and make sure the quality is there. And there's a huge risk that as we just simply increase the speed, which was we generate code, that there are quality issues in that, you know, review cycle in that actual delivery cycle, because we designed all of our software development systems to have a certain amount of code going through them. And now we're saying maybe we're at 20% more code, maybe a 50% more code. Are the different systems of review, systems of tests that we expected to have here going to work or are we going to have these huge blockages that maybe then people don't spend as much time reviewing? And this is where there's this need for this next level automation, like what I mentioned Uber is doing, like what we're doing with our free GitStream tool, say, okay, how can we help these blockages in the software development lifecycle, these area where the pipe kind of tightens up, particularly code review, move faster?
AJ_O’NEAL: So how is code review going to be affected by AI? Because to me, like...Okay. So my experience with AI, I use both. Oh, llama with the, the code up. I think it was code up and code wizard. There's, there's two models that I've switched between with a llama. And then with GPT chat, GPT, uh, I use just the main website. I don't have any VIM integration for code gen in part, because I've, from what I've seen and experienced thus far having it pop up bad code half the time and me having to do the mental context switch of like, am I writing code or am I correcting code back and forth? That's like that. I'm that's frustrating to me. So I'd rather just take a few extra seconds to type things out and just type out what I know I need to do rather than like approve or deny and retry on something that's relatively, you know, simple on on code gen. So for like little snippets and stuff, I'm not doing it. But I will, I'll ask GPT a question. And again, it's like 50% of the time it's worth it, I guess, because it'll get it right, it'll get it wrong. But I'll ask it a question. And it seems like the closer my question is to Python code to start, the better it's going to be at generating Go or JavaScript.
CONOR_BRONSDON: Yes.
AJ_O’NEAL: But but there is some value there because there are some things that are relatively common boilerplate that are you know, it might be 30 lines of boilerplate and I'll ask it to do something that I know is maybe in that realm and it can generate it.
CONOR_BRONSDON: It's very good at that.
AJ_O’NEAL: Yeah, it is. It is. I also asked it to write a backup routine and it wrote a function that would delete all files in the folder.
CONOR_BRONSDON: Yeah, that tracks. As I think we both experienced here, the more complicated your use case the farther away Gen.ai is from being ready to really solve it for you without deep fine-tuning. This is where like major enterprises have the opportunity to say, Hey, we really want to fine-tune this tool internally so that it fits our code base, it fits our needs. But right now it's really good at the basics, helping solve these kind of pain points. And where we're seeing it be interested in code review, as you asked, is that we have just more code moving through the system because the basics are all easier now. People can now just deliver code faster. And so now we're seeing this major pain point, which was frankly already a pain point for many software development teams, which is not only are there more code reviews to be done, there's more code to be reviewed. People are still spending most of their time on generating code. So what we've seen is this already existent pain point get exacerbated by the increased speed which code is generated.
AJ_O’NEAL: Yeah, but so what I was gonna say is, what I see it do is I'll see it add comments to code. And asking it to summarize code, it can, it can often do a really good job at that. Even, even if it's something pretty niche, like there's been cryptographic stuff that I didn't understand. And I was working in some cryptographic library and I paste in some code and I say, you know, what does, what does MKW stand for here? And it's like, that's the Mill curve. And I'm like, oh, okay. Is that a real thing? And then, you know, it turns out it actually is. And I'm like, oh, okay. That's that. And so it's done really good at generating comments, but then most of the time it's pretty dumb. It's, you know, because it's, this adds two and four to get six. I'm like Yeah. Yeah. And so when I'm, when I'm thinking about in code review, I just, most of what it would do. I, you know, if it was something for a cryptographic library, I would, you know, want somebody who's at a really high level to be handling that because it's something that's really sensitive and we don't want to take a chance that it's a hallucination, right? And if it's something that's really basic and then it's just going to spit out, okay, this, you know, this really benign, this or banal description of, of code or something. Where, where does it, where is it helpful? What is it doing? That's, that's really, what's the sweet spot for that?
CONOR_BRONSDON: So I actually think you nailed the opportunity here, which is context and, and saying, okay, how can we auto genre generate the right piece of context? So maybe it's things like something that we're doing is a PR labelling how long is this PR going to take? We can estimate based off of a little machine learning on the back end and say, oh, this, this PR is X number of lines of code long, 104 lines of code long. This will take five minutes or this will take 30 minutes to review. And it can also give you, hey, here's a few things that you might not know what this means to your point, you know, the MKV or whatever, and give you that context so that you can hopefully jump in. And it's a lot easier to get acclimated because too often, and I'm sure you've experienced this, someone you know, you get assigned as a reviewer for PR for someone else and you have to go in, you're like, okay, what are you doing here? You know, maybe they didn't give you a great comment from the start with like, Oh, here's the context. And that's where I think Jenny, I can start, start to help us. Um, so it's helping us very much in the, Hey, let's code, let's code through the kind of annoying boilerplate faster. Let's code tests faster. Great. And then let's give context faster so that we can then make more strategic decisions and spend our time on kind of the higher leverage opportunities of, okay, does this all work? Let me apply my knowledge.
AJ_O’NEAL: All right. So let's talk about tests for a second. My experience in testing code when I, when I really feel like I need tests is usually one of two situations. Uh, it's usually either regular expression or it's something where I actually am not certain what the result is supposed to be. So I'm writing a test kind of to say I think the result is supposed to be this and then if I find a case where it falls outside of those parameters then I need to manually review and say okay is it that the code is not handling an edge case or is it there was an edge case in the input that needs to be handled in the test. I'm typically not doing Um I don't think that the type of when I'm, when I'm reaching for tests, I mean, I guess the regular expression stuff could. I mean, the test is really there more to explain what the regular expression does. It's more like a comment at that point. But then again, so is, you know, that other thing. So, so what kind of, um, test generation, uh, again, is, is where the value is being brought to the table.
CONOR_BRONSDON: Yeah, so I would say two things. Um, and I'll admit here that I haven't spent a ton of time generating tests with genetic code. This is very much something that I'm hearing second hand from people that I'm seeing in our data.
AJ_O’NEAL: I've heard it. I just haven't seen it. I haven't. I totally will say, Oh yeah, I'm generating my tests, but I haven't seen the tests that were generated to, to be like, Oh yeah, that brings a lot of value to the table. I saw
CONOR_BRONSDON: for two examples. So, so one, that regular expression, when you mentioned totally a great use case, like this is something that's very possible. It's I don't want to call it basic, but it's like, okay, it's boilerplate that we've kind of talked about. They can get this. Let's, let's help move things faster. Save me a little time. Um, and the edge cases, I think are still to be figured out though. Maybe you'll have someone who's going to come on next week. Who's going to be an expert in edge case testing with gen AI and they can help with help lead us there. But the other area here is that when you have a large enough database, um, so I'm going to keep using this Uber example, um, cause it's something that they've they've looked at and they're saying, okay, now we have thousands of devs we have an idea of basically what our code base is. We know how it functions. Let's take all of those tests and make them artifacts that help train this AI. It's particularly saying, okay, let's suggest tests for you right off the bat for some of those edge cases. Here's things that we saw as potential edge cases that were issues in previous similar areas of the code base. Let's just suggest some right off the bat for you. And that's an area that I think is a huge opportunity for companies that, and particularly like B2B companies that are working with a variety of clients, we can say, hey, can we train a great database to improve testing off of this? And I think there's a lot to be done there still, but we're hearing great results from individual devs and small dev teams, and we're seeing some enterprise companies succeed with that. So I expect there to be an explosion of these tools moving forward. Yeah, so I think one really interesting thing we saw in the report was one of the things we asked survey respondents was, Hey, do you prefer qualitative metrics or quantitative metrics? I.e. do you want hard data of like, this is 10% better or do you want, hey, we saw developer happiness improve? And what we saw is that particularly with larger companies who are usually farther along in their adoption journey for AI, they just have the resources to do it and add it more into the software development process. They were ready to start saying, hey, we want these hard metrics. We want to see, you know, 8% efficiency gain is the ROI of our Gen.AI initiative you know, 7% of an efficiency gain. Whereas earlier on the initial signal is all, Hey, our devs are happier. They feel like they're moving faster. We see this gain in total lines of code created, but those are all kind of bad measures. Well, I mean, dev happiness isn't, it's an amazing dev X measure. We can definitely talk about that, but like lines of code, for example, is. I think we all know not necessarily indicative of great code or, you know, what you actually want. And there are some terrible layoffs.
AJ_O’NEAL: My favorite PRs are more red lines than green lines.
CONOR_BRONSDON: Yep. And, and this is where I think, um, companies are looking to say, okay, what's the actual impact of the code that we generate with gen AI. And this is where we're running our beta with linear B we're saying, okay, let's label every PR that has been co-generated with generative AI or fully generated with generative AI, and then track its life cycle to see its impact on the code base. And we're, we're very much just starting to get data there. We're going to be putting some out in a workshop here in a couple of weeks. But I'm excited about the long-term potential there. Cause I think we're going to have this transition over the next couple of months from, you know, survey data, which is super useful to survey data backed by these hard data metrics, these quantitative data metrics that can really dive into it. And this is where you also see this in developer experience, which kind of like was the early indicator here of success. So as I mentioned in our work with Google Clouds, a state of software development report last year, we saw these efficiency gains for teams because team members were happier and it was kind of correlating this positive feedback cycle for dev teams based off of the leveraging gen AI, whether it's for tests or coding those basics. And now I think we're starting to see this wave and are going to continue to see this wave of, here's the actual impact of that org long-term. Here's where they delivered X number of features faster. And that's what we're really excited about is that next step and that's what we're building.
AJ_O’NEAL: Okay. And then the developer experience, and I, I want to allow us to take this conversation outside of just the AI portion, but with the developer experience, what, what were the areas in which that happiness was improving or that, and well, and also I want to know, I want to know what, what tools actually before I go into the, the question about the happiness, what, what tools are people using? Because I'm I'm familiar with chat GBT and Olamma. Those are the ones that I use the most. And then I know that a lot of people are using VS code with the Microsoft, uh, chat GPT plugin, the co-pilot plugin. What, what else is there out there that people are using?
CONOR_BRONSDON: Yeah. So obviously co-pilots, the hundred pound gorilla here, they've got, I think 150,000 organizations across the world using it ChatGPT being the more broadly available for every individual dev and everyone else. Code Whisper is another that's out there that we saw was quite popular in our survey. Codium.ai and Tab9 were a few others that we saw. But you named most of the key ones there.
Hey, have you heard about our book club? I keep hearing from people who have heard about the book club. I'd love to see you there. This month we are talking about Docker Deep Dive. That's the book we're reading. It's by Nigel Poulsen. And we're going to be talking about all things Docker, just how it works, how to set it up on your machine, how to go about taking advantage of all the things it offers and using it as a dev tool to make sure that what you're running in production mirrors what you're running on your machine. And it's just one of those tools a lot of people use. They're really looking forward to diving into it. That's February and March, April and May, we're going to be doing the Pragmatic Programmer. And we're going to be talking about all of the kinds of things that you should be doing in order to further your career, according to that book. Now, of course I have my own methodology for these kinds of things, but we're going to be able to dovetail a lot of them because a lot of the ideas really mesh really nicely. So if you're looking for ways to advance your career, you're looking to learn some skills that are going to get you ahead in your career, then definitely come join the book club. You can sign up at top endeavs.com slash book club.
AJ_O’NEAL: Okay. And then tab nine, man. Oh, they were doing, they were doing a marketing push. And then I think. Two weeks later, Microsoft announced the GPT co-pilot. I was so sad for them that day.
CONOR_BRONSDON: Their timing was tough, yeah.
AJ_O’NEAL: Yeah.
CONOR_BRONSDON: It's hard to compete with a company like Microsoft when they decide to roll something out like that.
AJ_O’NEAL: Well, I mean, and you don't know that it's coming, right?
CONOR_BRONSDON: No, yeah.
AJ_O’NEAL: They've been working on their platform for however long, you know, a few years, finally getting out of beta. And then just overnight. Oh, that's so rough. That's so rough. I did use tab nine and I kind of had the same, you know, feeling about it. Like it was, it was, it was cool. It did. Um, it was so good with refactoring. So good in terms of, you know, I need to go change the next 10 lines to be from. I don't know. Grams to kilograms or whatever, you know, some sort of. I need to add SQL tags to the comments. I needed whatever, like it was so good at that sort of thing. And I imagine code pilot probably is too, although, or copilot, although I haven't used that and then codium, I, I think I've heard of this one. Is this any relation to VS codium?
CONOR_BRONSDON: Uh, yes.
AJ_O’NEAL: Okay.
CONOR_BRONSDON: But codium.ai is his own website, but I do believe it's related. I'm not speaking here. Maybe Steve's going to correct me here.
STEVE_EDWARDS: Well, no, I just said, uh, A panelist, co-panelist of mine on another podcast was using Codium and he really liked it. He didn't get into details, but it was just the way that the AI was just sort of built into everything in the IDE was the way he described it was very helpful. So I haven't used it myself, but I just remember his description of it.
CONOR_BRONSDON: I know they have both the JAP brains and VS code free versions. So I do believe they're right.
STEVE_EDWARDS: Oh, Jeff brings I use PHP storm up check that out.
AJ_O’NEAL: Well, I'd be happy to pay for their Vim plugin when that comes out. Okay, is this one more privacy-focused or something, or what makes it, do you know what makes it different?
CONOR_BRONSDON: I'll be honest, I don't. I don't know a ton about what makes them different. I've been pretty focused on looking at co-pilot because the data sets a lot bigger, and kind of evaluating our metrics across three areas. So like adoption, AKA. Are you opening more pull requests? Are you getting more pull requests merged? Are we just seeing more code flowing through? And then the benefits piece of, are you merging more frequently? Is coding time going down? Are more stories getting completed? Are you being more accurate when you plan stories? And then also we're starting to look at the risks too, like our PR size is bloating, is rework rate up, is review depth down because people are just kind of skimming, our PR is being merged without review, that kind of thing.
AJ_O’NEAL: What about do you have anything in place for license violations? Because I think that it's been shown that in many cases, it's almost a direct copy and paste from the source material. And I think that I haven't experienced this myself, but I've heard that people have complained that it's generated GPL license code as like a whole block, which is really weird, because Microsoft for the longest time. You know, they wouldn't even, they would not allow their employees to look at GPL code and to think that we've got an AI scanning all of GitHub where I don't know what 5% of the code I'm just making up a number, you know, but it's like MIT and Apache are probably the highest BSD, but there's a lot of GPL code out there. Um, any, any thoughts or observations on the licensing concerns?
CONOR_BRONSDON: I mean, I think that's a huge concern to your point. Like this is one of the areas where we saw significant concerns, this compliance side of things. Um, uh, we don't have a full solve for it yet. I think this is something we'll, you can expect us to have an automation out on our Git stream platform within linear B here in the next month or two. That will hopefully help track that. But, um, it's a, it's definitely a risk that I think companies are going to solve. And I mean, we're seeing lawsuits over this kind of thing not just in the code space, but look at the New York Times suing OpenAI for use of their content and models. So the legal side of this is going to be a massive thing to unfurl over the coming months. No question.
STEVE_EDWARDS: Yeah, I can remember. I can remember the big stink when GitHub's copilot first came out that I remember seeing on Hacker News or other places that there were authors that had private repos, you know, with non-GPL licenses. So it should not have been in X by by a co-pilot and they would type in and here comes their code, just, you know, straight out of the way they read it, coming up from co-pilot when it wasn't supposed to be. So that was, I can remember that stink and I haven't really followed it since then as to what actually happened with those instances. But it struck me as odd as that I would think that, you know, co-pilot being a GitHub product and GitHub having access to all of the repos, how you don't have a flag in there says, okay, if this isn't private, don't going into the access code. It didn't seem real complex if you own the whole back end, but obviously I'm not sure how that works, but that was a big issue.
CONOR_BRONSDON: Yeah, it's a great question, Steve. And it's really interesting too, because we've talked about these three use cases for it, right, of writing new code, writing tests, assisting in code reviews. And then there's this whole other side of, I'll call it like generating documentation. And some of those, it's totally fine. Like generating documentation, sure, I don't care if I'm. Copying something probably like maybe there's some compliance concerns, but that writing new code piece There are absolute compliance concerns and and I know copilot is working on this but to your point It's surprising. They're not a little farther along given that it is a Microsoft-related product.
AJ_O’NEAL: Yeah, and I I do wonder I okay I've seen one of these reports that came out where it generated like entire Paragraph of paragraphs or text or or it would generate images that looked almost identical to the original but with basically noise on top of it. And in that case, I think that that is truly a copyright concern. But I also wonder how much of this is Boy Cried Wolf, where how many ways are there to implement the, you know, area of a triangle function. And so I do want to, you know, just put in my own caution there, even though I'm the one that brought this up. I haven't looked at what these people are saying to verify, oh, that's a legitimate copyright problem, as opposed to, well, how many ways in Java can you implement a JWT signing function? You know, maybe it pulled from some quote-unquote, pulled from GPL code. But maybe that's because the GPL code is also the same as the MIT code and the Apache code for those three lines that everybody has in common. And I think that that was one of the things in that Oracle versus Google lawsuit over Java, that if I recall correctly, ultimately they ruled in favor of Google saying, yeah, like for common mathematical functions and, you know, inputs and outputs of an API that you can't copyright or patent that stuff because it's just too basic. So you know, there's, there's probably some mix in there of people crying wolf and making a big deal out of nothing. And then, and then some actual, you know, like what Steve was saying, where. You know, there, there's some sort of proprietary code, particular to a company. And then boom, all of a sudden it's been shared. And actually that that's a one question that I had is that how much of this code cause cause with chat GPT, the license with chat GPT and I, and I wish they would make it easier to get a business account, I'm trying to get a business account and it is it is really hard to get a business account with open AI, but anytime you use chat GPT, anything that you type in or paste in goes into the pool for model learning. But on the business account, if you use the API and then the, it doesn't go into the pool for model learning. You keep it proprietary. Um, do you know anything about that?
CONOR_BRONSDON: Well, I would just say that. I think this is a huge opportunity for OpenAI and all these other companies here. And this is why you're seeing them get these crazy valuations is because every business is going to want to secure their own code base and data, and they don't want to extend it into a main chat GPT database. So to your point, AJ, like this is a reason for every business to look at that business, business style version of it and pay that extra dollar is that compliance use case. And it's a classic B2B one, right? Where it's like, oh, You know, anyone can get our basic tooling, but you know, you have to get our enterprise package in order to add the compliance piece, add, you know, real base access control, these other things. And I think you're absolutely going to see that. I mean, we already are with all these AI companies.
AJ_O’NEAL: All right. So now, now let's take the turn and let's Steve, do you have any more questions on this topic?
STEVE_EDWARDS: No.
AJ_O’NEAL: Let's, let's take the turn and go into the, the developer experience, the developer happiness, uh, topic genre. So we go ahead. Let's start with the AI. What are the biggest improvements? What are people excited about? What's what's making them happy?
CONOR_BRONSDON: Yeah, so I mentioned Dora's report from last year. I'll just say the 2023 accelerated DevOps report is what I'm saying when I'm referencing that. So I interviewed Nathan Harvey, who's the head of Dora's kind of outreach team on our Dev Interrupted podcast last year. And we were also partners, linear view as partners on the report data. And one of the really interesting things we saw in that report is that they saw among the 3000 plus organizations they surveyed that last year, that about 50% of them were using AI tools already. And, you know, most were already considering adoption, but of those folks, there was a correlation with improved developer happiness on dev teams. And so this is where we've started to see these great examples of teams that are, you know, saving time on documentation, saving time on code, saving time on tests, saving time on code generation. And it's helping devs get rid of these annoying blockers in their time, and spend more time on the strategic inputs. And so it's a really key piece of how you can potentially make your team both more efficient and happier. And there's plenty of research, whether from Google or many others, that has shown that happier teams perform better. There's this like positive feedback loop of, hey, I feel good. I have more energy. I maybe feel like I'm being more impactful on my work. And this is where I think a lot of companies are starting to wake up to the importance of that developer experience idea. You can look at LinkedIn, which uses a developer experience index, which is a specialized metric LinkedIn provides to teams like aggregate scores based on a number of different metrics, such as, you know, local build times and, you know, happiness scores on surveys and a variety of qualitative and quantitative measures. And they use that to help kind of calibrate the complexity of developing within their Oregon. And Uh, there's plenty of other companies do that. Right? Google does this Peloton does this. Obviously they're different pieces. Like for Peloton, for example, they look at engagement, velocity, quality, and stability. And it's really important that your organization starts to think about this in the context of, am I making my devs perform better because we're giving them the right tools, we're giving them the right time, we're giving them the right frameworks. And am I going to be able to recruit and retain devs because of that? And so whether, whether you're an individual developer or you know, a major corporation is just looking at some dollars and cents because DevEx should be something people are thinking about.
STEVE_EDWARDS: I would like to categorize this under the category of the sky is blue and the grass is green as well. I mean, if you're, you have developers and you want them to do your work, it would seem logical that you're going to do best when you give them the tools that they need.
CONOR_BRONSDON: A hundred percent.
STEVE_EDWARDS: Um, this is, I mean, it extends to more than developer tools. Uh, there was a recent article that came out and this has been, uh, along the same lines, although it's not specifically code related, there's whole studies that have been done about the workspace that you give engineers and that you will, you find that engineers need an office space instead of being crammed together in cube forms for various reasons has mostly to do with focus and you know, not being interrupted and so on. And that's one of those things that that to this day is, is a no brainer you know, anybody who's been an engineer or has worked in environments like that will tell you, you know, yeah, uh, that I can focus better. I can concentrate, I can get more done. And yet you still have people, uh, that cram them all into, into cube farms. And there's, you know, there's various reasons for that in terms of office space and having only so much space, you know, versus being able to who give everybody their own individual office. I can still remember being out at Intel. Intel has a lot of their engineering offices here on the West side in the Portland area. And being out there one time, and holy cow, you literally had to have a map to know where you're going in this gigantic cube farm. I was like, how do people get stuff done in here? So, you know, sorry to go off on a tangent, but you know, what you were saying struck me as, you know, struck me as...Yeah. Okay. That goes without saying it. It doesn't have to be a dev job, any job here, go do this job, but I'm not going to give you the tools you need to do the job. Well, I think you're kind of output. What kind of output are you going to expect from them?
CONOR_BRONSDON: The challenge I think is that too often, particularly lately software development and other R and D teams are being beholden more to I'll call them CFO metrics without thinking about what's actually happening on teams. How can we improve it and so it's saying, okay, well, we can save money by doing X. We can cut headcount on this team and that'll make us more efficient on our cap table and we can return more value to shareholders.
STEVE_EDWARDS: And you wanna know the classic, the poster child for this right now? Everybody knows about? Boeing.
CONOR_BRONSDON: Yes.
STEVE_EDWARDS: Okay. Portland, two weeks ago, plane takes off, door panel comes flying off mid flight, two seats get checked out, a 12 year old sitting next to them has his shirt ripped off. The plane comes around and is able to make a landing. It was an Alaska Airlines flight, which happens to be an airline I've flight quite a bit. And this has been an ongoing thing with Boeing ever since 1997, uh, especially with, well, especially with the 737 max line of planes where it used to be. I've seen articles written ad nauseum where Boeing used to be an engineer driven company and everything was about quality. Didn't matter the cost. Let's get a good plane made and that's how they developed the reputation and then in 97 they merged with was it Pat whitney or who was it that they merged with? There was another giant aeronautic McDonald Douglas. I think it was mcdonald douglas And the joke was that mcdonald douglas bought Boeing with their own money because they came in and pretty soon everything was driven by cost and cost accounting. And let's save money. Let's not do anything. That's going to cost a lot of money. So Boeing decides, Hey, we need to keep up with Airbus, but we don't want to take 10 years to develop a whole new product that's going to actually compete. So let's just take our 737 and modify it and tweak it. So right out of the gate, you've got planes crashing because of software that overrode the pilots and forced the plane to dive into the ground. That happened at least two times. And now you've got they started outsourcing production, outsourcing building of a lot of this, which is where this particular problem was tracked to and a lot of QA stuff that happened there all because quality went out the window and we've got C-suite people who weren't engineers making decisions solely based on money and not on quality. So sorry, that's my rant, but comes right to mind as soon as I, I,
CONOR_BRONSDON: I hear that. That's an incredible example, Steve. And I think it really correlates to something we're seeing a lot in software engineering, right? Where we could name layoffs that have happened in January ad nauseam. And a lot of the reason for them, you know, whether it's Amazon layoffs, YouTube, uh, discord, wherever else might be because section 174 of the U S tax code got changed, which has now made it so that you cannot, um, expent right off R and D expenses in the United States as part of tax, you have to instead cost capitalize them over five or 15 years, making it more expensive to develop and company saying, Oh, well, let's just, we can outsource this. This w let's not actually pay attention to the quality piece. They're, they're making those same decisions that Boeing did to your point, Steve. And I think you brought up something else that was really interesting, which is the idea of the space that we built for developers. And I think it's important to think not only about, you know, the office space, the meat space, I'll call it, uh, but also the, the digital space, like platform engineering teams can significantly enhance developer experience by building a tight information loop that identifies pain points and gathers dev feedback. And if they use, you know, the right internal developer portals to do this, you can, you know, actively solicit developer input via surveys and use quantitative measures like local build times and cycle time to understand and track blockers and broadly improve developer experience. So like thinking about both how we're setting up devs in person and setting up devs in digital space is really crucial. And I think you guys are doing a great job highlighting that. Um, because if we instead spend our time thinking, okay, what's the most cost effective solution? Well, it's probably, you know, pushing our dev team off to some other country and, and cutting our us team entirely. And there are major risks that come with that, but they're not usually risks in the moment. They're usually long-term risks. And too often companies are thinking, well, I'm going to, I can cut costs right now think Wall Street or my investors are really going to like that. You know, my VCs will love seeing our cap table look better here because we're spending less money on our R and D costs and you have to grapple with the whether or not that's going to be good for your company and the product you're building, whether you can actually succeed long-term. And this is where I think a lot of companies are making a major mistake is they aren't tracking developer experience metrics. They aren't tracking their engineering metrics in broadly so they could see, Hey, there's an efficiency drop or the PR quality that is being merged is lower now. And yeah, this example you gave Steve of the McDonald Douglas merger with Boeing, I think is a perfect example of this in an engineering heavy fuel. And I'll also say like, I was supposed to be flying a 737 MAX 9 the next morning that got canceled. So right there with you on the Alaska flights getting canceled and the impact there. And they then ended up finding like bolts loose on multiple planes, which could happen again, very scary.
STEVE_EDWARDS: Yep, and then there was another one in Japan. Uh, there's four layers apparently to the windshield in the cockpit and one of them was cracked as he took off. So they came back around and said, uh, yeah, no. So far as I know, all of the, that particular 737 max, not all of them, 737 max nine, something I think had been grounded. Uh, but, uh, anyway.
CONOR_BRONSDON: Yeah. I think they're bringing some back into service once they've been like fully checked over. Uh, but definitely it's a super scary thing. And I think we need to have that same risk concern in software engineering. Like, sure, maybe we're not going to have a plane blow out its window, but maybe the software you're building is impacting, you know, whether that plane will dive into the ground, as you mentioned, Steve, or maybe it's impacting people's bank account information. There are very real world concerns with both the compliance and security risks that we have to consider, and also the quality of our code base generally. And this is why I think it's so crucial that every software engineering team above a few people needs to have some basic visibility on how is our team performing, how is our company getting value out of this? And it shouldn't be about, Hey, is this individual developer delivering X lines of code? Because it's a terrible metric of success, but it needs to be, are we, you know, delivering on our promises to the business? Are we having to refactor a bunch? Uh, or, or in fact, are we, you know, helping drive revenue by what we're doing? And I think any engineering leader today, any dev leader needs to be stepping up and saying, Hey, like we have to help influence this discussion at all the way up to the board level. Or else we're going to see more CFOs make that decision that you mentioned, Steve, of saying, Hey, let's cut costs. Let's, let's just focus on shareholder value and not necessarily customer value. And, and when you start getting away from customer value, that's when these huge risks come into organizations.
AJ_O’NEAL: All right. So you've you've kind of touched on my favorite topic there, which is, uh, in the world of SaaS, the customer comes last because you're taking money that was almost free, printed from the federal reserve that's given to venture capitalists that are only interested in growth and don't necessarily care about profitability. So what happens on the black mirror side of things here? Do you have any thoughts on that? How do we, how do we, how do we incentivize that we actually care about the customer and the developer or, you know, what risks are there with AI being. Yeah, well, so for example, you know, people for hundreds of years have been saying, oh, this technological improvement X is going to make everybody's quality of life go up without having to work more. And we're going to, you know, we're going to work less hours. And this is, you know, this has been going since, you know, probably the era of the factory. What actually happens every time? Well, oh, by the way, Steve. Do you have a thing? Did you put that thing up on the screen? I can't take that.
STEVE_EDWARDS: Yes. I don't think. Okay. Yes, I did.
AJ_O’NEAL: All right. Oh, sorry. Anyway, I just got distracted because I was expecting the comments to fade away after 20 seconds or something and it didn't.
STEVE_EDWARDS: Oh, no, I gotta take it down.
AJ_O’NEAL: Yeah. So, yeah, what's, how do we, what might the abuse situation look like and how do we prevent it or curb it?
CONOR_BRONSDON: Oh man, that's a big one. So I'll say I think what you're talking about here is that a lot of the value that gets generated from technological innovations like AI or, you know, improved machinery that helps assembly lines or anything else that's happened over the last 100, 200 years during industrialization has accrued to capital because, you know shareholders are driving this and they accrue most of the gains and not most of those gains aren't necessarily accruing to the working class. I know I swear. I'm not a Marxist y'all totally, but I'm just talking about here
AJ_O’NEAL: to be fair. The rising tide raises all ships like because Amazon exists. I have access to products. I personally have access to products that I would not be able to get access to otherwise without having much more specialized knowledge and having to drive much further to get to a specialized store. So I also have access to fake crap and I can't always distinguish between the product I'm trying to get in the fake version, but like I have access to more products. So it's not like it's all bad, but in particular what I was pointing out is that they said, oh, you're going to work less and you're going to have more. I do think that we have more. We're certainly not working less. Are we work different? Yeah. I don't know if we're happier.
CONOR_BRONSDON: That I think has another lot of other impacts, right? Like, I think we, I think we work differently is really what it turns out to be, right? Where it's like, okay, maybe now with these AI changes, we're going to spend less time writing tests. We're going to spend less time generating basic code, but we're going to spend a lot more time strategizing on the deeper pieces of code, or how does this all fit together? Or understanding, making sure these systems don't break. Maybe like we're seeing an increased need for platform engineering and internal developer portals. And you know, DevX teams at major companies because we need to make sure all these systems we've built to improve development speed work. Um, and the other piece I think of what you're saying is like, you know, this kind of goes back to our discussion of, Hey, dev teams need to have these conversations, dev leaders need to have these conversations with their, their CFOs, with their partners in finance and their partners in sales and all these other place parts of the company. And we need to be part of influencing the business discussion and saying, Hey, make sure some of these benefits do accrue to dev teams. Like we're helping provide more efficiency and more value. Like we need to be able to work on things we're passionate about. We need to be able to be, ensure we have the VEX metrics that look at whether those teams engage and talk to the value of that. So I do think you're spot on there and it has absolutely allowed us to create these incredible technological improvements. But what I think we need to be really careful on is assuming that will just mean we work less. Usually it just means we work more efficiently and then we try to increase overall value, overall GDP. through that and hopefully you produce more value that then gets you the next pay raise, the next piece up. And I think as an industry, if we aren't helping shape that discussion by talking about DevEx, by talking about developer productivity, by being part of that discussion, we're gonna let, to Steve's point, the accountants be the ones who make the decisions. And that's where we get into major trouble. Shout out good accountants ever, by the way. I'm not saying all accounts are bad.
AJ_O’NEAL: Um, so one thing that was funny, you're talking about dev X teams. Originally that was called dev ops. The purpose of dev ops was to make the developer experience better. Then we just rebranded ops to dev ops without changing anything.
CONOR_BRONSDON: I do agree with you.
AJ_O’NEAL: So what's the new team? Um, what's going to happen with this dev X team stuff?
CONOR_BRONSDON: Well, now it's plow from engineering. So, I mean, to your point, it's all. It's all a bit of, you know, six of one half dozen of another. Uh, we, we kind of use these new terms that speak to new eras of, of how we think about soccer, but we typically retain it right where it's like, Hey, the agile movement, that's huge thing. And then DevOps, that's huge, like layer on top of it. Now we're talking developer experience and platform engineering, building on this and really, I'll say my take is that it's basically how can we improve on the last standardized way we did this and how can we keep leveling up? But is the role very similar? Absolutely. Are many of the concepts the exact same? Absolutely.
AJ_O’NEAL: All right. Well, it is about time for us to wrap up here. Is there anything else that you wanted to share? Any topics we didn't discuss?
CONOR_BRONSDON: No, I really enjoyed the conversation. AJ Steve has been super fun. Thanks so much for having me on.
Hey, this is Charles Max Wood. I just wanted to talk really briefly about the Top End Devs membership and let you know what we've got coming up this month. So in February, we have a whole bunch of workshops that we're providing to members. You can go sign up at topendevs.com slash sign up. If you do, you're going to get access to our book club. We're reading Docker Deep Dive, and we're gonna be going into Docker and how to use it and things like that. We also have workshops on the following topics, and I'm just gonna dive in and talk about what they are real quick. First, it's how to negotiate a raise. I've talked to a lot of people that they're not necessarily keen on leaving their job, but at the same time, they also want to make more money. And so we're gonna talk about the different ways that you can approach talking to your boss or HR or whoever about getting that raise that you want and having it support the lifestyle you want. That one's gonna be on February 7th, February 9th. We're gonna have a career freedom mastermind. Basically you show up, you talk about what's holding you back, what you want dream about doing in your career, all of that kind of stuff. And then we're going to actually brainstorm together. You and whoever else is there and I, all of us are going to brainstorm on how you can get ahead. Um, the next week on the 14th, we're going to talk about how to grow from junior developer to senior developer, the kinds of things you need to be doing, how to do them, that kind of a thing on the 16th, we're going to do a visual studio, uh, or VS code, uh, tips and tricks. Um, On the 21st, we're going to talk about how to build a software course. And on the 23rd, we're going to talk about how to go freelance. And then finally, on February 28th, we're going to talk about how to set up a YouTube channel. So those are the meetups that we're going to have along with the book club. And I hope to see you there. That's going to be at topendevs.com slash sign up.
AJ_O’NEAL: Well, we do picks at the end. Hopefully you were informed, which is, you know, basically can be anything. It's just stuff that you're interested in. Could be tech-related, not tech-related. I'll go ahead and start to give you some time to think about something. If you, if you hadn't, because I, I'm guessing if you, if you got the email with the wrong calendar link, you probably didn't get the email with the description. Um, anyway, so I, there's a couple of things that I still, that came to my mind a few weeks ago that I still haven't, um, gone through all of it. So let me see if I can scroll to where this was. So one thing that I was going to talk about that, I mean, I've mentioned just like, like I did today, this whole issue with, you know, we've got the investors and they're focused on growth and, and that's a lot of what's driving SaaS companies is not profitability, but just growth. And then what happens when you've reached the market, the full market potential of the growth, you know, what, what happens when everybody who wears shoes has a Bluetooth smart shoe? What happens when everybody who needs a calendar link has Calendly? What happens when everybody who wants to scroll infinitely and starve to death is on Facebook, right? And there's a video that... Let me see what the title is. The thumbnail was something like 98.9 market saturation. Now what? Oh, whoa, whoa, whoa. That was really loud. Sorry about that. I need to reconfigure my don't auto-play plugin. Yeah. Okay. So the title of the video is market saturation equals 98.9%. Now what? So I think this is an interesting video. And it just talks about this issue of these these SaaS companies are starting to kind of cumulatively all get to the point where growth is no longer possible. And many of them still haven't figured out how to be profitable. So what's going to give as there's no more investment to get and there's no more people to reach with the product? I'm not, it seems like there should lead to a collapse. I don't think that it will. I think that something else is going to proxy and take it place. I fear it might be something government-related or, you know, I don't know. But this is a question that I think is I don't know. I guess it's not really important thing about it because nobody can do anything about it. Unless you're on the investor capital side.
CONOR_BRONSDON: We're getting really existential today.
AJ_O’NEAL: You're just going to experience it. But I don't know. It's something that I wish more people were aware of. And I really would like to see us get back to having profitable businesses rather than having growth businesses because certainly profitable...Profitable businesses are better for the customer by definition. Because the customer is the one who benefits.
CONOR_BRONSDON: They can often be better for the employee too, because you aren't having these kind of wild growth expectations. Instead it's how do we build a strong business? So I've been thinking about this too lately, actually, AJ, which is, you know, I've considered, uh, and kind of in the process of starting my own business. I've previously been a founder and business owner and, um, I'm very much looking at the bootstrap side of things and saying, I don't want to take VC funding because the pressure that it provides to me as a founder, to me as an employee is so substantial. And some of the goals that get set by are kind of perverse incentives as you put out there. And there's a, there's I think a big push now for folks to say, Hey, I'm going to bootstrap I'm or I'm going to take very minimal funding, or I'll take, and I think this is a new way, for sure. And I see I'll take funding, but you get a share of future profits. Instead of, hey, I just get a percentage of the company. And I think we're going to see a continued wave of that going forward.
AJ_O’NEAL: Yeah. I think that that's, I hope, I hope to see that. I hope that the DHH revolution kicks off strong and hard and we get back to making products that deliver value to customers rather than, than deliver a poker face for investors. I mean, it's really a Ponzi scheme. It's a hot potato hot Ponzi scheme between investors right now. And it needs to get back to profitability because it's just gonna lead to government bailouts. Anyway, I'll go further down that route. But other thing, really cool key chain pin tool that I got that just turns out to be useful all the time. I don't know if I, pull this out real quick, not show my keys on video where you can easily copy them. But we got a little key chain tool. It's got a little silicon wrapper. The pencil like SIM cards reset a router, you know, hit the reset button on the game controller, you name it. There's all sorts of things that have that little reset button on them. And I just have this on my key chain. It's a little and because it's in a silicon sleeve, it's not poking into my, you know, like ripping my jeans or anything. I need one or poking in my skin. So, yeah, I mean, there. They're overpriced only in the sense that in order to ship it, like the cost of shipping is more than the cost of the product. But you can get a pack of three or five of them on Amazon with the silicon sleeves. So I'll put a link to that there. Also, open audible. I don't remember when we were talking about audible last time, but thank heavens that's not a trademarkable word. For whatever reason, I don't I don't quite know maybe because it's a dictionary word. I don't think you can trademark dictionary words. But anyway, open audible is you can download your audible books and then you can convert them into a format so that you actually own them. So rather than being a rental where when the license with that author ends, you lose access to your books, you can give your grandchildren your audiobooks. And so I often buy the MP3 CDs when they're available and at a good price within the same range of what I'd pay on Audible, the $15, $20, $10, something like that. But I would not be using Audible and I would not be listening to as many audiobooks as I do if it weren't for Open Audible giving me peace of mind that this stuff isn't just going to be ripped out from underneath me one day, that I do have it preserved in a permanent format.
CONOR_BRONSDON: I won't check that out.
AJ_O’NEAL: Lastly, yeah, yeah, it's it's it's 20 bucks or so, but totally worth it to be able to have that peace of mind that, you know, the stuff you bought, you actually own, you know, for real. And then last thing was because we're talking about the plane stuff, there's this YouTube channel mentor pilot. And he, he goes over every accident report of every plane incident and all of history across the globe. And I just love listening to it. It's one of those things that you can also listen to while you're falling asleep, you know, kind of half pay attention if you want to, just need to hear something. But it's so interesting to learn about the different plane accidents and the different things that go into play, the different parts of the report, what people on the ground and the control center reporting. What was going on in the black box, you know, what other pilots were seeing on the radar, whatever, you know, he takes, it's not like that for every single incident, but you know, for all the incidents that have the ex the external data points, he links it together. He talks about manufacturing process. He talks about, you know, that the different grades of lubricants that are used on a particular shaft and why it, you know, it failed because it wasn't, it, it, they shit switch from this brand to this brand and now it should have been on a six month rotation, but they kept it on the 18th month rotation, you know, any of that stuff. Um, it's not too technical. I don't think, I mean, like it's, I'm not a pilot and it's very enjoyable to me. It is technical, but it's not like you don't have to be a pilot to appreciate it. Um, but he, he, he does go into detail on things that I, and I really liked that. So anyway, those are, those are my, my picks for the day. Steve, I left you a little time for, for dad jokes.
STEVE_EDWARDS: Hopefully I'll take the time whether you left it or not. That's most important part of the episode. No along the line of pilots I'm when working on getting my drone pilot certification. For the FAA, it's called part 107. And I've been taking a class. That was recommended to me by somebody else who's a fixed-wing pilot as well and runs our drone program for our fire district and it's called Pilot Institute. And really, really quite good. The guy's name, I forget the guy's name, it's Greg something, he's French. But he lives in Prescott, Arizona and does a lot of teaching and training. But everything I've been seeing from those who have taken his course and then taken the part 107 FAA test, said you just need to know the course, everything you need is in there. And it's been really good, a lot of. Lot of information to cover for sure, but pilotinstitute.com is the URL. Really quite good. All right. So now for the, the jokes of the day. Um, I got to give kudos shout-out to the best standup community of all time without dispute. His name is Steven Wright. Uh, deadpan comedian was actually hysterical. And recently he was on, uh, with Stephen Colbert. And, uh, you know, let loose a couple of jokes there that were, uh, that were fantastic. So, uh, he bought this one. He told this one that had me laughing out loud and you have to imagine him telling it in his deadpan style. I'll do my best. He says, my friend Jimmy brought an electric car and he was very excited about it. Then he bought an electric blanket and he bought an electric guitar. And then he bought an electric chair and I haven't heard from him in a while. Oh, that loop got turned on again. Sorry. So, uh, right. So one time when I was in school, I went to my teacher and I said, would you punish me for something I did not do? She said, no, of course not. I said, okay, I did not do my homework. And then a simple question, what do skunks say at church? Let us spray.
CONOR_BRONSDON: All right, I got to give you props, Steve. Those are some solid ones there.
STEVE_EDWARDS: Thank you.
CONOR_BRONSDON: I have high standards when it comes to the dad jokes I tell, so I do my best. I don't have a good dad joke to share, but I do have a couple of good carve-outs. So AJ, shout out for the audiobooks, huge fan. Also recommend, in addition to trying to get your open audible licenses set up, public libraries have fantastic audiobook selections. I'm in the Seattle area. Seattle Public Libraries has an amazing connection. Very much recommend it. Most local libraries, whether it's a county library, city library, will be part of a larger collective that has a ton of audio books available to you. So if you wanna get an audiobook through your public libraries, fantastic way to go about it. I'll also shout out to Ali Abdel, one of my favorite YouTubers who I'm currently reading his new book, which is called Feel Good Productivity, How to Do More of What Matters to You. Very much enjoying it, talking about a lot of the concepts that we've kind of brought up more broadly around dev experience and there's other pieces of, Hey, how can I be happier in what I'm doing and get to do more things that make me happy and really enjoying the book to kind of start off my year. And then I'll close by just saying, Hey, if you haven't checked out the Dev Interrupted Substack or our podcast, would love to hear feedback from everyone on this. Always great to come on these other shows and really learn from AJ and Steve here. It's been so much fun. I really appreciate you guys having me on. And you can check us out at devi and maybe AJ or Steve will join us for a guest article this year. I don't know. I can talk you guys into it. We'd love to feature something for you guys if it's interesting later in the year.
STEVE_EDWARDS: Sure. I'm up for that. Awesome. Like I said, I'm not exactly what you call an engineering leader, but I can do my best.
CONOR_BRONSDON: Well, I'm stoked to continue collaborating with you guys. And thanks again for having me on. It's been a ton of fun.
AJ_O’NEAL: Yeah. Thanks for coming. I'm glad that we had you and we got to talk about these topics and get a preview of that report from Linear Beat.
CONOR_BRONSDON: Yeah, super exciting stuff. It'll be out here probably by the time this pod goes fully live on all channels and Lots of cool insights coming. So yeah linear B.io to check out the report and then devo to check out our weekly guest articles and podcasts. So yeah, thanks guys been a wonderful conversation.
AJ_O’NEAL: I really really enjoyed it All right Well, have a good one Adios.
STEVE_EDWARDS: Adios.
CONOR_BRONSDON: Yes