skip to content

What Is Edge Computing?

We’re hearing a lot about “the edge” lately — but what does that mean? In this episode, Sunil Pai will teach us what it means to build modern web applications on the edge.

Full Transcript

Click to toggle the visibility of the transcript

Captions provided by White Coat Captioning (https://whitecoatcaptioning.com/). Communication Access Realtime Translation (CART) is provided in order to facilitate communication accessibility and may not be a totally verbatim record of the proceedings.

JASON: Hello, everyone, and welcome to another episode of "Learn With Jason." Today on the show, we're bringing back one of my favorite thinkers about code, Sunil Pai. Sunil, how are you doing?

SUNIL: Doing fine, Jason, how are you doing?

JASON: What is this?

SUNIL: This is what the young kids yeah, let's fucking go. I'm going from my couch, can I just go from couch. Just doing it from home. Yeah, this is my new thing, yeah, baby.

JASON: So, we got a lot of folks in the chat already. People are real talkative, which is really exciting. Hello, everyone. Thank you for joining us. Today's episode is going to be a little bit different from the standard format, because to be completely honest, this is going to be more of like a discussion of the future of web dev rather than a strict demo. And I'm so excited about it. How often do we get a chance to just sit and really talk through what's possible with the tech that's coming through? And kind of what it's going to unlock for us. But before we dive into that, for folks that aren't familiar with your work, do you want to give us a little bit of a background?

SUNIL: Sure. My name is Sunil. I currently work as a tech lead for developer productivity on the workers team in CloudFlare. Some folks might know me, because I spent a year on the React team and flamed out. Previously to that, I did a bunch of CSS and JSS stuff. Open source things for a while. Oh, and people might know me from shit posting on Twitter, where I I think that's mostly it. Folks know me from Twitter. Who the fuck is this guy?

JASON: That maybe is my favorite. Hello, I'm Sunil Pai. You may know me from shit posting on Twitter. I guess we're going to have to put the explicit warning on this episode, but fuck it.

SUNIL: Exactly. Come on. We're all adults here. And I'm not saying anything that their parents or their bosses don't already say. It's fine.

JASON: True enough, true enough. Okay. So, today I want to talk specifically about something that, you know, you're at CloudFlare, I'm at Netlify. I think that our companies are kind of pushing forward a really big bet on the future of the Internet, which is this idea of edge compute. So, CloudFlare is in on this, Dino is another kind of avenue towards this, Netlify is building on top of Dino, Supabase is building on top of Dino. A lot of companies are realizing there might be something here, this might change things. And I'll tell you what, when I started to get my head around what edge computing possibly unlocks, I haven't been this excited about web development in a long time. So, maybe, what's the elevator pitch? What is edge computing?

SUNIL: Okay, so, first thing, I guess, it's an overloaded term. Some companies call it some things. Let's talk about some of the common characteristics, right. The first thing is that servers are as close as possible to the users as they can physically be. Like, literally, physically they can be. And these providers of these servers are actually making deals with telecom networks and so on, so that they have incredibly low latency. But I think the common one is you have servers as close as possible to the users, and the way to do that is to blanket the planet with servers. As many deals as you can. Even if you start making an edge if you create make yourself an edge company today, you might start with tense cities, but the goal is not to be in 10,000 cities in the future, but to have a server behind your washing machine. Everyone should have one in your house. How physically close can you get to this, and start running code off it. I think that's why they say edge, because it was like the edge of the network. That's where the word edge came from, but what is the big one. Hey, what happens when we cover the planet with hardware and let people use it.

JASON: Yeah, so, that's kind of like if we do a little bit of kind of historical, like evolutionary backtracking, so everybody buckle up, we're going to do like what was that thing called, like the conjunction junction, what's your function. Schoolhouse Rock. We're going to do it on the history of delivering content over the Internet. So, back in the early, early days of the Internet, the only way that content would go on the Internet was if you had a physical computer in your house, and you would have that on and connect it to the Internet. Somebody would then enter the address of that computer, and you would be able to load a file from it. So, if you were on the other side of the planet, you were limited by the speed of light, speed of networks. Obviously, this was prohibitively difficult to enter, so we looked at hosting companies, all using GoDaddy, all these different companies. And they would run a room full of servers for you. But, again, this is usually one server in a building somewhere on the planet. Then we started seeing AWS, Google Cloud, Azure, IBM Cloud, all these companies started building multiple data centers, and we've got AWS U.S. East 1, U.S. East 2, European data centers, Australia, et cetera. Now you've got copies of the data, or you can choose to run your servers in different places. And then, at the same time, we started putting together CDNs, content delivery networks, where you could take assets, not necessarily code, but images, HTML documents, stuff like that. You could put those on lots and lots of different points of presence, pops, that there's maybe dozens or hundreds or how many does CloudFlare have? Hundreds, right?

SUNIL: Upwards of 10,000 at some point. Shit ton.

JASON: Holy crap. 10,000 different places on the Internet. So, when you make a request to CloudFlare or something like that, your request is only going ten miles instead of 2,000 miles across the ocean. But the limitation of that, of course, these were for assets only. So, then we started looking at servers are a pain to maintain. Maybe we could do serverless functions, but they are limited to U.S. East 1, U.S. West 2, the big data centers. So, edge compute, edge functions, or CloudFlare Workers, are this idea of the edge computing layer. What if these CDN pods, these 10,000 locations around the world, could run a little bit of code? Now you have a server right next door, you don't have to wait for these requests to happen. We're seeing this move, CloudFlare is doing this, Netlify is doing this, Dino is doing this. Not just the host. CloudFlare will give you this distributed edge network, Netlify will give you this distributed edge network. But you still have to go back to U.S. East 1 for your database, so who cares. But now I don't know, if you're running Supabase, your request to Supabase and your request to the edge network are right next to each other. Everything is almost happening in your house, in your neighborhood at least. That's incredible. Planet Scale now has portals, it's distributed edge networks of read only to give you almost near instantaneous requests anywhere in the world. Sunil, how did I do, did I miss major points as I was trying to give that history?

SUNIL: Actually, it's pretty bang on. The one thing I think that happened in parallel is alpha engineers in the '90s or '80s believed they needed complete control of these hardware machines to do anything. When I meant they needed complete control, it's a level of gatekeeping, as well. How to be a webmaster who understands how to set up services and SHH into the machine and run logs. The level of that kind of control/management actually goes down the further the closer to the present that you come, which is AWS says, okay, you can use U.S. East 1, U.S. East 2, but, A, you had to still pick which locations, but you don't get to SHH into these machines. You give us a container, et cetera. Then you go to the extreme, which is CloudFlare and I assume Netlify, as well. You don't get to pick which part of the world it is. In fact, it tries to deploy all across the world, but really all you're given is a JavaScript function. I think that's part of the secret sauce, as well. Giving up this level of control suddenly gives you so much power that a provider like Netlify or CloudFlare can take care of for you, which is great, because it does my favorite thing that technology does, which is it makes incredibly hard things accessible to mere mortals.

JASON: Yes.

SUNIL: I don't want to learn C++ to run a server. That's not my problem. There's a whole army of people in CloudFlare taking care of this. Me, I like JavaScript. Can I write a function that takes a request and returns a response? Fuck yeah. That's all I want to do. That's the other thing.

JASON: I feel like you just tapped into one of my other favorite things, which is that there's a complaint about so many distributed services, or all, you know, you need a specialized service for each thing. But when I think about it, I get really excited, because what I'm seeing is a complete flattening of the learning curve, where if you become really good at HTML, CSS, and JavaScript, it used to be that full stack developers didn't technically exist, because there was so much to learn that you couldn't possibly really be full stack. You could be kind of good over here and a little bit of information on these other things. You can, literally, be full stack now, because you can run JavaScript on the server, in serverless functions, edge compute, on the front end. If you become a JavaScript, HTML, and CSS expert, you are a one developer army. You can build anything, because all of these services have been put together that you can assemble and connect, and it's all API calls, and fetch calls. That's it, right. You're building incredible shit.

SUNIL: Another thing I like about that is, because you mentioned it, is, A, nobody owns the technology. Well, Oracle owns the trademark to JavaScript, I think, but nobody owns the tech. Because it's so called standards API that everyone is doing, it means that, let's say you get really pissed off with CloudFlare. You can spend, literally, a week and migrate to Netlify. There are some issues. Okay, this is how I have to handle environment variables. But your code, pretty much the same thing. It's a request response function. That means, A, companies are now incentivized to compete on the quality of service and customer experience, which is, hey, how can we keep you happy using this? Sure, we have a nine second millisecond latency. If you lose it, you would move to someone nicer and treats you better. And I love that. Those are the perfect incentives to capitalism. They don't have anything proprietary, per se. One of the nice things about Dino and Dino Deploy is it was a forcing for Netlify to announce open sourcing. It was very much a, you know what, now is the right time to do it, because why not. We're not competing on what runs the JavaScript anymore.

JASON: Always, in businesses, you've got a majority of the companies, hey, let's do the right thing. Everybody is like, yeah, we should. We don't have time. What I love about competition is competition raises those needs to do the right thing. When the competition is built on this, on experience, on treating people right, that's the type of capitalism I love, because it basically means, oh, all that tech debt about your API is awful to use, you got to fix that, or else you're going to die as a company, because 15 other companies just shipped an amazing developer experience.

SUNIL: Right, exactly.

JASON: So, I love, as you said, it's a really good forcing function for the way that companies make decisions when we move to this world where it's not about proprietary locking, not about building an experience that only works on your platform. It's about building an experience that's so good that, yeah, I can ship whatever, next.js on CloudFlare, Netlify, whatever. But which one feels the best to use? And that's kind of what else can I do when I want to get to this edge stuff, I want to do personalization. Am I using APIs that I like, if I want to add extra things, or I want to ship, I don't know, maybe our marketing site is going to be in Astro or Quick, right. We're all competing now on making developers' lives easier, as opposed to, ha, ha, you're stuck with us, because you paid for an enterprise contract, and you can't go anywhere else.

SUNIL: That's right. And we give you a certification, and that's just it. When a company says you need a certification to use their technology cough AWS, that sucks. It's a weird lock in, I wouldn't call it a cult, but it's weird you need a certification to do that. I want common code, I want to install libraries from npm I can use on these things. But there is a difference, right, companies do think to themselves, how can we differentiate. Netlify has a couple products what is

JASON: OneGraph.

SUNIL: Beautiful, what a concept. Hey, we know you have all these integrations. Let us make that part simple for you, and it's part of your experience with Netlify. CloudFlare goes this other way. Hey, we have this Sci fi thing called durable objects, what can you make with it? Holy shit, I don't know, but it looks super exciting. Web socket y multiplayer stuff on that, which so, I think which is interesting to me also, because none of these ideas then are particularly proprietary in themselves, because, okay, fine, if somebody else wants to build it, they can build it. But it's not a we got you to use one thing and now your entire application has to be on our stack.

JASON: Right. So, this is I don't want to spend too much time on this, but I think both CloudFlare and Netlify, you talk about the jam stack as being an application architecture that makes a huge difference. And a lot of that is you have taken the web experience and made sure it's not dictated by the business logic in your APIs, back ends, whatever it is. You're just de coupling that UI experience from the APIs or microservices you've built, or third party systems that you rely on. That data and business logic is accessed through an API, which means you can ship the web experience anywhere. And that's a huge productivity boon to companies, because then if they have some super proprietary back end that requires a lot of lock in, that could be potentially great for your back end, but it's horrible for your users, because that back end is for the back end, not for delivering front end experiences. And, so, I think the de coupling that we're getting out of this, this architectural shift, where we're seeing companies move away from monolithic applications and to this idea of a de coupled web experience, means that, you know, we can, as front end teams, as web teams, we start looking at these experiences based on how do I get the best outcome for my use case. And we see this in research at Netlify. When we're talking to developers, people don't get locked into Netlify or CloudFlare. We almost always hear that developers have, oh, well, I deploy this site to Netlify, this to

SUNIL: Exactly.

JASON: We all have our strengths. Durable objects, I'm going to CloudFlare. Whatever it is, serverless functions experience, Netlify forms, we all have our strengths and weaknesses. So, to me, that's really exciting, because, again, the incentive now is how do we make this feel better to use and be more useful, give people a faster onramp or better guidance or whatever it is, rather than how do I force all front end developers to put all of their websites on this, because if they don't, it's broken?

SUNIL: I fucking love that, by the way. And that's also appealing to me as a user, because well, as a web developer, 9 out of 10 of your ideas are really just side projects that you spin up on the weekend. And any time you start okay, it's Friday evening, Saturday morning. Friday evening I went out drinking. Saturday morning and I want to start up something. Is it easier for me to start well, what is the shortest path to getting the shit working? I appreciate boilerplates, framework boilerplates, but even those sometimes you have to learn the tech, how to do it right. That's why remix is nice, because there is the whole standards thing, which is quite appealing.

JASON: Yeah.

SUNIL: But it's also very much a, hey, can I start with truly writing a single function, and saying request response. Because of this standards based thing, you can technically run it in the browser itself directly. First prototype, I made an index, HTML, JavaScript file, I run it there, and figured out how to put it on the edge after which. I'm not ready for that just yet. That is strangely liberating, because I can remember at least two or three years ago, where, honestly, part of it still persists today, which is it used to be dead simple. It also wasn't super capable ten years ago, but dead simple, npm install, express, copy five or six.

JASON: That was the game. Yeah.

SUNIL: That's it, you're there. Then it became this whole thing, where, okay, fine, you got it working, now where are you going to deploy it? Shit, I need a credit card number, this, and that. And it started, and then React started feeling a little like you have to set up a whole thing before you get there, and that started feeling weird. But with the return of streaming APIs and the edge thing where people are building edge native frameworks, again, it's starting to crunch where people can start focusing on this is the idea I want to execute. Another thing I was thinking of is the edge also works like you were mentioning, right, there's a back end and API that might belong in a monolithic fashion, but you can bring other things here. Turns out there are ways to bridge that. This was such a good business idea. I don't know if you know Max and Tim something called Graph CDN, and it's a simplest idea executed so well. Hey, you have an API, which is expensive as hell to run. What we'll do is we'll you just install this one thing in your service and enable with Graph CDN, and we give you caching, replication across the world, and they are using Fastly, I think, which is another great service. And now your thing is 10x better as a service, instantly, you can do this in two hours. That's a way without making code changes in your own code to suddenly reach the edge. I wish I'd come up with that idea. Such a good idea, shit.

JASON: And it's interesting, too, because it really does point to, like, a lot of times the ideas that work are the simplest ideas. They are the things that it's like nobody wants to go and set up a CDN, you know, nobody wants to go and wrangle a varnish cache. That's not fun. That's not a useful application of development hours. And to your point about boilerplate, no one wants to configure a domain name. No one wants to set up the domain routing so you have the proxies in place to have your API run from serverless functions, but under your main site's domain. All that stuff should just disappear, and that's the premise that on the Netlify side at least, that's a premise we're built on. You deploy your site code and a folder with serverless functions in it and edge functions in it, and we'll wire those up under one domain name, so you can build. I want you to go from an empty folder, to building your idea, in five minutes. Looking at a running edge function in five minutes. From there you're plugging ideas, instead of figuring out how to wire things together. So many of my side projects died on the vine, because I spent my entire weekend trying to get the stack right. Whatever.

SUNIL: I have a life. That's it. Once you enter your 30s, you're extremely conservative with how much time you spend on building a stack. I love that, by the way. Netlify's folder of functions. Man, that's just it. It's an idea I fought against for years, but now I realize the tradeoffs are so worth it. Next.js does this, folder of React components

JASON: Routes, yeah.

SUNIL: You get routing for it. Netlify does this as functions. As a nerd, what if you want to do anything custom. No, fuck it. This is just superior way to do this thing. I don't want to set up a router and wire up little slash, XYZ, bracket, slash ID. PHP, this is why PHP won. Everybody thought it sucked, but it still works. Shit.

JASON: Yeah, I think that's actually one of the things that I love. And this is where I think remix got this really right, where they start with folder based routes, where you've got your app folder, throw a page in there, those are your routes, those will get rendered anyways. You can do the wild card routes if you want, or you can straight up pull in React router and do a full SPA as a subsection of your Remix app. If you do have a particularly aggressive thing that you want to build. I think that's an interesting and powerful approach. You're basically saying you're inverting it, right. Instead of saying in order to do anything you have to know everything. By default, this is going to make your life easier. But you can go as far as you want. I've written about this before. I referred to it, what is it, the Nielsen Norman Group calls it progressive disclosure, so I wrote a post about progressive disclosure of complexity, talking about specifically this, right. Whoops, where is it? Progress disclosure of complexity.

SUNIL: I found it, yeah.

But, you know, these are the kinds of concepts, where I want by default, if I take every shortcut, every default, I'm going to make something good, performant, good, easy to grow. As I hit the edges of the defaults, I can opt into more complexity. But that should be driven by business needs, not by, oh, yeah, you've got a pile of Webpack, Babel, and React, and now you're bringing taste in, you're setting up ES lint rules, Prettier config, and now every single project is functionality identical. You're just putting React components on the Internet. But they are completely opaque to anybody who doesn't work in that code base. And that's not because anything functionally changes. Just because we spent four fucking weeks making this Webpack config so custom, and now nobody uses it but me.

SUNIL: What's even more worse is it's sinister, because it's seductive. You feel like you're doing real engineering. Oh, I installed the Webpack plug in, installed this, wired this. Tell me this, does the Netlify CLI work with an empty folder of functions? I suspect it would, right?

JASON: Yeah.

SUNIL: Does it tell you, hey, you have an empty folder, add a single file and it will start working?

JASON: We have a Netlify function create that will put hello world in for you.

SUNIL: That's just so nice. Otherwise you're thinking it's nice to be able to run the CLI with no input. Then have the CLI tell you, oh, welcome to this, you can start doing things. That's an idea I need to steal. So, edge compute. Back to edge compute.

JASON: Hold on, before we do that, I just realized we're 30 minutes in. I want to do a quick shout. We have live captioning on this show. Let me switch over and actually show this to everyone. So, we have live captioning on the show. We've got Ashly with us today taking down all these words. Thank you so much, Ashly, for writing down our swears. And that is made possible through White Coat Captioning, and our sponsors, Netlify, Nx, and Backlight, all kicking in to make this show more accessible to more people. And we are talking to Sunil today, who is on Twitter, shit posting, as he said, as Three Point One, which is a joke that took me way too long to get.

SUNIL: Actually, it's well, did they get it, did folks in chat get it?

JASON: Do you get it, chat, do you get it?

SUNIL: No? Come on, man! I have to do this

JASON: It's because Sunil's last name is Pai, and pi is 3. get it?

SUNIL: Sounded so clever years ago when I started using it, but if you ever have to spell out your email address on the phone, threepointone@Gmail.com, no numbers, no dots, it's a fucking pain in the ass. I highly regret doing what I did.

JASON: This is the coaching I need in school. Don't teach us about, I don't know, all the silly stuff that we have to learn, like woodshop. Teach us how to pick a user name you can explain over the phone.

SUNIL: At least it's not something like oh, God, no, I can't be that shrewd either. It's not something where I was in college and stoned, and something particularly rude that I have to use professionally after that.

JASON: Oh, my God. I definitely had a user name when I was 13 that just had 69 in it, you know, really ridiculous stuff like that. I had to burn down and restart my whole web presence when I was like 17 years old. I was like, oh, wait, I have to tell adults what this is.

SUNIL: That's part of your identity after a point. Doom slayer 69xxx. You don't want to be that guy. No, you don't. You don't. So, edge compute. One of the things that I make a point to do every time I go on this hype, oh, fuck, everything, it's the future. Is to be very clear of the tradeoffs. Because without the tradeoffs, you get a bunch of common questions right in the beginning. The first thing is none of these edge providers, maybe a couple of them, they don't run node, which is the standard for server side JavaScript right now. And this bothers people, because the first thing they try to do is npm install express. That's the nice thing about the node.js ecosystem. Wow, at this point, it has 12, 13 years of a history and ecosystem behind it. There are libraries that just work, et cetera. Express wasn't great because of its router, but it was because you had a selection that you could use to set up a website any point of time. Course, cookies, session storage, talks to your readers, instance, whatever. Then you come to the edge, yeah, we don't do those things yet. We're working on it. It disappoints a lot of people.

JASON: But wait, but wait, right, because we do. This is actually what made it click for me when I started thinking about edge computing. Edge functions or cloud flair workers are effectively middle ware for the entire web, regardless of framework, back end, anything like that. If you think about the request response chain, I'm in my browser, type in an URL, and I go and hit whatever asset store or CDN or wherever it is, and I get a document back, and that's always going to be a collection of HTML, CSS, and JavaScript. That's what's coming back. And CloudFlare workers and edge functions are going to let me intercept that response, or intercept that request, and do different things. I'm going to check your cookies to make sure you have an authorization header. I'm going to check to see if you've shown interest in a certain thing and swap the order of how products are displayed on this page before I send it back to you. Now we're doing personalization, authorization, all these things you'd do in express middleware, right, but we don't need that node server running. We don't need to have the underlying, like, express API of setting up all my get requests. I get to say, whenever we make a request, if the URL matches this one, do this extra logic.

SUNIL: Which is the tradeoff, by the way. But that's the tradeoff that you're explicit about, which are there are things potentially still happening on these servers that are very familiar and that you're used to, and in the most common use case, you're either adding power and functionality or you're reducing your costs or maintenance costs on these servers. And if you're not very clear about this tradeoff, people saying, oh, shit, you mean I do need to run AWS somewhere? That's why Netlify's model is nice, right, you actually provide both as a service. CloudFlare doesn't right now. CloudFlare is like, yeah, you sort of have to bring your own. But also have you considered using Remix and hosting your whole app on this thing? And that is the tension right now, which is we're actually in the middle of this transition phase of how far can we take it to the edge. That's one of the things I like talking about. The other one is the state thing. Oh, man, the state thing, we could have a whole session just on that. Okay, so, the model that I have for edge/serverless/whatever is actually not that it's 10,000 of servers on the planet. It's that it's actually the stretchy balloon like server on the planet. This is what I've been doing in meetings internally. Imagine a balloon stretched on earth, and you get to interact with the parts that are closest to you. The thing about data is that the cap theorem, cap phenomenon, cap theorem, is immutable. So, you'll never have consistency, availability, and parturitionibility whatever, you'll never have all three of them because of the speed of light. The idea is, either you have a central server somewhere, where if you write data to it, read data from it, you'll always get the truth, but if it's on the other side of the planet, you're hosed, because it takes that long to talk to. Or you spread your data across the planet, which your write goes to one place, read to another place, and you're reading all data. Then how do you have consensus? What happens when two people send rights to the same thing? Because they thought they had a lock on being able to write to it. So, who wins? How do you do consistency amongst these things? And this problem is completely exacerbated if you start spreading this is a general database problem, right.

JASON: Right.

SUNIL: The way a lot of people solve it, okay, fine, only one of the centers is where you send rights, and you'll have re replicas across the planet. And that's somehow we'll make that work. Anyway, 99% of all applications are read heavy. Okay, fine, for the common use case, maybe that's good. But there is this

JASON: Yeah, there's definitely tension, and I suspect we'll find new patterns that emerge. Because I think the thing that's really interesting, too, is that you also generally have well, okay, things like Twitter notwithstanding, you generally have more readers than writers to a database.

SUNIL: Right, yes.

JASON: That could lead to you have a migrating write point based on who's online, so that it's faster for them, and people that are out of the typical timeline, they just go a little bit further. And I think the other thing, too, our expectations shift when we're performing operations. I know that I, personally, I suspect a form submission to take longer than an URL load. There's a little bit of built in leniency there. When talking about that tradeoff, you're 100% right. I think that is a huge challenge for us. But the tradeoff is that, like, the reads and the writes are happening. So, when I start thinking about it like that, I think I'm okay with that. I think I'm okay with just faster reads. It's a net positive for me. At least in every use case I've found. I'm not building social media, and I'm not doing massively multiplayer online gaming. So, I know I'm not hitting all the edge cases here, but for a document kind of thing, we're updating a new dashboard together, something like that, it's enough for me, I'm happy.

SUNIL: There are two things. One, as frontend developers, which I like to pretend to be, even though my day job isn't really that anymore, it means that you can now leverage a good user experience. And the thing I'm thinking about is, of course, stuff like optimistic updates. Or just showing nicer spinners, leading the user through a path. Even having a small animation might give you enough for the right to seem right. So, there's that. One is, of course, as user experience developers, we get to make that experience better. The second one is, it turns out over the last year, because of stuff like the edge, but also because running edge is like cheaper, and enables nice patterns for multiplayer, it turns out multiplayer data structures so, this is your why js, automerge, which is great, I was playing with it, it's so simple and nice. Also libraries like live blocks and repli cache. Live blocks is a popular one, I see a lot of people using this. Start becoming the data primitives for doing these sorts of things. And the reason that I'm calling these data structures out specifically is that I think they now realize that they need to be usable by JavaScript developers, who are incredibly entitled and want to get something running without having to learn algorithms. You have a React component that lets me do this. So, I personally believe that last year I think is when it started getting more serious. This year is when it's hitting the mainstream, and I suspect by 2023, it will be way more common than not. Simply because of the prevalence of edge networks. Because people will be using local caches to write to that then sync to a central database, et cetera. I don't know, that's just it. It's not clear to me yet whether this adds new use cases or eats into common things that people already do, but it does strike me as something that's going to become a lot more popular in the next year, yep.

JASON: So, actually, I want to dig into that for a minute, because I've seen a couple things that feel like they are unlocking, or potentially net new capabilities as a result of edge compute. And, so, one example is I'm actually going to jump over and show this, because it's easier to explain.

SUNIL: Cool.

JASON: Let me jump over here and get out of the double view here. So, let's look really quickly at this no js personalization. What I find cool about this, this is a demo that I've made that is, let's see, disable the cache, and also disable JavaScript. Disable it. Okay, so, what this is

SUNIL: Oh, shit, does chrome DevTools now have a command chooser?

JASON: Command p does that now.

SUNIL: Oh, okay, that's dope. Awesome, good, okay.

JASON: So, check this out. I built this little demo. It's a list of products, and what I wanted to see, with JavaScript disabled, what would happen if I wanted to personalize this using an edge function. So, I want to see, there's two categories, corgis and food. If I show interest in food, and this is a placeholder page, but if I show interest in food and come back here, and then I do that a couple times, my score is going to go up. Did I break this whole demo? I bet I did, didn't I? Come on, y'all.

SUNIL: I think those are changed. Why is it showing the waterfall?

JASON: Are my cookies disabled? What's going on? Something is going wrong. There should be maybe this is causing all of my cookies to clear. One second, let me try again, make sure this is actually doing what I want. Where are my cookies? Okay, there's cookies. So, I should now have food, right, there's my food stuff, I come back, and it's not updating, which means I've broke something. Okay, well, that demo was embarrassing. What it was doing before was with JavaScript disabled, it was shifting the order of these. And then, like, there was another one that I was working on that was like what if we took plain HTML, and we did a link tree, and it updated in the background with your latest stats. This is all powered, if we look at the actual source code, this post count, or episode count, that's not part of what's being done here. And it was cached at the edge to make this faster, so it upgraded in the background. But this is all enhanced on the edge, and there's no JavaScript running on this page. So, I feel like I'm just screwing around here, but I'm imagining full fledged no client side JavaScript frameworks. Like, what is a Remix look like that does all the dynamic shit in an edge function, so that you never ship a single byte of client side JavaScript to the browser? It feels like that's an open door here.

SUNIL: So, let me see if I understand this use case right, which is you have a node server or some server, which does some basically generates this list, but then you intercept it with an edge function and add all this dynamic shit. You're using the edge, and it's just so cheap. What would have otherwise been expensive and hard to integrate into this thing is now just a thing that the edge takes care of. Is that a way am I reading this right?

JASON: Exactly. Check this out. I have a suite of serverless functions here, and each one goes off and hits whatever service. And checks my follower count, or whatever it is. And then I sent it back with a what is that what that, hour long ttl. So, once it runs once, it's cached at the edge. If it was a static asset, JSON file, for an hour. Request it again, comes back nice and fast. At the edge, I have one that's grouping all those up in a single stats call, so I make one request. Then I go into this edge function here, and the edge function is hitting not that. It's hitting the all call for the stats, which is that serverless function, and then it's using the HTML rewriter, which is something that came out of CloudFlare, to look for this data and rich true HTML attribute. Then it just adds in this thing. Just this little span, here's the data. And that's all that it takes. And I'm not shipping any JavaScript. I'm just enhancing the response that's being sent to the browser. That's why this is so exciting to me. We're literally opening a door here, where somebody who doesn't know a bunch of back end stuff, who doesn't know how to configure express or deal with all the middleware, anything like that, can use technology like HTML rewriter, use technology like edge functions, drop them in a folder alongside the front end. If we look at this, my main where am I actually shipping here? Oh, HTML file. Literally, an HTML file. So, I shipped svg icon and a link to it. This links to my blog. That's all that's on the page.

SUNIL: Beautiful, I love it. Really nice.

JASON: This is so freaking cool to me. There's so much possibility here. I'm very, very excited about the possibilities of this, in terms of being able to do all the dynamic stuff that we cared about, and not having to ship any client side JavaScript. It's kind of a have your cake and eat it too moment if you think about it from will this work with JavaScript disabled, will this work on a feature phone, is this still performant if somebody is loading on a 1999 Nokia flip phone. Do they have Nokia flip phones in '99? Probably not, right?

SUNIL: I had a 3310 in 2003. So, I don't know. Sure they had flip phones. I don't know if the Nokia had one. I don't remember. What I also like about your code, of course, is at no point did you have to do all the other things that would have been associated with running a dynamic website, right, you didn't have to provision any servers. You don't have to look at server health. You didn't have to write server.listen and so on and so forth. It's very much a, hey, I assume you wrote a function that returns a response object. That took in one response object and started rewriting it and returned it back out.

JASON: Yeah. Here's the whole config. I did a rewrite, so that my functions were at an API like URL. And then I did this I said the edge functions on the home page. That's it. That's the whole config. All the proxies, all the coordination, all the DNS, it just works. I don't have to care or know about it. I just know it's happening. And to me, that feels like, you know, this is what we're talking about when we're talking about companies competing on experience. Shit like this makes me want to build complicated things. Because I don't have to figure out how to configure route 53 and the API gateway and AWS lambda, you know, whatever other CloudFront, S3, I don't have to connect that shit together and figure out how it works. Okay, I have a folder full of logic, here's my whole app, edge functions and functions, ship it. All right, I'm done. Right? I had an idea, shipped this, I think I spent maybe two hours on this.

SUNIL: So, there must be a metric that we can pull out of this. I know Supabase I just learned recently, one of their success metrics internally is what they call time to first query. As soon as a user signs up, how short is the time how long is the time from them signing up and making the first query. And a success metric is making it as small as possible. That's just such a beautiful number I was thinking to myself. That's a thing that means that the experience of using Supabase is simple, hopefully. So, what then is the success metric for a framework like this? So, what is your time to I want to say first render, but that doesn't make sense, because you could cheat with a boilerplate. Really, I guess the question is, and I hate this phrase, because it sounds like it's capitalism shitting all over it, which is your time to business logic. How long did it take you before you started writing the code that mattered and not the code that otherwise was needed?

JASON: Yes.

SUNIL: That's not a very good phrase, but you get what I'm saying. That's the kind of thing stuff like this enables, which is in your scenario, for example, it literally was, hey, I had to make a project, and make that I assume you didn't need that when you started up, and started using HTML Rewriter. Well, you had to know the API for that. I think that's another thing sorry, go.

JASON: I was just going to say, the thing that I find really, really fascinating about this, we really are talking about HTML Rewriter is a jQuery like syntax, and the API is pretty straightforward. You know, it's got TypeScript autocomplete, so if you're using VSCode, you can look at the thing and see what's available. The thing that I find really encouraging about this is the vast majority of what we're doing here is we'll go to ndn, look what a request object looks like, and you're good to go. Response object, good to go. The auxiliary stuff is all, I don't know, workers are going to provide the geo, and, you know, extra context, and edge functions do the same, where we get this context object with bonus information, but it's not like you need to care about that at all. It's only relevant if that's what you're trying to do. I want to show where you're requesting from. Great, do that. But as far as I'm concerned, I just used a bunch of web standards to do some really advanced stuff, which actually is a good, like, segue into a question from ekafyi in the chat. How do edge functions differ from regular lambda functions? The clarification is we can serve no JavaScript SSR pages with serverless functions. In her mind the main benefit is caching in a worldwide server closest to the user. Any other differences in your mind?

SUNIL: The big one, of course, is lambdas right now are heavy as shit. It's using a run time, which is node, that was not designed for this use case. So, either last I checked, feel free to correct me, I think lambda's startup time was 150 to 200 milliseconds when you make what we call a cold start. I think Amazon literally sells you a service called Concurrency that helps you keep them warm, so that they are effectively zero. But if you look at the new breed of run time, CloudFlare Workers, Bun hi, Jared, that guy is such a beast, Fastly, they literally run a js run time it's kind of crazy, I love it. But those things are built for this use case, in which case the start up time is under 10 milliseconds, and in most cases it's zero. They do a hack, while you're starting to make the handshake, they already start warming up it, so it's ready by the time the request comes. It's cheap to run. It's really quick to start up, which means you can use it as a this is a big deal. Imagine if I told you that every function that called 200 milliseconds to run, before it even started running. That wouldn't be a good compute primitive. You'd always be thinking how can I badge this shit, et cetera. No, with these new run times, you can conceptually think of it as an architecture diagram with pipes with your code. You can tell yourself, this part I'm going to run on Netlify. And you will not fuck it, multi provider, these functions run on CloudFlare, and because the hardware is so close and the run time has been designed for this use case, it's almost like it's running locally, they are all running together. That's the big deal here. They've been designed from scratch for this use case.

JASON: This is what I think that if you and I are right about this, the logical conclusion of how edge compute works is exactly what you said, where we're literally able to ship little logical containers to every router in the world, because, honestly, that's how things are going to work. You'll just kind of slowly get these warm connections to services that you go to, and they'll be smart enough, hey, I hit this website a lot. Let me pull all of their CDN stuff right into my router, so it just executes here. Boom, done.

SUNIL: Exactly.

JASON: And there's so many interesting outgrowths of that. What does it mean, and if we start looking at this idea of decentralization, crypto notwithstanding, but just this idea of you can kind of do data ownership in this sort of way, where it's distributed across the world. That doesn't mean that you have to have a server in your house. I think this is kind of where that conversation breaks down for me. Own your data. Okay, well, what does owning your data means? Means you have to have a physical device your data is on. I'm never going to do that. Cool, this conversation is dead to me. But if I'm able to use services that duplicate the data, so I am the de facto owner of it in the sense of I've got all the copies and if one service goes down, who cares, it's there. That's distributed globally in a way that is really close to users. Now I'm kind of interested in this idea. Okay, let's see how we can do this, get lower redundancy, figure out how to get data close to users and make sure it's not centralized to one server cluster in one region. It really does kind of get my gears turning about how many things we can do.

SUNIL: The data ownership thing is also, as you can imagine, internally at CloudFlare we talk about it. We have a tech called durable objects, which for the sake of discussion here, and it's very sci fi, but for the sake of discussion, say it's one JavaScript object that can live in one data center that's closest to you that you can spin up. And one of the interesting APIs it has is when you create it, it has a configuration parameter called jurisdiction, which currently only takes one value, EU. And the idea is, if you have GDPR regulations to follow, and you pass this jurisdiction EU, it makes sure the objects are hosted only inside the EU.

JASON: Fascinating.

SUNIL: That is the only value it takes now. You can expand to multiple countries. India, whoever, and we'll expand that. But this data ownership thing, let's say I can buy a stupid rack from CloudFlare and like I was mentioning, I'll put it right behind my washing machine. Now jurisdiction is the ID of this server, boom. I kind of have data ownership locally on hardware that I own, and I can now write a little work around on top of it that lets you access it if you pass some odd and some values.

JASON: More practically, if we think about a lot of bigger companies, they are doing this stuff with local VPNs and firewalls. I want the jurisdiction to be my VPN.

SUNIL: On prem, this is your answer for on prem.

JASON: Yeah. I watched IBM tie itself into logical knots to explain the concept of hybrid cloud, where it was like half on prem, half on cloud, so they can convince companies they did in fact own all of their data. It ended up being a lot of stuff like that. It was really complicated firewall rules, VPN access control, blah, blah, blah. And this idea, again, if what companies are competing on is the experience of setting this stuff up, if I get to say my jurisdiction is my company, and, you know, EU gives me GDPR rules, my company gives me VPN firewall lockdown, and the only config I had to do is tell the company I'm an enterprise customer and these are, whatever, the IP ranges or something. Done, great. I will hand you my credit card gleefully. I will underhand shovel buckets of money at you if it means I never have to open up a next config myself.

SUNIL: That's exactly right. In fact, is this that's right, is this actually the run of late '90s, 2000s web engineering, where you'll have a box underneath your desk, so when demoing to your CEO, connecting directly to the machine, and you click one button and suddenly it deploys to the rest of the world.

JASON: Unbelievable.

SUNIL: I think that's what we should do. Implement FTP on workers. That's a product idea I should take back.

JASON: Yeah, you build that, because I want nothing to do with building it. But, no, this is what I find so exciting about this space. We've been talking about this for barely an hour, and the ideas just pour out, right. And I imagine chat, what about you? I imagine there's unlimited use cases. What's happening in your brains? What do you want to know about this? We have 30 minutes, you have Sunil Pai unlocked, he's paying attention. What do you want to know? What are the things you're curious about or excited about? What do you see being possible with this edge based world that kind of unlocks new innovation? And while we're waiting for the chat to think, as we were doing like the ten minutes of prep before we went live, you said something that was so funny, but I was in the middle of explaining something else, so I didn't get a chance to properly react to it. Where you were talking about the tradeoffs of, like, the server being you were talking about servers and sizes. So, I wanted to make sure we get a chance to talk about this, because it just made me happy.

SUNIL: That's just it. It's actually a pretty important architecture discussion to have. Which is, would you rather fight one horse sized server or 1,000 duck sized CPUs. That's the architecture decision for edge compute.

JASON: And I love this so much, because it really does start to it changes the way that you think about things, because you might think, oh, well, I definitely don't want 1,000 distributed CPUs, because that sounds like a nightmare to wrangle. But then I'm thinking, hold on, I remember I used to have the horse sized server, and the major problem that I had with the horse sized server was that I was always trying to find ways to get caches and distributing that to 1,000 duck sized CPUs.

SUNIL: Exactly.

JASON: So, a lot of my problems actually stemmed from okay, I have a server, this server does all the things in one place, and I love that, but if more than ten people at a time try to access it, the whole thing explodes and catches fire.

SUNIL: Pretty much, exactly.

JASON: How do I distribute this logic in a way that allows an unlimited number of people to access it at the same time without our whole website going down. It's like, oh, well, we make new instances of the server. Oh, microservices. You know what I mean? You start naturally driving towards this idea of lots of distributed compute power.

SUNIL: This was a phenomenon go on, sorry, sorry, sorry.

JASON: The old model, we need to make copies of the big server and put that in a lot of places, but we're basically saying, okay, you can have that benefit by doing lots and lots of CPU around the world, but the server itself doesn't even need to exist. You can put little bits of logic instead of a big monolithic core buddy that does all the things in one place.

SUNIL: So, this was a phenomenon in the 2000s that used to be called getting slash dotted. And it was very interesting phenomenon, because slash dot what is the website? Just slash dot, anyways, which is another thing impossible to say on the phone. I think that's why they chose the name. The idea is as a hacker you would build something interesting, and somebody would share it on slash dot, and you would get a shit ton of traffic and that would bring down your server. The similar thing happens nowadays if you get on to Hacker News or Reddit and don't have good caching set up. The phenomenon was particularly interesting to me, because it meant the people who would suffer the most were the people who were most clever. The hacker mentality. Slash dot would bring a whole flood of traffic only if you were really good. Which sucked I think that was when part of the mythos of being able to scale among engineers. What happens if you get popular and things come your way? What happens what happens when you get 10,000 people who attend your sale? That's also part of the that became part of the interview process. How do you do vertical scaling, when really that was a phenomenon of an architecture that couldn't stand that kind of traffic. But now in the edge world, I don't give a shit, bro. Take care of it. Why do I care?

JASON: I think about that a lot. I remember when I was younger, earlier in my career, standing up somebody's site getting ready for Black Friday or something. We knew, a bunch of ads, a promo, everybody wants this 50% off deal. We're expecting hundreds of thousands of concurrent viewers. So, I'm standing up simulation software that's going to ramp up the number of concurrent connections. And now if I were to try to do that with my personal site, I've spent zero seconds thinking about how to scale, I would run out of memory on the simulation software before my site would go down. The architecture is so different now, the scale is not even a consideration.

SUNIL: Yep, exactly. So weird, because, sure, you might face this at an edge edge case thing, which is let's say that I don't know, man, John Lennon comes back to life and has a one night only concert in London, sure. That database is going to take a pounding, but even then you'll do a Dependently thing and stop it after a while. Having to scale a server used to be such a fundamental part of degree a network developer, and now it's like two days before you go live, have we set up our caches with our provider? Yeah, okay, cool, done. Tick item. I love it when technology gets commoditized like this. The young smart hackers can build things and servers won't go down when they are popular. Okay, I have an idea to pitch to you. This is service as a platform. This is something fundamentally new that edge providers can provide us. So, internally, at CloudFlare we call it workers for platforms. I don't think there's anything fundamental about this, I suspect every provider will provide some version of this. Okay. So, there are a number of use cases of companies, organizations, products, that get a lot better if their users are able to upload code and these people can run it. So, the two big ones that come to mind are Shopify lets you upload code to make a shop. In fact, they just announced this thing two days ago. You can write React components for your shop, upload it, and Shopify runs it for you. Components for e commerce. I assume a checkout button, product perusers, stuff like that. The fact is, Shopify can run the code. Similarly, I don't know if you know of course, I suppose you do know this, Discord doesn't have a bot platform, because they don't want to spend engineering time and ops and also expensive maintaining a shit ton of servers where potentially the worst kind of users are uploading crypto miners. That's why they don't have their own bot platform, which is why you use somebody else's platform. First thing you do. So, that's another use case. There are others that I can think of. So, for example, component AI is the one I think of. Right now, you can store HTML and CSS, but it sure would be nice to run MDX directly. What if your document that you store to your database was MDX, and that's the thing. If you want to use MDX by the way, for folks that don't know, MDX is like JS but looks like Markdown. If you want to do MDX for your blog right now, you have to own all the code, store it all in your git repo, and store your blog in the git repo. But if you want to use a service like Contentful or Sanity, I guess, I don't know what other ones are out there. What are these services that you can only store JSON or data or strings. Can't really store code there. But what if you could upload code to them, so every time you make a request, you get a response object. The use case I think of is a product company that shuts down between 10:00 p.m. and 6:00 a.m., so that and based on where you're coming from, because it wants you to get eight hours of sleep. Sorry, we're not selling you stuff right now, go back to sleep.

JASON: A very real example of this is BH Photo and Video is like an AV equipment store. They will not sell you anything on Shabbat. That's a super real world example.

SUNIL: This is one I will build at. Auth0 is a multi gazillion company. If I had to build a competitor to it, what if the API, okay, I'll do authentication for you. You can choose Google, Facebook, GitHub, whatever. But you take a user function as input and returns true or false. That's it. That's my whole fucking thing. You can make a call out to your active directory or hard code four user names, which are the people in your company as you're starting your startup. This is a fundamentally new architecture. Hope you don't mind I do a brain dump on you.

JASON: This is literally what I wanted. This is exactly what I wanted.

SUNIL: So, something called a Hollywood principle when it comes to computer science. That's the way they tell you the difference between a library and a framework. In Hollywood, when starting out as a store, they say don't call us, we'll call you. React is a framework, because you provided your components, and it runs it whenever. It decides how to do the scheduling and when effects fire and so on. Something like Underscore is a library, because you take the function, and you call it. You actually call it. Pass it data and it does stuff.

JASON: That's a good heuristic.

SUNIL: But the problem is it's not complete. You do call a function when you use React. Secondly, when you do underscore.map, you do pass the call it calls. Okay. So, it's a good thing, but still never perfect. There are clearly exceptions to this. All libraries take callbacks. Anyways, the reason I brought this up is the difference between libraries and frameworks is the same difference between services and platforms. So, platform is something like where you upload your code, Netlify, whatever, you upload your code and it runs it in response to the request responses. Should I say something like a database service, where you actually say, hey, give me this data right now and it gives it back to you. Now, what if for services you pass callbacks that it runs internally? As a user. If auth could provide you something. The reason I said all those use cases, the reason I started with Shopify and Discord, until now, about 2022, this has been the domain only of big companies with money to spend on setting this up and ops teams and figuring out how to run it efficiently.

JASON: If you think about just the spend on the team and baseline infrastructure, you need to have probably a million, minimum half a million ready to spend every year.

SUNIL: Right. And time. You're going to take months to do it, and who knows if it works. You can only do this when you're a big company. You know which company is good at uploading user code and managing it and running it cheaply for you? Fucking CloudFlare and edge providers. It's a technology easy for providers. So, you reach out to CloudFlare or anybody else in the future and say, hey, can we set up a platform. You basically set up a name space and it gives you an API call for uploading a string of JavaScript, and then you call it whenever you want. It will give you details.

JASON: What I love about this, I'm picturing you in the executive pitch meeting, you're in there, just to pull some quotes from our conversation. Will, all right, all right, everybody gather around. What do developers love? They fucking love features. Four API calls, bro! I could see you

SUNIL: That's pretty much it. What I like about this, because it's for API calls, it now means you and I for our side project on a Saturday morning, evening, can set up a platform. You know what, the use case, the one I've been thinking of recently is Natto.dev. Have a look. This is basically a live coding environment, one of those visual programming editors. You can make little blogs and connect them. See, you can connect them, there are variables. There are a number of like this, but it also runs client side. This is a project by a smart dude name Paul Shem. I assume he can't be asked to set up a platform for other people to run code on his servers, but I am saying, you know what, you want to talk to CloudFlare, bro, we'll run your shit serverside, it's fine. You can do anything, you can have secrets there, that will just run, and then expand to other people to run as a function. You want to store sorry, go.

JASON: Anybody building a developer focused tool would have a moment where it would be really cool if our users could customize this with code. They are all developers. Let them customize it with code, and then you immediately say, no way. Setting up a sandbox environment is way out of our reach, so you walk away from that feature set. To me, that's yeah, I'm excited. You have my attention.

SUNIL: As a startup, one of the big things is do fucking sales, and go up to sales, the client is like I wish it did this. Then you have to think to yourself how can you roll out this feature for one client but as part of your whole product offering. No, say the client is XYZ. Make a file called XYZ.js, run it to this platform, only for this user, done, that's it, fucking done. This is what I mean, what are the patterns that the edge can enable? Sure, performance thing we were talking about, developer experience, and those are all incredibly valuable today. But what are the new novel architectures that we can enable for the builders of tomorrow? You know what, you want to set up your own you want to set up a cell for your friends, let's say, for each of your Twitter followers. Some 65,000, something

JASON: 40 something like that.

SUNIL: I was counting in dollars. Anyways, you want to say, hey, I want to do a thing, but if you're a follower of Jason Lengstorf, you get some 10mb of hosting and you can run whatever in edge function, et cetera. Also for 12 of your closest friends, you want to have some specific feature. Hell, or you want to customize these things. These are conversations that don't end now with a, fuck, that sounds like a lot of work, I'll do it in the future. You know what I mean?

JASON: Every idea, I'm not kidding, every idea that I have these days, especially now given my job is mostly meetings, I get an idea, I get really excited, then I look at an architecture diagram and draw the second box and go this is too fucking hard. I can't do this.

SUNIL: I don't want to do this anymore. This looks like too much work. Now we're saying something that has been that has been the only big companies with a lot of money and time and effort can spend on suddenly has become commoditized. The word I use, I use service as a platform, where every service becomes a platform. CloudFlare calls it workers for platforms. Internally it was function as a service as a service. We used to say FSaaS in meetings, and it would derail the entire meeting when we say this. Sometimes I think of platform development. Point being that you now not only can you use the edge for compute, but you can now create curated experience for other hackers and builders very simply. Sure, those have their own tradeoffs. Not all your users are going to be programmers. Sure, you can offer it as an additional thing. You can click these options to set it up, or if you have something custom, here's a text area for you to write a function. You want to use npm modules, go for it, whatever features you want. That's the thing I've been thinking about for the last month, two months. Holy shit, this is way different than any builders. It's a kind of tool that builders haven't had access to cheaply for a while. And that's what that's my pitch to you.

JASON: If you think about the other thing that's interesting about this, it's not just cheaply. It's free to start.

SUNIL: Free.

JASON: You can build this whole thing, and you don't have to put a credit card in, and just see if people use it. If you scale, if you get adoption, you'll have to pay, but the experimentation phase is 100% no credit card required, and that to me is a legit game changer. I actually saw a comment I love. I describe myself as front of the front end, and when I tell you edge has made me dangerous, right. The thing I find exciting, the whole reason I joined Netlify, get excited about edge functions, excited about serverless functions, is you have this ability to be a developer who would call yourself front of the front end. And the skill sets that you have, things you already know, make you a full stack dev, because the learning curve is flattened so much. And that's the thing that I find the most exciting. If Stephanie's feeling I'm front of the front end, but this edge function stuff feels approachable to me. I'm feeling like I came in front of the front end, was a designer before I was a developer. I feel I can reach into these edge functions or serverless functions and build cool stuff. I get how the fetch API works, I know how to make a call to a third party service, that makes sense, I know what a request looks like. The world really feels flat when you start looking at these technologies. And now it's kind of limited by your imagination, not limited by your willingness to wade through all the boilerplate and config and connection and plumbing. Or your willingness to put down a credit card and say, hey, do I care about this side project enough to spend $9 a month on it to have a digital ocean server, whatever, a Render.com instance running, so I can build and hack on it. No, it's free, it's accessible, it's super approachable for developers who have a front end skill set, and we can just, you know, we're dangerous. I love that.

SUNIL: You used to have to be a Linux nerd to run a website. Don't get me wrong, I like Linux, by the way, but holy shit, that's the first thing that people would fail at. What do you mean I need to install an operating system to host a website? Go fuck yourself, right. Sorry.

JASON: How much damage did we do in that? I remember because I wasn't a Linux nerd, I didn't understand the implications of shared hosting, for example. I'm on GoDaddy shared hosting, $5 a month. Probably fine. I had my first few clients, shared hosting, big issue on shared hosting is wasted space. Oh, okay, I'll go copy/paste this clean up script to get rid of unused images, and I whacked down a root, nuked the whole fucking server. Of course, I wasn't paying for the backups. I'm just I'm a year into my career at this point. And I had to build three people a new website, because I nuked everything. Lost all their data. Fortunately, wasn't very much data, but what a nightmare for me, what a nightmare for my clients, because I got thrown in way over my head managing a Linux server instead of just building the front end, which is the thing I understood.

SUNIL: Just a pain in the ass, when you said Steph, is that Stephanie Echols? Hi, big fan, follow you on Twitter, hi, Steph. Anyway, I think specifically JavaScript goes through waves like this, right, where you get very excited about some things, and then normalize and provide help, paying career, pay all the bills. Hell, brought me to London. Then it's sort of the rate of innovation, not only does it slow down, but it starts feeling humdrum. And then something like a React comes along. Oh, shit, we learned a bunch of things. For the last two, three years, I shit you not, I've been feeling that same feeling. React is stable now, exploring the space, doing a bunch of cool things, but what the fuck is durable objects? I should reach out to CloudFlare. Wait a second, this is fucking awesome. Oh, what do you mean? That's it, the first thing that hits me is the 9 millisecond latency time. It tells you what the latency is. I was like what do you mean it's nine milliseconds? That makes no sense to me. Then you're like, wait a second, anybody can fucking write this. My job was to build CLI. It's such a nice job, because I get to take credit for all the hard core shit happening behind it. Okay, I guess that's it, I'll take credit for the nine millisecond latency and shit. It's an awesome place to be. That's why I feel so fucking excited. The nice thing about edge computing is it's also like, oh, wow, tradeoffs are clear, wins are also clear, and for a majority of use cases, not just some use cases, it's usable today, there's value today, all you need to do is kind of spread the gospel. Fuck it, let's get with it. I love it.

JASON: So, the last time that you were on the show, I'm going to pull that up actually, because it was a great episode. We talked about build, and I'll put it in the chat for anybody that wants to watch it. In this thing, you're talking about it feels humdrum. Edge functions are a boring bet, but what I find exciting about boring bets is it's a guaranteed win. This isn't the thing where most of us do, where we get three years into our code base, tech debt is building up, getting frustrated, getting bored. You know what would be easier, let's burn it down and build in a new framework. You know, and we've seen people do this. We all went from angular, to React, then we went from React to Next, and now from Next to Remix. All of those decisions are decisions that come with benefits, but the really sticky messy part typically doesn't change with your front end framework. You end up with all of these, okay, we've got somebody who's in India on a feature phone, they are not going to have a great time loading a Remix site. They are currently having a terrible site loading a site from U.S. East 1. How do we improve that materially? Edge compute. Framework doesn't matter at that point. This is the thing that I think is really, really, really exciting. Yeah, Nikki is remembering the silver bullets only work on werewolf shaped problems. I read that quote from you, stuck in my head ever since, I use it all the time. But it's just very, very interesting to me to when you see something that's such an obvious win. At least for a broad swath of use cases. Again, not a silver bullet, but there are a lot of things we're looking at that we've been hacking around this either because we weren't quite ready to go serverless, didn't like cold starts, or dealing with serverless cold starts because we didn't want to deal with servers. Now we get to say I'd like to have the cake and eat it, too.

SUNIL: Yeah, yeah. It's fine if I can't connect to a database directly from one of these functions. I'll figure that out later.

JASON: We're getting there, right. These sorts of things are in the meantime you can do what I did, which is I have a serverless function that caches the first call, because the data doesn't change more than every hour or so. So, I just do that. I just use this cached thing and it's basically like loading a JSON file from the edge. Cool, great. Fast enough, I'm happy. I don't really care if the TTL is an hour or day, whatever. Would I build a stock picker off of it? No. Would I show the latest stats from my analytics? Yeah, I don't care if they are an hour out of date.

SUNIL: Can I tell you something meta, just reminiscing on something?

JASON: 100%.

SUNIL: Steph's quote on being dangerous also and the whole thing on edge functions and this thing. I might have discussed this on your last time I was on, which is I did very badly in college. I actually did really well to get into a very good college, and then I discovered weed, serious girlfriend, got into the college band. I did everything fucking wrong. I did really badly in college. There was a lot of fun, trust me. Anyway, that's a conversation that can't happen

JASON: That's the after hours, yeah. Learn With Jason After Hours.

SUNIL: But I was left in this place where I couldn't get a good job, where all my other friends did so well and got high paying jobs. And at the time, JavaScript was there to save me. At that point it was a toy language, people didn't give real engineers didn't consider it seriously. It was a toy language. The fact that a woman had to teach it to me was something that I will never forget. It was always the because women weren't considered real engineers. Some might say a lot of people say that now, as well. I'm like, go fuck yourself. JavaScript was there to save me. You know what, hey, if you're willing to write JavaScript for at the time rich RIAs, rich interactive applications or some shit like that, people would pay money for these frameworks. But it was there for me then. Then browsers got a good debugging story, with Fire Bug, and that made me slightly more serious. Then browsers started getting frameworks. I remember prototype js, jQuery, et cetera, and I became a little more serious. Node came out in 2009, and then JavaScript made me a back end developer, I could write servers. React came out, and it made me a functional fucking UI developer. The company I was working at decided to shut down the website and go app only. They are like what am I going to do? You're smart, you'll figure it out. I bought a book, didn't open it. This sucks, I have to find another job. I shit you not, two weeks later, React Native was announced and I became a web developer. We were the first company, I don't know, across the world, but definitely in India to put React Native into production besides Facebook.

JASON: Oh, nice.

SUNIL: Anyway, I became a mobile developer. I and now it's making me a serverless full stack developer. I've had really great timing. I remember, and I've always been experimental. I feel like that's my wife. Oh, cool tech, I want to try it out. CSS, JS, things like that. Because I was on the React team, that identity stuck with me. He's a React guy. I don't want to be a fucking React guy. That's not my scene really. It is my scene, because it was interesting, but not my identity. My identity is I want to use a technology to enable people, and I want to enable I want it to provide the kind of social mobility and opportunity and friendships, really, that it has brought into my life through other people. If I can make a framework, library, or CLI easy, that's what I think JavaScript does. To pontificate a little bit, feels like that's what edge functions also does. Elevates a whole group of people to achieve very ambitious plans. That's what I love.

JASON: You can do more with less, and that's the core thing. And I'm going to make that the moral of the story, because, unfortunately, we ran out of time. I was thinking that we might be, oh, I don't know, 60 minutes, maybe we could be out of stuff to talk about, I could go another two hours. Sunil, thank you so, so much for taking the time hanging out with us. I'm going to give a shout out to Ashly from White Coat Captioning doing the live captioning all day today. Thank you so much for being here with us. I'm also going to drop another link to Sunil's Twitter, make sure you go and follow Three Point One on Twitter. And I dropped a link to the schedule in the chat, as well. Make sure that you go and check out the schedule. We have so many good episodes coming up, including a couple that are relevant to what we talked about today, such as Auth0. I'm going to be talking about their new actions interface, which allows you to execute arbitrary code dealing with Auth0. Cool stuff like that. Make sure you check the schedule out. Sunil, any parting words for everybody before we wrap this thing up?

SUNIL: Man, it's been a rough two and a half years, and honestly, sometimes it feels like it's only gotten worse. It feels like this is a time to lean into the fundamentals of life, relationships, staying healthy, finding a group of friends you can trust, and JavaScript will always be there for you. Those are the focus points, I guess.

JASON: I love it. Sunil, thank you so much, as always, absolute pleasure talking with you today.

SUNIL: Likewise, thank you so much for having me.

JASON: We're going to have somebody to raid. Looks like Ben is live, we're going to raid Ben. Thank you all for hanging out. We'll see you all very soon.

Closed captioning and more are made possible by our sponsors: