589

March 17th, 2023 × #Next.js#Serverless#AWS

Supper Club × Next.js on AWS + Serverless with Dax Raad

Wes Bos and Scott Tolinski talk with Dax Raad about modern hosting platforms and serverless technology, including using Next.js with Vercel vs self-hosting on AWS, Cloudflare's edge platform, and optimizing costs with thoughtful architecture.

or
Topic 0 00:00

Transcript

Guest 1

Welcome to Syntax, the podcast with the tastiest web development treats out there. I've got another Banger Supper Club for you today. We have somebody who has, I'm gonna admit it, a cooler name than I have. Dax Rad, his is on today to talk about, I think we're just going to be talking mostly about serverless, hosting Next. Js Anywhere. He's behind a couple different projects, SST as well as OpenNext. Js, we're excited to talk all about that. Welcome, Dax. How are you doing? I'm doing great. Thanks for having me on. Oh, you're welcome. You're welcome. We're sponsored today by Sentry, let's talk about their new APM.

Guest 1

What is APM? Application performance monitor. So what that will do is You throw it in your application, and it's sort of trying to monitor things that will make your website slow. We had a a show a couple a couple days ago about Slow websites and what could possibly cause it. So what century will do is it will detect common things, large render blocking assets, slow database queries, File IO on the main thread. That's probably not something you wanna do. You know, in node, you have that, right file sync, Which is that will block up your main thread and anything else running on that main thread will will not be able to go until you've actually finished writing that file. So That could be a big, big no no. You're gonna wanna check it out. There's there's lots in this APM, and I'm sure I'll go into the The features of it over the next couple ad reads. You wanna check it out at century. Io. Use a coupon code tasty treat for 2 months for free. Also wanna check out their blog post, which is the new APM actionable, affordable, and actually built for developers. Thank you, Century, for sponsoring. We'll start this by just saying 2023, I said, is my year for AWS.

Topic 1 02:21

Wes looking at server costs, eyeing AWS but finding it too complicated

Guest 1

So I've been looking at some of my server bills and whatnot, and, I say, like, man, this is this is really expensive, to be able to do this stuff. And then I I look over at AWS, And I go, that's too hard. And I'm stuck in this, like, position of, like, something is very expensive and obviously easy, versus something that is, I think cheaper.

Guest 1

We'll see. And but it's just like there's so many pieces to it. So we're trying to, like this year, we're trying to get a little bit more into AWS and Understand all the different parts and whatnot. So, you're the 1st person that we're having on to sort of talk about that type of stuff. Let's talk about, you wanna give, like, a quick rundown of who you are and what you do. And then and then I thought we'll get into the Open Next. Js stuff first, and then we'll get into the the SST stuff. Yeah. Sounds good. So, again, my name is Dax. I work entirely on open source. It's a pretty lucky situation I'm in.

Topic 2 02:45

AWS has many attempts at easier deployments but not in their DNA; focused on infrastructure not developers

Guest 2

The primary project that I work on is called SST.

Guest 2

It's a framework that helps people like yourself start to build things on AWS without fully being exposed to the monstrosity that is AWS.

Topic 3 03:19

Dax's SST framework makes AWS approachable by handling complexity for users

Guest 2

Because like you said, there's a lot of great reasons to be building on AWS, especially for things that are a little bit more serious and as you start to, you know, get to a certain scale, but It's pretty unapproachable for the average person. So our job is to really go dig through all the AWS docs, read everything, Figure out all the little weird quirks and then expose something that makes a lot more sense for the end user, figure out what kind of rough edges there are on their day to day development,

Guest 1

cycles and make all that smoother. I'll start this with with a question is and I went down this rabbit hole the other day. I put a tweet out where, I logged into AWS, and I looked at some of the profiles I had created. And one of them was 12 years old.

Guest 1

And I tweeted, like, has it gotten any better in about 12 years? And half people are like, no. And half the people are yes. So I I I was like, I just want to deploy a serverless function raw to AWS.

Topic 4 04:14

Wes tweets asking if AWS improved in 12 years he's used it; gets mixed responses

Guest 1

And I, like, got into it and, like, there's no build pipeline and there's all these products. Why does AWS not have something like Netlify or Vercel or like, why don't they have that? And, like, why is there all these businesses built up on top of AWS being easier?

Topic 5 04:26

Wes wants to deploy serverless function to AWS but no easy build pipeline

Guest 2

The thing is AWS does think they have that, and they have, like, 3 or 4 Attempts at building something like Netlify or Vercel or Heroku or or whatever.

Topic 6 04:51

AWS has many fragmented attempts at simplifying deployments

Guest 2

I think Yeah. I forgot who it was. Someone documented all different ways to deploy a container, and there's, like, Something created, like, 18 different ways that you can do it because they represent all these attempts that AWS trying to create a higher level, Like, abstraction for to make some of these things simpler.

Guest 2

I just don't think it's in their DNA. They are an infrastructure hyperscaler company. They're about deploying, Like, a crazy amount of physical hardware across the whole world have it working.

Guest 2

That is really far removed to what a developer experiences day to day when they're trying to build something. Just it's, like, it's an awkward fit for them culturally. Interesting.

Topic 7 05:26

AWS scope is massive trying to serve every business unlike focused competitors

Guest 3

Yeah. It does feel like that's been a long time, thing, forever. I mean, even into Amazon .com itself, they've never had the polish, I think, in the UI department that many other companies have. And and that does seem to turn off people from AWS. And like you mentioned, Wes, it opens up the door for all these other businesses that are built on top of AWS just to make it easier on folks. Right? So do you do you think it's a do you think it's a a positive thing that AWS figured this out because it enables all these other companies to exist, or,

Guest 2

do you think they'll get there eventually? Yeah. I think for me, I always I find myself in a tricky situation because I'll be trying to do something out of AWS. I'll be really frustrated, and I'll wish that it was better. And I'll be annoyed with With the teams behind certain services that are creating these problems, and then I'll realize, oh, well, if they were actually doing all those things, I wouldn't really have a job. So, I definitely appreciate the gap that it creates. I think it is a good good system.

Guest 2

AWS's scope is just Massive. They're trying to serve every possible business out there.

Guest 2

And the reality is to build good services, you do have to narrow your scope a lot. A lot of these Companies that people like, like Vercel or Netlify, their scope is a lot narrower in terms of what they try to offer and who they try to serve than what AWS can do. And that's kinda why they have a good product. And I think at some I think that's a there's a reality of it where At the scope that AWS has, it just you just can't really build anything that great. Alright. Let's get into the the open Next. Js stuff. So Next. Js,

Next.js hugely popular React framework, tied to Vercel hosting platform

Guest 1

Huge, probably the largest, React framework for building a website out there. It sort of encompasses a lot, And it is obviously very tightly not tightly, but it is it's the baby of Vercel. And Vercel is this amazing hosting platform that does Images and CDN and caching and, deploy previews and, serverless and edge functions, and they they do all the stuff. And the, like, Experience of writing a Next. Js app and hosting it on Vercel is a very, very nice experience. We host the That's the exact stack and hosting platform we use for the syntax website.

Topic 9 07:43

Hosting Next.js anywhere but Vercel not always straightforward

Guest 1

But there's often been people saying, well, it's not that easy To host a Next. Js site anywhere else, the they provide some sort of some outlets and whatnot. So do you wanna give us, like, a rundown of, like, what is the open Next. Js project?

Guest 2

Yeah. So I think the premise is like you said, Next. Js hosted on for sale is an incredible experience.

Topic 10 07:59

Open Next.js project enables self-hosting Next.js apps like on AWS

Guest 2

And trying to host it in other places, you kind of have a variety of different different options and variety of different experiences you can have.

Guest 2

So you can self host Next. Js in a container, it's pretty easy for Vercel to support that because it's just a long running process. I can literally just do everything that they needed to do.

Guest 2

But for a lot of people, self hosting Next. Js in a container is pretty different than what they get from Purcell. Just in terms of costs, SMS is always running.

Guest 2

It's a little bit it's a lot more expensive to have, like like, global availability, things like that when you have to deploy containers everywhere.

Guest 2

When most people think of I wanna self host Next. Js, they expect it to be entirely serverless, deploying to functions And, in kinda serverless primitives that AWS offers, and the process for doing that isn't actually straightforward. Not gonna say Vercel, like, explicitly doesn't let you do that. They actually do put in a lot of effort in creating standard output so that you can take a Next. Js app and potentially get it to, to be self hosted in in AWS.

Topic 11 08:39

Fully serverless Next.js deployment has challenges vs. Vercel hosting

Guest 2

But the gap there of what you get from that build output to actually deploying is Pretty massive. You need to, 1, be an expert in AWS. You really need to understand the different ways you can do things. You also need to understand Pretty, like, detailed internals of Next. Js understand the different components that, that Next actually offers.

Topic 12 09:09

Big gap deploying serverless Next.js yourself vs. Vercel's managed service

Guest 2

And there's not just a single way you can deploy AWS. There's A lot of different ways you can deploy with different trade offs. You can you know, some functionality just cannot work in an entirely dev environment.

Guest 2

It's really difficult, and it takes a lot of man hours to just figure that out.

Topic 13 09:45

Took lots of work to achieve 1:1 parity self-hosting Next.js

Guest 2

So that's why we started the Open Next project because there were a lot of Kinda disparate efforts at doing this. We had, there's, like, an older project called serverless Next. Js that a lot of companies were using. In fact, AWS even used to use open source project internally to power their Next. Js offering? Of course, like, 1 person cannot maintain that long term. That project kinda died, and And there are kinda few offshoots of trying to do this. And there was enough interest in our community of people wanting to run Next. Js and AWS where We avoided building this project for a while. We really didn't wanna do it just because it's a lot of work, but we got to a point where there was enough expertise in the community for us to Actually, go ahead and and and build that. So how many people are involved in an effort like this? Yeah. So from our core team, we have, Frank, who pretty much leads this effort, he spends a lot of time on this. There's a little bit of upfront work, you know, to get to a place where there is 1 on 1 parity with what Vercel offers. And then from there, it's more maintenance. And then in terms of helping the community, I think we have, like, 5 or 6 people that are pretty much talking about this all day.

Guest 2

None of us at SST are actually Next. Js users, which is kinda funny because we're the ones that are the highest effort.

Topic 14 10:50

Open Next.js team doesn't actually use Next.js themselves, rely on community

Guest 2

So we're not experts in the details of how Next. Js should work or its exact features. We really rely on our community to, you know, report how things are working and what things aren't exactly right. And they find a lot of really obscure things. Like, they'll figure out, oh, we need to pass this explicit What's a header for, like, this very specific feature to work, things like that? Let's let's move through the

Guest 1

different parts of Next. Js, and we'll sort of, like, map those To what are the the Amazon products. So, at at its very core, Next. Js splits its routes. So every time you you go to a URL, it makes that into a serverless function. Is that right?

Guest 2

So not exactly.

Topic 15 11:35

Next.js build outputs single function bundling all routes, not separate ones

Guest 2

There are so the build output outputs a A single function that handles all of your routes.

Guest 2

So that means it's all bundled into, like, one, Like, kind of fat Lambda function.

Guest 2

And then they do have, you know, their edge function concept, which is a which is a little bit separate. Okay. So,

Guest 1

Literally every single route is all bundled into 1 app and run on 1 serverless function. Because I I remember when, Vercel was was it now? And they moved away from Node. Mhmm. And I had an express app running on Node.

Guest 1

And they're like, yeah, just throw it in a in a a Lambda function, and it's like that that it doesn't fit. Like, it's it's way too large for for that type of thing. So, like, is there a is there a hard limit? I always thought for some reason, I thought that they split it up per URL. So they're literally throwing your entire application Back end API routes and everything into

Topic 16 12:33

Bundling all routes into one function isn't ideal long-term

Guest 2

1, 1 serverless function? Well, so I suspect what Vercel does internally might be a little bit different than the build output that The open source Next. Js produces. So, yeah, you're right. Bundling everything to a single function is not great long term. If you have 1 dependency on 1 route that is really large, Something that's hitting the cold starts of, like, all the routes in your application.

Guest 2

So I suspect that Next. Js or Vercel might, When they deploy on onto their platform, they might do some kinda splitting. It's challenging. It's easy with stuff like API routes. Like, API routes split nicely, but it is a little bit trickier with with UI routes, than Nest and especially with the new app layout stuff. Yeah. Yeah. That that always got me, not just on Next. Js, but In general,

Topic 17 13:14

At build time don't think about discrete functions, split later

Guest 1

where it's like yeah. You put everything into your own and every single route has its own package JSON. And That's annoying to me because I was like, I don't I don't want every package JSON for every single function.

Guest 1

And then other people say, well, okay. Now you have the special package, that shared code, And you can put anything you want in there that Sherco. But then I'm like, well, okay. Now I have to decide.

Guest 1

Can I just put it where I want it? And it will, like, figure it out from there. And I'm assuming or I I'm not assuming maybe that's what Vercel does on their end is they do a bunch of tree shaking or whatnot to to make the functions as small as possible because Nice and small. They deploy super quick. You get nice cold starts. It's it's really sweet when that stuff does work. Yeah. And that's a pattern that We actually encourage

Guest 2

our main framework, SST, we talk about something similar. You don't want to you don't at author time, you don't wanna think about it as, like, these complete Discrete separate things. You wanna build, like, 1 big application.

Guest 2

And at deploy time, you want it to be broken down and tree shaking and all that. And that's that's exact that's the exact pattern we We like, and that's what we encourage because, yeah, thinking that granularly when you're building something is is just kind of annoying. Mhmm.

Guest 1

And then the Next. Js has a concept of middleware, and those are deployed to the edge. You give us a quick rundown? We've talked about it on the show a few times, but I'm sure people are still not really familiar. What is the edge? Yeah. So this one is really tricky. So

Guest 2

The reason this is tricky for us to implement is, Vercel itself runs primarily on AWS. So when you run a function on Next Oh, sorry. On Vercel.

Topic 18 14:44

Edge functions run on Cloudflare, not AWS, unlike main Next.js app

Guest 2

It's I keep getting these 2 confused.

Guest 2

On Vercel, it runs in AWS, But a lot of their edge stuff is powered by Cloudflare.

Topic 19 15:03

To self-host edge must run in AWS with different constraints than Cloudflare

Guest 2

So when the middleware runs at the edge, it's not running in AWS data center. It's running in Cloudflare data center. And Cloudflare data centers, there's a lot more of them, and they're closer to the end user. And they're less capable, but good enough to run things like middleware.

Guest 2

So a big challenge for us is, okay, if you wanna self host all of Next. Js including the middleware, we have to deploy that middleware inside AWS. And that means, Now there's different constraints that we have to abide by that Verso does not because they can pick whatever architecture they want. Yeah. Yeah.

Topic 20 15:34

Vercel uses ideal architecture without constraints of self-hosting

Guest 1

That it's actually kind of nice if you go to a vendor and they say, you know what? We'll just we'll figure it out for you. We'll make it fast. So the the limitations we're talking about with Edge functions. And this is true for running in Deno, in Cloudflare Workers, in, there's a couple other run times we've talked about is that it's not a Node. Js environment. It's a JavaScript web environment, and you only have A certain set of of APIs available to you. So I converted a tiny little site the other day, and I was like, you know, I'm going to run it on, On Netlify Edge, which is it runs in Deno.

Topic 21 16:12

Edge functions have limitations compared to Node.js environments

Guest 1

So you can't use any Node. Js APIs unless Deno supports them and and large packages and whatnot. So There's, like, a little bit of a limitation, but the upside is what? It's it's much faster and and quicker to load, and and they they literally deploy them to,

Topic 22 16:26

Edge functions have fast worldwide locations despite limitations

Guest 2

What? Hundreds of different server locations around the world? Yeah. Exactly. So if you look at traditional Lambda functions on AWS, they're not opinionated about what language you use. You can use a variety of different languages. You can that they support natively. You can bring your own custom runtime for languages you want, and that constrains their architecture. They have to build a functions platform for running any arbitrary code.

Guest 2

Some of these edge platforms like Cloudflare, they again, it's all about narrowing scope. Whenever you narrow scope, you can really Up the quality.

Guest 2

They're saying you can not only just you you're constrained not only to JavaScript, but a very specific flavor of the runtime that we provide. It's good enough for most things. If you accept these constraints, we can do some really interesting things. And one of those things is basically zero cold start.

Topic 23 17:10

Cloudflare billing only for compute time, not waits like network I/O

Guest 2

Their billing the Cloudflare Edge's billing is pretty incredible.

Guest 2

Let me see if I can explain this. So if you you only pay for compute time. That means if your function starts, compute some stuff, then makes a network request and it waits, like, 500 milliseconds to network request. You're not actually being billed for that waiting time. You're only billed when the network request comes back, and you're, you know, using the CPU again. And that creates a It's a very simple concept, but it creates so many things and so many opportunities for things you can build that, you know, isn't exactly possible on a traditional functions platform. I I remember Seeing that because I remember looking at the, like,

Guest 1

10 milliseconds or something like that on Cloudflare Workers. I was like, that's That's not enough.

Topic 24 17:50

Cloudflare billing seems limited at first but works at scale

Guest 1

And then, like, who who's possibly running code in 10 milliseconds and and that's it? But I've I've never hit that. I've never hit the issue.

Guest 1

And somebody explained to me, no, it's not. If you're fetching a, like a big thing that Cloudflare Workers is really good at is proxying Web pages. So if you're going to nba.com and downloading all of the HTML and then waiting for it to come back, you're not paying for that time, and That's wild. Like, how do they did they just not charge you for it, or do they literally spin it down,

Topic 25 18:25

Cloudflare billing model allows serving multiple customers per second

Guest 2

while you're waiting for that? Yeah. So, technically, they are not paying for like, your Your function is off the CPU at that point. It's just waiting for the network hardware to say, hey, it's done downloading. I have I have a payload for you. So there still is, like, a memory cost for them, but the CPU cost is what's limited.

Guest 2

So they can have other customers running on that CPU while you're waiting for your Your request to come back, and that's how they they basically sell the same second multiple times to multiple people.

Guest 1

It's a model that that can work really well at scale. Wow. And that's that's that's another thing that people don't often understand is that, like, if you look at pricing for a lot of these things, they charge you by compute

Guest 2

Our just regular people look at it and go like, how do I do that? Do you do you have any, like, tips on, like, how would you calculate that type of thing? Or is it is it just a You gotta see. Pricing is a funny thing in a serverless world. It's pretty much in hindsight, and I would say I think people try to come up with ways to figure out cost ahead of time. I think the reality is is you don't know the cost ahead of time.

Guest 2

It's more around shifting your mindset around what your costs are. I think we're used to paying for things in a monthly way. Now, like, if you think about our personal lives, we pay monthly bills.

Topic 26 19:29

Serverless pricing unintuitive, shift mindset from monthly to unit costs

Guest 2

But when it comes to cloud infrastructure and things involving your business, I think it's better to think about it as marginal cost where you just need to make sure that you are charging someone else more than what your marginal cost is. So to serve 1 request, it costs you A penny, make sure you're charging 10¢ for that. That way it doesn't matter what your monthly bill is because if you scale up a 100 x, Your margin still makes sense. Right? You're never gonna go negative as long as you're thinking about, like, the unit economics there.

Guest 2

But it is tricky. Like, the The pricing is unintuitive, and it's it's not even compute hour now. It's, like, per millisecond. Like, how many milliseconds are you gonna use this month? It's it's not really something that You can really guess. You kind of look at it in hindsight and see see what's what.

Topic 27 20:21

Serverless pricing very cheap in practice despite appearing expensive

Guest 2

Can it be, like, incredibly cheap to to run stuff on on this as well, though? Yeah. I think there's a lot of confusion around pricing. I think people I've I've heard people instinctively say, serverless stuff is really expensive. I've also heard people instinctively say that it's it's not expensive.

Guest 2

In practice, it's insanely cheap, and the numbers the way you compute the numbers are a little bit counterintuitive, which is why people might perceive them as being expensive. But it is, like, unbelievably cheap. Like, I don't think I've had a bill more than, like, $5 or something, and I run some stuff that gets, like, serious traffic.

Guest 2

And there's and I think the culture, at least with AWS, is they have never raised prices ever. There's some services that are now 99% cheaper than they were when they launched. They know that the moment that they leave room, that someone else is gonna come in and offer a cheaper service. So you'll just see prices go down and down and down. I'd say one of my favorite examples of this is, Lambda let me see if I can remember the exact numbers. So they used to charge for 100 milliseconds.

Topic 28 20:58

AWS serverless prices tend to go down over time, not up

Guest 2

So if you had a request that took 20 milliseconds,

Share

Play / pause the audio
Minimize / expand the player
Mute / unmute the audio
Seek backward 30 seconds
Seek forward 30 seconds
Increase playback rate
Decrease playback rate
Show / hide this window