November 28th, 2022 × #Serverless#AWS Lambda#Limitations
Serverless Limitations
Wes and Scott discuss limitations of serverless functions and how to work around them.
- Discussing limitations of serverless functions
- Function size limits
- Sponsor - Sentry error monitoring
- Sponsor - Magic Bell notifications
- Serverless use cases and tradeoffs
- Serverless constraints vs traditional servers
- Function code size limits
- Node module sizes add up quickly
- Puppeteer hits size limit
- Chromium is too big for functions now
- Layers can workaround size limits
- Limited Node support in Cloudflare Workers
- Database access not straightforward
- Not all providers support cron jobs
- Local dev environments differ
- Sharing code can be difficult
- Bundling doesn't always optimize
- Environment variable size limits
- Execution timeouts are short
- Additional services add up in cost
- Need to know AWS well
- Account costs even before usage
- Infrastructure as code is crucial
- Cold start issues can happen
- Search offerings not ideal
- Algolia is expensive at scale
Transcript
Announcer
Monday. Monday. Monday.
Announcer
Open wide dev fans, get ready to stuff your face with JavaScript, CSS, node modules, barbecue tips, get workflows, breakdancing, soft skill, web development, the hastiest, the craziest, the tastiest web development treats coming again hot. Here is Wes Barracuda
Wes Bos
Boss and Scott
Scott Tolinski
CSD.
Scott Tolinski
Welcome to Syntax.
Scott Tolinski
On this Monday, hasty treat, we're gonna be pushing it to the limit, And I'm talking about the limit of serverless. We're gonna be talking about serverless limitations and, some some situations maybe where serverless it. Has changed how you work.
Discussing limitations of serverless functions
Scott Tolinski
Not not things you can't get around, but maybe some things to be aware of. My name is Scott Talinski. I'm a developer from Denver. With as always is Wes Bos. Hey, everybody. Hey, Wes. How's it going, man? That's going good. I'm excited to talk about
Function size limits
Wes Bos
Serverless limitations.
Scott Tolinski
Yeah. Serverless limitations.
Scott Tolinski
And we're gonna be, sponsored by 2 amazing companies that have absolutely zero limitations. I'm talking about Sentry and Magic Bell. Sentry is the perfect place to log and capture all of your errors and exceptions and keep track of what's going on in your application at any given point, Whether that's the UI, the API, the app, anything you could possibly imagine, you can collect all of that information, log it into Sentry, it. See how many people it's affecting.
Sponsor - Sentry error monitoring
Scott Tolinski
See, you know, what kind of reach this thing has. Also, see when this was introduced. Maybe this was introduced in a specific version. Maybe it was introduced by a specific person.
Scott Tolinski
You can go ahead and and create, GitHub issues directly from Sentry. You can see, it. Performance metrics for your application to see how fast it's performing. And overall, this is one of those tools that if you're using anything that people are are using in the real world, You're gonna need some kind of monitoring, and Sentry is really the best. So check it out at century.i0.
Scott Tolinski
Use the coupon code at tasty treat, all lowercase and all one word, it. And you'll get 2 months for free. Let's talk about our other sponsor, Magic Bell. They are a notification
Wes Bos
inbox for your application. If you wanna add notifications To your application, you need to think about, okay, well, there needs to be something in the app, but there also needs to be like a push notification maybe to their phone and maybe to Slack, it. Maybe the email. It gets kind of complicated. Magic Bell makes all of that super easy. They also they rolled out this thing called segments, which is an entire UI for segmenting your entire customer base, kind of like how you would do it in an email, An email newsletter program, and you can say, like, alright. These people are in this this segment. These people that their email matches this or they can be dynamic when Somebody switches a payment tier, then send them this notification. And it's a whole UI. You don't have to write code for having to figure out those complicated things. And quite honestly, whenever you're doing A lot of these complicated segments, these people or these people are people who have opened the app in the last 10 days, and they have an email address of Google .com, then send them this. Makes it really easy. Check it out. Magic Bell .com. Sick. Alright.
Sponsor - Magic Bell notifications
Scott Tolinski
Serverless limitations. Wes, do you wanna give a little bit of background on, like, where this idea came from and what what you're thinking here? Yeah. So I I I was just going through a little bit of
Wes Bos
updating of my personal website, and I've got, I don't know, 4 or 5 different serverless functions for doing things like Generating images for Twitter preview. I got another one that fetches my Instagram feed, another one that fetches my Twitter feed, and A couple other ones in there, and I hit a couple road bumps here. And over the years of writing little serverless functions here or there, I've hit a couple little bumps. And I thought, like, yeah, it's serverless is awesome.
Serverless use cases and tradeoffs
Wes Bos
However, the upside of of getting something that is infinitely scalable and cheap and all that is there's always some sort of constraint. Right? Whenever you work yourselves into and and we could probably extend this not just to serverless functions, but also edge functions.
Wes Bos
And any time that you get something that is better, you generally are giving up something else. And that that's just a general rule in life, I guess.
Wes Bos
So I thought, like, let's just, like, rattle through a bunch of sort of limitations.
Wes Bos
These are not necessarily bad things about serverless, but just things that you have to think about your application in a little bit of a different way than maybe you have thought about in the past with a traditional Long running server rendered application or server app, not server rendered. That's that's something totally different.
Function code size limits
Wes Bos
So the first one is, a function limit. So there is generally a limitation to how big a function can be.
Wes Bos
Fifty megs is the one for AWS.
Wes Bos
You might think like 50 megs. Like, I'm never going to write 50 megs of JavaScript.
Wes Bos
Take a peek in your node modules. How big is your node modules folder For for everything you want, you can get out of something as simple as like a text to emoji library.
Wes Bos
You can rack up 60 megs real quick of, just libraries because server Server libraries don't have to worry about size. Well, they do because that's why we're talking about it. But, so there's generally not as much of a historically. Yeah.
Puppeteer hits size limit
Wes Bos
Where I hit it was I was using puppeteer, and puppeteer is a headless Chrome browser.
Wes Bos
And what what I use puppeteer for is I have a page on my website that renders thumbnails out, and I do that so I can have full HTML, full CSS, a full JavaScript to make my thumbnails for my Twitter, previews, open graph previews. Right. Which is great. And I know because I do it with Cloudinary and, like Yeah. Trying to no shade on Cloudinary because I love them, but it's certainly not HTML and CSS. I wish I was doing it that way. Yeah. It's True. And people always send me, Versal has, like, a SVG version of it. I want full ass HTML and CSS, and And I want us to take a screenshot of it. You know?
Scott Tolinski
Full ass HTML.
Wes Bos
That should be your course.
Wes Bos
Talk about let me. That's such a good domain name.
Wes Bos
So the problem is is that Chromium has It grows every time they add a feature to Chrome. The bundle of Chromium grows and we hit we've hit an inflection point now where it's 49.7 megs.
Chromium is too big for functions now
Wes Bos
And so you literally I was running Chromium 20 lines of code And some other library to load in puppeteer for it. And I was going over by 3 megs, and I was like, I brought it up with the package author. And he says, yeah. Like, that's Chrome is just bigger now.
Wes Bos
So you can no longer put Chromium into a serverless function.
Wes Bos
The solution there is You use and AWS has a thing is anytime you have something larger, like an entire browser that needs to go on a serverless function, you use something called a layer in in AWS Lambda. And the layer will just kind of have it ready for you. And then you only have to ship the code that actually runs it. And it's it'll be like Ten k instead of 50 megs.
Layers can workaround size limits
Wes Bos
But Vercel, Netlify, all these companies don't And all these companies that are, like, easy to host stuff, they don't give you access to layers. That's like a just AWS.
Wes Bos
Actually, begin.com does.
Wes Bos
So I have to move to something else or I talked to the author. He's going to bundle Chrome with some less flags. Like like there's stuff in Chrome that I don't need, like GPU stuff and three d stuff, so maybe they can bundle Chrome with that. But that is the function limit. You'll be you'll be surprised how quickly A serverless function can go over 50 megs once they start getting everything in.
Wes Bos
Solutions to that, ES build, That brought it down about 5 megs for me switching from webpack to esbuild.
Wes Bos
Tree shaking will help as well. Were you using ESM? Is that why?
Scott Tolinski
Or did you stay in CJS with that? You just had better. I was in common JS for the thing. I believe that I tried to switch it Over because that was going to be my question is like, can ESM help here as well? I don't think so because
Wes Bos
ESBuild And Webpack knows how to tree shake regardless of which which type that you're in. And all the dependencies are not necessarily all shipped Does ESM? So you still have to have a conversion process there.
Wes Bos
Next, what we have here, node support.
Wes Bos
This is more of an Edge thing, and I believe that it will be going away soon. But, when you run A function in Cloudflare Workers is probably the big one. But also if you like Vercel, Next. Js, Middleware, Those things are running in Edge functions, and they don't have full blown node support in there. It is just a pared down JavaScript environment. So if you want to run a whole bunch of stuff in that, you don't have access to all of Node. So again, there's this constraint that It comes with the ability to run fast.
Limited Node support in Cloudflare Workers
Wes Bos
I believe I saw something the other day that Cloudflare is rolling out full NPM support Because that has always been a sticking point for me.
Wes Bos
Is that like, yeah, I'm fine with this, I guess, as an edge function edge is generally sits in between your request and your response and sort of does stuff in the middle, kind of like middleware. That's why they use it in, in X. Js middleware.
Database access not straightforward
Wes Bos
But, yeah, you don't always have full node available. And then that really limits packages that you could possibly use Inside of that function. Totally.
Wes Bos
Next 1, Cron Jobs.
Wes Bos
I have a tweet that's, like, 3 years old, and people reply to it every couple months. Hey. Is there a solution to this yet? Not every serverless provider or framework gives you the ability to do cron jobs.
Wes Bos
Arc, Brian LaRue begin.com does have it, but things like Vercel doesn't have it.
Wes Bos
I does Netlify have it. Let's do a quick Google. Yeah. So Netlify Netlify has scheduled functions.
Not all providers support cron jobs
Scott Tolinski
Yeah. Render has something like that as well. Yeah. Yeah. Renderers not serverless, though. Oh, I thought they had they have functions.
Scott Tolinski
They have,
Wes Bos
oh, Ron jobs. It. You can have a service that is just a straight up cron job. I guess that's not that's not a simple exception. That's the thing with traditional servers. A cron job is, Yeah. That's the most simple thing ever. You set up a cron job, and it runs when you want it. With serverless function is, They don't always have the ability to do cron jobs, and the solution that everybody always says is just use this service.
Wes Bos
And that always It kills me when part of your infrastructure is $8 a month to run a cron job. That's not a good solution to me.
Wes Bos
Next one. Local development isn't one to 1. So, again, we talk about different environments. Cloudflare has done a really good job at making this local thing called mini flare.
Wes Bos
It tries to replicate it, but I still run into issues, not just Cloudflare alone. This is all Things is that your local environment does not look like your deployed environment.
Local dev environments differ
Wes Bos
And in the past, people would use Docker to do that. And I guess you still can deploy Docker, but it's still hit bumps bumps in the road.
Wes Bos
Database access isn't straightforward.
Sharing code can be difficult
Wes Bos
Oh, yeah. That's actually a concern of mine. Yeah. If you want to use something as simple as, what's the text based Database. That is awesome. SQLite. If you want to use SQLite, serverless functions are spread among many servers when they scale up. So there isn't like a just one server that has it. That also is the case when you do, you have multiple servers running on like something like DigitalOcean as well. But same with database access. You need to pull connections because you fire up a 1,000 serverless functions at once. You're going to make a 1,000 connections to your database, And that's not ideal. So then you have an additional step in an additional infrastructure to pull your connections, which is You have 1 service that connects to the database and then all of your serverless functions talk to the pool instead of Going directly into the database.
Wes Bos
Sharing code, not always easy. So, if you have 6 serverless functions and you have a bunch of, like, shared code between the 2.
Bundling doesn't always optimize
Wes Bos
You can't just like some. Sometimes the bundlers don't do a great job at sharing code between the 2. And this is something I've hit many times over again Or it's just like, can I just be able to require something from a different folder and you figure it out? It's been a pain in my side for a while.
Wes Bos
Environmental variables. This is something I hit with Netlify the other day. AWS. So I should explain. Vercel, Netlify, Begin, All of these these things are not running their own serverless functions. They sit on top of what is called AWS Lambda, and they get they make it a easier to do this type of thing, and they provide a whole bunch of, like, tooling and infrastructure on top of it.
Environment variable size limits
Wes Bos
So the limit Two environmental variables on an AWS Lambda is 4 k, which is 1024 characters.
Wes Bos
You might think, oh, Kyle, that that's quite a bit. But sometimes you have these very long generated strings that need to be set as environmental variables, And you run over it. In my case, I was on Netlify, and Netlify sends all of your environmental variables, Including things like production server, dev server. I had a bunch of URLs as Netlify, and I ran out I hit that limit. So I was here renaming my variables to be the shortest variable names as possible.
Wes Bos
Oh, my gosh. Yeah. Yeah. And because they have to send it all to the serverless function because my site wouldn't deploy.
Wes Bos
The next morning, they announced That you can scope things to just be Netlify and just be serverless functions. So that's no longer a concern, but It's something you think about. Keep your environment variable names short.
Scott Tolinski
Yeah. Yeah. I never even that's not anything that I would have thought about ever. No, me neither. Until it Hit, and I was frustrated.
Wes Bos
Timeouts.
Execution timeouts are short
Wes Bos
Cloudflare has 10 second or sorry. 10 millisecond Time out. You must reply. That's the reason Cloudflare is a pared down environment.
Wes Bos
You don't have 3 seconds to go to an API, fetch something and come back on Cloudflare.
Wes Bos
You have 10 milliseconds to do what you want because generally, Cloudflare is sitting you sort of do the work on the way to your website, not As an endpoint, that's the difference between the edge function and a serverless function.
Wes Bos
Most serverless functions tap out at 10 seconds.
Wes Bos
So if you want to do something for a longer amount of time, scheduled functions, which are functions that don't run When somebody hits a URL in the browser, but they run every 30 minutes or at 2 o'clock every morning, those have a 32nd time out. So If you need to fetch a bunch of data and you need to wait 10 seconds between the 2, then you have to split it over multiple functions, and then you're dealing with databases because there's no shared memory between them, and that could be a bit of a pain. Yeah. Yes. See, like, these are things that you take for granted In the regular old server world that are not easy in serverless world.
Additional services add up in cost
Wes Bos
SAS is expensive. I had a tweet the other day where everyone's like serverless is super cheap.
Wes Bos
But here I am signing up for my 9th a $9 a month service to run my website and you realize, okay, this can get really expensive. And that's a joke because AWS straight up is very, very cheap. Their free plan is probably more than I would need for most of my projects.
Need to know AWS well
Wes Bos
But once you start realizing I need to build pipeline and a test runner and a gooey and all that type of stuff, Then you realize, okay, maybe maybe I'm not using it. AWS has a bunch of tooling around it. And quite honestly, I need to I need to familiarize familiarize myself with it because it seems like that's probably the way to go with a lot of the stuff. Yes. Just get good at AWS.
Wes Bos
Yeah. That's that's why people are good at it. So they don't you're not paying somebody else who knows how it works and sitting on top of it. Like, like, what's the minimum? I don't know if we should talk about how much pricing now. I'll leave that. We're getting kind of here. Just look at like a lot of these, companies that do serverless for you.
Wes Bos
Look at how much it costs just to own an account with them, not to run anything, run to do any bandwidth or something like that. Often you just have to pay per seat. Yeah. 5 developers on your team. You're paying $9 a month for every developer. Cool. That adds up quickly. For a lot of companies, maybe not. But For for some people, it is, whereas you're used to spending $5 a month for a digital ocean droplet and and you're good to go. Yeah. Right. I asked Ask on Twitter as well what people thought. Brian LaRue from begin.com, he said infrastructure as code is crucial. That's kind of a really good one is You can't rely on somebody knowing which buttons to click in the AWS console. You can't rely on someone because if you have to set it up again, you're not gonna remember that. So your infrastructure has to be a configuration file. It has to be a JSON or YAML file that you can easily redeploy in the future.
Account costs even before usage
Wes Bos
A lot of people said cold starts Were an issue, which is essentially when you don't run a serverless function for a while, it will go to sleep. It's not running on any server anywhere, and that's that's why serverless functions are so nice is that if you you're trying to you're trying to share the Amazon services servers with the rest of the world. And if your generate PDF function that you run once a month for 2 minutes is not going to run for 29 more days after that, then that thing goes to sleep and it's not using any resources. That's the benefit of it.
Cold start issues can happen
Wes Bos
But the downside is that if you need to have that spun up and reply very quickly, that there could be a cold start issue there. That's not something I'm super familiar with, but it seems like it's becoming less and less of an issue every single time that I talk about it, so I'm not sure about that. Yeah.
Scott Tolinski
Seems like those functions are going to sleep really easily. Any chance, we could have those functions talk to my kids and say, hey.
Wes Bos
Love it. And then the last when we have your search offerings are not ideal because with search, You need to, like, be constantly indexing your database for things.
Wes Bos
And that's a serverless function can only run for 30 seconds at a time.
Search offerings not ideal
Wes Bos
You need a you need, like, a server that's constantly always doing that type of stuff. So everybody always says, what's the big one out there? What's the search thing there? Why am I forgetting? Yeah.
Wes Bos
Algolia.
Wes Bos
That's the one. Algolia.
Wes Bos
Yeah. Yeah. So The solution again to a lot of this is use a service. But, and I love Algolia. I think it's amazing, but it is a very expensive once you get going Going on large datasets.
Algolia is expensive at scale
Wes Bos
So you got to you got to be careful there as well.
Wes Bos
That's my thoughts on serverless limitations.
Wes Bos
Just things you need to know about when you are approaching a new project with serverless, and, hopefully, that's some helpful stuff in there. Yeah. I learned a lot. Holy cow. Well, thanks so much, Wes. Alright. No problem. Catch you later. Peace. Peace.
Scott Tolinski
Head on over to syntax s.f m for a full archive of all of our shows.
Scott Tolinski
And don't forget to subscribe in your podcast player or drop a review if you like this show.