Getting More from AWS Without Breaking the Bank

Published about 1 month ago • 3 min read

Hi friends,

AWS is sometimes looked at as a money black hole where you’ll burn your cash the second you sign up. In a way, that’s not wrong.

Interestingly, there's also a group of people who dive right in and only realize the expenses when they see their monthly bill.

It doesn’t have to be this way

AWS can be used at no cost, or if not entirely free, then for minimal expenses.

Stating the obvious

When you join, AWS enrolls you in their free tier, allowing you to use certain databases, compute instances, and other services at no cost.

However, this free tier only lasts for 12 months; after that, charges begin.

While a free tier that ends might seem like a lure, it's a common practice.
Yet, with AWS, you might find yourself heavily reliant on these services, making it hard to simply stop using them.

Today, I'll focus on AWS's lesser-known "always free" offerings, which could enable truly cost-free production deployments for various projects.

Push the erverless boundaries

If you start with a serverless approach, you'll likely run mostly for free, and I stand by this. However, I must mention the risks involved.

My strong opinion is that serverless doesn't scale in terms of costs and hidden fees as many people expect. More on that here.

“You get 1 million free requests per month and 400,000 GB-seconds of compute time per month indefinitely.”

1 million free requests per month can support many side projects, and even many initial staging or production environments. Take advantage of this offer.

There are numerous excellent frameworks available to help you do so; one that I particularly recommend is SST.

It’s a serverless framework for building serverless applications, that takes infrastructure and decisions around it very seriously.

S3 is cheap, but it’s rarly a must

The Free Tier for Amazon S3 is only for the first 12 months.

Everybody loves S3, it’s easy to use, it’s robust and has been around forever (literally the first service ever offered by AWS!).

However, while S3 isn’t especially expensive, it can get out of hand, and it’s not free past the first 12 months. There are multiple tiers offered by AWS which you most likely should use unless you feel 11 9’s after the decimal dot of 99% uptime is for you.

More often than not, small projects don’t actually need S3.
They use it because it’s there, and it’s easy.

The ultimate discount machine: Spot Instances

Spot Instances are a way to use unused computing power on AWS. They come in a lower price than usual, but they can be taken back by Amazon at any time if someone else needs them or is willing to pay more.

It’s like renting a video game for a cheaper price because the store has too many copies, but you have to return it suddenly if more people want to rent it.

You can get away with 90% off the on-demand price if you’re willing to take the relatively small risk. And, if you know what you’re doing, things like auto scaling groups will revive the workload anyway.

But wait, it gets better

ECS (Elastic Container Service) is a service that allows you to run and manage groups of tasks or services using containers.

Combined with Fargate, it lets you scale up and down without having to manage the underlying servers yourself. Essentially, it’s like a serverless container solution.

You can mix the two (Fargate and Spot) and configure ECS to run the underlying servers for you, but on spot instances.

And this, creates a perfect combo: enjoying containers, while running them on a serverless platform, all while using spot instances for discounts!

Databases don’t have to break the bank either

If I were deploying a a stateful application today I’d probable do one of two things:

1. I’d use an on-disk SQLite DB and make sure it sits on a detachable EBS volume that’s snapshotted and can be recovered

2. I’d use a service with free tier like PlanetScale or Neon which I’ve userd in the past for free and loved.

AWS will offer a free tier RDS but as with other services, after 12 months you start paying.

DynamoDB offers a generous, indefinite free tier: 25 GB of storage, and 25 units each of read and write capacity.

However, remember that Dynamo uses a unique API, making data migration harder than moving Postgres data. For a side project, this might be less concerning, but it's important to consider all factors.

Don’t forget the marketplace

AWS provides a marketplace for no-longer-needed compute resources. For instance, if you're locked into a three-year server contract but only need it for 12 months, you can offer the remaining time to other AWS users via the marketplace.

Over the years companies started to take advantage of this feature, and AWS recently responded with limiting the usage for accounts.

But again, for solo users, this can be a big deal and a money saver.

These are the basics, and I'm working on an entire new section called “Advanced Mode” which I’ll share in the future.

As always, feel free to respond if you have feedback or questions.

Have a great weekend!


Every once in a while I send hand picked things I've learned. Kind of like your filter to the tech internet. No spam, I promise!


Hi Friends, Today, we’re diving into the often under-appreciated git blame. Often used to find out who last modified a line of code, git blame has several powerful features. Let’s explore some of these hidden gems, particularly focusing on the flags: -w -C -C -C -C -C -C blame -C -C -C ignores code movements across the entire history Ignoring Whitespace with the -w Flag Do you know the annoying blame results showing over an indent or a space removed / added? "-w" fixes that! git blame -w...

6 days ago • 1 min read

Hi friends, This is the story of how one mysterious hacker, took advantage of an inactive negligible library in Linux, and the maintainers emotional stress. Slowly, week after week, bit by bit lines of seemingly random testing code were added, in front of everyone’s eyes, to the Linux upstream. Thanks to one curious developer, who had an itch because of a tiny lag in his login process, sending him deep into the rabbit hole of Linux SSH and the mystic world of ZX, we would have never heard of...

13 days ago • 6 min read

Hi friends, You know how companies use merge commits when working on projects? Just go to one of your work projects and checkout the history of the main branch. It may look tidy (or not) but it’s bad for the environment (the real-world one 😉) Lots of merge commits While that’s pretty standard, it actually makes things messy. Every time we do this, it stops us from keeping our commit history nice and clean all the way through from development to production. Here’s why that’s a bit of a...

19 days ago • 2 min read
Share this post