I have a DynamoDB table that is automatically exported as JSON to
compute some reports. I wanted to automate the table creation process
and load steps. Amazon Athena allows querying from raw files stored on
S3, which allows reporting when a full database would be too expensive
to run because it’s reports are only needed a low percentage of the time
or a full database is not required. It’s billed by the amount of data
scanned, which makes it relatively cheap for my use case.
I went to AWS Reinvent this year, and I wanted my registered events to
show up on my personal calendar. I’m honestly surprised they didn’t
implement an iCal export for your registered events. So I took these
steps to get them.
I always wanted to have search on my site, but it’s statically hosted,
so I can’t have any dynamic content on it. I recently resolved that by
creating a little tiny lambda function on AWS that queries a SQLite
database hosted on S3. Here I’ll walk you through how I created the
back-end for it, and in some later posts I’ll include details on
calculating the cost.
I’ve been studying for my developer associate certification on Amazon
Web Services. I’ve been practicing with the various streaming solutions
like SQS and Kinesis lately. I’ve wanted to branch into using DynamoDB
and capturing changes. To do that I needed to generate a lot of take
data.