Serverless Relational Databases

Relational Databases have been a flagship part of software development for the past few decades. With Aws Aurora Databases, you can integrate a Relational DB with your Serverless Application. You can create Postgres or MySql compatible databases that scale with your application

Key Differences
Aurora Databases are designed to integrate with Serverless environments by executing SQL over a HTTP tunnel making it more accessible from a Serverless Function. You can use third party libraries to access the DB from a Serverless Function using both the standard Python DB API and SlqAlchemy.

Creating a Relational DB

    
from cdev.aws.relational_db import RelationalDB, db_engine myDB = RelationalDB( cdev_name="demo_db", engine=db_engine.aurora_postgresql, username="username", password="password", database_name="default_table" )
Database Variables Tip
If starting from the quick-start template place the code snippet from above in your src/hello_world/resources.py file.



Accessing the DB from the CLI

Once you have created a DB, the fastest way to access the DB is through an interactive session on your CLI. You can open a interactive session using the following Cdev Command. This shell is designed to be a simple emulation of psql and mysql.

cdev run relationaldb.shell <component_name>.<resource_name>

Create a Table

Mysql

CREATE TABLE users(id INT NOT NULL AUTO_INCREMENT,name VARCHAR(100), PRIMARY KEY ( id ));

Postgres

CREATE TABLE users( id serial PRIMARY KEY, name VARCHAR ( 50 ) );

Add Row

INSERT INTO users(id, name) VALUES (18,'Paul Atreides');

Query the Table

select * from users;

Transactions

Start Transaction

BEGIN

Commit Transaction

COMMIT

Rollback Transaction

ROLLBACK

Quit Interactive Shell

quit


Connecting to a Relational DB with Python

Although there are some underlying architectural decisions that make these db’s optimized for use in a Serverless Computing environment, we can access them in our code using the standard Python Database Api and SqlAlchemy. To simplify the use of the these standard Api’s, we will be using two third party packages: aurora-data-api and sqlalchemy-aurora-data-api. These packages handle the hard work of creating the standardized Api’s for our created databases and exposing them in packages that are optimized for Serverless Development.

pip install aurora-data-api sqlalchemy-aurora-data-api

Python Database Api

Using aurora-data-api to access your DB via the Python Database Api is the fastest way to get started.

    
import os import aurora_data_api from cdev.aws.relational_db import RelationalDB, db_engine from cdev.aws.lambda_function import ServerlessFunction myDB = RelationalDB( cdev_name="demo_db", engine=db_engine.aurora_postgresql, username="username", password="password", database_name="default_table" ) @ServerlessFunction("db_handler", environment={"CLUSTER_ARN": myDB.output.cluster_arn, "SECRET_ARN":myDB.output.secret_arn, "DB_NAME": db_name}, permissions=[myDB.available_permissions.DATABASE_ACCESS,myDB.available_permissions.SECRET_ACCESS]) def connect_to_db(event, context): cluster_arn = os.environ.get("CLUSTER_ARN") secret_arn = os.environ.get("SECRET_ARN") database_name = os.environ.get("DB_NAME") with aurora_data_api.connect(aurora_cluster_arn=cluster_arn, secret_arn=secret_arn, database=database_name) as conn: with conn.cursor() as cursor: # mysql uncomment follow line # cursor.execute("SHOW DATABASES;") # postgres uncomment follow line #cursor.execute("select * from pg_catalog.pg_tables") print(cursor.fetchall())

Execute the function

cdev run function.execute hello_world_comp.db_handler

Check the logs

cdev run function.logs hello_world_comp.db_handler
How to use the Python Database API
For more information on how to use the Python Database Api checkout this tutorial

SqlAlchemy Api

The SqlAlchemy Api allows you to execute sql directly or use an Object-Relational Mapper (ORM) to access your DB.

Direct Access

    
import os import sqlalchemy_aurora_data_api from sqlalchemy import create_engine from cdev.aws.relational_db import RelationalDB, db_engine from cdev.aws.lambda_function import ServerlessFunction myDB = RelationalDB( cdev_name="demo_db", engine=db_engine.aurora_postgresql, username="username", password="password", database_name="default_table" ) cluster_arn = os.environ.get("CLUSTER_ARN") secret_arn = os.environ.get("SECRET_ARN") database_name = os.environ.get("DB_NAME") sqlalchemy_aurora_data_api.register_dialects() engine = create_engine(f'postgresql+auroradataapi://:@/{db_name}', echo=True, connect_args=dict(aurora_cluster_arn=cluster_arn, secret_arn=secret_arn)) @ServerlessFunction("db_handler", environment={"CLUSTER_ARN": myDB.output.cluster_arn, "SECRET_ARN":myDB.output.secret_arn, "DB_NAME": db_name}, permissions=[myDB.available_permissions.DATABASE_ACCESS, myDB.available_permissions.SECRET_ACCESS]) def connect_to_db(event, context): print(sqlalchemy_aurora_data_api) #sql_stmt = "SHOW DATABASES;" #sql_stmt = "select * from pg_catalog.pg_tables" with engine.connect() as conn: for result in conn.execute(sql_stmt): print(result)

Execute the function

cdev run function.execute hello_world_comp.db_handler

Check the logs

cdev run function.logs hello_world_comp.db_handler

SqlAlchemy ORM

One of the main benefits of SqlAlchemy is the option to use it’s powerful ORM. To get the full use of the ORM, we can pair it with the alembic library to automatically generate the migration files when we update our models.

Starting from the quick-start template, add a relational_db to your resources.

cdev init orm-demo --template quick-start

Update your src/hello_world/resources.py file to

    
# Generated as part of Quick Start project template from cdev.aws.api import Api from cdev.aws.lambda_function import ServerlessFunction from cdev.aws.relational_db import RelationalDB, db_engine from cdev import Project as cdev_project myProject = cdev_project.instance() DemoApi = Api("demoapi") hello_route = DemoApi.route("/hello_world", "GET") @ServerlessFunction("hello_world_function", events=[hello_route.event()]) def hello_world(event, context): print('Hello from inside your Function!') return { "status_code": 200, "message": "Hello Outside World!" } myDB = RelationalDB( cdev_name="demo_db", engine=db_engine.aurora_postgresql, username="username", password="password", database_name="default_table" ) myProject.display_output("Base API URL", DemoApi.output.endpoint) myProject.display_output("Routes", DemoApi.output.endpoints)

Now we can deploy our resources

cdev deploy


Next, we are going to create our models. Install sqlalchemy_aurora_data_api.

pip install sqlalchemy_aurora_data_api

Then, create a src/hello_world/models.py file and add the following code:

    
from sqlalchemy import Column from sqlalchemy import Integer from sqlalchemy import String from sqlalchemy.orm import declarative_base import sqlalchemy_aurora_data_api sqlalchemy_aurora_data_api.register_dialects() Base = declarative_base() class User(Base): __tablename__ = "users" id = Column(Integer, primary_key=True) name = Column(String(30)) def __repr__(self): return f"User(id={self.id!r}, name={self.name!r})"

Now we are going to install and configure alembic.

pip install alembic
Alembic Library
Alembic is one of the best in class tools for working with SqlAlchemy. You can learn more about the tool from their official documentation.

Initialize the needed files for alembic using the following command. We will need to edit some of the generated files to connect to our db.

alembic init src/alembic

In the src/alembic/eny.py file, import the base declarative model from the models.py file, and set that as the target_metadata variable on line 21.

from src.hello_world.models import Base
target_metadata = Base.metadata

Then, change line 62 to use connect args and our database engine.

connectable = create_engine(
        postgres_database_engine,
        echo=True,
        connect_args=dict(aurora_cluster_arn=cluster_arn, secret_arn=secret_arn)
    )

Above the run_migrations_online function declaration on line 55, add the follow lines and uncomment the database engine based on your type of db engine.

import os
from sqlalchemy import create_engine

cluster_arn = os.environ.get("CLUSTER_ARN")
secret_arn = os.environ.get("SECRET_ARN")
database_name = os.environ.get("DB_NAME")

#postgres_database_engine = f'postgresql+auroradataapi://:@/{database_name}'
#mysql_database_engine = f'mysql+auroradataapi://:@/{database_name}'

Now to make sure our values are registered in the env.py script, we need to set our environments.

export SECRET_ARN=$(cdev output --value hello_world_comp.relationaldb.demo_db.secret_arn)
export CLUSTER_ARN=$(cdev output --value hello_world_comp.relationaldb.demo_db.cluster_arn)
export DB_NAME=<db_name>

You should now be able to create automated migrations.

Auto Generation limits
You should familiarize yourself with the limits of alembic auto generation and ALWAYS confirm the changes before applying them.
alembic revision --autogenerate -m "Added users table"

Apply the migration with the upgrade command. This will create our Users Table.

alembic upgrade head


Lets now connect to our DB and add an User.

cdev run relationaldb.shell hello_world_comp.demo_db
BEGIN
INSERT INTO users(id, name) VALUES (1,'Paul Atreides');
COMMIT
quit


Now we can update our src/hello_world/resources.py to have our Serverless Function use the SqlAlchemy ORM to access our Database. Replace your src/hello_world/resources.py with the following code.

    
# Generated as part of Quick Start project template import os from sqlalchemy import select, create_engine from sqlalchemy.orm import Session from cdev.aws.api import Api from cdev.aws.lambda_function import ServerlessFunction from cdev.aws.relational_db import RelationalDB, db_engine from cdev import Project as cdev_project from .models import User myProject = cdev_project.instance() ## Routes DemoApi = Api("demoapi") hello_route = DemoApi.route("/hello_world", "GET") ## DB myDB = RelationalDB( cdev_name="demo_db", engine=db_engine.aurora_postgresql, username="username", password="password", database_name="default_table" ) ## Functions cluster_arn = os.environ.get("CLUSTER_ARN") secret_arn = os.environ.get("SECRET_ARN") database_name = os.environ.get("DB_NAME") engine = create_engine(f'postgresql+auroradataapi://:@/{database_name}', connect_args=dict(aurora_cluster_arn=cluster_arn, secret_arn=secret_arn)) @ServerlessFunction("hello_world_function", events=[hello_route.event()], environment={"CLUSTER_ARN": myDB.output.cluster_arn, "SECRET_ARN": myDB.output.secret_arn, "DB_NAME": myDB.database_name}, permissions=[myDB.available_permissions.DATABASE_ACCESS, myDB.available_permissions.SECRET_ACCESS]) def hello_world(event, context): print('Hello from inside your Function!') session = Session(engine) stmt = select(User).where(User.name == 'Paul Atreides') for user in session.scalars(stmt): print(user) return { "status_code": 200, "message": "Hello Outside World!" } ## Output myProject.display_output("Base API URL", DemoApi.output.endpoint) myProject.display_output("Routes", DemoApi.output.endpoints)

Then deploy the changes to the function

cdev deploy

run the deployed function

cdev run function.execute hello_world_comp.hello_world_function

Check the logs from the function

cdev run function.logs hello_world_comp.hello_world_function