In this guide we will go over how to serve data from a Postgres database that is managed by BuildFlow using SQLAlchemy. All the code for this guide can be found on GitHub.

To do this we will use:

To see a full working project example with a UI included see our SaaS application example.

We currently only support managing postgres databases hosted on GCP Cloud SQL, but more support will be coming soon!

1

Initialize Flow

First we create a new BuildFlow Flow. This will be the main entry point to our application.

app = Flow()
2

Attach a Service

We then attach a service to our flow object. We will later attach endpoints to store and retrieve data from our database.

service = app.service(service_id="serve-from-postgres")
3

Define Database Primitives

Next we define a primitive to manage our database. For GCP we use a Cloud SQL Instance and a Cloud SQL Database.

cloud_sql_instance = CloudSQLInstance(
    instance_name="launchflow-serve-from-postgres",
    project_id=os.environ["GCP_PROJECT_ID"],
    database_version=CloudSQLDatabaseVersion.POSTGRES_15,
    region="us-central1",
    settings=CloudSQLInstanceSettings(tier="db-custom-1-3840"),
)

cloud_sql_database = CloudSQLDatabase(
    instance=cloud_sql_instance,
    database_name="launchflow-serve-from-postgres-db",
)

app.manage(cloud_sql_instance, cloud_sql_database)
4

Initialize a SQLAlchemy Session Dependency

Now we create a SessionDep to manage our SQLAlchemy sessions for requests. This dependency opens a connection to our database when the server starts up and give each endpoint a unique Session whenever they are called.

You will need to provide the user and password for authenticating with the database. You can add a user and password to your database following this guide.

DB = SessionDep(
    db_primitive=cloud_sql_database,
    db_user=os.environ["DB_USER"],
    db_password=os.environ["DB_PASSWORD"],
)
5

Initialize Database Model

Now we initialize our database model that is contained in models.py. This will create all the tables in our database.

We put our models in a different file to avoid them being serialized with our processors.

We surround this in a try catch because when we run buildflow apply the database will not exist yet.

try:
    models.Base.metadata.create_all(
        bind=engine(
            cloud_sql_database, os.environ["DB_USER"], os.environ["DB_PASSWORD"]
        )
    )
except aiohttp.ClientResponseError as e:
    if e.status != 404:
        raise e
6

Add Endpoints

Last we add two endpoints:

  1. /store - This endpoint takes a string and stores it in our database.
  2. /retrieve - This endpoint retrieves all the strings in our database.

Both of these endpoints take in our DB dependency we defined above letting them use a shared connect, but still have a unique session.

@service.endpoint("/store", method="POST")
def store_string(string: str, db: DB):
    with db.session as session:
        session.add(models.Strings(string=string))
        session.commit()
    return {"status": "success"}


@service.endpoint("/retrieve", method="GET")
def retrieve_string(db: DB) -> api_schemas.RetrieveResponse:
    with db.session as session:
        strings = session.query(models.Strings).all()
        string_responses = []
        for string in strings:
            string_responses.append(
                api_schemas.StringResponse(id=string.id, string=string.string)
            )
        return api_schemas.RetrieveResponse(strings=string_responses)
7

Run the Code

Before running make sure you either update the code, or set the following environment variables:

You will need to provide the user and password for authenticating with the database. You can add a user and password to your database following this guide.

export GCP_PROJECT_ID=<your-project-id>
export DB_USER=<your-db-user>
export DB_PASSWORD=<your-db-password>

First create all the resources with the VSCode extension or the CLI:

After confirming it will take several minutes to actually provision the database.

buildflow apply

Once completed you can run the code with the VSCode extension or the CLI:

buildflow run

Now you can visit: http://localhost:8000/docs to see the swagger UI and test out the endpoints.

8

Clean up Resources

Once your done you can use the VSCode extension or the CLI to clean up all the resources:

buildflow destroy