Introduction

Quickstart

In this example we'll create a FastAPI app to read and write to a storage bucket managed by LaunchFlow, then build and deploy a Docker image using LaunchFlow's CLI.

Install LaunchFlow

First install the LaunchFlow client library and CLI using pip.

1
pip install "launchflow[gcp]"

Sign up for an Account

Once installed you can login or signup for a free account by running:

1
launchflow login

If you don't already have a LaunchFlow account this will create a free accout for you.

Connect your Cloud Provider

When you signup for an account, LaunchFlow creates a GCP service account that will be used to create projects and environments in your GCP organization. You will need to grant this service account the necessary permissions to create resources in your GCP organization. To do so run:

1
launchflow connect --provider=gcp

This will print out instructions like:

1
Grant `account-2746196448@launchflow-admin-dev.iam.gserviceaccount.com` the following roles on your GCP organization:
2
- Folder Creator (roles/resourcemanager.folderCreator)
3
- Organization Viewer (roles/resourcemanager.organizationViewer)
4
- Billing Account User (roles/billing.user)
5
6
These roles will be used to create a unique GCP project for every environment in your account.
7
8
Hit enter once complete and we will verify your setup:

Once you've added these roles, hit enter and LaunchFlow will verify that the service account has the necessary permissions to create resources in your GCP organization. Once verified you are ready to use LaunchFlow!

Sometimes it can take a few minutes for these permissions to propagate. If you see an error message about missing permissions, wait a few minutes and try again.

If you would like to learn more about why we need these permissions see our how it works section on GCP.

You can connect multiple cloud providers by running launchflow connect again. Or by visiting the Cloud Provider page in the LaunchFlow console.

Create your first Application

We'll start with a simple backend that writes a file to a GCS bucket. We'll create the application by running:

1
launchflow init

This will setup a new LaunchFlow application in the current directory. launchflow init is only used to help you get started - if you're adding LaunchFlow to an existing project you can skip this step.

This will prompt you with several questions for setting up your applications:

  • project: The name of the project, we will create a LaunchFlow project with this name if one doesn't already exist.
  • Cloud Provider: The cloud provider you want to use.
  • framework: The backend framework you want to use. In the below examples we'll use FastAPI, but you can use any framework you like.
  • resources: Select any resources you would like to create for your application. In this case we'll select a GCS bucket.

The init command generates an initial project structure for you, Including:

  • An app dir for your python application.
  • A Dockerfile that will be used for deploying your service
  • A launchflow.yaml file that contains the configuration for your application.

Don't worry too much about the launchflow.yaml file for now, if you want to learn more about it you can read about it in our reference section.

Once created you can navigate to the newly created directory.

1
cd $PROJECT_NAME

Your initial project will only contain two python files. An app/infra.py file that defines our resources. The only difference between GCP, AWS, and Azure is our bucket definition in infra.py all the other code will be the same.

1
import launchflow as lf
2
3
# Docs: https://docs.launchflow.com/reference/gcp-resources/gcs-bucket
4
bucket = lf.gcp.GCSBucket("my-bucket")

and app/main.py file that looks just like any other FastAPI main file. The only addition is calling await bucket.connect_async() in the lifespan context manager. This ensures we can connect to our bucket when our application starts.

1
from contextlib import asynccontextmanager
2
3
from app.infra import bucket
4
from fastapi import FastAPI
5
6
@asynccontextmanager
7
async def lifespan(app: FastAPI):
8
# This ensures the bucket client is connected / ready
9
await bucket.connect_async()
10
yield
11
12
app = FastAPI(lifespan=lifespan)
13
14
@app.get("/")
15
def root():
16
return {"message": "Hello World"}

Now we'll update our main.py file so that it can read and write to our bucket. To do this we'll add two new endpoints to our main.py file that will end up looking like. You can copy and paste this code into your main.py file.

1
from contextlib import asynccontextmanager
2
3
from app.infra import bucket
4
from fastapi import FastAPI
5
6
@asynccontextmanager
7
async def lifespan(app: FastAPI):
8
# This ensures the bucket client is connected / ready
9
await bucket.connect_async()
10
yield
11
12
app = FastAPI(lifespan=lifespan)
13
14
@app.post("/{file_name}")
15
async def write_file(file_name: str, file_contents: str):
16
# Once connected, you can start using the bucket client
17
bucket.upload_from_string(file_contents, file_name)
18
return "OK"
19
20
21
@app.get("/{file_name}")
22
async def read_file(file_name: str) -> str:
23
return bucket.download_file(file_name).decode("utf-8")

NOTE:

Resources also include access to the underlying boto3 & google Python clients if you ever need to use advanced features. Visit the Reference Docs to see example usage.

Create your Resources

Before creating your resources be sure to update your bucket in infra.py to have a unique name since GCS bucket names must be unique.

Now that we have our code written we'll need to create our bucket so we can run and test our application. To do this run the below command in the root of you project:

1
launchflow create

This will find any resources that need to be createed in your application and create them after confirming with you. In this case it will create a GCS bucket with the name you provided.

Run your Application

Install your dependencies:

1
pip install -r requirements.txt

Now you can run you application just like you normally would. For FastAPI we'll use uvicorn to run our application:

1
uvicorn app.main:app

You may need to grant your individual user access to the bucket depending upon how your GCP or AWS account is setup.

Once running you can test out your application at http://localhost:8000/docs or by using curl.

To upload a file to your bucket run:

1
curl -X 'POST' \
2
'http://127.0.0.1:8001/hello-world.txt?file_contents=hello%20world' \
3
-H 'accept: application/json'

To download the file you just uploaded run:

1
curl -X 'GET' \
2
'http://127.0.0.1:8001/hello-world.txt' \
3
-H 'accept: application/json'

Deploy your Application

Once you're done testing locally you can deploy your application to Cloud Run on GCP or ECS Fargate on AWS with one command. To do this run:

1
launchflow deploy

What's next?

Previous
Installation