Introduction
How it Works?
Overview
LaunchFlow provides tools that make it easy to build and deploy Python APIs in your own GCP / AWS account. You can import Postgres and other cloud resources in your Python code, then deploy everything to dedicated Environments that run in your own cloud account.
You can create and connect to cloud Resources in an Environment by simply importing them in your code. Our SDK provides ready-to-use client APIs for every Resource so you can immediately start using them in your application both locally and when deployed.
You can build and deploy Services in the same Environment as your Resources with a single command. LaunchFlow deploys your APIs to Cloud Run on GCP and ECS Fargate on AWS - your network, service accounts, and other configuration are automatically setup for you.
Everything you build with LaunchFlow deploys to your own GCP / AWS account, giving you full control over your infrastructure and the most cost savings possible.
TL;DR: LaunchFlow automates your GCP and AWS configuration / deployments so you can focus on writing your application code.
Define Resources
First you define your resources anywhere in your python code. We recommend putting your resources in a file called infra.py
. But you are free to put them wherever.
1import launchflow as lf
2
3postgres = lf.gcp.CloudSQLPostgres("my-pg-database")
If you are using the CLI resources should be defined as global variables in a python file.
This tells LaunchFlow that you would like a Postgres database named my-pg-database
to be created in your GCP account. To trigger this creation you can simply run
1launchflow create
This will prompt you to select the project and environment you would like a database to be created in, and LaunchFlow will do all the configuration for where the database lives and how to connect to it.
After this command the connection info is stored securely in your cloud account. This ensure sensitive information like database passwords are never stored in your code or in your git history.
To download the connection info you can call resource.connect()
or await resource.connect_async()
. This will update your resource with the connection info allowing your application to connect to the resource. For our database above we would do:
1# Connect synchronously
2postgres.connect()
3# Connect asynchronously
4await postgres.connect_async()
Deploy Services
To deploy a service add a service to your launchflow.yaml code. Like:
1project: my-project
2environment: dev
3services:
4 - name: my-service
5 product: gcp_cloud_run
This tells LaunchFlow the my-service
should be deployed to GCP cloud run. When you run:
1launchflow deploy
From the directory containing your launchflow.yaml
file, LaunchFlow will deploy your service to the environment you specify. This will upload your code to an artifact bucket in your account and trigger a build in your cloud account (powered by CloudBuild on GCP and CodeBuild on AWS). Once the build is complete LaunchFlow will create a new revision of your service and deploy it for you.
GCP Projects and Environments
For each LaunchFlow environment we will create a new GCP project inside of this folder. This allows you to keep all of your resources and services organized and isolated from each other, helping ensure SOC2 compliance by having each environment have a dedicated network.
When you first sign up for LaunchFlow we create a GCP service account that will be used for admin operations. You will need to grant this service account access to your organization in order to use LaunchFlow with GCP. The following permissions are needed:
Role | Why? |
---|---|
Folder Creator (roles/resourcemanager.folderCreator) | Allows LaunchFlow to create a dedicated GCP folder per LaunchFlow project that you create. |
Organization Viewer (roles/resourcemanager.organizationViewer) | Allows LaunchFlow to associate LaunchFlow project folders with your organization |
Billing Account User (roles/billing.user) | Allows LaunchFlow to associate environment GCP projects with your billing account. |
AWS Projects and Environments
For each LaunchFlow environment you create in an AWS project we will provision a VPC to contain and connect all resources. This allows you to keep all of your resources and services organized and isolated from each other, helping ensure SOC2 compliance by having each environment have a dedicated network.