Setting up Tortoise ORM with FastAPI

Setting up Tortoise ORM with FastAPI

With the advent of so many asynchronous frameworks being put out in the Python space, I decided to use a fully async tech stack for my current project. Most of these frameworks are not mature yet, so there aren't a lot of built-in integrations.

My first task was getting Tortoise ORM working with a FastAPI service running under hypercorn. Luckily, this is pretty simple as Tortoise had a few examples of other framework integrations. I decided to follow fairly closely.

This was built using Python 3.7, but it should work with 3.5+ without issue. In addition, these are the module versions used at the time of writing this...

  • hypercorn=0.7.2
  • fastapi=0.34.0
  • tortoise-orm=0.12.8

First, we need to override the Tortoise init() function.

The notable change we make is passing in our web framework app class. In this case, it is app: FastAPI – but you can easily pop in Starlette. If you use a framework that doesn't build off of Starlette, however, you will need to make changes to the event hooking code we're about to write.

We'll need a couple of local functions inside register_tortoise to handle startup, schema generation, and graceful closing.

The important parts here are the strings that the @app.on_event() decorators take. 'startup' and 'shutdown' are Starlette specific events, which FastAPI inherits, that hook into starting and closing of the server. If you were using another async framework, you would need their specific event hook strings.

All these functions do is handle Tortoise initialization and closing with our server app. In addition, we check if the generate_schemas parameter is True inside the register_tortoise call , generating them if so. Remember,  default Tortoise behavior is to skip generation of a schema if it already exists.

Now we can import this function wherever we define our FastAPI application.

Note the path we are passing in our config_file parameter – it should point to a .json (or .yaml) file looking similar to this:

The apps.models.models structure should be pointing to the location you are planning on storing your Tortoise models in.

You can also use the other two options for configuration (config or (db_url, modules)). The Tortoise documentation talks about how those are utilized if you choose to go that route.

There we have it – after plugging all this in you can start defining models and utilizing Tortoise functions to add or consume data! Major thanks to the Tortoise contributors for having some easy to grasp examples that made getting this working a breeze!

Comments