### Prerequisites - [Vagrant](../scripts/vagrant/README.md) - Python 3.9 - Pipenv ### Development environment ```bash cd openreplay/api # Make your own copy of .env file and edit it as you want cp .env.dev .env # Create a .venv folder to contain all you dependencies mkdir .venv # Installing dependencies (pipenv will detect the .venv folder and use it as a target) pipenv install -r requirements.txt [--skip-lock] ``` ### Building and deploying locally ```bash cd openreplay-contributions vagrant ssh cd openreplay-dev/openreplay/scripts/helmcharts # For complete list of options # bash local_deploy.sh help bash local_deploy.sh api ``` ### autogenerated api frontend API can autogenerate a frontend that documents, and allows to play with, in a limited way, its interface. Make sure you have the following variables inside the current `.env`: ``` docs_url=/docs root_path='' ``` If the `.env` that is in-use is based on `env.default` then it is already the case. Start, or restart the http server, then go to `https://127.0.0.1:8000/docs`. That is autogenerated documentation based on pydantic schema, fastapi routes, and docstrings :wink:. Happy experiments, and then documentation! ### psycopg3 API I mis-remember the psycopg v2 vs. v3 API. For the record, the expected psycopg3's async api looks like the following pseudo code: ```python async with app.state.postgresql.connection() as cnx: async with cnx.transaction(): row = await cnx.execute("SELECT EXISTS(SELECT 1 FROM public.tenants)") row = await row.fetchone() return row["exists"] ``` Mind the following: - Where `app.state.postgresql` is the postgresql connection pooler. - Wrap explicit transaction with `async with cnx.transaction(): foobar()` - Most of the time the transaction object is not used; - Do execute await operation against `cnx`; - `await cnx.execute` returns a cursor object; - Do the `await cursor.fetchqux...` calls against the object return by a call to execute.