Update documentation #102

Merged
mattn merged 1 commit from matt/readme into main 2023-04-27 14:06:16 +12:00
8 changed files with 254 additions and 331 deletions

View file

@ -82,3 +82,6 @@ clean: stop
reset: clean
docker-compose down --volumes --remove-orphans
userguide: README.md docs
docs/build-pdf.sh

315
README.md
View file

@ -1,31 +1,31 @@
# RightTree
# Right Plant
Right Plant Right Place Right Time implementation using React and Django.
## Initial Setup
Before running the applications please ensure the following prerequisites have been met.
#### Software
Most applications in this repository are built using Docker which resolves many dependencies but you will require a local installation of `git`, `docker` and `docker-compose`.
### Prerequisite Software
```bash
$ sudo apt install git docker-compose
```
Ensure you have the following software tools installed to your local machine before continuing with the setup.
To install `docker`, follow the [official installation documentation](https://docs.docker.com/get-docker/). [Instructions are also available for `docker-compose`](https://docs.docker.com/compose/install/).
* [Docker](https://docs.docker.com/desktop/install/linux-install/)
* [docker-compose](https://docs.docker.com/compose/install/linux/#install-using-the-repository)
* Make
In order to recieve address data while running in development mode you will need to set an environment variable containing a valid linz data service api key. Such a key can be retrieved by signing up to https://data.linz.govt.nz/. One way of setting the variable is exporting it in the same terminal window that you will run the application. To do this please create a .env file in the root directory using `default.env` as an example. Fill in values as appropriate.
### LINZ API Key
You may also need to give the `dev` script executable permissions using the following command:
In order to recieve address data you will need to supply a LINZ API key. Such a key can be retrieved by signing up to https://data.linz.govt.nz/. Your key should be placed in the `.env` file as `LINZ_API_KEY`.
```
chmod +x ./dev
```
### Required Layers
### Add shapefiles for database population
Download and extract the following geospatial layers to `backend/right_tree/api/data/resources`:
Please unzip and add the following shapefiles to the `./backend/right_tree/api/data/resources` directory. It should include all the files required by the shapefile and use naming conventions as follows:
* [Ecological Districts](https://catalogue.data.govt.nz/dataset/ecological-districts)
* [Greater Christchurch Area](https://catalogue.data.govt.nz/dataset/greater-christchurch-area2)
* [Fundamental Soil Layers New Zealand Soil Classification](https://catalogue.data.govt.nz/dataset/fundamental-soil-layers-new-zealand-soil-classification)
Your filesystem structure should appear similar to the following:
**Ecological Districts Shapefile:**
```
backend/right_tree/api/data/resources/ecological_districts/
- DOC_EcologicalDistricts_2021_08_02.cpg
@ -37,8 +37,6 @@ backend/right_tree/api/data/resources/ecological_districts/
- DOC_EcologicalDistricts_2021_08_02.shp.xml
- DOC_EcologicalDistricts_2021_08_02.shx
```
**Ecological Districts Shapefile:**
```
backend/right_tree/api/data/resources/fundamental_soil_layers/
- fundamental-soil-layers-new-zealand-soil-classification.cpg
@ -48,132 +46,217 @@ backend/right_tree/api/data/resources/fundamental_soil_layers/
- fundamental-soil-layers-new-zealand-soil-classification.shx
- fundamental-soil-layers-new-zealand-soil-classification.xml
```
**Christchurch Zone Shapefile:**
```
backend/right_tree/api/data/resources/chch_zone/
- Greater_Christchurch_Area.cpg
- Greater_Christchurch_Area.cpg
- Greater_Christchurch_Area.shp
- Greater_Christchurch_Area.dbf
- Greater_Christchurch_Area.dbf
- Greater_Christchurch_Area.shx
- Greater_Christchurch_Area.prj
```
### Add spreadsheet data for database population
### Plant Data Spreadsheet
The plant spreadsheet should be renamed as `plant_data.xlsx` and placed in the `./backend/right_tree/api/data/resources` directory.
## Running application for development
### Initial build
A valid plant data spreadsheet must be provided and placed at `backend/right_tree/api/data/resources/plant_data.xlsx` within the filesystem.
Builds the Django backend docker image. This may need to be re-run if any new dependencies are added.
```
./dev build
## Data Ingest
---
**WARNING:** The supplied `Makefile` does not differentiate development and production environments. If configuring the application for production, it is recommended to rename `docker-compose.production.yaml` to `docker-compose.yaml`, overwriting the existing file.
---
To initialise the database, first run the Django migrations to create the appropriate tables.
```bash
$ make migrate
```
### Initialise database
It is then safe to apply the Django fixtures and load the geospatial layers (ecological districts, region boundaries, etc.).
Creates `right_tree` database and installs `postgis` extensions.
```
./dev init_database
```bash
$ make ingest
```
### Run web application
For address searching, the LINZ 'NZ Street Address' layer must be loaded into a separate schema.
Starts up the applications including the frontend, backend and database.
```
./dev start
```bash
$ make ingest_linz
```
Once running the components can be accessed as follows:
Finally, create a superuser from whom the admin interface can be accessed. The tool will prompt you to supply a username, email, and password interactively.
| Application | Hosted |
```bash
$ make superuser
```
## Running the Application
Assuming the database has been configured in accordance to the [Data Ingest](#data-ingest) section, you should be able to start the application with
```bash
$ make start
```
`CTRL + C` will escape the log output, but persist running containers. To stop all running containers, execute
```bash
$ make stop
```
## The User Interface
The development web interface can be accessed at http://localhost. The production server is hosted at https://rightplant.biospherecapital.com/.
The application is contained within a single-page and offers a questionnaire to its users. To access the questionnaire, a user must provide a valid activation key which they can provide or purchase for a fee via a [Stripe](https://stripe.com) payment portal.
The user workflow is as follows:
![User workflow](docs/img/flowchart.png)
## The Admin Interface
The admin interface can be accessed from http://localhost/admin or https://rightplant.biospherecapital.com/admin.
There are several important sections within this interface that are especially important for retrieving and exporting completed questionnaires.
| Model | Description |
| --- | --- |
| React Frontend | http://localhost:3000 |
| Django Backend | http://localhost:8000 |
| Database | postgis://localhost:5432 |
| Activation key | A consumable activation key that can access the questionnaire a specific number of times |
| Activation key set | A named group of Activation key objects most useful for bulk exports. 'Stripe - digital' and 'Stripe - physical' are reserved names for digital and physical copy orders by individuals. |
| Customer | An identity tied to a physical copy orders |
| Customer address | A physical address of a Customer used for mailing physical copies. This is the address specified when a user fills out their shipping information from a Stripe checkout. |
| Questionnaire | A set of location and habitat information for a completed questionnaire, tied to an Activation key. These are filterable by their associated Activation key set in order to bulk export questionnaires and planting guides for only those whom have ordered physical copies. |
| Export | An export job of one or more questionnaires which can be downloaded as a ZIP archive of PDF documents containing the planting guides for each Questionnaire exported. |
## Available commands
## Maintenence
Other commands can be run using the following.
```
./dev <command>
### Renewing SSL Certificates
To generate new SSL certificates, run
```bash
$ make cert
```
A summary of available commands are outlined below. Note that if the command requires the application to be running (`Requires Run`) please execute `./dev start` in another terminal before running that command.
You may be required to restart the running NGINX container for the change to take effect. You may do this by running
| Command | Description | Requires Run |
| --- | --- | --- |
| `create_database` | Removes the existing database and data. Then it creates the `right_tree` database within a fresh postgis database instance. | No
| `makemigrations` | Performs the django `makemigrations` command in the backend container. | Yes
| `migrate` | Performs the django `migrate` command in the backend container. | Yes
| `createsuperuser` | Performs the django `createsuperuser` command in the backend container. | Yes
| `load_fixtures` | Performs the django `loaddata` command in the backend container. This loads all the fixtures found in the `/backend/right_tree/api/data/fixtures` directory. | Yes
| `load_shapefiles` | Performs the custom `loadshapefiles` command in the backend container. This loads the ecological districts and soil layers shape files in `c`. | Yes
| `create_plant_fixtures` | Performs the custom `createplantfixtures` command in the backend container. This loads the plant spreadsheet data from `/backend/right_tree/api/data/resources/plant_data.xlsx`. Requires the fixtures to be applied and shapefiles loaded. | Yes
| `reset_plants` | Performs the custom `resetplants` command in the backend container. This removes all plant entries from the database. | Yes
| `load_plant_fixtures` | Loads the `/backend/right_tree/api/data/fixtures/plants.json` fixture. Requires the `plants.json` file to be created (`./dev create_plant_fixtures`) and the plant table to be empty (`./dev reset_plants`). | Yes
| `load_plants` | Creates plants fixtures and loads them into a fresh plant table in the database. Requires the fixtures to be applied and shapefiles loaded. | Yes
| `load_sites_from_spreadsheet` | Loads site spreadsheet data the database initially (replaced with fixtures containing further information) | Yes
| `populate_database` | Populates the `right_tree` database with base data (fixtures), provided shapefiles and plant spreadsheet data. Requires the database to be created. | No
| `init_database` | Creates and populates the database | No
| `reset_database` | Removes, recreates and populates the database | No
| `build` | Builds required images | No
| `start` | Runs all services including the frontend, backend and postgres database | No
| `build` | Builds required images (frontend and backend) for development | No
| `build_production` | Builds required images (frontend and backend) for production | No
| `start_production` | Runs all services in production mode including the frontend, backend and postgres database | No
| `renew_certificate` | Renews certificates for production | No
| `process_svg_files` | Removes semi-colons from raw svg files to be compatible with the application | No
```bash
$ docker compose restart nginx
```
### View Logs
To view and following logs of all running containers, run
```bash
$ make logs
```
To view individual container logs, you may interact with `docker-compose` directly. e.g.
```bash
$ docker compose logs [-f] <container-name>
```
### Django Shell
**WARNING:** The following assumes you have an understanding of Python, Django, and its ORM. As these tools are powerful and potentially destructive, it is recommended to to use the [Admin Interface](), unless you know what you are doing.
To access an interactive Python shell on the Django project, run
```bash
$ make shell
```
### PostgreSQL `psql` CLI Access
**WARNING:** The `psql` shell provided has superuser privileges and can make permanent changes to actively used production data if not handled properly.
To perform administrative tasks at the database-level, you may access an interactive superuser `psql` shell by running
```bash
$ make psql
```
### Application Reset
---
**WARNING:** This is a destructive operation and will permanently remove all data in the running application. It is recommended that you backup any required questionnaires, keys, layers, etc. before continuing.
---
To reset your working environment, run
```bash
$ make reset
```
This will restore the repository to its initial state, from which you will need to follow the [Initial Setup](#initial-setup) steps again in order to run the application again.
## Running Configuration
Right Plant is currently live and operates with the following configuration.
### Cloud Tenant
The Right Plant VM lives in the Hamilton (nz-hlz-1) region of Catalyst Cloud, with an IP address of 103.197.61.141.
It runs a minimal version of 20.04, which will be supported until April 2025. Patching the server is as simple as
```
# apt update
# apt dist-upgrade -y
```
and a system reboot, if required. It has not been configured for automatic upgrades.
### SSH
SSH is available on port 22 for Catalyst management purposes, and 43212 for general user access. SSH Keys are required to log into the server, password authentication should not be enabled.
### Code
The code and all configuration can be found on the VM in `/opt/rightplant`.
Code is deployed as a git repository - any changes made will be tracked by git. It is not possible to push changes to the Catalyst stored git repository.
## Development
### Backend
The backend code is stored under `backend/`. It uses
* Python 3.11
* Django web framework (version 3.2)
* Django REST framework for its API
* Pandas for CSV parsing
* pdfkit and PyPDF2 for generating planting guide PDFs
* Celery for async tasks like bulk questionnaire exports
* Redis
* firstly as a message queue for Celery
* secondly as a key-value store for mapping activation keys to Stripe checkout sessions
* not used for database caching at all
* Gunicorn as a WSGI server
* PostGIS
* `public.righttree` is used as the main Django database
* `linz.nz_street_address` is a copy of the [LINZ layer of the same name](https://data.linz.govt.nz/layer/53353-nz-street-address/) used for searching addressese
### Frontend
The frontend code is stored under `frontend/`. It uses
* React 17
* React Router (was used in a past feature which has since been integrated into the single-page app)
* React Context for state management
* Material UI for components
* Axios for XHR requests
* Sass for styling
* Leaflet for map viewer
* Bootstrap/Reactstrap for styling conveniences (should double check if we need this)
### Creating Habitat Images
## Creating zones for habitat images
1. Create png image from original svg with approprate crop.
2. Create zone polygons/rectangles on the original svg with divider lines anchor points as a guide
2. Create zone polygons/rectangles on the original svg with divider lines anchor points as a guide
3. Copy zone polygons/rectangles to png image and size to fit (this is to ensure the only paths on the image the selectable ones)
4. Ensure all overlays have an almost transparent fill (lowest transparency value - in Inkscape this is 1) and no outline
5. Add a 'label' (not an id) to each overlay to match with a column name relating to the zone segment, this may be repeated. In Inkscape this is under 'Object Properties'.
6. Save the png with overlays as an svg (it may either be inkscape or plain svg)
7. Place svg in relevant directory (`./frontend/src/assets/img/habitatSVG/`) in the frontend
8. Find and replace any instance of colons (:) in property names for the raw svg i.e. inkscape:label -> inkscapelabel. A helper script has been written to do this automatically please run `python process_svg.py`.
## Setting up and running the application for production
1. Ensure the prerequisites are met as defined in [#Initial Setup]
2. Create an `.env` file (if not done prior) in the root directory using `default.env` as an example. Uncomment values relating to production and fill in the values as appropriate.
3. Build backend image `sudo ./dev build_production`
4. Create the database `sudo ./dev create_database`
5. Manually create postgres user with password and add the user to the `righttree` database with all permissions.
Create an interactive terminal into the postgres container
```bash
sudo docker-compose -f docker-compose.production.yaml up -d postgres
sudo docker exec -it postgres bash
```
Within the interactive terminal. Connect to the database, add the righttree_admin user and give permissions. Please use the same credentials as defined in .env.
```bash
psql -U postgres
\c righttree
CREATE USER righttree_admin;
ALTER USER righttree_admin with encrypted password 'YOUR PASSWORD';
GRANT ALL PRIVILEGES ON DATABASE righttree TO righttree_admin;
```
Exit the container and stop postgres service:
```
[CTRL-D] - to exit psql THEN [CTRL-D] to exit container
sudo docker-compose -f docker-compose.production.yaml down
```
6. Populate the database using `sudo ./dev populate_database`
7. Build optimised frontend build and collect together staticfiles `sudo ./dev create_staticfiles`
8. Create a django superuser for access to the admin interface. Please use the same credentials as defined in .env `sudo ./dev createsuperuser`
9. Run the production application using `sudo ./dev start_production`
### Setting up certificates
Create certificate using certbot and letsencrypt, choose option 1 and provide an appropriate email. Ensure port 80 and 443 are externally exposed for the domain before running this command. To retrieve a staging certificate, use the `--test-cert` flag.
```
sudo docker run -i --rm --name certbot -p 443:443 -p 80:80 -v /etc/letsencrypt:/etc/letsencrypt/ certbot/certbot certonly -d [YOUR DOMAIN] --logs-dir /etc/letsencrypt/logs
```

View file

@ -1,12 +1,19 @@
#!/usr/bin/env bash
# Uses https://gitlab.wgtn.cat-it.co.nz/nickg/pandoc-catalyst-template
# Uses https://gitlab.wgtn.cat-it.co.nz/nickg/pandoc-catalyst-template
# to build Markdown -> LaTeX -> PDF
set -e
BASE_DIR=$(dirname $(dirname $0))
cp $BASE_DIR/docs/template.md $BASE_DIR/docs/userguide.md
cat $BASE_DIR/README.md >> $BASE_DIR/docs/userguide.md
sed -i 's/docs\///' $BASE_DIR/docs/userguide.md
docker run \
--rm \
-v $PWD:/source \
-v $BASE_DIR/docs:/source \
-w /source \
-u $(id -u):$(id -g) \
gitlab.wgtn.cat-it.co.nz:4567/nickg/pandoc-catalyst-template:latest \
userguide.md > rightplant-documentation.pdf
convert userguide.md
mv $BASE_DIR/docs/userguide.md.pdf $BASE_DIR/docs/rightplant-documentation.pdf
rm -f $BASE_DIR/docs/userguide.md

29
docs/img/flowchart.md Normal file
View file

@ -0,0 +1,29 @@
flowchart TB
access[Access site]
activation[Activation]
valid[Valid key?]
remaining[Activations remaining?]
checkout[Stripe Checkout]
questionnaire[Questionnaire]
activate[Activate key]
results[Show table of results]
error[Error]
physical[Ordered physical copy?]
receive[Receive physical copy in mail]
done[Done]
access-->activation
activation-->checkout
checkout-->activation
activation-->valid
valid--"No"-->error
valid--"Yes"-->remaining
remaining--"Yes"-->questionnaire
remaining--"No"-->results
error-->activation
questionnaire-->activate
activate-->results
results-->physical
physical--"No"-->done
physical--"Yes"-->receive
receive-->done

BIN
docs/img/flowchart.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 63 KiB

Binary file not shown.

12
docs/template.md Normal file
View file

@ -0,0 +1,12 @@
---
mainfont: DejaVuSans
documentclass: article
geometry: margin=2.54cm
title: Right Plant Systems Management Guide
toc: true
coverpage: true
covertitle: Right Plant Systems Management Guide
clientlogo: img/biosphere.png
clientlogoheight: 1.5cm
packed-figures: true
---

View file

@ -1,211 +0,0 @@
---
mainfont: DejaVuSans
documentclass: article
geometry: margin=2.54cm
title: Right Plant Systems Management Guide
toc: true
coverpage: true
covertitle: Right Plant Systems Management Guide
clientlogo: img/biosphere.png
clientlogoheight: 1.5cm
packed-figures: true
---
# Architecture
Right Plant is built on a stack of a React frontend build in JavaScript, a simple Django REST API as a backend and PostGIS for data storage and persistence.
All the components are configured to be deployed using `docker-compose`.
# Running Configuration
## OS and Supporting Software
The Right Plant VM lives in the Hamilton (nz-hlz-1) region of Catalyst Cloud, with an IP address of 103.197.61.141.
It runs a minimal version of 20.04, which will be supported until April 2025. Patching the server is as simple as `sudo apt update; sudo apt dist-upgrade -y` and a system reboot if required. It has not been configured for automatic upgrades.
### SSH
SSH is available on port 22 for Catalyst management purposes, and 43212 for general user access. SSH Keys are required to log into the server, password authentication should not be enabled.
### Code
The code and all configuration can be found on the VM in `/opt/rightplant`.
Code is deployed as a git repository - any changes made will be tracked by git. It is not possible to push changes to the Catalyst stored git repository.
### Observing running configuration
Because Right Plant is deployed using Docker, the root user can access the docker daemon and execute CLI commands, such as `docker ps` to view running containers and `docker logs <container-name>` to view logs from the containers.
In general, running `docker <command> --help` will get you useful information about how to interact with the docker CLI.
\pagebreak
# Developer README
> This section has been lifted from the `README.md` file, that can be found in the git repository. It is aimed at developers and how to develop and run Right Plant.
### Add shapefiles for database population
Please unzip and add the following shapefiles to the `./backend/right_tree/api/data/resources` directory. It should include all the files required by the shapefile and use naming conventions as follows:
**Ecological Districts Shapefile:**
```
backend/right_tree/api/data/resources/ecological_districts/
- DOC_EcologicalDistricts_2021_08_02.cpg
- DOC_EcologicalDistricts_2021_08_02.dbf
- DOC_EcologicalDistricts_2021_08_02.prj
- DOC_EcologicalDistricts_2021_08_02.sbn
- DOC_EcologicalDistricts_2021_08_02.sbx
- DOC_EcologicalDistricts_2021_08_02.shp
- DOC_EcologicalDistricts_2021_08_02.shp.xml
- DOC_EcologicalDistricts_2021_08_02.shx
```
**Ecological Districts Shapefile:**
```
backend/right_tree/api/data/resources/fundamental_soil_layers/
- fundamental-soil-layers-new-zealand-soil-classification.cpg
- fundamental-soil-layers-new-zealand-soil-classification.dbf
- fundamental-soil-layers-new-zealand-soil-classification.prj
- fundamental-soil-layers-new-zealand-soil-classification.shp
- fundamental-soil-layers-new-zealand-soil-classification.shx
- fundamental-soil-layers-new-zealand-soil-classification.xml
```
**Christchurch Zone Shapefile:**
```
backend/right_tree/api/data/resources/chch_zone/
- Greater_Christchurch_Area.cpg
- Greater_Christchurch_Area.shp
- Greater_Christchurch_Area.dbf
- Greater_Christchurch_Area.shx
- Greater_Christchurch_Area.prj
```
### Add spreadsheet data for database population
The plant spreadsheet should be renamed as `plant_data.xlsx` and placed in the `./backend/right_tree/api/data/resources` directory.
## Running application for development
### Initial build
Builds the Django backend docker image. This may need to be re-run if any new dependencies are added.
```
./dev build
```
### Initialise database
Creates `right_tree` database and installs `postgis` extensions.
```
./dev init_database
```
### Run web application
Starts up the applications including the frontend, backend and database.
```
./dev start
```
Once running the components can be accessed as follows:
| Application | Hosted |
| --- | --- |
| React Frontend | http://localhost:3000 |
| Django Backend | http://localhost:8000 |
| Database | postgis://localhost:5432 |
## Available commands
Other commands can be run using the following.
```
./dev <command>
```
A summary of available commands are outlined below. Note that if the command requires the application to be running (`Requires Run`) please execute `./dev start` in another terminal before running that command.
| Command | Description | Requires Run |
| --- | --- | --- |
| `create_database` | Removes the existing database and data. Then it creates the `right_tree` database within a fresh postgis database instance. | No
| `makemigrations` | Performs the django `makemigrations` command in the backend container. | Yes
| `migrate` | Performs the django `migrate` command in the backend container. | Yes
| `createsuperuser` | Performs the django `createsuperuser` command in the backend container. | Yes
| `load_fixtures` | Performs the django `loaddata` command in the backend container. This loads all the fixtures found in the `/backend/right_tree/api/data/fixtures` directory. | Yes
| `load_shapefiles` | Performs the custom `loadshapefiles` command in the backend container. This loads the ecological districts and soil layers shape files in `c`. | Yes
| `create_plant_fixtures` | Performs the custom `createplantfixtures` command in the backend container. This loads the plant spreadsheet data from `/backend/right_tree/api/data/resources/plant_data.xlsx`. Requires the fixtures to be applied and shapefiles loaded. | Yes
| `reset_plants` | Performs the custom `resetplants` command in the backend container. This removes all plant entries from the database. | Yes
| `load_plant_fixtures` | Loads the `/backend/right_tree/api/data/fixtures/plants.json` fixture. Requires the `plants.json` file to be created (`./dev create_plant_fixtures`) and the plant table to be empty (`./dev reset_plants`). | Yes
| `load_plants` | Creates plants fixtures and loads them into a fresh plant table in the database. Requires the fixtures to be applied and shapefiles loaded. | Yes
| `load_sites_from_spreadsheet` | Loads site spreadsheet data the database initially (replaced with fixtures containing further information) | Yes
| `populate_database` | Populates the `right_tree` database with base data (fixtures), provided shapefiles and plant spreadsheet data. Requires the database to be created. | No
| `init_database` | Creates and populates the database | No
| `reset_database` | Removes, recreates and populates the database | No
| `build` | Builds required images | No
| `start` | Runs all services including the frontend, backend and postgres database | No
| `build` | Builds required images (frontend and backend) for development | No
| `build_production` | Builds required images (frontend and backend) for production | No
| `start_production` | Runs all services in production mode including the frontend, backend and postgres database | No
| `renew_certificate` | Renews certificates for production | No
| `process_svg_files` | Removes semi-colons from raw svg files to be compatible with the application | No
## Creating zones for habitat images
1. Create png image from original svg with approprate crop.
2. Create zone polygons/rectangles on the original svg with divider lines anchor points as a guide
3. Copy zone polygons/rectangles to png image and size to fit (this is to ensure the only paths on the image the selectable ones)
4. Ensure all overlays have an almost transparent fill (lowest transparency value - in Inkscape this is 1) and no outline
5. Add a 'label' (not an id) to each overlay to match with a column name relating to the zone segment, this may be repeated. In Inkscape this is under 'Object Properties'.
6. Save the png with overlays as an svg (it may either be inkscape or plain svg)
7. Place svg in relevant directory (`./frontend/src/assets/img/habitatSVG/`) in the frontend
8. Find and replace any instance of colons (:) in property names for the raw svg i.e. inkscape:label -> inkscapelabel. A helper script has been written to do this automatically please run `python process_svg.py`.
## Setting up and running the application for production
1. Ensure the prerequisites are met as defined in [#Initial Setup]
2. Create an `.env` file (if not done prior) in the root directory using `default.env` as an example. Uncomment values relating to production and fill in the values as appropriate.
3. Build backend image `sudo ./dev build_production`
4. Create the database `sudo ./dev create_database`
5. Manually create postgres user with password and add the user to the `righttree` database with all permissions.
Create an interactive terminal into the postgres container
```bash
sudo docker-compose -f docker-compose.production.yaml up -d postgres
sudo docker exec -it postgres bash
```
Within the interactive terminal. Connect to the database, add the righttree_admin user and give permissions. Please use the same credentials as defined in .env.
```bash
psql -U postgres
\c righttree
CREATE USER righttree_admin;
ALTER USER righttree_admin with encrypted password 'YOUR PASSWORD';
GRANT ALL PRIVILEGES ON DATABASE righttree TO righttree_admin;
```
Exit the container and stop postgres service:
```
[CTRL-D] - to exit psql THEN [CTRL-D] to exit container
sudo docker-compose -f docker-compose.production.yaml down
```
6. Populate the database using `sudo ./dev populate_database`
7. Build optimised frontend build and collect together staticfiles `sudo ./dev create_staticfiles`
8. Create a django superuser for access to the admin interface. Please use the same credentials as defined in .env `sudo ./dev createsuperuser`
9. Run the production application using `sudo ./dev start_production`
### Setting up certificates
Create certificate using certbot and letsencrypt, choose option 1 and provide an appropriate email. Ensure port 80 and 443 are externally exposed for the domain before running this command. To retrieve a staging certificate, use the `--test-cert` flag.
```
sudo docker run -i --rm --name certbot -p 443:443 -p 80:80 \
-v /etc/letsencrypt:/etc/letsencrypt/ certbot/certbot certonly \
-d [YOUR DOMAIN] --logs-dir /etc/letsencrypt/logs
```