Get information about Origami repositories. See the production service for API information.
Running Origami Repo Data requires Node.js 14.x and npm. A PostgreSQL database is also required.
If you're working on a Mac, the simplest way to install PostgreSQL is to use Homebrew. Run the following and pay attention to the instructions output after installing:
brew install postgresql
Before we can run the application, we'll need to install dependencies:
npm install
Run PostgreSQL locally. If you used brew to install PostgreSQL on a Mac run:
brew services start postgresql
Then create a local PostgreSQL database, you may need to provide credentials for the following command depending on your local setup:
make db-create
Now you'll need to migrate the database, which sets up the required tables. You'll also need to run this command if you pull commits which include new database migrations:
make db-migrate-up
Run the application in development mode with:
make run-dev
Now you can access the app over HTTP on port 8080
: http://localhost:8080/
We configure Origami Repo Data using environment variables. In development, configurations are set in a .env
file. In production, these are set through Heroku config. Further documentation on the available options can be found in the Origami Service documentation.
ENABLE_SETUP_STEP
: Set to true
in order to allow the creation of an admin key using the /v1/setup
endpoint. Once a key has been created this way, this configuration should be removed for security reasons.BUILD_SERVICE_URL
: The url of the Origami Build Service to use for component urls such as demos.DATABASE_URL
: A PostgreSQL connection string, with write permission on a databaseGITHUB_AUTH_TOKEN
: A GitHub auth token which has read access to all Financial Times repositories.NPM_REGISTRY
: The url of the npmjs registry to use when ingesting via npm.NODE_ENV
: The environment to run the application in. One of production
, development
(default), or test
(for use in automated tests).PORT
: The port to run the application on.CMDB_API_KEY
: The CMDB API key to use when updating health checks and the application runbookFASTLY_PURGE_API_KEY
: A Fastly API key which is used to purge URLs (when somebody POSTs to the /purge
endpoint)GRAPHITE_API_KEY
: The FT's internal Graphite API key.PURGE_API_KEY
: The API key to require when somebody POSTs to the /purge
endpoint. This should be a non-memorable string, for example a UUIDREGION
: The region the application is running in. One of QA
, EU
, or US
CHANGE_API_KEY
: The change-log API key to use when creating and closing change-logs.RELEASE_ENV
: The Salesforce environment to include in change-logs. One of Test
or Production
SENTRY_DSN
: The Sentry URL to send error information to.SLACK_ANNOUNCER_AUTH_TOKEN
: The Slack auth token to use when announcing new repo versions on SlackSLACK_ANNOUNCER_CHANNEL_ID
: The Slack channel to announce new repo versions in (unique ID, not channel name)GRAFANA_API_KEY
: The API key to use when using Grafana push/pullThe service can also be configured by sending HTTP headers, these would normally be set in your CDN config:
FT-Origami-Service-Base-Path
: The base path for the service, this gets prepended to all paths in the HTML and ensures that redirects work when the CDN rewrites URLs.Most of the files which are used in maintaining your local database are in the data
folder of this repo. This is split into migrations and seed data.
You can use the following commands to manage your local database:
make db-migrate-up # migrate up to the latest version of the schema
make db-migrate-down # revert the last applied migration
make db-seed # add seed data to the database for local testing
To create a new migration file, you'll need to run:
./script/create-migration.js <NAME-OF-MIGRATION>
This will generate a file in data/migration
which you can update to include up
and down
migrations. We use Knex for migrations, copying from an existing file may help.
Seed data for local development is in data/seed/demo
. Every file in this directory will be used to seed the database when make db-seed
is run.
The source documentation for the runbook and healthcheck endpoints (EU/US) are stored in the operational-documentation
folder. These files are pushed to CMDB upon every promotion to production. You can push them to CMDB manually by running the following command:
make cmdb-update
The tests are split into unit tests and integration tests. To run tests on your machine you'll need to install Node.js and run make install
. Then you can run the following commands:
make test # run all the tests
make test-unit # run the unit tests
make test-integration # run the integration tests
You can run the unit tests with coverage reporting, which expects 90% coverage or more:
make test-unit-coverage verify-coverage
The code will also need to pass linting on CI, you can run the linter locally with:
make verify
To run the integration tests, you'll need a local PostgreSQL database named origami-repo-data-test
. You can set this up with:
make db-create-test
We run the tests and linter on CI, you can view [results on CI][ci]. make test
and make lint
must pass before we merge a pull request.
The production (EU/US) and QA applications run on Heroku. We deploy continuously to QA via [CI][ci], you should never need to deploy to QA manually. We use a Heroku pipeline to promote QA deployments to production.
You can promote either through the Heroku interface, or by running the following command locally:
make promote
We've outlined some common issues that can occur in the running of the Origami Repo Data:
For now, restart the Heroku dynos:
heroku restart --app origami-repo-data-eu
heroku restart --app origami-repo-data-us
If this doesn't help, then a temporary measure could be to add more dynos to the production applications, or switch the existing ones to higher performance dynos.
If you really need to deploy manually, you should only do so to QA (production deploys should always be a promotion). Use the following command to deploy to QA manually:
make deploy
The Financial Times has published this software under the MIT license.