Skip to content
This repository has been archived by the owner on Jan 17, 2025. It is now read-only.

vtfk/E18-api

Repository files navigation

E18-api

E18 is a system for central async job queue, error handeling and statistics.
It is composed of the following components:

  • API (This project)
  • Database (MongoDB)
  • Orchestrator (Microsoft Azure Logic App)
  • Web frontend

The philosophy behind E18 is that systems can fire and forget jobs to it and be certain that it either will run sucessfully or that any errors will be seen and delt with via the frontend.

E18 jobs

An E18 job contain information about the system that requested it, the purpose for the job as well as some other optional metadata and configuration options, it also contains one or more tasks that should be run.
The specifics about the fields are documented in detail

How it works:

The E18 API exposes an endpoint that let other systemes registers jobs on it.
The system creates a job-request with one or more tasks and sends it to E18 (fire and forget)

E18 then does the following:

  1. The orchestrator runs periodically and checks for tasks that are ready to run
    Tasks will as default run sequentially, but they can be grouped to run in parallell
  2. It verifies that the tasks is for an approved third-party system
  3. The job is executed
  4. The result is logged back to the job entry

If a task fails

  1. The error is logged back to the task
  2. The orchestrator will attempt to re-run the task X times, before giving up
  3. Tasks that don't run succesfully will stay in the queue until delt with via the frontend

When all tasks are completed successfully

  1. When the last task for a job is sucessfull the job is set to completed
  2. When the orchestrator runs maintenance it will strip away any non-metadata and move the job for the queue to statistics

Usage

Update openAPI specification in API Manager

Microsoft uses it's own servers object in the specification. So to update the openAPI specification in APIM:

  • Copy all content from the yaml file
  • Open the API in APIM
  • Select the arrow down in Frontend and choose OpenAPI editor (YAML)
  • Paste in the new yaml replacing the old one
  • Remove the servers section
  • Click Save

Add statistics from a system

POST /jobs

{
  "system": "<unique-identifier-for-your-system>",
  "type": "<unique-identifier-for-this-operation>",
  "status": "completed", // must be set to completed for statistics
  "projectId": 0, // Project id
  "e18": false, // must be set to false for statistics
  "tasks": [ // add one task entry pr item this job has performed
    {
      "system": "<which-system/api-has-been-used>", // see system overview
      "method": "<which-endpoint-in-system/api-has-been-called",
      "status": "completed", // must be set to completed for statistics
      "regarding": "something", // optional free text field (no sensitive information!)
      "tags": [ // optional array of tags (can be handy for sorting in statistics overview)
        "hey there"
      ]
    }
  ]
}

Register a job for E18 to execute

POST /jobs

{
  "system": "<unique-identifier-for-your-system>",
  "type": "<unique-identifier-for-this-operation>",
  "projectId": 0,
  "tasks": [
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "regarding": "something",
      "data": { "someData": "test" }, // Optional: Object that will be sent to the task endpoint
      "tags": [
        "hey there"
      ]
    }
  ]
}

Register job with tasks that can run in parallel

POST /jobs

{
  "system": "<unique-identifier-for-your-system>",
  "type": "<unique-identifier-for-this-operation>",
  "projectId": 0,
  "parallel": true, // If set to true all tasks will run in parallel if dependency information is provided
  "tasks": [
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "data": { "someData": "test" }
    },
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "data": { "someData": "test" }
    }
  ]
}

Register job with tasks that can run by grouping them

POST /jobs

{
  "system": "<unique-identifier-for-your-system>",
  "type": "<unique-identifier-for-this-operation>",
  "projectId": 0,
  "tasks": [
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "data": { "someData": "test" }
    },
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "group": "group1",
      "data": { "someData": "test" }
    },
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "group": "group1", // This task will run together with the above task
      "data": { "someData": "test" }
    }
  ]
}

Register job where all tasks run in parallel

POST /jobs

{
  "system": "<unique-identifier-for-your-system>",
  "type": "<unique-identifier-for-this-operation>",
  "projectId": 0,
  "parallel": true, // This will make all tasks run in parallel
  "tasks": [
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "data": { "someData": "test" }
    },
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "data": { "someData": "test" }
    },
    {
      "system": "<which-system/api-has-been-used>",
      "method": "<which-endpoint-in-system/api-has-been-called",
      "data": { "someData": "test" }
    }
  ]
}

Register new API key

POST /apikeys

{
  "name": "Key name"
}

Add ?fullitem=true to the URL to retrieve the full item from db

Update API key

Enable / Disable

PUT /apikeys/:id

{
  "name": "Key name", // required field
  "enabled": true | false
}

Remove API key

DELETE /apikeys/:id

Data mapping

E18 has the capability to merge data from the task operations into subsequent tasks.
This is done through the task property dataMapping
There are three types of dataMapping: 1:1, template and full.
the dataMapping can also be an array with multiple mappings.

Examples

1:1 mappings

/*
  Mappings:
  name from previous operations will be mapped to fullname
*/
{
  "system": "e18",
  "type": "example",
  "tasks": [
    {
      "system": "system1",
      "method": "system1-method"
    },
    {
      "system": "system2",
      "method": "system2-method",
      "dataMapping": "fullname=name"
    }
  ]
}
/*
  Mappings:
  firstname will be mapped to person.name.firstname
  lastname will be mapped to person.name.lastname
  The object hiearchy does not need to already be present in data property
*/
{
  "system": "e18",
  "type": "example",
  "tasks": [
    {
      "system": "system1",
      "method": "system1-method"
    },
    {
      "system": "system2",
      "method": "system2-method",
      "dataMapping": [
        "person.name.firstname=firstname",
        "person.name.lastname=lastname"
      ]
    }
  ]
}

Full mapping

/*
  Mappings:
  All data from previous operations will be merged into the tasks data
*/
{
  "system": "e18",
  "type": "example",
  "tasks": [
    {
      "system": "system1",
      "method": "system1-method"
    },
    {
      "system": "system2",
      "method": "system2-method",
      "dataMapping": "*"
    }
  ]
}

Template mapping

Works by providing a JSON-object template with placeholders to be replaced.
The placeholders can be Mustache or Sjablong-fields.
It also supports handlebars-expressions.

/* 
  Mappings:
  Replaces all data that matches the placeholders
*/
{
  "system": "e18",
  "type": "example",
  "tasks": [
    {
      "system": "system1",
      "method": "system1-method"
    },
    {
      "system": "system2",
      "method": "system2-method",
      "dataMapping": "{\"person\": {\"firstname\": \"{{firstname}}\"}}"
    }
  ]
}

Local development

For local development create a .env.development file

DBCONNECTIONSTRING=mongodb+srv://<username>:<password>@<mongoHost>?retryWrites=true&w=majority
AZURE_BLOB_CONNECTIONSTRING=get-connection-string-from-access-keys-on-your-storage-account
AZURE_BLOB_CONTAINERNAME=files

If no 'DBCONNECTIONSTRING' is provided, the API will create a MongoMemoryServer instance on port 9000

Environment variables

Variable Description Example
DBCONNECTIONSTRING MongoDB connection string
STORAGE_ACCOUNT_CONNECTIONSTRING Connection string for Azure storage account
STORAGE_ACCOUNT_BLOB_NAME Name of the blob container
APPLICATIONINSIGHTS_CONNECTION_STRING Connectionstring for Azure Application Insights (optional)
RATELIMIT_WINDOW_MS The number of milliseconds pr. rate limit frame 60000
RATELIMIT_WINDOW_LIMIT The number of requests pr. requester in the current limit window 150