View on GitHub

omega2-apps

Onion Omega 2 IoT MIPS32LE projects

Emotion recognition app

Emotion recognition app is a distributed highly scalable serverless compute vision app that discovers mood of an audience with respect to their tweets with embedded images.

Architecture

This app consists of following components:

Twitter daemon

Twitter daemon is a highly concurrent tweet dispatcher. It does following:

Twitter daemon supports two kinds of recognition functions:

Emotion recognition function

Emotion recognition function aka emokognition is a TensorFlow + OpenCV deep neural network prediction application. It does following:

Written in Python.

Emotion recorder function

Emotion recorder function is a highly concurrent persistence layer entry point for all emokognition. It does following:

Written in Golang.

Mood statistics UI

Mood statistics UI is a function that calls emotion results function to provide set of emotions written by emotion recorder. It does following:

Written in Python.

Emotion results

Emotion results is a function that reads all records from persistence layer It does following:

Written in Python.

Persistence layer

App relies on persistence layer PostgreSQL.

Workflow

Each represented function has it’s own type - sync or async. For sync it means whether function runs in attached mode, if i call function i have to await for it’s result. For async it means that function runs in detached mode from caller, you’re not getting a result from function, but you’ll get call identifier that can be used to check execution status some time later. Previously mentioned functions are deployed with following types:

As well as type each function identified with format - default or http. For default format it means that function gets started only for one call and then dies. For http format it means that function stays alive while something calls it and dies if there were no calls within certain time frame. Previously mentioned functions are deployed with following formats:

On the following figure you can see how emotion recognition works:

emotion recognition flow

For each new tweet daemon starts new sync emotion recognition function, each emotion recognition function calls emotion recorder function to make a record of its results.

On the following figure below you can see how mood UI works:

mood ui compiling flow

Twitter daemon —> emokognition —> emotion-recorder; mood statistics UI <– emotion results

Configuration

In order to make all this app work it is necessary to configure properly twitter daemon with following environment vars:

Deployment

In order to start one use following command:

docker run --name postgres -p 5432:5432 -e POSTGRES_PASSWORD=postgres -e POSTGRES_DB=emokognition -e POSTGRES_USER=postgres -d postgres

TODOs