HomeTechnologyComponents → OntoMotion
 
 
OntoMotion - Ontologic eMotion
   
 

Introduction
OntoMotion is the component of OntoLi+-x that handles all kind of tasks related with emotions, such as for example:

  • capturing signals provided by multimodal input and other components,
  • recording signals,
  • processing signals,
  • recognizing emotions,
  • analyzing emotions,
  • interpreting emotions,
  • modelling emotions,
  • generating artificial emotions,
  • simulating emotions,
  • generating signals, and
  • sending signals for multimodal output or other components.

Like the other software components the OntoMotion component is also a result of our research and development activities in the OntoLab, so that the:

  • affective computing (see the chapter 6. Ausblick of The Proposal),
  • conduction of multidimensional tasks in all directions,
  • realization of Multilingual Multimodal Multidimensional Multiparadigmatic Multimedia User Interfaces (M⁵UIs),
  • execution of the tasks and functions in relation with emotions and affective computing in real-time, and
  • seemless interplay with the other software components

are included by design of our integrating Ontologic System Architecture (OSA) just right from its start, so that very interesting, advanced, and complex Ontologic Applications can be realized.

Based on Ontology
Instead of implementing a specific classification paradigm and recognition system for emotions, we defined a related ontology on the basis of for example

This ontology is used in the Bridge from NI to AI, which again is based on the Artifical Intelligence (AI) capabilities and the Machine Learning (ML) functionalities of our OntoBot.

Application

The OntoMotion component provides the foundation for the development of many interesting, sophisticated, and complex Multilingual Multimodal Multidimensional Multiparadigmatic Multimedia (M⁵) software and hardware applications and systems respectively Ontologic Applications by incorporating emotions or feelings and providing affection in ways, that have not been envisioned before.

On the input side various applications are possible, like for example using a:

  • heart rate sensor and humidity sensor to measure the galvanic resistance of the skin and analyze the heart rate variability as done with e.g. a lie detector,
  • camera to capture eye movements, grimaces, gestures, as well as other physical expressions and movements to recognize emotions in the behavior of a user,
  • microphone to capture sound and sound levels to recognize emotions in speech, and
  • electrodes of a Brain Computer Interface (BCI) to capture brainwaves and to recognize emotions in thinking and dreaming.

On the output side the same applications are possible in the opposite direction by displays, loudspeakers, electrodes of a BCI, etc..

Such applications can be executed on or integrated with devices, such as for example:

  • smartphones,
  • handheld Ontoscopes,
  • smartwatches,
  • wristworn Ontoscopes,
  • smart eyewear and Head-Mounted Displays (HMDs),
  • headworn Ontoscopes,
  • vehicles, specifically automobiles, rotorcrafts, and planes, and for sure
  • robotic systems (keep in mind that the Ontoscope is inspired by a head of a robot or humanoid),

to complete affective computing systems

   
 
© and/or ® 2006-2014
Christian Stroetmann GmbH
Disclaimer