Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

The LED-dome displays various animations and can visualize neural activity. LIve neural visualization has not yet been implemented (only from prerecorded files), but two such as:

  • eyes and face animations
  • text
  • yellow construction lights
  • red/blue emergency lights
  • ++

The dome also visualizes neural activity from living biological neural networks.

MEA live streaming

Neuronal cell cultures grown at the neuroscience department can be recorded by use Micro-Electrode Arrays (MEAs). This MEA data can be visualized by the LED-dome. Two options exist for this:

  1. reading prerecorded data from a csv file
  2. streaming live activity directly from the MEA (MEAME2) server at Øya.

Reading from file is already implemented in LED-dome, while lIve streaming has yet to be implemented. Two standalone projects have been developed for communication with the MEAsMEAME:

Both of these projects communicate directly with the MEAME2 server. One of these projects should be chosen to be selected and integrated with LED-dome to enable live stream of MEA data.

Example of using SINRi in LED-dome

An overview of the SINRi project:

The Grinder module can receive an MEA stream over TCP from MEAME. This module could be used in LED-dome.