BrainBit for developers Subscribe for updates Visit website

Emotions

Algorithm of emotional states

Overview

The library is designed to process the signal from BrainBit headband. It allows you to get the emotional state of a person (mental levels) - in terms of the degree of relaxation or concentration, as well as the rhythms of the brain - in terms of relative expression levels for Delta, Theta, Alpha, Beta, Gamma bands. The library provides adjustable artifacts detection and elimination techniques, and signal quality analysis. It shows the presence of artifacts in EEG signal and average estimation of the signal quality. With its help you can understand how much the signal is suitable for calculations. In most cases, the presence of artifacts means a poor fit of the device to the skin or eyes and muscular related movements.

Pipeline description

Bipolar vs multiple channels modes

The library can work with bipolar channels or with multiple channels processing modes (mathLibSetting.bipolar_mode).

Input data

The algorithm processes raw EEG data in Volts iteratively by a sliding window of a given length (mathLibSetting.fft_window) with a given processing frequency (mathLibSetting.process_win_freq). 

The data is added to the library in short fragments by MathLibPushData() method (bipolar mode) or MathLibPushDataArr() (multiple channels mode). There is no strict limit for the length of input arrays in these methods, the inner buffer of the library will process it according to a chosen sliding window length and processing frequency. However, it is recommended to have these arrays length <= mathLibSetting.sampling_rate / mathLibSetting.process_win_freq.

Filtering

The default filters set is specifically designed to eliminate low frequency components (Delta and partially Theta), because in most cases for mobile neurodevices, it does not represent neural activity, but rather artifacts. In case you are interested in this components, you still can obtain it with usage of internal filters by executing method  MathLibSetZeroSpectWaves(tMathPtr, true, 1, 1, 1, 1, 1, &opSt) with coefficient 1 for Delta.


You can also use your own filters before sending data to the library. In such cases, the internal filtering can be turned off by MathLibUseInternalFilters(False).

Data processing

After adding the data to the library, MathLibProcessDataArr() has to be executed.

Obtaining the results

Then, the resulting spectral and mental values are obtained with MathLibReadMentalDataArr() and MathLibReadSpectralDataPercentsArr().

These values are averaged results across certain number of windows,
where the number defined by mentalAndSpectralSetting.n_sec_for_averaging.


The library will return new spectral and mental values, when there are enough new points added for the next sliding window shift. The sliding window shift is an internal parameter, defined as mathLibSetting.sampling_rate / mathLibSetting.process_win_freq. When the length of input arrays in PushData methods is twice or more higher than sliding window shift value - it will return an array with several consecutive instances of spectral and mental values corresponding to several signal parts.

Artifacts processing

There are 2 main methods for artifacts check:
MathLibIsBothSidesArtifacted() and MathLibIsArtifactedSequence().

The first one is for detection of artifacts on all channels for the current window,
and the second one is for detection of artifacts on all channels for several consecutive windows (defined by artifactDetectSetting.global_artwin_sec in seconds).

 

In bipolar mode - if artifacts are detected on one of the bipolar channels, the artifacts on the second bipolar channel are checked, and if there are no artifacts there, the signal processing is switched to that channel. In case of artifacts on both bipolar channels - this region is not used for estimations, the spectral and mental values are filled with the previous valid values, and the counter of consecutive artifacted windows increases. You can specify which bipolar channel will be in priority for signal processing in case, when there are no artifacts on both sides (MathLibSetPrioritySide()).

 

In multiple channels mode - if artifacts are presented on the current channel, then there is a switch to the channel with the best signal quality (see Quality of the signal part). In case of artifacts on all channels - the spectral values and values of mental levels are filled with the previous valid values, and the counter of consecutive artifacted windows increases.


When the maximum number of consecutive artifact windows (artifactDetectSetting.global_artwin_sec) is reached, MathLibIsArtifactedSequence() returns True, which supposes to give the user information about the need to check the position of the device. This flag is usually raised 4 sec after receiving continuous artifacts (with the default parameter of artifactDetectSetting.global_artwin_sec). If there is no need to give notification of momentary artifacts, you can use this function as the primary for artifact notifications. Otherwise, use MathLibIsBothSidesArtifacted() to check for momentary artifacts, returning True for artifacts on all channels for the current window.

Quality of the signal

The signal quality metric is based on a relation between total power of filtered EEG signal and the corresponding border parameter (artifactDetectSetting.total_pow_border). It reflects the quality of the signal in percentages and also serves as a base for artifact detection. The sensitivity of artifacts detection can be adjusted by making this border parameter to higher or lower values, with the higher values - more distorted signal will be still considered with higher quality. 


The method MathLibGetEEGQuality() returns the quality of filtered EEG signal (current implementation supports only Bipolar mode). The metric is computed as averaging across several windows (artifactDetectSetting.num_wins_for_quality_avg), by making this parameter to lower or higher value, it’s possible to make it more or less reactive in time to changes in the signal, with lower values - the metric will be more fastly reactive.

 

Brain Rhythms: Delta, Theta, Alpha, Beta, Gamma (%)

The library provides an estimation of relative expression of brain waves:
Delta [1..3] Hz, Theta [4..6] Hz, Alpha [7..13] Hz, Beta [14..24] Hz, Gamma [25..49] Hz.
To obtain these values, firstly, FFT is performed for the current window. All FFT bins are computed as: 2 * sqrt (binval_real*binval_real + binval_imag*binval_imag). Then, bins till 50 Hz are accumulated as total power sum, and bins for each rhythm interval accumulated in corresponding variables. The relation of bins for certain brain rhythm bands to total power sum is used as values of brain waves expression levels for the current signal window. The final brain rhythms values are obtained as averaging across several signal windows (mentalAndSpectralSetting.n_sec_for_averaging) using method MathLibReadSpectralDataPercentsArr().

Additionally, there is an option to normalize spectral bands values by bands width - turned on by default, to turn it off - use MathLibSetSpectNormalizationByBandsWidth(False), and option to use specific weights coefficients [0..1] for adjustment of each band level with methods MathLibSetWeightsForSpectra() and MathLibSetSpectNormalizationByCoeffs(). It’s possible to use either one or another normalization.

MathLibSetZeroSpectWaves() method allows to eliminate fully certain bands.

Mental Levels: Attention, Relaxation (%)

There are two options for obtaining mental levels values: Relative and Instant,
both are returned in the structure from MathLibReadMentalDataArr() or MathLibReadAverageMentalData() methods. Mental levels estimation is mostly based on analysis of Alpha and Beta bands expression levels relation.

Relative values are estimated as some deviation from the initial state of the user, which is determined during the calibration time of the algorithm. Therefore, Relative mental levels become available only when the calibration procedure is completed. It is worth mentioning, that if during the calibration time the person will be in one of the pronounced states, for example, in deep relaxation or high concentration, it will be very difficult then to achieve higher Relaxation or Attention values. 


Instant values are estimated from the user state based on some short time period of several last seconds (mentalAndSpectralSetting.n_sec_for_instant_estimation). There are two modes for Instant levels estimation - Independent and Dependent. The difference is that, in case of Dependent estimation (used in the library by default) - Attention and Relaxation levels are only based on Alpha and Beta values and always in clear opposite relation, the sum of them gives 100%. Whereas, for Independent estimation another approach is used, which also accounts for Theta values, to turn on this option - use MathLibSetMentalEstimationMode(True).

Calibration

The calibration procedure is needed to determine the baseline user state, which further will be used to estimate Relative mental levels. During the calibration phase, the user should sit still with eyes open and try to be in a neutral mental state, while the algorithm processes signals and estimates its spectral values. At this time baseline Alpha and Beta expression levels (%) are determined, and these values will be further used for computation of Relative Attention and Relaxation metrics. Baseline Alpha and Beta levels can be obtained by method MathLibReadCalibrationVals().


The calibration can be started at any time by executing method MathLibStartCalibration(). Without artifacts it will last for a certain time period (by default 8 seconds), this time can be adjusted by MathLibSetCallibrationLength() method. The calibration may last longer, if the signal will have artifacts (on all channels), it will continue while the algorithm collects enough clear signal parts for specified time. There are methods to get information about the status of calibration procedure - the progress in percentage to finish - MathLibGetCallibrationPercents(), and end of the calibration - MathLibCalibrationFinished(). The calibration can be performed only once for the whole session of work with the library.

Getting started

To get the emotional state of a person you need to follow the following steps:

  1. Install the package for the desired platform;
  2. Create an instance of the library;
  3. Get data from the BrainBit device;
  4. Use the library to process the data;
  5. Finish working with the library.

Install

Download from [GitHub](https://github.com/BrainbitLLC/EmotionsStateArtifacts-cpp) all folders and add .dll to your project by your preferred way.
            
#### First way

1. Download `libem_st_artifacts.deb` from [GitHub](https://github.com/BrainbitLLC/linux_em_st_artifacts/tree/main/package).
2. Install package by `apt`:

```
sudo apt install ./libem_st_artifacts.deb
```

#### Second way

Download from [GitHub](https://github.com/BrainbitLLC/linux_em_st_artifacts/tree/main/raw) and add .so to your project by your preferred way.

Library built on Astra Linux CE 2.12.46, kernel 5.15.0-70-generic. Arch: x86_64 GNU/Linux

```
user@astra:~$ ldd --version
ldd (Debian GLIBC 2.28-10+deb10u1) 2.28
Copyright (C) 2018 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Written by Roland McGrath and Ulrich Drepper.
```

That library depends from others library, for instance:
```
linux-vdso.so.1
libblkid.so.1
libc++.so.1
libc++abi.so.1 
libc.so.6
libdl.so.2
libffi.so.6
libgcc_s.so.1
libgio-2.0.so.0
libglib-2.0.so.0
libgmodule-2.0.so.0
libgobject-2.0.so.0
libm.so.6
libmount.so.1
libpcre.so.3
libpthread.so.0
libresolv.so.2 
librt.so.1
libselinux.so.1
libstdc++.so.6
libudev.so.1
libuuid.so.1
libz.so.1
ld-linux-x86-64.so.2
```

If you are using a OS other than ours, these dependencies will be required for the library. You can download dependencies from [here](https://github.com/BrainbitLLC/linux_neurosdk2/tree/main/dependencies).
            
The Android version is designed for APIs >= 21.

Neurosdk for android is distributed using JitPack as an aar library. Here is an example of adding SDK to an AndroidStudio project using gradle:

Add to `build.gradle` of project:

###### Groovy:
```groovy
dependencyResolutionManagement {
	repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
	repositories {
		...
		maven { url 'https://jitpack.io' }
	}
}
```

###### Kotlin:
```kotlin
dependencyResolutionManagement {
    repositoriesMode.set(RepositoriesMode.FAIL_ON_PROJECT_REPOS)
    repositories {
        ...
        maven { setUrl("https://jitpack.io") }
    }
}
```

and to `build.gradle` of app:

###### Groovy:
```groovy
dependencies {
    implementation 'com.github.BrainbitLLC:Emotional-state-artifacts:1.0.3'
}
```

###### Kotlin:
```kotlin
dependencies {
    implementation("com.github.BrainbitLLC:Emotional-state-artifacts:1.0.3")
}
```

To prevent build errors add to build.gradle this settings:

###### Groovy:
```groovy
android {
    packagingOptions {
        pickFirst 'lib/x86_64/libc++_shared.so'
        pickFirst 'lib/x86/libc++_shared.so'
        pickFirst 'lib/arm64-v8a/libc++_shared.so'
        pickFirst 'lib/armeabi-v7a/libc++_shared.so'
    }
    ...
}
```

Latest version: `1.0.3`
            
The iOS version is designed for OS >= 12.0.

By Pods:

Add to Podspec file:

```
pod 'em-st-artifacts', '1.0.7'
```

And run `pod install` command.

You can install framework manually:

1. download `em-st-artifacts.xcframework` from [GitHub](https://github.com/BrainbitLLC/apple_em-st-artifacts)
2. add `em-st-artifacts.xcframework` to `Frameworks, Libraries, and Embedded Content` section of your project
3. set `Embedded` to `Embed & Sign`

Latest version: `1.0.7`
            
> Python package available for Windows, MacOS and Ubuntu 24.04

By pip:

```
pip install pyem-st-artifacts
```

Before you can use the library on linux, you must install the emotion library:
```
sudo apt install ./libem_st_artifacts.deb
```
It can be downloaded [here](https://github.com/BrainbitLLC/linux_em_st_artifacts/tree/main/package).


The package has the following structure:
 - em_st_artifacts - the main package with the implementation of methods
 - sample - file `sample.py`, sample of usage library
 - libs - contain dll library files

EmotionalMath - one main module contains all methods of library

```python
from em_st_artifacts.emotional_math import EmotionalMath
```
Latest version: `1.0.7`
            
> Available for iOS, Android, Windows, MacOS platforms

1. Open Package Manager
2. Click "Add" menu and choose "Add package from GIT url...". A text box and an Add button appear.
3. Enter a https://github.com/BrainbitLLC/unity_em_st_artifacts.git in the text box and click Add.

If Unity was able to install the package successfully, the package now appears in the package list with the  tag. If Unity was not able to install the package, the Unity Console displays an error message.

Latest version: `1.0.3`
            
> Available for iOS, Android and UWP platforms

Install latest version `EmStArtifacts` package from NuGet Gallery in your preferred way to a common project.

[NugetPackage](https://www.nuget.org/packages/EmStArtifacts)

Latest version: `1.0.8`
            
Install latest version `EmStArtifacts` package from NuGet Gallery in your preferred way.

Latest version: `1.0.8`

[NugetPackage](https://www.nuget.org/packages/EmStArtifacts)
            
Latest version v1.0.3

Run this command with Flutter:

flutter pub add em_st_artifacts

Or install it manually:

Add package to dependencies:

dependencies:
  em_st_artifacts: ^1.0.3

Run `flutter pub get` command

Then use it in your Dart code:

import 'package:em_st_artifacts/em_st_artifacts.dart';
            

            

 

Parameters

Main parameters description

Structure `MathLibSettings` with fields:
1. sampling_rate - raw signal sampling frequency, Hz, integer value
2. process_win_freq - frequency of spectrum analysis and emotional levels, Hz, integer value
3. fft_window - spectrum calculation window length, integer value
4. n_first_sec_skipped - skipping the first seconds after connecting to the device, integer value
5. bipolar_mode - enabled bipolar mode, boolean value
6. squared_spectrum - mode of calculating spectral values of frequencies. If squared = true then values are calculated as the sum of squares of FFT bins of the spectrum of the interval of the corresponding frequency (e.g. alpha), if squared = false then as the sum of FFT bins. Boolean value
7. channels_number - count channels for multi-channel library mode, integer value
8. channel_for_analysis - in case of multichannel mode: channel by default for computing spectral values and emotional levels, integer value

`channels_number` and `channel_for_analysis` are not used explicitly for bipolar mode, you can leave the default ones.

The sampling rate parameter must match the sampling frequency of the device. For the BrainBit, this is 250 Hz.

Separate parameters:
1. MentalEstimationMode - type of evaluation of instant mental levels - disabled by default, boolean value
2. SpectNormalizationByBandsWidth - spectrum normalization by bandwidth - disabled by default, boolean value

 

Artifact detection parameters description

Structure `ArtifactDetectSetting` with fields:
1. art_bord - threshold for the long amplitude artifact, mcV, integer value
2. allowed_percent_artpoints - percent of allowed artifact points in the window, integer value
3. raw_betap_limit - boundary for spectral artifact (beta power), detection of artifacts on the spectrum occurs by checking the excess of the absolute value of the raw beta wave power, integer value
4. total_pow_border - boundary for spectral artifact (in case of assessment by total power) and for channels signal quality estimation, integer value
5. global_artwin_sec - number of seconds for an artifact sequence, the maximum number of consecutive artifact windows (on both channels) before issuing a prolonged artifact notification / device position check, integer value
6. spect_art_by_totalp - assessment of spectral artifacts by total power, boolean value
7. hanning_win_spectrum - setting the smoothing of the spectrum calculation by Hamming, boolean value
8. hamming_win_spectrum - setting the smoothing of the spectrum calculation by Henning, boolean value
9. num_wins_for_quality_avg - number of windows for estimation of signals quality, by default = 100, which, for example, with process_win_freq=25Hz, will be equal to 4 seconds, integer value

Structure `ShortArtifactDetectSetting` with fields:
1. ampl_art_detect_win_size - the length of the sliding window segments for the detection of short-term amplitude artifacts, ms, integer value
2. ampl_art_zerod_area - signal replacement area of the previous non-artifact to the left and right of the extremum point, ms, integer value
3. ampl_art_extremum_border - threshold of the extremum considered to be artifactual, mcV, integer value

Structure `MentalAndSpectralSetting` with fields:
1. n_sec_for_instant_estimation - the number of seconds to calculate the values of mental levels, integer value
2. n_sec_for_averaging - spectrum averaging, integer value

Separate setting is the number of windows after the artifact with the previous actual value - to smooth the switching process after artifacts (`SkipWinsAfterArtifact`).

 

Initialization

Main parameters

MathLibSetting mathLibSetting;
mathLibSetting.sampling_rate = 250;
mathLibSetting.process_win_freq = 25;
mathLibSetting.n_first_sec_skipped = 4;
mathLibSetting.fft_window = 1000;
mathLibSetting.bipolar_mode = true;
mathLibSetting.channels_number = 4;
mathLibSetting.channel_for_analysis = 0;

ArtifactDetectSetting artifactDetectSetting;
artifactDetectSetting.art_bord = 110;
artifactDetectSetting.allowed_percent_artpoints = 70;
artifactDetectSetting.total_pow_border = 100;
artifactDetectSetting.raw_betap_limit = 800000;
artifactDetectSetting.spect_art_by_totalp = false;
artifactDetectSetting.global_artwin_sec = 4;
artifactDetectSetting.num_wins_for_quality_avg = 125;
artifactDetectSetting.hanning_win_spectrum = true;
artifactDetectSetting.hamming_win_spectrum = false;

ShortArtifactDetectSetting shortArtifactDetectSetting;
shortArtifactDetectSetting.ampl_art_detect_win_size = 200;
shortArtifactDetectSetting.ampl_art_zerod_area = 200;
shortArtifactDetectSetting.ampl_art_extremum_border = 25;

MentalAndSpectralSetting mentalAndSpectralSetting;
mentalAndSpectralSetting.n_sec_for_averaging = 2;
mentalAndSpectralSetting.n_sec_for_instant_estimation = 4;

EMOpStatus opSt;
MathLib* tMathPtr = createMathLib(mathLibSetting, artifactDetectSetting, shortArtifactDetectSetting, mentalAndSpectralSetting, &opSt);
            
int samplingFrequency = 250;
var mls = new MathLibSetting
{
    sampling_rate        = samplingFrequency,
    process_win_freq     = 25,
    n_first_sec_skipped  = 4,
    fft_window           = samplingFrequency * 4,
    bipolar_mode         = true,
    channels_number      = 4,
    channel_for_analysis = 0
};

var ads = new ArtifactDetectSetting
{
    art_bord                  = 110,
    allowed_percent_artpoints = 70,
    raw_betap_limit           = 800_000,
    total_pow_border          = 100;
    global_artwin_sec         = 4,
    spect_art_by_totalp       = true,
    num_wins_for_quality_avg  = 125,
    hanning_win_spectrum      = false,
    hamming_win_spectrum      = true
};

var sads = new ShortArtifactDetectSetting 
{ 
    ampl_art_detect_win_size = 200, 
    ampl_art_zerod_area = 200, 
    ampl_art_extremum_border = 25 
};

var mss = new MentalAndSpectralSetting 
{ 
    n_sec_for_averaging = 2, 
    n_sec_for_instant_estimation = 4 
};

var math = new EegEmotionalMath(mls, ads, sads, mss);
            
int samplingFrequency = 250;
MathLibSetting mls = new MathLibSetting(samplingFrequency, 25, samplingFrequency * 4, 4, true, 4, 0);

ArtifactDetectSetting ads = new ArtifactDetectSetting(110, 70, 800000,(int) (100), 4, true, true,false,125);

ShortArtifactDetectSetting sads = new ShortArtifactDetectSetting(200,200,25);

MentalAndSpectralSetting mss = new MentalAndSpectralSetting(2,4);

EmotionalMath tMathPtr = new EmotionalMath(mls, ads, sads, mss);
            
mls = lib_settings.MathLibSetting(sampling_rate=250,
                                  process_win_freq=25,
										 
                                  n_first_sec_skipped=4,
                                  fft_window=1000,
                                  bipolar_mode=True,
                                  squared_spectrum=True,
                                  channels_number=4,
                                  channel_for_analysis=0)

ads = lib_settings.ArtifactDetectSetting(art_bord=110,
                                         allowed_percent_artpoints=70,
                                         raw_betap_limit=800_000,
                                         global_artwin_sec=4,
                                         num_wins_for_quality_avg=125,
                                         hamming_win_spectrum=True,
                                         hanning_win_spectrum=False,
                                         total_pow_border=100,
                                         spect_art_by_totalp=True)

sads = lib_settings.ShortArtifactDetectSetting(ampl_art_detect_win_size=200,
                                               ampl_art_zerod_area=200,
                                               ampl_art_extremum_border=25)

mss = lib_settings.MentalAndSpectralSetting(n_sec_for_averaging=2,
                                            n_sec_for_instant_estimation=4)

math = EmotionalMath(mls, ads, sads, mss)
            
let mathLibSetting = EMMathLibSetting(samplingRate: 250,
                                      andProcessWinFreq: 25,
                                      andFftWindow: 1000,
                                      andNFirstSecSkipped: 4,
                                      andBipolarMode: true,
                                      andSquaredSpectrum: true,
                                      andChannelsNumber: 1,
                                      andChannelForAnalysis: 0)

let artifactDetectSetting = EMArtifactDetectSetting(artBord: 110,
                                                    andAllowedPercentArtpoints: 70,
                                                    andRawBetapLimit: 800000,
                                                    andTotalPowBorder: 100,
                                                    andGlobalArtwinSec: 4,
                                                    andSpectArtByTotalp: true,
                                                    andHanningWinSpectrum: false,
                                                    andHammingWinSpectrum: true,
                                                    andNumWinsForQualityAvg: 125)

let shortArtifactDetectSetting = ShortArtifactDetectSetting(ampl_art_detect_win_size: 200,
                                                            ampl_art_zerod_area: 200,
                                                            ampl_art_extremum_border: 25)

let mentalAndSpectralSetting = MentalAndSpectralSetting(n_sec_for_instant_estimation: 4,
                                                                    n_sec_for_averaging: 2)
                                                                    
let emotionalMath = EMEmotionalMath(libSettings: mathLibSetting, andArtifactDetetectSettings: artifactDetectSetting, andShortArtifactDetectSettigns: shortArtifactDetectSetting, andMentalAndSpectralSettings: mentalAndSpectralSetting)
            
EMMathLibSetting* mathLibSetting = [[EMMathLibSetting alloc] initWithSamplingRate:250
                                                                andProcessWinFreq:25
                                                                     andFftWindow:1000
                                                              andNFirstSecSkipped:4
                                                                   andBipolarMode:true
                                                               andSquaredSpectrum:true
                                                                andChannelsNumber:1
                                                            andChannelForAnalysis:0];

EMArtifactDetectSetting* artifactDetectSetting = [[EMArtifactDetectSetting alloc] initWithArtBord:110
                                                                       andAllowedPercentArtpoints:70
                                                                                 andRawBetapLimit:800000
                                                                                andTotalPowBorder:100
                                                                               andGlobalArtwinSec:4
                                                                              andSpectArtByTotalp:true
                                                                            andHanningWinSpectrum:false
                                                                            andHammingWinSpectrum:true
                                                                          andNumWinsForQualityAvg:125];

ShortArtifactDetectSetting shortArtifactDetectSetting;
shortArtifactDetectSetting.ampl_art_detect_win_size = 200;
shortArtifactDetectSetting.ampl_art_zerod_area = 200;
shortArtifactDetectSetting.ampl_art_extremum_border = 25;

MentalAndSpectralSetting mentalAndSpectralSetting;
mentalAndSpectralSetting.n_sec_for_averaging = 2;
mentalAndSpectralSetting.n_sec_for_instant_estimation = 4;

EMEmotionalMath* math = [[EMEmotionalMath alloc] initWithLibSettings:mathLibSetting
                                         andArtifactDetetectSettings:artifactDetectSetting
                                      andShortArtifactDetectSettigns:shortArtifactDetectSetting
                                        andMentalAndSpectralSettings:mentalAndSpectralSetting];
            
int samplingRate = 250;
final mathSettings = MathLibSettings(
  samplingRate: samplingRate,
  fftWindow: samplingRate * 2,
  channelsNumber: 1,
  channelForAnalysis: 0,
  squaredSpectrum: true,
  bipolarMode: true,
);
final detectionSettings = ArtifactsDetectSetting(
  artBord: 110,
  allowedPercentArtpoints: 70,
  totalPowBorder: 100,
  hammingWinSpectrum: true,
);
final shortDetectionSettings = ShortArtifactsDetectSetting(
  amplArtExtremumBorder: 25,
);
final mentalSpectralSettings = MentalAndSpectralSetting(
  nSecForAveraging: 4,
);
EmotionalMath math = EmotionalMath(
  mathSettings,
  detectionSettings,
  shortDetectionSettings,
  mentalSpectralSettings,
);
            

            

            

            

Optional parameters 

EMOpStatus opSt;

// setting calibration length
int calibration_length = 6;
MathLibSetCallibrationLength(tMathPtr, calibration_length, &opSt);

// type of evaluation of instant mental levels
bool independent_mental_levels = false;
MathLibSetMentalEstimationMode(tMathPtr, independent_mental_levels, &opSt);

// number of windows after the artifact with the previous actual value
int nwins_skip_after_artifact = 10;
MathLibSetSkipWinsAfterArtifact(tMathPtr, nwins_skip_after_artifact, &opSt);

// calculation of mental levels relative to calibration values
MathLibSetZeroSpectWaves(tMathPtr, true, 0, 1, 1, 1, 0, &opSt);

// spectrum normalization by bandwidth
MathLibSetSpectNormalizationByBandsWidth(tMathPtr, true);
            
// setting calibration length
int calibrationLength = 6;
math.SetCallibrationLength(calibrationLength);

// type of evaluation of instant mental levels
bool independentMentalLevels = false;
math.SetMentalEstimationMode(independentMentalLevels);

// number of windows after the artifact with the previous actual value
int nwinsSkipAfterArtifact = 10;
math.SetSkipWinsAfterArtifact(nwinsSkipAfterArtifact);

// calculation of mental levels relative to calibration values
math.SetZeroSpectWaves(true, 0, 1, 1, 1, 0);

// spectrum normalization by bandwidth
math.SetSpectNormalizationByBandsWidth(true);
            
// setting calibration length
int calibrationLength = 6;
math.setCallibrationLength(calibrationLength);

// type of evaluation of instant mental levels
boolean independentMentalLevels = false;
math.setMentalEstimationMode(independentMentalLevels);

// number of windows after the artifact with the previous actual value
int nwinsSkipAfterArtifact = 10;
math.setSkipWinsAfterArtifact(nwinsSkipAfterArtifact);

// calculation of mental levels relative to calibration values
math.setZeroSpectWaves(true, 0, 1, 1, 1, 0);

// spectrum normalization by bandwidth
math.setSpectNormalizationByBandsWidth(true);
            
# setting calibration length
calibration_length = 6
math.set_calibration_length(calibration_length)

# type of evaluation of instant mental levels
independent_mental_levels = False
math.set_mental_estimation_mode(independent_mental_levels)

# number of windows after the artifact with the previous actual value
nwins_skip_after_artifact = 10
math.set_skip_wins_after_artifact(nwins_skip_after_artifact)

# calculation of mental levels relative to calibration values
math.set_zero_spect_waves(True, 0, 1, 1, 1, 0)

# spectrum normalization by bandwidth
math.set_spect_normalization_by_bands_width(True)
            
// setting calibration length
let calibrationLength: Int32 = 6
math.setCallibrationLength(calibrationLength)

// type of evaluation of instant mental levels
let independentMentalLevels = false
math.setMentalEstimationMode(independentMentalLevels)

// number of windows after the artifact with the previous actual value
let nwinsSkipAfterArtifact: Int32 = 10
math.setSkipWinsAfterArtifact(nwinsSkipAfterArtifact)

// calculation of mental levels relative to calibration values
math.setZeroSpectWavesWithActive(true, andDelta: 0, andTheta: 1, andAlpha: 1, andBeta: 1, andGamma: 0)

// spectrum normalization by bandwidth
math.setSpectNormalizationByCoeffs(true)
            
// setting calibration length
int calibration_length = 6;
[math setCallibrationLength:calibration_length];

// type of evaluation of instant mental levels
bool independent_mental_levels = false;
[math setMentalEstimationMode:independent_mental_levels];

// number of windows after the artifact with the previous actual value
int nwins_skip_after_artifact = 10;
[math setSpectNormalizationByCoeffs:nwins_skip_after_artifact];

// calculation of mental levels relative to calibration values
[math setZeroSpectWavesWithActive:true andDelta:0 andTheta:1 andAlpha:1 andBeta:1 andGamma:0];

// spectrum normalization by bandwidth
[math setSpectNormalizationByBandsWidth:true];
            
// setting calibration length
math.setCalibrationLength(6);

// type of evaluation of instant mental levels
math.setMentalEstimationMode(false);

// number of windows after the artifact 
math.setSkipWinsAfterArtifact(10);

// calculation of mental levels relative to calibration values
math.setZeroSpectWaves(true, 0, 1, 1, 1, 0);

// spectrum normalization by bandwidth
math.setSpectNormalizationByBandsWidth(true);
            

            

            

            

Types

RawChannels

Structure contains left and right bipolar values to bipolar library mode with fields:
1. LeftBipolar - left bipolar value, double value
2. RightBipolar - right bipolar value, double value

 

RawChannelsArray

Structure contains array of values of channels with field:
1. channels - double array

 

MindData

Mental levels. Structure with fields:
1. Rel_Attention - relative attention value
2. Rel_Relaxation - relative relaxation value
3. Inst_Attention - instantiate attention value
4. Inst_Relaxation - instantiate relaxation value

 

SideType

Side of current artifact. Enum with values:
1. LEFT
2. RIGHT
3. NONE

 

Processing data

1. If you need calibration start calibration right after library initialization: 

EMOpStatus opSt;
MathLibStartCalibration(tMathPtr, &opSt);
            
math.StartCalibration();
            
math.startCalibration();
            
math.start_calibration()
            
emotionalMath.startCalibration()
            
[math startCalibration];
            
math.startCalibration();
            

            

            

            

 

2. Adding and process data

In bipolar mode:

Receive samples from BrainBit and convert it to bipolar, then push it to library. Usually bipolar is calculated as the difference of two channels on the same side of the headband:

void onBrainBitSignalDataReceived(Sensor *pSensor, BrainBitSignalData *pData, int32_t size, void *userData)
{
    RawChannels* bipolars = new RawChannels[size];
    for(int i = 0; i < size; i++){
        bipolars[i].left_bipolar = pData[i].T3 - pData[i].O1;
        bipolars[i].right_bipolar = pData[i].T4 - pData[i].O2;
    }

    MathLibPushData(tMathPtr, bipolars, size);	
    MathLibProcessDataArr(tMathPtr);
}
OpStatus outStatus;
BrainBitSignalDataListenerHandle lHandle = nullptr;
addSignalDataCallbackBrainBit(_sensor, onBrainBitSignalDataReceived, &lHandle, nullptr, &outStatus);
            
private void onBrainBitSignalDataRecived(ISensor sensor, BrainBitSignalData[] data)
{
    RawChannels[] bipolars = new RawChannels[data.Length];
    for (var i = 0; i < data.Length; i++)
    {
        bipolars[i].LeftBipolar  = data[i].T3 - data[i].O1;
        bipolars[i].RightBipolar = data[i].T4 - data[i].O2;
    }
    math.pushData(bipolars);
    math.processDataArr();
}
...
sensor.EventBrainBitSignalDataRecived += onBrainBitSignalDataRecived;
            
sensor.brainBitSignalDataReceived = data -> {
    RawChannels[] bipolars = new RawChannels[data.length];
    for (int i = 0; i < data.length; i++) {
        bipolars[i].leftBipolar = data[i].T3 - data[i].O1;
        bipolars[i].rightBipolar = data[i].T4 - data[i].O2;
    }
    math.pushData(bipolars);
    math.processDataArr();
};
            
def on_brain_bit_signal_data_received(sensor, data):
    raw_channels = []
    for sample in data:
        left_bipolar = sample.T3-sample.O1
        right_bipolar = sample.T4-sample.O2
        raw_channels.append(support_classes.RawChannels(left_bipolar, right_bipolar))
    
    math.push_data(raw_channels)
    math.process_data_arr()
...
sensor.signalDataReceived = on_brain_bit_signal_data_received
            
sensor?.setSignalDataCallbackBrainBit( { data in
    var bipolars : [EMRawChannels] = []
    for sample in data {
        let bipolarElement = EMRawChannels(leftBipolar: sample.t3.doubleValue - sample.o1.doubleValue, andRightBipolar: sample.t4.doubleValue - sample.o2.doubleValue)
        bipolars.append(bipolarElement!)
    }
    math.pushData(bipolars)
    math.processDataArr()
})
            
[sensor setSignalDataCallbackBrainBit:^(NSArray<NTBrainBitSignalData *> * _Nonnull data) {
    NSMutableArray<EMRawChannels*>* bipolars = [NSArray new];

    for(int i = 0; i < data.count; i++){
        double leftBipolar = [data[i].T3 doubleValue] - [data[i].O1 doubleValue];
        double rigntBipolar = [data[i].T4 doubleValue] - [data[i].O2 doubleValue];
        EMRawChannels* bipolarElement = [[EMRawChannels alloc] initWithLeftBipolar:leftBipolar andRightBipolar:rigntBipolar];
        [bipolar addObject: bipolarElement];
    }

    [math pushData:bipolars];
    [math processDataArr];       
}];
            
void processSamples(List<BrainBitSignalData> event) {
  final samples = event
      .map((data) => RawChanel(leftBipolar: data.t3 - data.o1, rightBipolar: data.t4 - data.o2))
      .toList();
  try {
    math.pushBipolars(samples);
    math.processData();
  } on ArtifactsException catch (e) {
    print(e.message);
  }
}
...
sensor.signalDataStream.listen(processSamples);
            

            

            

            

In multi-channel mode: 

RawChannelsArray* samples = new RawChannelsArray[SAMPLES_COUNT];
...
MathLibPushDataArr(tMathPtr, samples, SAMPLES_COUNT);
MathLibProcessDataArr(tMathPtr);
            
var samples = new RawChannelsArray[SAMPLES_COUNT];
math.PushDataArr(samples);
            
var samples = new RawChannelsArray[SAMPLES_COUNT];
math.pushDataArr(samples);
math.processDataArr();
            
samples = []
math.push_data_arr(samples)
math.process_data_arr()
            
var samples: [[NSNumber]] = [[]]
math.pushDataArr(samples)
math.processDataArr()
            
NSArray<NSArray<NSNumber*>*>* samples = [NSArray new];
[math pushDataArr:samples];
[math processDataArr];
            
List<List<double>> samples = [];
math.pushMonopolars(samples);
math.processData();
            

            

            

            


3. Then check calibration status if you need to calibrate values: 

EMOpStatus os;
bool calibrationFinished = false;
MathLibCalibrationFinished(tMathPtr, &calibrationFinished, &os);
// and calibration progress
int calibrationProgress = 0;
MathLibGetCallibrationPercents(tMathPtr, &calibrationProgress, &os);
            
bool calibrationFinished = math.CalibrationFinished();
// and calibration progress
int calibrationProgress = math.GetCallibrationPercents();
            
boolean calibrationFinished = math.calibrationFinished();
// and calibration progress
int calibrationProgress = math.getCallibrationPercents();
            
calibration_finished = math.calibration_finished()
# and calibration progress
calibration_progress = math.get_calibration_percents()
            
let calibrationFinished = math.calibrationFinished()
// and calibration progress
let calibrationProgress = math.getCallibrationPercents();
            
bool calibrationFinished = [math calibrationFinished];
// and calibration progress
UInt32 calibrationProgress = [math getCallibrationPercents];
            
bool calibrationFinished = math.isCalibrationFinished();
// and calibration progress
int calibrationProgress = math.getCalibrationPercents();
            

            

            

            


4. If calibration finished (or you don't need to calibrate) read output values: 

EMOpStatus opSt;

int size = 0;

// Reading mental levels in percent
MathLibReadMentalDataArrSize(tMathPtr, &size, &opSt);
MindData* mental_data = new MindData[size];
MathLibReadMentalDataArr(tMathPtr, mental_data, &size, &opSt);

// Reading relative spectral values in percent:
MathLibReadSpectralDataPercentsArrSize(tMathPtr, &size, &opSt);
SpectralDataPercents* sp_data = new SpectralDataPercents[size];
MathLibReadSpectralDataPercentsArr(tMathPtr, sp_data, &size, &opSt);
            
// Reading mental levels in percent
MindData[] mentalData = math.ReadMentalDataArr();

// Reading relative spectral values in percent
SpectralDataPercents[] spData = math.ReadSpectralDataPercentsArr();
            
// Reading mental levels in percent
MindData[] mentalData = math.readMentalDataArr();

// Reading relative spectral values in percent
SpectralDataPercents[] spData = math.readSpectralDataPercentsArr();
            
# Reading mental levels in percent
mental_data = math.read_mental_data_arr()
# Reading relative spectral values in percent
sp_data = math.read_spectral_data_percents_arr()
            
// Reading mental levels in percent
let mindData = math.readMentalDataArr()

// Reading relative spectral values in percent
let spData = math.mathLibReadSpectralDataPercentsArr()
            
// Reading mental levels in percent
NSArray<EMMindData*>* mindData = [math readMentalDataArr];

// Reading relative spectral values in percent
NSArray<EMSpectralDataPercents*>* spData = [math MathLibReadSpectralDataPercentsArr];
            
// Reading mental levels in percent
List<MindData> mindData = math.readMentalData();

// Reading relative spectral values in percent
List<SpectralDataPercents> spectralPercents = math.readSpectralDataPercents();
            

            

            

            

 

5. Check artifacts

   5.1. During calibration

EMOpStatus opSt;
bool artefacted = false;
MathLibIsBothSidesArtifacted(mathLibPtr,  &artefacted, &opSt);
if(artefacted){
// signal corruption
}
            
if(math.IsBothSidesArtifacted()){
// signal corruption
}
            
if(math.isBothSidesArtifacted()){
// signal corruption
}
            
if math.is_both_sides_artifacted():
# signal corruption
            
if(math.isBothSidesArtifacted()){
    // signal corruption
}
            
if([math isBothSidesArtifacted]){
    // signal corruption
}
            
if(math.isBothSidesArtefacted()){
    // signal corruption
}
            

            

            

            

   5.2. After (without) calibration

EMOpStatus opSt;
bool artefacted = false;
MathLibIsArtifactedSequence(mathLibPtr,  &artefacted, &opSt);
if(artefacted){
    // signal corruption
}
            
if(math.IsArtifactedSequence()){
// signal corruption
}
            
if(math.isArtifactedSequence()){
// signal corruption
}
            
if math.is_artifacted_sequence():
# signal corruption
            
if(math.isArtifactedSequence()){
    // signal corruption
}
            
if([math isArtifactedSequence]){
   // signal corruption 
}
            
if(math.isArtifactedSequence()){
   // signal corruption 
}
            

            

            

            

   5.3. Artifacts by side (only bipolar mode)

EMOpStatus opSt;
bool artefactedLeft = false;
MathLibIsArtifactedLeft(mathLibPtr,  &artefactedLeft, &opSt);

bool artefactedRight = false;
MathLibIsArtifactedRight(mathLibPtr,  &artefactedRight, &opSt);
if(artefactedLeft || artefactedRight){
    // signal corruption
}
            
if(math.IsArtifactedLeft() || math.IsArtifactedRight()){
    // signal corruption
}
            
if(math.isArtifactedLeft() || math.isArtifactedRight()){
    // signal corruption
}
            
if math.is_artifacted_left() or math.is_artifacted_right():
# signal corruption
            
if(math.isArtifactedLeft() || math.isArtifactedRight()){
    // signal corruption
}
            
if([math isArtifactedLeft] || [math isArtifactedRight]){
   // signal corruption 
}
            
if(math.isArtifactedLeft() || math.isArtifactedRight()){
    // signal corruption
}
            

            

            

            

6. Check EEG quality (only bipolar mode)

EMOpStatus opSt;
int qLeft = false;
int qRight = false;
MathLibGetEEGQuality(mathLibPtr,  &qLeft, &qRight, &opSt);
            
QualityValues qValues = math.GetEEGQuality();
            
QualityValues qValues = math.getEEGQuality();
            
q_values: QualityValues = math.get_eeg_quality()
            
let qValues = math.getEEGQuality()
            
EMQualityVals* qValues = [math getEEGQuality];
            
QualityValues qValues = math.getEEGQuality();
            

            

            

            

QualityValues contains integer values of signal quality on the left and right sides.

Finishing work with the library

freeMathLib(tMathPtr);
            
math.Dispose();
            
math = null;
// native finalizer is run
            
del math
            
math = nil
            
math = nil;
            
math.dispose();