Benchmarking on MOABB with Braindecode (PyTorch) deep net architectures#

This example shows how to use MOABB to benchmark a set of Braindecode pipelines (deep learning architectures) on all available datasets. For this example, we will use only 2 datasets to keep the computation time low, but this benchmark is designed to easily scale to many datasets.

# Authors: Igor Carrara <igor.carrara@inria.fr>
#          Bruno Aristimunha <b.aristimunha@gmail.com>
#          Sylvain Chevallier <sylvain.chevallier@universite-paris-saclay.fr>
#
# License: BSD (3-clause)

import os

import matplotlib.pyplot as plt
import torch
from absl.logging import ERROR, set_verbosity

from moabb import benchmark, set_log_level
from moabb.analysis.plotting import score_plot
from moabb.datasets import BNCI2014_001, BNCI2014_004
from moabb.utils import setup_seed


set_log_level("info")
# Avoid output Warning
set_verbosity(ERROR)
os.environ["TF_CPP_MIN_LOG_LEVEL"] = "3"

# Print Information PyTorch
print(f"Torch Version: {torch.__version__}")

# Set up GPU if it is there
cuda = torch.cuda.is_available()
device = "cuda" if cuda else "cpu"
print("GPU is", "AVAILABLE" if cuda else "NOT AVAILABLE")
Torch Version: 1.13.1+cu117
GPU is NOT AVAILABLE

In this example, we will use only 2 subjects from the dataset BNCI2014_001 and BNCI2014_004.

Running the benchmark#

The benchmark is run using the benchmark function. You need to specify the folder containing the pipelines, the kind of evaluation, and the paradigm to use. By default, the benchmark will use all available datasets for all paradigms listed in the pipelines. You could restrict to specific evaluation and paradigm using the evaluations and paradigms arguments.

To save computation time, the results are cached. If you want to re-run the benchmark, you can set the overwrite argument to True.

It is possible to indicate the folder to cache the results and the one to save the analysis & figures. By default, the results are saved in the results folder, and the analysis & figures are saved in the benchmark folder.

This code is implemented to run on CPU. If you’re using a GPU, do not use multithreading (i.e. set n_jobs=1)

In order to allow the benchmark function to work with return_epoch=True (Required to use Braindecode( we need to call each pipeline as “braindecode_xxx…”, with xxx the name of the model to be handled correctly by the benchmark function.

# Set up reproducibility of Tensorflow
setup_seed(42)

# Restrict this example only to the first two subjects of BNCI2014_001
dataset = BNCI2014_001()
dataset2 = BNCI2014_004()
dataset.subject_list = dataset.subject_list[:2]
dataset2.subject_list = dataset2.subject_list[:2]
datasets = [dataset, dataset2]

results = benchmark(
    pipelines="./pipelines_braindecode",
    evaluations=["CrossSession"],
    paradigms=["LeftRightImagery"],
    include_datasets=datasets,
    results="./results/",
    overwrite=False,
    plot=False,
    output="./benchmark/",
    n_jobs=-1,
)
BNCI2014-001-CrossSession:   0%|          | 0/2 [00:00<?, ?it/s]/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/paradigms/base.py:355: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
  X = mne.concatenate_epochs(X)
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/torch/nn/modules/conv.py:459: UserWarning: Using padding='same' with even kernel lengths and odd dilation may require a zero-padded copy of the input be created (Triggered internally at ../aten/src/ATen/native/Convolution.cpp:895.)
  return F.conv2d(input, weight, bias, self.stride,
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4688        0.8837       0.5172        0.7108  0.6206
      2       0.4844        0.9449       0.5172        0.7103  0.5758
      3       0.6094        0.7891       0.5172        0.7095  0.5601
      4       0.4375        1.0690       0.5172        0.7093  0.5726
      5       0.5156        0.8475       0.5172        0.7091  0.5647
      6       0.4688        0.9607       0.5172        0.7089  0.5676
      7       0.4375        1.0614       0.5172        0.7086  0.5635
      8       0.5625        0.8471       0.5172        0.7082  0.5629
      9       0.4844        0.8999       0.5172        0.7079  0.5629
     10       0.4688        0.9013       0.5172        0.7075  0.5697
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5469        0.9092       0.5862        0.6819  0.5699
      2       0.4375        0.9730       0.5517        0.6820  0.5397
      3       0.5000        0.9071       0.5517        0.6821  0.5413
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.6250        0.6554       0.4138        0.6960  0.1083
      2       0.5938        0.6640       0.4138        0.6960  0.0929
      3       0.5000        0.7298       0.4138        0.6959  0.0899
      4       0.6250        0.6693       0.4138        0.6959  0.0889
      5       0.6250        0.6813       0.4138        0.6959  0.0883
      6       0.5156        0.6895       0.4138        0.6958  0.0884
      7       0.5312        0.6986       0.4138        0.6958  0.0904
      8       0.4844        0.6853       0.4138        0.6958  0.0885
      9       0.5938        0.7050       0.4138        0.6957  0.0881
     10       0.5156        0.6961       0.4138        0.6957  0.0895
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5312        0.7113       0.3103        0.7049  0.0881
      2       0.4531        0.7398       0.3103        0.7046  0.1019
      3       0.4688        0.7539       0.3103        0.7044  0.0878
      4       0.5312        0.7288       0.3103        0.7042  0.0880
      5       0.6406        0.6908       0.3448        0.7040  0.0873
      6       0.5000        0.7800       0.3448        0.7039  0.0881
      7       0.6406        0.6531       0.3448        0.7037  0.0877
      8       0.5625        0.7416       0.3448        0.7036  0.0886
      9       0.4219        0.7501       0.3448        0.7035  0.0880
     10       0.5000        0.7685       0.3448        0.7034  0.0885
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5938        0.9378       0.4483        0.9357  0.2790
      2       0.3438        1.1412       0.4483        0.9063  0.2752
      3       0.4375        0.9843       0.4828        0.8822  0.2866
      4       0.4219        0.9976       0.5172        0.8586  0.2759
      5       0.5625        0.9187       0.5517        0.8363  0.2747
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4844        0.8690       0.5172        1.1665  0.2762
      2       0.6094        0.7410       0.5517        1.1349  0.2762
      3       0.5469        0.8589       0.5172        1.1001  0.2768
      4       0.6406        0.6981       0.5172        1.0683  0.2858
      5       0.5156        0.8577       0.5172        1.0400  0.2761

BNCI2014-001-CrossSession:  50%|#####     | 1/2 [00:20<00:20, 20.04s/it]/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  24 events (all good), 2 – 6 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 12
 'right_hand': 12>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/paradigms/base.py:355: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
  X = mne.concatenate_epochs(X)
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5625        0.8743       0.5517        0.6963  0.5536
      2       0.5781        0.7499       0.5517        0.6957  0.5571
      3       0.4375        1.0817       0.5862        0.6951  0.5548
      4       0.4375        1.0247       0.5862        0.6948  0.5623
      5       0.4531        0.9465       0.5517        0.6944  0.5553
      6       0.4688        0.9765       0.5862        0.6941  0.5704
      7       0.5469        0.8158       0.6207        0.6941  0.5657
      8       0.5000        0.9634       0.6207        0.6941  0.5576
      9       0.5000        0.9056       0.6207        0.6940  0.5624
     10       0.5625        0.9748       0.6207        0.6940  0.5554
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4688        0.9456       0.4828        0.7555  0.5677
      2       0.5469        1.0201       0.4828        0.7557  0.5553
      3       0.5469        0.8478       0.4828        0.7553  0.5685
      4       0.5938        0.8190       0.4828        0.7552  0.5662
      5       0.5000        0.9454       0.4828        0.7558  0.5601
      6       0.5938        0.9660       0.4828        0.7557  0.5907
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4375        0.7541       0.5172        0.7017  0.0911
      2       0.4062        0.7586       0.5172        0.7010  0.1025
      3       0.5469        0.7198       0.5172        0.7003  0.0889
      4       0.4844        0.7471       0.5172        0.6998  0.0881
      5       0.5781        0.7175       0.5172        0.6993  0.0887
      6       0.5938        0.6738       0.5172        0.6989  0.0889
      7       0.5469        0.7188       0.5172        0.6985  0.0881
      8       0.5938        0.7138       0.5517        0.6982  0.0877
      9       0.4531        0.7493       0.5517        0.6980  0.0890
     10       0.4219        0.7350       0.5517        0.6978  0.0879
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4531        0.7809       0.5172        0.6870  0.0894
      2       0.5156        0.7138       0.5172        0.6869  0.0981
      3       0.4688        0.8271       0.5172        0.6867  0.0890
      4       0.4688        0.8024       0.5172        0.6866  0.0883
      5       0.5000        0.7661       0.5172        0.6865  0.0888
      6       0.5469        0.6993       0.5172        0.6864  0.0884
      7       0.5000        0.7718       0.5172        0.6863  0.0893
      8       0.4531        0.7396       0.5172        0.6862  0.0898
      9       0.5312        0.7611       0.5172        0.6861  0.0893
     10       0.4688        0.7611       0.5172        0.6861  0.0882
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4219        1.3747       0.4483        0.9366  0.2798
      2       0.3750        1.2758       0.4828        0.8790  0.2760
      3       0.5625        0.9987       0.4138        0.8315  0.2839
      4       0.4375        1.1600       0.4138        0.8155  0.2756
      5       0.4531        1.0910       0.4483        0.8107  0.2757
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.6094        0.8983       0.5862        0.7402  0.2797
      2       0.4375        1.2841       0.6207        0.7468  0.2765
      3       0.5781        0.7875       0.6207        0.7496  0.2755
Stopping since valid_loss has not improved in the last 3 epochs.

BNCI2014-001-CrossSession: 100%|##########| 2/2 [00:41<00:00, 20.77s/it]
BNCI2014-001-CrossSession: 100%|##########| 2/2 [00:41<00:00, 20.66s/it]

BNCI2014-004-CrossSession:   0%|          | 0/2 [00:00<?, ?it/s]/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'lampx.tugraz.at'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
  warnings.warn(


  0%|                                              | 0.00/34.2M [00:00<?, ?B/s]

  0%|                                      | 32.8k/34.2M [00:00<03:24, 167kB/s]

  0%|▏                                      | 153k/34.2M [00:00<00:56, 601kB/s]

  1%|▎                                      | 304k/34.2M [00:00<00:35, 944kB/s]

  2%|▋                                     | 616k/34.2M [00:00<00:19, 1.70MB/s]

  3%|█▏                                   | 1.08M/34.2M [00:00<00:12, 2.68MB/s]

  6%|██▏                                  | 2.00M/34.2M [00:00<00:06, 4.78MB/s]

 11%|████                                 | 3.77M/34.2M [00:00<00:03, 8.85MB/s]

 20%|███████▌                             | 6.98M/34.2M [00:00<00:01, 16.1MB/s]

 29%|██████████▌                          | 9.76M/34.2M [00:00<00:01, 19.7MB/s]

 36%|█████████████▏                       | 12.2M/34.2M [00:01<00:01, 21.0MB/s]

 44%|████████████████▎                    | 15.1M/34.2M [00:01<00:00, 23.6MB/s]

 52%|███████████████████▍                 | 17.9M/34.2M [00:01<00:00, 24.9MB/s]

 60%|██████████████████████               | 20.4M/34.2M [00:01<00:00, 24.9MB/s]

 68%|█████████████████████████▏           | 23.3M/34.2M [00:01<00:00, 26.1MB/s]

 76%|████████████████████████████▎        | 26.1M/34.2M [00:01<00:00, 26.7MB/s]

 84%|███████████████████████████████▏     | 28.8M/34.2M [00:01<00:00, 26.8MB/s]

 92%|██████████████████████████████████▏  | 31.6M/34.2M [00:01<00:00, 27.2MB/s]

  0%|                                              | 0.00/34.2M [00:00<?, ?B/s]
100%|██████████████████████████████████████| 34.2M/34.2M [00:00<00:00, 120GB/s]
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'lampx.tugraz.at'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
  warnings.warn(


  0%|                                              | 0.00/18.6M [00:00<?, ?B/s]

  0%|                                      | 32.8k/18.6M [00:00<01:51, 167kB/s]

  1%|▎                                      | 137k/18.6M [00:00<00:34, 537kB/s]

  2%|▌                                      | 290k/18.6M [00:00<00:20, 908kB/s]

  3%|█▎                                    | 638k/18.6M [00:00<00:09, 1.81MB/s]

  6%|██▎                                  | 1.14M/18.6M [00:00<00:06, 2.89MB/s]

 11%|████▏                                | 2.13M/18.6M [00:00<00:03, 5.13MB/s]

 22%|███████▉                             | 4.01M/18.6M [00:00<00:01, 9.44MB/s]

 38%|██████████████▏                      | 7.15M/18.6M [00:00<00:00, 16.3MB/s]

 54%|████████████████████                 | 10.1M/18.6M [00:00<00:00, 20.4MB/s]

 69%|█████████████████████████▌           | 12.9M/18.6M [00:01<00:00, 22.5MB/s]

 84%|███████████████████████████████▏     | 15.7M/18.6M [00:01<00:00, 24.3MB/s]

 97%|████████████████████████████████████ | 18.1M/18.6M [00:01<00:00, 21.0MB/s]

  0%|                                              | 0.00/18.6M [00:00<?, ?B/s]
100%|█████████████████████████████████████| 18.6M/18.6M [00:00<00:00, 80.0GB/s]
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  120 events (all good), 3 – 7.5 s, baseline off, ~3.1 MB, data loaded,
 'left_hand': 60
 'right_hand': 60>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  120 events (all good), 3 – 7.5 s, baseline off, ~3.1 MB, data loaded,
 'left_hand': 60
 'right_hand': 60>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  160 events (all good), 3 – 7.5 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 80
 'right_hand': 80>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  160 events (all good), 3 – 7.5 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 80
 'right_hand': 80>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  160 events (all good), 3 – 7.5 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 80
 'right_hand': 80>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/paradigms/base.py:355: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
  X = mne.concatenate_epochs(X)
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5268        0.8728       0.6000        0.6849  1.0541
      2       0.5156        0.9240       0.6167        0.6841  1.0415
      3       0.5402        0.8472       0.6167        0.6837  1.0444
      4       0.5045        0.9586       0.6167        0.6829  1.0447
      5       0.5379        0.8781       0.6167        0.6823  1.0464
      6       0.5312        0.9116       0.6083        0.6816  1.0900
      7       0.5201        0.8951       0.6250        0.6809  1.0458
      8       0.5089        0.8897       0.6083        0.6801  1.0454
      9       0.4799        0.9377       0.6083        0.6792  1.0491
     10       0.5469        0.8595       0.6250        0.6779  1.0515
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5134        0.9702       0.5417        0.6898  1.0533
      2       0.5246        0.9658       0.5500        0.6872  1.0551
      3       0.4933        0.9663       0.6000        0.6850  1.0470
      4       0.5223        0.9169       0.6250        0.6828  1.0432
      5       0.4933        1.0141       0.6000        0.6806  1.0389
      6       0.5134        0.9173       0.6250        0.6783  1.0490
      7       0.5156        0.9557       0.6333        0.6759  1.0511
      8       0.4955        0.9721       0.6250        0.6732  1.0542
      9       0.5536        0.8938       0.6167        0.6703  1.0528
     10       0.4955        0.9590       0.6250        0.6673  1.0535
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4844        0.9674       0.5357        0.6901  1.0512
      2       0.4643        1.0456       0.5179        0.6896  1.0448
      3       0.5089        0.9567       0.5357        0.6892  1.0509
      4       0.5112        0.9992       0.5357        0.6888  1.0464
      5       0.5201        0.8963       0.5536        0.6885  1.0562
      6       0.5067        0.9243       0.5536        0.6880  1.0572
      7       0.5179        0.9366       0.5446        0.6875  1.0533
      8       0.5067        0.9621       0.5357        0.6871  1.0545
      9       0.4777        0.9535       0.5357        0.6865  1.0459
     10       0.4710        0.9677       0.5625        0.6861  1.0415
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5223        0.9475       0.5446        0.6931  1.0496
      2       0.4688        1.0016       0.5446        0.6928  1.0513
      3       0.4888        0.9268       0.5446        0.6925  1.0467
      4       0.5112        0.8781       0.5357        0.6922  1.0506
      5       0.5112        0.9517       0.5357        0.6918  1.0425
      6       0.5112        0.9222       0.5357        0.6912  1.0442
      7       0.5134        0.9051       0.5446        0.6905  1.0324
      8       0.4777        0.9531       0.5446        0.6898  1.0401
      9       0.4643        0.9489       0.5357        0.6891  1.0424
     10       0.4978        0.9368       0.5446        0.6886  1.0381
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5134        0.9003       0.5357        0.6779  1.0417
      2       0.5134        0.9302       0.5268        0.6771  1.0504
      3       0.4799        1.0533       0.5268        0.6760  1.0476
      4       0.4821        0.9575       0.5268        0.6750  1.0490
      5       0.5089        0.9342       0.5446        0.6741  1.0448
      6       0.5134        0.9132       0.5446        0.6731  1.0532
      7       0.4888        0.9302       0.5446        0.6721  1.0544
      8       0.5424        0.8900       0.5714        0.6710  1.0490
      9       0.5223        0.9138       0.5804        0.6700  1.0604
     10       0.4531        0.9628       0.5893        0.6690  1.0554
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4933        0.7355       0.5333        0.6928  0.1920
      2       0.4844        0.7237       0.5167        0.6937  0.1859
      3       0.4888        0.7347       0.4917        0.6944  0.1950
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5089        0.7321       0.5417        0.6929  0.1918
      2       0.4866        0.7287       0.5250        0.6930  0.2024
      3       0.4844        0.7390       0.5333        0.6931  0.1887
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4732        0.7831       0.4821        0.6934  0.1986
      2       0.4777        0.7372       0.4821        0.6934  0.1878
      3       0.4955        0.7595       0.4911        0.6934  0.1870
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4777        0.7389       0.4643        0.6946  0.1881
      2       0.5156        0.7284       0.4732        0.6945  0.1869
      3       0.4643        0.7520       0.4732        0.6944  0.1852
      4       0.5112        0.7335       0.4732        0.6943  0.1846
      5       0.5112        0.7312       0.4732        0.6943  0.1867
      6       0.5134        0.7237       0.4732        0.6943  0.1936
      7       0.5737        0.7014       0.4643        0.6943  0.1842
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5580        0.7129       0.5268        0.6924  0.2020
      2       0.5714        0.6880       0.5268        0.6921  0.1908
      3       0.5290        0.7084       0.5268        0.6918  0.1895
      4       0.5246        0.7209       0.5357        0.6916  0.1856
      5       0.5558        0.6838       0.5268        0.6914  0.1837
      6       0.5201        0.7132       0.5357        0.6912  0.1958
      7       0.5223        0.7013       0.5357        0.6911  0.1837
      8       0.5558        0.6887       0.5357        0.6909  0.1837
      9       0.5759        0.6969       0.5446        0.6908  0.1851
     10       0.5446        0.7050       0.5446        0.6907  0.1870
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4933        0.8867       0.5833        0.7748  0.7551
      2       0.5469        0.8407       0.5583        0.9758  0.7607
      3       0.5268        0.8136       0.6000        0.8091  0.7668
      4       0.5513        0.7877       0.5833        0.6934  0.7888
      5       0.5670        0.7809       0.5750        0.6761  0.7630
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5223        0.9496       0.4833        0.9299  0.7588
      2       0.5246        0.8457       0.5333        0.7850  0.7544
      3       0.5379        0.8245       0.5000        0.7341  0.7594
      4       0.5804        0.7769       0.5417        0.7109  0.7518
      5       0.5826        0.8032       0.5833        0.6885  0.7575
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5201        0.9490       0.5000        0.9558  0.7561
      2       0.5536        0.8375       0.5089        0.9280  0.7556
      3       0.5871        0.7990       0.5179        0.7928  0.7466
      4       0.6071        0.8078       0.5893        0.6746  0.7519
      5       0.5603        0.7828       0.6875        0.5999  0.7569
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5045        0.9390       0.5357        0.8534  0.7613
      2       0.5179        0.8981       0.5625        0.7521  0.7477
      3       0.5558        0.8441       0.5804        0.7348  0.7613
      4       0.5759        0.8036       0.6071        0.7023  0.7586
      5       0.5759        0.8032       0.5982        0.6742  0.7657
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5536        0.8681       0.6518        0.6613  0.7667
      2       0.6094        0.7723       0.6607        0.6204  0.7452
      3       0.6094        0.7237       0.6875        0.5982  0.7529
      4       0.6518        0.7014       0.7054        0.5836  0.7670
      5       0.6518        0.6994       0.7232        0.5675  0.7570

BNCI2014-004-CrossSession:  50%|#####     | 1/2 [01:28<01:28, 88.37s/it]/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'lampx.tugraz.at'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
  warnings.warn(


  0%|                                              | 0.00/33.1M [00:00<?, ?B/s]

  0%|                                      | 32.8k/33.1M [00:00<03:17, 167kB/s]

  0%|▏                                      | 144k/33.1M [00:00<00:58, 568kB/s]

  1%|▎                                      | 296k/33.1M [00:00<00:35, 924kB/s]

  2%|▋                                     | 604k/33.1M [00:00<00:19, 1.68MB/s]

  3%|█▏                                   | 1.10M/33.1M [00:00<00:11, 2.79MB/s]

  6%|██▎                                  | 2.08M/33.1M [00:00<00:06, 5.03MB/s]

 12%|████▍                                | 3.98M/33.1M [00:00<00:03, 9.45MB/s]

 19%|███████▏                             | 6.38M/33.1M [00:00<00:01, 13.9MB/s]

 26%|█████████▌                           | 8.56M/33.1M [00:00<00:01, 16.4MB/s]

 33%|████████████▏                        | 10.9M/33.1M [00:01<00:01, 18.6MB/s]

 40%|██████████████▋                      | 13.1M/33.1M [00:01<00:01, 19.6MB/s]

 48%|█████████████████▉                   | 16.0M/33.1M [00:01<00:00, 22.5MB/s]

 58%|█████████████████████▎               | 19.1M/33.1M [00:01<00:00, 24.8MB/s]

 66%|████████████████████████▍            | 21.8M/33.1M [00:01<00:00, 25.7MB/s]

 74%|███████████████████████████▎         | 24.4M/33.1M [00:01<00:00, 25.5MB/s]

 83%|██████████████████████████████▋      | 27.4M/33.1M [00:01<00:00, 26.8MB/s]

 92%|██████████████████████████████████▏  | 30.6M/33.1M [00:01<00:00, 25.4MB/s]

  0%|                                              | 0.00/33.1M [00:00<?, ?B/s]
100%|██████████████████████████████████████| 33.1M/33.1M [00:00<00:00, 132GB/s]
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/urllib3/connectionpool.py:1061: InsecureRequestWarning: Unverified HTTPS request is being made to host 'lampx.tugraz.at'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
  warnings.warn(


  0%|                                              | 0.00/16.5M [00:00<?, ?B/s]

  0%|                                      | 32.8k/16.5M [00:00<01:38, 167kB/s]

  1%|▎                                      | 136k/16.5M [00:00<00:30, 532kB/s]

  2%|▋                                      | 289k/16.5M [00:00<00:17, 905kB/s]

  4%|█▌                                    | 665k/16.5M [00:00<00:08, 1.90MB/s]

  7%|██▌                                  | 1.12M/16.5M [00:00<00:05, 2.79MB/s]

 13%|████▊                                | 2.15M/16.5M [00:00<00:02, 5.20MB/s]

 25%|█████████                            | 4.07M/16.5M [00:00<00:01, 9.62MB/s]

 42%|███████████████▋                     | 7.01M/16.5M [00:00<00:00, 15.7MB/s]

 61%|██████████████████████▋              | 10.1M/16.5M [00:00<00:00, 20.5MB/s]

 79%|█████████████████████████████        | 13.0M/16.5M [00:01<00:00, 23.0MB/s]

 95%|███████████████████████████████████▏ | 15.7M/16.5M [00:01<00:00, 24.3MB/s]

  0%|                                              | 0.00/16.5M [00:00<?, ?B/s]
100%|█████████████████████████████████████| 16.5M/16.5M [00:00<00:00, 72.1GB/s]
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  120 events (all good), 3 – 7.5 s, baseline off, ~3.1 MB, data loaded,
 'left_hand': 60
 'right_hand': 60>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  120 events (all good), 3 – 7.5 s, baseline off, ~3.1 MB, data loaded,
 'left_hand': 60
 'right_hand': 60>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  160 events (all good), 3 – 7.5 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 80
 'right_hand': 80>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  120 events (all good), 3 – 7.5 s, baseline off, ~3.1 MB, data loaded,
 'left_hand': 60
 'right_hand': 60>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/datasets/preprocessing.py:273: UserWarning: warnEpochs <Epochs |  160 events (all good), 3 – 7.5 s, baseline off, ~4.1 MB, data loaded,
 'left_hand': 80
 'right_hand': 80>
  warn(f"warnEpochs {epochs}")
/home/runner/work/moabb/moabb/moabb/paradigms/base.py:355: RuntimeWarning: Concatenation of Annotations within Epochs is not supported yet. All annotations will be dropped.
  X = mne.concatenate_epochs(X)
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5156        0.9867       0.5089        0.6962  1.0562
      2       0.5045        0.9637       0.5089        0.6951  1.0465
      3       0.5045        0.9875       0.5089        0.6940  1.0364
      4       0.5312        0.9360       0.5268        0.6929  1.0403
      5       0.4821        0.9903       0.5446        0.6918  1.0446
      6       0.4866        0.9762       0.5446        0.6908  1.0401
      7       0.5268        0.9155       0.5357        0.6897  1.0502
      8       0.4732        0.9850       0.5446        0.6888  1.0489
      9       0.4978        0.9809       0.5268        0.6879  1.0439
     10       0.4955        0.9432       0.5446        0.6871  1.0436
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4554        1.0125       0.3571        0.7017  1.0517
      2       0.4888        1.0102       0.3929        0.7084  1.0374
      3       0.4710        1.0082       0.4018        0.7155  1.0464
Stopping since valid_loss has not improved in the last 3 epochs.
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5286        0.9036       0.5673        0.6935  0.9065
      2       0.5208        0.8991       0.4808        0.6946  0.9010
      3       0.5052        0.9895       0.4904        0.6963  0.9081
Stopping since valid_loss has not improved in the last 3 epochs.
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4576        0.9355       0.5089        0.6887  1.0553
      2       0.4710        0.9823       0.5179        0.6882  1.0369
      3       0.5246        0.8910       0.5179        0.6878  1.0327
      4       0.4688        0.9600       0.5268        0.6874  1.0365
      5       0.5000        0.9251       0.5268        0.6870  1.0355
      6       0.5536        0.8774       0.5268        0.6866  1.0370
      7       0.4911        0.9521       0.5268        0.6862  1.0390
      8       0.4821        0.9585       0.5268        0.6858  1.0371
      9       0.4955        0.8971       0.5268        0.6855  1.0483
     10       0.4821        0.9494       0.5268        0.6850  1.0507
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4818        0.9675       0.5288        0.6932  0.9032
      2       0.5156        0.8960       0.5192        0.6931  0.9038
      3       0.5417        0.8389       0.5192        0.6930  0.8970
      4       0.5234        0.9498       0.5192        0.6929  0.9109
      5       0.5182        0.8601       0.5096        0.6928  0.8935
      6       0.4948        0.9401       0.4904        0.6927  0.8949
      7       0.4948        0.9359       0.5096        0.6926  0.9286
      8       0.5130        0.8834       0.5288        0.6925  0.8880
      9       0.4844        0.9651       0.5288        0.6924  0.8930
     10       0.5026        0.9493       0.5288        0.6923  0.8948
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5022        0.7134       0.4732        0.6937  0.1845
      2       0.5491        0.7009       0.4732        0.6938  0.1844
      3       0.4799        0.7134       0.4554        0.6938  0.1840
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5134        0.7209       0.5893        0.6927  0.1851
      2       0.4911        0.7329       0.5089        0.6936  0.1841
      3       0.5045        0.7167       0.4732        0.6944  0.1855
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5312        0.7371       0.5385        0.6928  0.1614
      2       0.4844        0.7511       0.4904        0.6931  0.1595
      3       0.4922        0.7556       0.4904        0.6934  0.1594
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4978        0.7633       0.4732        0.6933  0.1867
      2       0.4665        0.7590       0.4643        0.6933  0.1869
      3       0.4911        0.7394       0.4821        0.6934  0.1855
Stopping since valid_loss has not improved in the last 3 epochs.
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5312        0.7093       0.4808        0.6936  0.1582
      2       0.5286        0.6961       0.4712        0.6937  0.1602
      3       0.5391        0.7024       0.5000        0.6938  0.1578
Stopping since valid_loss has not improved in the last 3 epochs.
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5201        0.9050       0.4286        1.8069  0.7490
      2       0.5290        0.8636       0.4286        0.9869  0.7459
      3       0.5357        0.8190       0.4732        0.8161  0.7560
      4       0.5491        0.7906       0.4732        0.7716  0.7560
      5       0.5335        0.7790       0.4732        0.7497  0.7510
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5268        0.8833       0.6071        1.1230  0.7514
      2       0.5223        0.8304       0.5268        0.7775  0.7632
      3       0.5022        0.8444       0.3929        0.8547  0.7586
      4       0.5312        0.8049       0.4107        0.8532  0.7536
Stopping since valid_loss has not improved in the last 3 epochs.
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.4609        0.9349       0.4904        2.0327  0.6462
      2       0.5312        0.8608       0.4904        1.6707  0.6541
      3       0.5026        0.8657       0.5000        1.1806  0.6393
      4       0.5260        0.8144       0.5096        0.9011  0.6481
      5       0.5182        0.8213       0.4904        0.8225  0.6483
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5424        0.8460       0.5714        0.9240  0.7410
      2       0.5201        0.8518       0.5268        0.7383  0.7566
      3       0.5357        0.8623       0.4732        0.7679  0.7471
      4       0.5491        0.8279       0.5268        0.7383  0.7477
      5       0.5246        0.7980       0.5268        0.7321  0.7426
/home/runner/work/moabb/moabb/.venv/lib/python3.9/site-packages/braindecode/models/base.py:180: UserWarning: LogSoftmax final layer will be removed! Please adjust your loss function accordingly (e.g. CrossEntropyLoss)!
  warnings.warn("LogSoftmax final layer will be removed! " +
  epoch    train_acc    train_loss    valid_acc    valid_loss     dur
-------  -----------  ------------  -----------  ------------  ------
      1       0.5104        0.8633       0.5096        1.2202  0.6408
      2       0.5078        0.9056       0.5385        0.7995  0.6535
      3       0.4792        0.8832       0.4808        0.7899  0.6373
      4       0.5286        0.8430       0.4904        0.8064  0.6457
      5       0.5443        0.8118       0.4808        0.7916  0.6463

BNCI2014-004-CrossSession: 100%|##########| 2/2 [02:38<00:00, 77.71s/it]
BNCI2014-004-CrossSession: 100%|##########| 2/2 [02:38<00:00, 79.31s/it]
        dataset    evaluation                       pipeline  avg score
0  BNCI2014-001  CrossSession       braindecode_EEGInception   0.529755
1  BNCI2014-001  CrossSession    braindecode_ShallowFBCSPNet   0.506535
2  BNCI2014-001  CrossSession  braindecode_EEGNetv4_resample   0.505883
3  BNCI2014-004  CrossSession       braindecode_EEGInception   0.566102
4  BNCI2014-004  CrossSession    braindecode_ShallowFBCSPNet   0.599238
5  BNCI2014-004  CrossSession  braindecode_EEGNetv4_resample   0.516589

The deep learning architectures implemented in MOABB using Braindecode are:

  • Shallow Convolutional Network [1]

  • Deep Convolutional Network [1]

  • EEGNetv4 [2]

  • EEGInception [3]

Benchmark prints a summary of the results. Detailed results are saved in a pandas dataframe, and can be used to generate figures. The analysis & figures are saved in the benchmark folder.

score_plot(results)
plt.show()
Scores per dataset and algorithm
/home/runner/work/moabb/moabb/moabb/analysis/plotting.py:70: UserWarning: The palette list has more values (6) than needed (3), which may not be intended.
  sea.stripplot(

References#

Total running time of the script: ( 3 minutes 21.573 seconds)

Estimated memory usage: 590 MB

Gallery generated by Sphinx-Gallery