Tutorials instructions

All documents we have received in advance are in the following online folder. You can download all the ones you want, install them beforehand and come over with your laptop or you can just discover everything at Télécom ParisTech using the PC on site. The password will be provided by email.

E

Session 1: Monday morning - 9:00 to 12:00 AM

Hierarchical General Linear Modelling and Robust Statistics for EEG

Cyril Pernet, Arnaud Delorme

During the workshop, we will analyze the full data space of publicly available data, using the open source LIMO EEG Toolbox (in the time domain, but it works the same in the frequency domain). The LInear MOdeling of EEG toolbox is an EEGLAB toolbox that integrates seamlessly with ‘Studies’ and provides all the tools to analyze any experimental design including all sorts of covariates at the subject or group level. It allows analyzing all electrodes, all time and/or frequency frames and has robust statistical methods implemented along several multiple comparisons procedures. Depending on time available (ie speed of the group), the various options of the toolbox will be explored. It is expected that attendees will have learned enough to be able to use the toolbox on their own data by the end of the session.

Teaching material available

 

Decoding EEG signals 

Alexandre Gramfort

Over the last decade, multivariate analyses have played a major role in interpreting complex neural time series such as EEG and MEG recordings. Here, we will combine introductory lectures and hands-on exercises to introduce the audience to the use of multivariate decoding. Using MNE-Python and scikit-learn, we will first show how users can decode EEG and MEG signals in less than 10 lines of code. We will then cover the motivation and interpretability of linear decoders in temporally-resolved neuroimaging. Finally, we will review a series of common analytical methods implemented in MNE, ranging from temporal generalization to common spatial patterns and receptive fields. Overall, the tutorial necessitates a basic knowledge of Python, and will be taught through online exercises.

Teaching material available

Simulating EEG data  

Mike X Cohen

This is the “data era” of neuroscience. But there are countless ways to analyze data, and not all of them are appropriate for your data. The purpose of this tutorial is to teach you the tools to simulate EEG data in order to evaluate analysis methods. This will allow you to (1) test the accuracy of analysis/reconstruction methods; (2) know how analysis parameters affect results; and (3) understand the assumptions that your analysis methods make. The result of this workshop will be MATLAB code that simulates single- and multichannel EEG data, including 1/f noise, phase-locked and non-phase-locked activity, and sinusoidal and nonstationary features.

Teaching material available

SEEG/ECOG analysis with Brainstorm  

François Tadel, Anne-Sophie Dubrarry

Participants will learn how to import and process SEEG epilepsy recordings with the Brainstorm interface: co-registration of pre- and post-implantation anatomical images, manual placement of the SEEG electrodes with the MRI viewer, display and pre-processing of the signals, computation of epileptogenicity maps. The example dataset will be the same as the online Brainstorm SEEG tutorial:
http://neuroimage.usc.edu/brainstorm/Tutorials/Epileptogenicity.

Teaching material available

 

E

Session 2: Monday Afternoon - 2:00 to 5:00 PM

Frequency Tagging (steady state analysis) in EEG

Molly Henry

“Frequency-tagging”, or steady-state analysis, traditionally refers to the practice of providing a repetitive sensory stimulus (a flashing light or a tone sequence, for example), and then analyzing the brain’s representation of that stimulus in the frequency domain. This technique has been fruitful in the past for determining hearing thresholds, especially of individuals incapable of providing a behavioral response (e.g., infants) and for determining a person’s ability to selectively attend to one stimulus at the expense of another. More recently, the technique has been applied in the domain of neural entrainment, whereby neural oscillations become synchronized with rhythmic environmental stimuli. I will give a theoretical background comparing these approaches. Then, I will demonstrate how frequency-tagging analysis can be taken further, relating high-dimensional neural dynamics to moment-to-moment variations in perception, providing a powerful tool for inferring the neural “states” that underlie human perception.

Teaching material available

Combining eye-tracking & EEG

Olaf Dimigen

During every waking hour, we move our eyes about 10,000 times. The combination of EEG recordings with eye-tracking is a promising approach to study visual cognition in such natural situations. This workshop will introduce students and researchers to this relatively new technique and its advantages, with a focus on data analysis. It will cover the following topics: Properties of saccade- and fixation-related brain potentials, building a suitable laboratory setup, data synchronization and integration, optimal strategies for removing eye movement artifacts from the data, and the use advanced linear deconvolution models to control for overlapping potentials and other confounds during natural vision. In hands-on exercises, we will analyze a combined dataset, using the EYE-EEG toolbox and the brand-new unfold toolbox.

 

Human Neocortical Neurosolver: A New Tool for Cellular and Circuit Level Interpretation of EEG

Stephanie Jones, Sam Neymotin, Dylan Daniels

We developed the Human Neocortical Neurosolver (HNN), an open-source modeling tool designed to help researchers interpret cellular and circuit origins of EEG/MEG. HNN presents a user-friendly GUI to a biophysically principled model of a neocortical circuit, under thalamic and cortical drive, that simulates the primary electrical currents underlying EEG/MEG recordings. We will describe the model and teach participants how to study the origins of commonly measured signals, including event related potentials and low frequency rhythms (alpha/beta/gamma). Participants will learn how to compare model results to recorded data and to adjust parameters to develop and test hypotheses on circuit-level mechanisms.

Teaching material available

Inverted encoding models of EEG signals

Thomas C Sprague

In cognitive neuroscience, we’re often interested in understanding how cognitive operations impact mental representations. For example, how are neural representations transformed by visual attention, or how are they updated when manipulating the contents of working memory?  I will walk through a recently-developed analysis procedure I call an “inverted encoding model” that enables us to reconstruct representations of feature values (like spatial position or visual orientation) given single neural activity patterns, including fMRI activation from individual regions of interest and evoked and induced scalp potentials measured with EEG.

E

Session 3: wednesday morning - 9:00 to 12:00 AM

Modeling EEG-behavior relationships

Valentin Wyart, Aurélien Weiss

This hands-on tutorial will present methods and code for relating EEG activity to behavior, a key analysis for characterizing the role of a neural signal of interest in cognition. The parametric modeling framework on which these brain-behavior analyses rest will be first presented, and then applied to EEG datasets collected in different perceptual decision-making contexts. The goal of the tutorial is to outline the explanatory power of these methods, and to show their generalizability to various fields of research in cognition (perception, decision-making, learning, memory). Participants are expected to have at least minimal experience with programming and basic knowledge of statistics.

Separating different alpha sources

Rasa Gulbinaite

Although the M/EEG community tends to agree on the existence of multiple generators of alpha-band (7-13 Hz) rhythm, there is little agreement on ways to separate them. In this tutorial, you will learn analytical and experimental approaches that will allow you to isolate different alpha sources using: (1) independent component analysis (ICA), a multivariate source separation technique that combines information from all the electrodes, and allows to determine independent sources of activity based on the statistical structure of the data; (2) resonance responses to rhythmic visual stimulation. You will also learn how to determine spectral properties of multiple alpha generators (peak frequency, width of the alpha-band etc.).

Temporal response functions – Extraction of the neural response to continuous stimuli

Lorenz Fiedler

Going beyond conventional ERP designs (i.e., multi-trial averaging), encoding models allow the extraction of the neural response to continuously varying stimulus features, such as luminance or the speech envelope. In this tutorial, we will implement and discuss the extraction of the neural response to continuous stimulus features. First, we will obtain relevant stimulus features. Second, we will discuss how to preprocess the EEG data. Third, we will extract the neural response using several methods. Finally, we will test how well the extracted response predicts unknown data. I will prepare some data and code, but the participants are welcome to bring their own datasets.

Teaching material available

Making sense of (large amounts of) human intracranial EEG data

 Jean-Philippe Lachaux

In this workshop, we will learn how to apprehend large-scale cortical dynamics supporting cognitive functions using intracranial EEG data from epileptic patients. I will rely on a novel iEEG visualization software (HiBoP) and a large iEEG dataset which will be freely distributed within the Human Brain Project (including visual and auditory perception, attention, language and memory tasks). The demonstration will mostly focus on the issue of comparative timing (relative latencies of activation across cortical sites) and functional connectivity, as revealed by amplitude-amplitude co-fluctuations. 

Teaching material available soon, we are debugging