web analytics
Categories
Adaptive Control Ageing Auditory Cortex Auditory Neuroscience EEG / MEG Evoked Activity Executive Functions Neural Oscillations Neural Phase Papers Perception Publications

New paper in press: Hen­ry et al., Nature Communications

Here comes a new paper in Nature Com­mu­ni­ca­tions by for­mer AC post­doc Mol­ly Hen­ry, with for­mer fel­low post­doc AC alum­nus Björn Her­rmann, our tire­less lab man­ag­er, Dun­ja Kunke, and myself! It is a late (to us quite impor­tant) result from our lab’s tenure at the Max Planck in Leipzig, 

Hen­ry, M.J., Her­rmann, B., Kunke, D., Obleser, J. (In press). Aging affects the bal­ance of neur­al entrain­ment and top-down neur­al mod­u­la­tion in the lis­ten­ing brain. Nature Communications. 

—Con­grat­u­la­tions, Molly!

Categories
Attention Auditory Cortex Auditory Neuroscience Auditory Speech Processing EEG / MEG Papers Psychology Publications

New paper in press in Jour­nal of Neur­al Engi­neer­ing: Fiedler et al. on in-ear-EEG and the focus of audi­to­ry attention

Towards a brain-con­trolled hear­ing aid: PhD stu­dent Lorenz Fiedler shows how attend­ed and ignored audi­to­ry streams are dif­fer­ent­ly rep­re­sent­ed in the neur­al respons­es and how the focus of audi­to­ry atten­tion can be extract­ed from EEG sig­nals record­ed at elec­trodes placed inside the ear-canal and around the ear.

Abstract
Objec­tive. Con­ven­tion­al, mul­ti-chan­nel scalp elec­troen­cephalog­ra­phy (EEG) allows the iden­ti­fi­ca­tion of the attend­ed speak­er in con­cur­rent-lis­ten­ing (‘cock­tail par­ty’) sce­nar­ios. This implies that EEG might pro­vide valu­able infor­ma­tion to com­ple­ment hear­ing aids with some form of EEG and to install a lev­el of neu­ro-feed­back. Approach. To inves­ti­gate whether a listener’s atten­tion­al focus can be detect­ed from sin­gle-chan­nel hear­ing-aid-com­pat­i­ble EEG con­fig­u­ra­tions, we record­ed EEG from three elec­trodes inside the ear canal (‘in-Ear-EEG’) and addi­tion­al­ly from 64 elec­trodes on the scalp. In two dif­fer­ent, con­cur­rent lis­ten­ing tasks, par­tic­i­pants ( n  =  7) were fit­ted with indi­vid­u­al­ized in-Ear-EEG pieces and were either asked to attend to one of two dichot­i­cal­ly-pre­sent­ed, con­cur­rent tone streams or to one of two diot­i­cal­ly-pre­sent­ed, con­cur­rent audio­books. A for­ward encod­ing mod­el was trained to pre­dict the EEG response at sin­gle EEG chan­nels. Main results. Each indi­vid­ual par­tic­i­pants’ atten­tion­al focus could be detect­ed from sin­gle-chan­nel EEG response record­ed from short-dis­tance con­fig­u­ra­tions con­sist­ing only of a sin­gle in-Ear-EEG elec­trode and an adja­cent scalp-EEG elec­trode. The dif­fer­ences in neur­al respons­es to attend­ed and ignored stim­uli were con­sis­tent in mor­phol­o­gy (i.e. polar­i­ty and laten­cy of com­po­nents) across sub­jects. Sig­nif­i­cance. In sum, our find­ings show that the EEG response from a sin­gle-chan­nel, hear­ing-aid-com­pat­i­ble con­fig­u­ra­tion pro­vides valu­able infor­ma­tion to iden­ti­fy a listener’s focus of attention.
Categories
Adaptive Control Attention Auditory Cortex Auditory Neuroscience Auditory Perception Auditory Speech Processing Degraded Acoustics EEG / MEG Evoked Activity Executive Functions Neural Oscillations Noise-Vocoded Speech Papers Perception Psychology Publications Speech

New paper in press in Cere­bral Cor­tex: Wöst­mann et al. on ignor­ing degrad­ed speech

Audi­to­ry Cognition’s own Malte Wöst­mann is in press in Cere­bral Cor­tex with his lat­est offer­ing on how atten­tion­al con­trol man­i­fests in alpha pow­er changes: Ignor­ing speech can be ben­e­fi­cial (if com­pre­hend­ing speech poten­tial­ly detracts from anoth­er task), and we here show how this change in lis­ten­ing goals turns around the pat­tern of alpha-pow­er changes with chang­ing speech degra­da­tion. (We will update as the paper becomes avail­able online.)

Wöst­mann, M., Lim, S.J., & Obleser, J. (2017). The human neur­al alpha response to speech is a proxy of atten­tion­al con­trol. Cere­bral Cor­tex. In press.

 

Abstract
Human alpha (~10 Hz) oscil­la­to­ry pow­er is a promi­nent neur­al mark­er of cog­ni­tive effort. When lis­ten­ers attempt to process and retain acousti­cal­ly degrad­ed speech, alpha pow­er enhances. It is unclear whether these alpha mod­u­la­tions reflect the degree of acoustic degra­da­tion per se or the degra­da­tion-dri­ven demand to a listener’s atten­tion­al con­trol. Using an irrel­e­vant-speech par­a­digm in elec­troen­cephalog­ra­phy (EEG), the cur­rent exper­i­ment demon­strates that the neur­al alpha response to speech is a sur­pris­ing­ly clear proxy of top-down con­trol, entire­ly dri­ven by the lis­ten­ing goals of attend­ing ver­sus ignor­ing degrad­ed speech. While (n=23) lis­ten­ers retained the ser­i­al order of 9 to-be-recalled dig­its, one to-be-ignored sen­tence was pre­sent­ed. Dis­tractibil­i­ty of the to-be-ignored sen­tence para­met­ri­cal­ly var­ied in acoustic detail (noise-vocod­ing), with more acoustic detail of dis­tract­ing speech increas­ing­ly dis­rupt­ing lis­ten­ers’ ser­i­al mem­o­ry recall. Where pre­vi­ous stud­ies had observed decreas­es in pari­etal and audi­to­ry alpha pow­er with more acoustic detail (of tar­get speech), alpha pow­er here showed the oppo­site pat­tern and increased with more acoustic detail in the speech dis­trac­tor. In sum, the neur­al alpha response reflects almost exclu­sive­ly a listener’s exer­tion of atten­tion­al con­trol, which is deci­sive for whether more acoustic detail facil­i­tates com­pre­hen­sion (of attend­ed speech) or enhances dis­trac­tion (of ignored speech).
Categories
Auditory Cortex Auditory Neuroscience Editorial Notes Neural Oscillations Papers Psychology

Sto­ry time: Hen­ry & Obleser (2012) revisited

Sto­ry time: Some time in ear­ly 2011, I sat down with an Amer­i­can, fresh PhD grad­u­ate who had just joined my new lab, in a Leipzig bar (Café Can­tona; if you are inter­est­ed you can find this great 247 bar with exquis­ite food also in the acknowl­edg­ments of, e.g., Obleser & Eis­ner, Trends Cogn Sci, 2009).
To the day, I could still point you to the table she and I sat down at, and the wall I faced (which is notable because we actu­al­ly spent an unhealthy amount of time and mon­ey there over the years). Soon there­after, we grabbed a beer mat and start­ed scrib­bling waves and marked where we would place so-called tar­gets (psy­chol­o­gist lin­go) and talked a lot of gib­ber­ish about fre­quen­cy mod­u­la­tion. I remem­ber vidid­ly that I had just read an insane­ly long review paper on neur­al oscil­la­tions by Wolf­gang Klimesch (that, more in pass­ing, cit­ed old-school tales of Schmitt fil­ters by the late great Francesco Varela or pio­neers  sound­ing like record pro­duc­ers, Dust­man & Beck, 1965), while the young Amer­i­can oppo­site me turned out to be an—if adventurous—die-hard expert on audi­to­ry psychophysics.

Who would have thought that this very night would car­ry me towards tenure in three years’ time, and her around the globe as an esteemed young colleague.
When I nowa­days check Google schol­ar, I am amazed to see that already more than 100 oth­er papers have cit­ed what direct­ly grew out of that beer mat one and a half years later—not count­ing the many more papers this said post­doc, Mol­ly Hen­ry, has pro­duced since.

Here is the link to how excit­ed we were when the paper appeared in PNAS in 2012, and a link to the lit­tle movie a ger­man sci­ence pro­gram kind­ly pro­duced on all of this in 2013.

Categories
Adaptive Control Auditory Neuroscience EEG / MEG Evoked Activity Hearing Loss Neural Phase Perception Preprints (not peer-reviewed yet) Publications Speech Uncategorized

New preprint paper: Fiedler et al. on pre­dict­ing focus of atten­tion from in-ear EEG

Very proud: PhD stu­dent Lorenz Fiedler goes live (pre-peer-review) with his work of pre­dict­ing the focus of atten­tion in sin­gle-chan­nel/­for­ward mod­els in in-ear EEG!
Here is the preprint of the paper, which now will under­go peer-review. Thanks for check­ing it out!

In-Ear results Fiedler

Categories
Auditory Speech Processing EEG / MEG Papers Speech

New Review Paper out: Wöst­mann, Fiedler & Obleser in Lan­guage, Cog­ni­tion and Neuroscience

A review arti­cle for those inter­est­ed in how to use mag­ne­to-/elec­troen­cephalog­ra­phy (M/EEG) to study speech com­pre­hen­sion. We pro­vide a his­tor­i­cal­ly informed overview over depen­dent mea­sures in the time and fre­quen­cy domain, high­light recent advances result­ing from these mea­sures and review the noto­ri­ous chal­lenges and solu­tions speech and lan­guage researchers are faced with when study­ing elec­tro­phys­i­o­log­i­cal brain responses.

Now avail­able online:

http://www.tandfonline.com/doi/full/10.1080/23273798.2016.1262051

Abstract

Mag­ne­to- and elec­troen­cephalo­graph­ic (M/EEG) sig­nals record­ed from the human scalp have allowed for sub­stan­tial advances for neur­al mod­els of speech com­pre­hen­sion over the past decades. These meth­ods are cur­rent­ly advanc­ing rapid­ly and con­tin­ue to offer unpar­al­leled insight in the near-to-real-time neur­al dynam­ics of speech pro­cess­ing. We pro­vide a his­tor­i­cal­ly informed overview over depen­dent mea­sures in the time and fre­quen­cy domain and high­light recent advances result­ing from these mea­sures. We dis­cuss the noto­ri­ous chal­lenges (and solu­tions) speech and lan­guage researchers are faced with when study­ing audi­to­ry brain respons­es in M/EEG. We argue that a key to under­stand­ing the neur­al basis of speech com­pre­hen­sion will lie in study­ing inter­ac­tions between the neur­al track­ing of speech and the func­tion­al neur­al net­work dynam­ics. This arti­cle is intend­ed for both, non-experts who want to learn how to use M/EEG to study speech com­pre­hen­sion and schol­ars aim­ing for an overview of state-of-the-art M/EEG analy­sis methods.

Categories
Auditory Cortex Auditory Perception Cross-Modal Integration EEG / MEG Neural Oscillations Perception

New paper out: Plöchl, Gas­ton, Mer­ma­gen, König & Hair­ston, Sci­en­tif­ic Reports

An arti­cle by our new AC group mem­ber Michael Plöchl from his PhD project in Osnabrück has been accept­ed for pub­li­ca­tion in Sci­en­tif­ic Reports. In their study, Plöchl, Gas­ton, Mer­ma­gen, König and Hair­ston demon­strate that “Oscil­la­to­ry activ­i­ty in audi­to­ry cor­tex reflects the per­cep­tu­al lev­el of audio-tac­tile integration”.

oscillatory_activity

Abstract
Cross-modal inter­ac­tions between sen­so­ry chan­nels have been shown to depend on both the spa­tial dis­par­i­ty and the per­cep­tu­al sim­i­lar­i­ty between the pre­sent­ed stim­uli. Here we inves­ti­gate the behav­ioral and neur­al inte­gra­tion of audi­to­ry and tac­tile stim­u­lus pairs at dif­fer­ent lev­els of spa­tial dis­par­i­ty. Addi­tion­al­ly, we mod­u­lat­ed the ampli­tudes of both stim­uli in either a coher­ent or non-coher­ent man­ner. We found that both audi­to­ry and tac­tile local­iza­tion per­for­mance was biased towards the stim­u­lus in the respec­tive oth­er modal­i­ty. This bias lin­ear­ly increas­es with stim­u­lus dis­par­i­ty and is more pro­nounced for coher­ent­ly mod­u­lat­ed stim­u­lus pairs. Analy­ses of elec­troen­cephalo­graph­ic (EEG) activ­i­ty at temporal–cortical sources revealed enhanced event-relat­ed poten­tials (ERPs) as well as decreased alpha and beta pow­er dur­ing bimodal as com­pared to uni­modal stim­u­la­tion. How­ev­er, while the observed ERP dif­fer­ences are sim­i­lar for all stim­u­lus com­bi­na­tions, the extent of oscil­la­to­ry desyn­chro­niza­tion varies with stim­u­lus dis­par­i­ty. More­over, when both stim­uli were sub­jec­tive­ly per­ceived as orig­i­nat­ing from the same direc­tion, the reduc­tion in alpha and beta pow­er was sig­nif­i­cant­ly stronger. These obser­va­tions sug­gest that in the EEG the lev­el of per­cep­tu­al inte­gra­tion is main­ly reflect­ed by changes in ongo­ing oscil­la­to­ry activity.
Categories
Auditory Cortex Auditory Perception Media Neural Oscillations Papers Publications Uncategorized

New fea­turette in eLife: Tell me some­thing I don’t know

For those inter­est­ed in audi­to­ry cor­tex and how a regime of pre­dic­tions, pre­dic­tion updates and sur­prise (a ver­sion of “pre­dic­tion error”) might be imple­ment­ed there, I con­tributed a brief fea­turette (“insight”, they call it) to eLife on a recent paper by Will Sed­ley, Tim Grif­fiths, and oth­ers. Check it out.
Obleser-elife-Figure

[For those not so famil­iar with it, “eLife”, despite its aes­thet­i­cal­ly ques­tion­able name, pos­es an inter­est­ing and rel­a­tive­ly new, high-pro­file, open-access pub­lish­ing effort by nobel-prize-win­ning Randy Schek­man, for­mer SfN pres­i­dent Eve Marder and others.]