Categories
Adaptive Control Ageing Auditory Cortex Auditory Neuroscience EEG / MEG Evoked Activity Executive Functions Neural Oscillations Neural Phase Papers Perception Publications

New paper in press: Hen­ry et al., Nature Communications

Here comes a new paper in Nature Com­mu­ni­ca­tions by for­mer AC post­doc Mol­ly Hen­ry, with for­mer fel­low post­doc AC alum­nus Björn Her­rmann, our tire­less lab man­ag­er, Dun­ja Kunke, and myself! It is a late (to us quite impor­tant) result from our lab’s tenure at the Max Planck in Leipzig, 

Hen­ry, M.J., Her­rmann, B., Kunke, D., Obleser, J. (In press). Aging affects the bal­ance of neur­al entrain­ment and top-down neur­al mod­u­la­tion in the lis­ten­ing brain. Nature Communications. 

—Con­grat­u­la­tions, Molly!

Categories
Attention Auditory Cortex Auditory Neuroscience Auditory Speech Processing EEG / MEG Papers Psychology Publications

New paper in press in Jour­nal of Neur­al Engi­neer­ing: Fiedler et al. on in-ear-EEG and the focus of audi­to­ry attention

Towards a brain-con­trolled hear­ing aid: PhD stu­dent Lorenz Fiedler shows how attend­ed and ignored audi­to­ry streams are dif­fer­ent­ly rep­re­sent­ed in the neur­al respons­es and how the focus of audi­to­ry atten­tion can be extract­ed from EEG sig­nals record­ed at elec­trodes placed inside the ear-canal and around the ear.

Abstract
Objec­tive. Con­ven­tion­al, mul­ti-chan­nel scalp elec­troen­cephalog­ra­phy (EEG) allows the iden­ti­fi­ca­tion of the attend­ed speak­er in con­cur­rent-lis­ten­ing (‘cock­tail par­ty’) sce­nar­ios. This implies that EEG might pro­vide valu­able infor­ma­tion to com­ple­ment hear­ing aids with some form of EEG and to install a lev­el of neu­ro-feed­back. Approach. To inves­ti­gate whether a listener’s atten­tion­al focus can be detect­ed from sin­gle-chan­nel hear­ing-aid-com­pat­i­ble EEG con­fig­u­ra­tions, we record­ed EEG from three elec­trodes inside the ear canal (‘in-Ear-EEG’) and addi­tion­al­ly from 64 elec­trodes on the scalp. In two dif­fer­ent, con­cur­rent lis­ten­ing tasks, par­tic­i­pants ( n  =  7) were fit­ted with indi­vid­u­al­ized in-Ear-EEG pieces and were either asked to attend to one of two dichot­i­cal­ly-pre­sent­ed, con­cur­rent tone streams or to one of two diot­i­cal­ly-pre­sent­ed, con­cur­rent audio­books. A for­ward encod­ing mod­el was trained to pre­dict the EEG response at sin­gle EEG chan­nels. Main results. Each indi­vid­ual par­tic­i­pants’ atten­tion­al focus could be detect­ed from sin­gle-chan­nel EEG response record­ed from short-dis­tance con­fig­u­ra­tions con­sist­ing only of a sin­gle in-Ear-EEG elec­trode and an adja­cent scalp-EEG elec­trode. The dif­fer­ences in neur­al respons­es to attend­ed and ignored stim­uli were con­sis­tent in mor­phol­o­gy (i.e. polar­i­ty and laten­cy of com­po­nents) across sub­jects. Sig­nif­i­cance. In sum, our find­ings show that the EEG response from a sin­gle-chan­nel, hear­ing-aid-com­pat­i­ble con­fig­u­ra­tion pro­vides valu­able infor­ma­tion to iden­ti­fy a listener’s focus of attention.
Categories
Adaptive Control Attention Auditory Cortex Auditory Neuroscience Auditory Perception Auditory Speech Processing Degraded Acoustics EEG / MEG Evoked Activity Executive Functions Neural Oscillations Noise-Vocoded Speech Papers Perception Psychology Publications Speech

New paper in press in Cere­bral Cor­tex: Wöst­mann et al. on ignor­ing degrad­ed speech

Audi­to­ry Cognition’s own Malte Wöst­mann is in press in Cere­bral Cor­tex with his lat­est offer­ing on how atten­tion­al con­trol man­i­fests in alpha pow­er changes: Ignor­ing speech can be ben­e­fi­cial (if com­pre­hend­ing speech poten­tial­ly detracts from anoth­er task), and we here show how this change in lis­ten­ing goals turns around the pat­tern of alpha-pow­er changes with chang­ing speech degra­da­tion. (We will update as the paper becomes avail­able online.)

Wöst­mann, M., Lim, S.J., & Obleser, J. (2017). The human neur­al alpha response to speech is a proxy of atten­tion­al con­trol. Cere­bral Cor­tex. In press.

 

Abstract
Human alpha (~10 Hz) oscil­la­to­ry pow­er is a promi­nent neur­al mark­er of cog­ni­tive effort. When lis­ten­ers attempt to process and retain acousti­cal­ly degrad­ed speech, alpha pow­er enhances. It is unclear whether these alpha mod­u­la­tions reflect the degree of acoustic degra­da­tion per se or the degra­da­tion-dri­ven demand to a listener’s atten­tion­al con­trol. Using an irrel­e­vant-speech par­a­digm in elec­troen­cephalog­ra­phy (EEG), the cur­rent exper­i­ment demon­strates that the neur­al alpha response to speech is a sur­pris­ing­ly clear proxy of top-down con­trol, entire­ly dri­ven by the lis­ten­ing goals of attend­ing ver­sus ignor­ing degrad­ed speech. While (n=23) lis­ten­ers retained the ser­i­al order of 9 to-be-recalled dig­its, one to-be-ignored sen­tence was pre­sent­ed. Dis­tractibil­i­ty of the to-be-ignored sen­tence para­met­ri­cal­ly var­ied in acoustic detail (noise-vocod­ing), with more acoustic detail of dis­tract­ing speech increas­ing­ly dis­rupt­ing lis­ten­ers’ ser­i­al mem­o­ry recall. Where pre­vi­ous stud­ies had observed decreas­es in pari­etal and audi­to­ry alpha pow­er with more acoustic detail (of tar­get speech), alpha pow­er here showed the oppo­site pat­tern and increased with more acoustic detail in the speech dis­trac­tor. In sum, the neur­al alpha response reflects almost exclu­sive­ly a listener’s exer­tion of atten­tion­al con­trol, which is deci­sive for whether more acoustic detail facil­i­tates com­pre­hen­sion (of attend­ed speech) or enhances dis­trac­tion (of ignored speech).
Categories
Auditory Cortex Auditory Neuroscience Editorial Notes Neural Oscillations Papers Psychology

Sto­ry time: Hen­ry & Obleser (2012) revisited

Sto­ry time: Some time in ear­ly 2011, I sat down with an Amer­i­can, fresh PhD grad­u­ate who had just joined my new lab, in a Leipzig bar (Café Can­tona; if you are inter­est­ed you can find this great 247 bar with exquis­ite food also in the acknowl­edg­ments of, e.g., Obleser & Eis­ner, Trends Cogn Sci, 2009).
To the day, I could still point you to the table she and I sat down at, and the wall I faced (which is notable because we actu­al­ly spent an unhealthy amount of time and mon­ey there over the years). Soon there­after, we grabbed a beer mat and start­ed scrib­bling waves and marked where we would place so-called tar­gets (psy­chol­o­gist lin­go) and talked a lot of gib­ber­ish about fre­quen­cy mod­u­la­tion. I remem­ber vidid­ly that I had just read an insane­ly long review paper on neur­al oscil­la­tions by Wolf­gang Klimesch (that, more in pass­ing, cit­ed old-school tales of Schmitt fil­ters by the late great Francesco Varela or pio­neers  sound­ing like record pro­duc­ers, Dust­man & Beck, 1965), while the young Amer­i­can oppo­site me turned out to be an—if adventurous—die-hard expert on audi­to­ry psychophysics.

Who would have thought that this very night would car­ry me towards tenure in three years’ time, and her around the globe as an esteemed young colleague.
When I nowa­days check Google schol­ar, I am amazed to see that already more than 100 oth­er papers have cit­ed what direct­ly grew out of that beer mat one and a half years later—not count­ing the many more papers this said post­doc, Mol­ly Hen­ry, has pro­duced since.

Here is the link to how excit­ed we were when the paper appeared in PNAS in 2012, and a link to the lit­tle movie a ger­man sci­ence pro­gram kind­ly pro­duced on all of this in 2013.

Categories
Auditory Cortex Auditory Perception Cross-Modal Integration EEG / MEG Neural Oscillations Perception

New paper out: Plöchl, Gas­ton, Mer­ma­gen, König & Hair­ston, Sci­en­tif­ic Reports

An arti­cle by our new AC group mem­ber Michael Plöchl from his PhD project in Osnabrück has been accept­ed for pub­li­ca­tion in Sci­en­tif­ic Reports. In their study, Plöchl, Gas­ton, Mer­ma­gen, König and Hair­ston demon­strate that “Oscil­la­to­ry activ­i­ty in audi­to­ry cor­tex reflects the per­cep­tu­al lev­el of audio-tac­tile integration”.

oscillatory_activity

Abstract
Cross-modal inter­ac­tions between sen­so­ry chan­nels have been shown to depend on both the spa­tial dis­par­i­ty and the per­cep­tu­al sim­i­lar­i­ty between the pre­sent­ed stim­uli. Here we inves­ti­gate the behav­ioral and neur­al inte­gra­tion of audi­to­ry and tac­tile stim­u­lus pairs at dif­fer­ent lev­els of spa­tial dis­par­i­ty. Addi­tion­al­ly, we mod­u­lat­ed the ampli­tudes of both stim­uli in either a coher­ent or non-coher­ent man­ner. We found that both audi­to­ry and tac­tile local­iza­tion per­for­mance was biased towards the stim­u­lus in the respec­tive oth­er modal­i­ty. This bias lin­ear­ly increas­es with stim­u­lus dis­par­i­ty and is more pro­nounced for coher­ent­ly mod­u­lat­ed stim­u­lus pairs. Analy­ses of elec­troen­cephalo­graph­ic (EEG) activ­i­ty at temporal–cortical sources revealed enhanced event-relat­ed poten­tials (ERPs) as well as decreased alpha and beta pow­er dur­ing bimodal as com­pared to uni­modal stim­u­la­tion. How­ev­er, while the observed ERP dif­fer­ences are sim­i­lar for all stim­u­lus com­bi­na­tions, the extent of oscil­la­to­ry desyn­chro­niza­tion varies with stim­u­lus dis­par­i­ty. More­over, when both stim­uli were sub­jec­tive­ly per­ceived as orig­i­nat­ing from the same direc­tion, the reduc­tion in alpha and beta pow­er was sig­nif­i­cant­ly stronger. These obser­va­tions sug­gest that in the EEG the lev­el of per­cep­tu­al inte­gra­tion is main­ly reflect­ed by changes in ongo­ing oscil­la­to­ry activity.
Categories
Auditory Cortex Auditory Perception Media Neural Oscillations Papers Publications Uncategorized

New fea­turette in eLife: Tell me some­thing I don’t know

For those inter­est­ed in audi­to­ry cor­tex and how a regime of pre­dic­tions, pre­dic­tion updates and sur­prise (a ver­sion of “pre­dic­tion error”) might be imple­ment­ed there, I con­tributed a brief fea­turette (“insight”, they call it) to eLife on a recent paper by Will Sed­ley, Tim Grif­fiths, and oth­ers. Check it out.
Obleser-elife-Figure

[For those not so famil­iar with it, “eLife”, despite its aes­thet­i­cal­ly ques­tion­able name, pos­es an inter­est­ing and rel­a­tive­ly new, high-pro­file, open-access pub­lish­ing effort by nobel-prize-win­ning Randy Schek­man, for­mer SfN pres­i­dent Eve Marder and others.] 
Categories
Auditory Cortex Auditory Neuroscience Auditory Perception Auditory Speech Processing Editorial Notes EEG / MEG Executive Functions Neural Oscillations Neural Phase Papers Publications Speech Uncategorized

[UPDATE] New paper in PNAS: Spa­tiotem­po­ral dynam­ics of audi­to­ry atten­tion syn­chro­nize with speech, Woest­mann et al.

Wöst­mann, Her­rmann, Maess and Obleser demon­strate that the hemi­spher­ic lat­er­al­iza­tion of neur­al alpha oscil­la­tions mea­sured in the mag­ne­toen­cephalo­gram (MEG) syn­chro­nizes with the speech sig­nal and pre­dicts lis­ten­ers’ speech comprehension.

Now avail­able online:

http://www.pnas.org/content/early/2016/03/18/1523357113

Press release:

https://www.uni-luebeck.de/forschung/aktuelles-zur-forschung/aktuelles-zur-forschung/artikel/aufmerksamkeit-in-wellen-erfolgreich-zuhoeren-im-rhythmus-der-sprache.html

spatiotemporal_dynamics

Abstract
Atten­tion plays a fun­da­men­tal role in selec­tive­ly pro­cess­ing stim­uli in our envi­ron­ment despite dis­trac­tion. Spa­tial atten­tion induces increas­ing and decreas­ing pow­er of neur­al alpha oscil­la­tions (8–12 Hz) in brain regions ipsi­lat­er­al and con­tralat­er­al to the locus of atten­tion, respec­tive­ly. This study test­ed whether the hemi­spher­ic lat­er­al­iza­tion of alpha pow­er codes not just the spa­tial loca­tion but also the tem­po­ral struc­ture of the stim­u­lus. Par­tic­i­pants attend­ed to spo­ken dig­its pre­sent­ed to one ear and ignored tight­ly syn­chro­nized dis­tract­ing dig­its pre­sent­ed to the oth­er ear. In the mag­ne­toen­cephalo­gram, spa­tial atten­tion induced lat­er­al­iza­tion of alpha pow­er in pari­etal, but notably also in audi­to­ry cor­ti­cal regions. This alpha pow­er lat­er­al­iza­tion was not main­tained steadi­ly but fluc­tu­at­ed in syn­chrony with the speech rate and lagged the time course of low-fre­quen­cy (1–5 Hz) sen­so­ry syn­chro­niza­tion. High­er ampli­tude of alpha pow­er mod­u­la­tion at the speech rate was pre­dic­tive of a listener’s enhanced per­for­mance of stream-spe­cif­ic speech com­pre­hen­sion. Our find­ings demon­strate that alpha pow­er lat­er­al­iza­tion is mod­u­lat­ed in tune with the sen­so­ry input and acts as a spa­tiotem­po­ral fil­ter con­trol­ling the read-out of sen­so­ry content.
Categories
Auditory Cortex Auditory Neuroscience Auditory Perception EEG / MEG Neural Oscillations Papers Publications Speech

New paper: Her­rmann, Hen­ry, Hae­gens & Obleser in Neuroimage

And again, AC-Alum­ni Björn Her­rmann got a new paper in press / online at Neu­roIm­age on

Tem­po­ral expec­ta­tions and neur­al ampli­tude fluc­tu­a­tions in audi­to­ry cor­tex inter­ac­tive­ly influ­ence perception

Abstract
Align­ment of neur­al oscil­la­tions with tem­po­ral­ly reg­u­lar input allows lis­ten­ers to gen­er­ate tem­po­ral expec­ta­tions. How­ev­er, it remains unclear how behav­ior is gov­erned in the con­text of tem­po­ral vari­abil­i­ty: What role do tem­po­ral expec­ta­tions play, and how do they inter­act with the strength of neur­al oscil­la­to­ry activ­i­ty? Here, human par­tic­i­pants detect­ed near-thresh­old tar­gets in tem­po­ral­ly vari­able acoustic sequences. Tem­po­ral expec­ta­tion strength was esti­mat­ed using an oscil­la­tor mod­el and pre-tar­get neur­al ampli­tudes in audi­to­ry cor­tex were extract­ed from mag­ne­toen­cephalog­ra­phy sig­nals. Tem­po­ral expec­ta­tions mod­u­lat­ed tar­get-detec­tion per­for­mance, how­ev­er, only when neur­al delta-band ampli­tudes were large. Thus, slow neur­al oscil­la­tions act to gate influ­ences of tem­po­ral expec­ta­tion on per­cep­tion. Fur­ther­more, slow ampli­tude fluc­tu­a­tions gov­erned lin­ear and qua­drat­ic influ­ences of audi­to­ry alpha-band activ­i­ty on per­for­mance. By fus­ing a mod­el of tem­po­ral expec­ta­tion with neur­al oscil­la­to­ry dynam­ics, the cur­rent find­ings show that human per­cep­tion in tem­po­ral­ly vari­able con­texts relies on com­plex inter­ac­tions between mul­ti­ple neur­al fre­quen­cy bands.

Cheers.

Ref­er­ences

  • Her­rmann B1, Hen­ry MJ2, Hae­gens S3, Obleser J4. Tem­po­ral expec­ta­tions and neur­al ampli­tude fluc­tu­a­tions in audi­to­ry cor­tex inter­ac­tive­ly influ­ence per­cep­tion. Neu­roim­age. 2015 Sep 18;124(Pt A):487–497. PMID: 26386347. [Open with Read]