web analytics
Categories
Attention Auditory Neuroscience EEG / MEG Papers Publications Speech Tracking Unilateral Vocoding

New Paper in Trends in Hear­ing by Kraus et al.

Frauke Kraus, Sarah Tune, Anna Ruhe, Jonas Obleser & Malte Wöst­mann demon­strate that uni­lat­er­al acoustic degra­da­tion delays atten­tion­al sep­a­ra­tion of com­pet­ing speech.

Uni­lat­er­al cochlear implant (CI) users have to inte­grate acousti­cal­ly intact speech on one ear and acousti­cal­ly degrad­ed speech on the oth­er ear. How inter­act uni­lat­er­al acoustic degra­da­tion and spa­tial atten­tion in a mul­titalk­er situation?
N = 22 par­tic­i­pants took part in a com­pet­ing lis­ten­ing exper­i­ment while lis­ten­ing to an intact audio­book under dis­trac­tion of an acousti­cal­ly degrad­ed audio­book and vice ver­sa. Speech track­ing revealed not per se reduced atten­tion­al sep­a­ra­tion of acousti­cal­ly degrad­ed speech but instead a delay in time com­pared to intact speech. These find­ings might explain lis­ten­ing chal­lenges expe­ri­enced by uni­lat­er­al CI users.

To learn more, the paper is avail­able here.

Categories
Editorial Notes Neural Oscillations Neural Phase Papers Uncategorized

A qui­et inno­va­tor: Peter Lakatos (1972–2021)

Our dear col­league and col­lab­o­ra­tor Peter Lakatos passed away sud­den­ly two months ago. With Peter’s so untime­ly death at the age of 49, Neu­ro­science has suf­fered an unimag­in­able loss.
It has been an hon­our and priv­i­lege to con­tribute Peter Lakatos’ obit­u­ary to Nature Neu­ro­science.

— Jonas Obleser

The pic­ture shows Peter just after or dur­ing his talk at our SNAP 2013 work­shop at the Max Planck Insti­tute in Leipzig. Inci­dent­ly, this is also the talk I ref­er­enced in my recent obit­u­ary, linked above.

Categories
Ageing Auditory Cortex Auditory Neuroscience EEG / MEG Hearing Loss Neural Filters Papers Publications

New paper in Nature Com­mu­ni­ca­tions by Tune et al.

We are very excit­ed to share that Oble­ser­lab post­doc Sarah Tune has a new paper in Nature Com­mu­ni­ca­tions. „Neur­al atten­tion­al-fil­ter mech­a­nisms of lis­ten­ing suc­cess in mid­dle-aged and old­er par­tic­i­pants“ is our lat­est and to-date most exten­sive out­put of the lon­gi­tu­di­nal ERC Con­sol­ida­tor project on adap­tive lis­ten­ing in age­ing indi­vid­ual (AUDADAPT).

This co-pro­duc­tion with cur­rent (Mohsen Alavash and Jonas Obleser) and for­mer (Lorenz Fiedler) Oble­ser­lab mem­bers, takes an in-depth and inte­gra­tive look at how two of the most exten­sive­ly stud­ied neu­ro­bi­o­log­i­cal atten­tion­al-fil­ter imple­men­ta­tions, alpha pow­er lat­er­al­iza­tion and selec­tive neur­al speech track­ing, relate to one anoth­er and to lis­ten­ing sucess.

Lever­ag­ing our large, rep­re­sen­ta­tive sam­ple of aging lis­ten­ers (N=155, 39–80 years), we show that both neur­al fil­ter imple­men­tatins are robust­ly mod­u­lat­ed by atten­tion but oper­ate sur­prins­ing­ly inde­pen­dent of one another.

In a series of sophis­ti­cat­ed sin­gle-tri­al lin­ear mod­els that include vari­a­tion in neur­al fil­ter strength with­in and between indi­vid­u­als, we demon­strate how the pref­er­en­tial neur­al track­ing of attend­ed ver­sus ignored speech but not alpha lat­er­al­iza­tion boosts lis­ten­ing success.

To learn more, the paper is avail­able here.

Categories
Ageing EEG / MEG fMRI Papers Publications

New Per­spec­tive paper in Neu­ron by Waschke et al.

We are excit­ed to share that for­mer Oble­ser­lab PhD stu­dent Leo Waschke, togeth­er with his new (Doug Gar­rett, Niels Kloost­er­man) and old (Jonas Obleser) lab has pub­lished an in-depth per­spec­tive piece in Neu­ron, with the provoca­tive title “Behav­ior need neur­al vari­abil­i­ty”.
Our arti­cle is essen­tial­ly a long and exten­sive trib­ute to the “sec­ond moment” of neur­al activ­i­ty, in sta­tis­ti­cal terms, essen­tial­ly: Vari­abil­i­ty — be it quan­ti­fied as vari­ance, entropy, or spec­tral slope — is the long-neglect­ed twin of aver­ages, and it holds great promise in under­stand­ing neur­al states (how does neur­al activ­i­ty dif­fer from one moment to the next?) and traits (how do indi­vid­u­als dif­fer from each other?).
Con­grat­u­la­tions, Leo!

Categories
Auditory Neuroscience Auditory Perception EEG / MEG Papers Perception Uncategorized

New paper in press in elife: Waschke et al.

Oble­ser­lab senior PhD stu­dent Leo Waschke, along­side co-authors Sarah Tune and Jonas Obleser, has a new paper in eLife.

The pro­cess­ing of sen­so­ry infor­ma­tion from our envi­ron­ment is not con­stant but rather varies with changes in ongo­ing brain activ­i­ty, or brain states. Thus, also the acu­ity of per­cep­tu­al deci­sions depends on the brain state dur­ing which sen­so­ry infor­ma­tion is processed. Recent work in non-human ani­mals sug­gests two key process­es that shape brain states rel­e­vant for sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. On the one hand, the momen­tary lev­el of neur­al desyn­chro­niza­tion in sen­so­ry cor­ti­cal areas has been shown to impact neur­al rep­re­sen­ta­tions of sen­so­ry input and relat­ed per­for­mance. On the oth­er hand, the cur­rent lev­el of arousal and relat­ed nora­dren­er­gic activ­i­ty has been linked to changes in sen­so­ry pro­cess­ing and per­cep­tu­al acuity.

How­ev­er, it is unclear at present, whether local neur­al desyn­chro­niza­tion and arousal pose dis­tinct brain states that entail vary­ing con­se­quences for sen­so­ry pro­cess­ing and behav­iour or if they rep­re­sent two inter­re­lat­ed man­i­fes­ta­tions of ongo­ing brain activ­i­ty and joint­ly affect behav­iour. Fur­ther­more, the exact shape of the rela­tion­ship between per­cep­tu­al per­for­mance and each of both brain states mark­ers (e.g. lin­ear vs. qua­drat­ic) is unclear at present.

In order to trans­fer find­ings from ani­mal phys­i­ol­o­gy to human cog­ni­tive neu­ro­science and test the exact shape of unique as well as shared influ­ences of local cor­ti­cal desyn­chro­niza­tion and glob­al arousal on sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance, we record­ed elec­troen­cephalog­ra­phy and pupil­lom­e­try in 25 human par­tic­i­pants while they per­formed a chal­leng­ing audi­to­ry dis­crim­i­na­tion task.

Impor­tant­ly, audi­to­ry stim­uli were selec­tive­ly pre­sent­ed dur­ing peri­ods of espe­cial­ly high or low audi­to­ry cor­ti­cal desyn­chro­niza­tion as approx­i­mat­ed by an infor­ma­tion the­o­ret­ic mea­sure of time-series com­plex­i­ty (weight­ed per­mu­ta­tion entropy). By means of a closed-loop real time set­up we were not only able to present stim­uli dur­ing dif­fer­ent desyn­chro­niza­tion states but also made sure to sam­ple the whole dis­tri­b­u­tion of such states, a pre­req­ui­site for the accu­rate assess­ment of brain-behav­iour rela­tion­ships. The record­ed pupil­lom­e­try data addi­tion­al­ly enabled us to draw infer­ences regard­ing the cur­rent lev­el of arousal due to the estab­lished link between nora­dren­er­gic activ­i­ty and pupil size.

 

Sin­gle tri­al analy­ses of EEG activ­i­ty, pupil­lom­e­try and behav­iour revealed clear­ly dis­so­cia­ble influ­ences of both brain state mark­ers on ongo­ing brain activ­i­ty, ear­ly sound-relat­ed activ­i­ty and behav­iour. High desyn­chro­niza­tion states were char­ac­ter­ized by a pro­nounced reduc­tion in oscil­la­to­ry pow­er across a wide fre­quen­cy range while high arousal states coin­cid­ed with a decrease in oscil­la­to­ry pow­er that was lim­it­ed to high fre­quen­cies. Sim­i­lar­ly, ear­ly sound-evoked activ­i­ty was dif­fer­en­tial­ly impact­ed by audi­to­ry cor­ti­cal desyn­chro­niza­tion and pupil-linked arousal. Phase-locked respons­es and evoked gam­ma pow­er increased with local desyn­chro­niza­tion with a ten­den­cy to sat­u­rate at inter­me­di­ate lev­els. Post-stim­u­lus low fre­quen­cy pow­er on the oth­er hand, increased with pupil-linked arousal.

Most impor­tant­ly, local desyn­chro­niza­tion and pupil-linked arousal dis­played dif­fer­ent rela­tion­ships with per­cep­tu­al per­for­mance. While par­tic­i­pants per­formed fastest and least biased fol­low­ing inter­me­di­ate lev­els of audi­to­ry cor­ti­cal desyn­chro­niza­tion, inter­me­di­ate lev­els of pupil-linked arousal were asso­ci­at­ed with high­est sen­si­tiv­i­ty. Thus, although both process­es pose behav­ioural­ly rel­e­vant brain states that affect per­cep­tu­al per­for­mance fol­low­ing an invert­ed u, they impact dis­tinct sub­do­mains of behav­iour. Tak­en togeth­er, our results speak to a mod­el in which inde­pen­dent states of local desyn­chro­niza­tion and glob­al arousal joint­ly shape states for opti­mal sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. The pub­lished man­u­script includ­ing all sup­ple­men­tal infor­ma­tion can be found here.

Categories
Attention Auditory Neuroscience Neural Oscillations Papers Psychology Uncategorized

New paper in press in the Jour­nal of Neuroscience

Wöst­mann, Alavash and Obleser demon­strate that alpha oscil­la­tions in the human brain imple­ment dis­trac­tor sup­pres­sion inde­pen­dent of tar­get selection.

In the­o­ry, the abil­i­ty to selec­tive­ly focus on rel­e­vant objects in our envi­ron­ment bases on selec­tion of tar­gets and sup­pres­sion of dis­trac­tion. As it is unclear whether tar­get selec­tion and dis­trac­tor sup­pres­sion are inde­pen­dent, we designed an Elec­troen­cephalog­ra­phy (EEG) study to direct­ly con­trast these two processes.

Par­tic­i­pants per­formed a pitch dis­crim­i­na­tion task on a tone sequence pre­sent­ed at one loud­speak­er loca­tion while a dis­tract­ing tone sequence was pre­sent­ed at anoth­er loca­tion. When the dis­trac­tor was fixed in the front, atten­tion to upcom­ing tar­gets on the left ver­sus right side induced hemi­spher­ic lat­er­al­i­sa­tion of alpha pow­er with rel­a­tive­ly high­er pow­er ipsi- ver­sus con­tralat­er­al to the side of attention.

Crit­i­cal­ly, when the tar­get was fixed in front, sup­pres­sion of upcom­ing dis­trac­tors reversed the pat­tern of alpha lat­er­al­i­sa­tion, that is, alpha pow­er increased con­tralat­er­al to the dis­trac­tor and decreased ipsi­lat­er­al­ly. Since the two lat­er­al­ized alpha respons­es were uncor­re­lat­ed across par­tic­i­pants, they can be con­sid­ered large­ly inde­pen­dent cog­ni­tive mechanisms.

This was fur­ther sup­port­ed by the fact that alpha lat­er­al­i­sa­tion in response to dis­trac­tor sup­pres­sion orig­i­nat­ed in more ante­ri­or, frontal cor­ti­cal regions com­pared with tar­get selec­tion (see figure).

The paper is also avail­able as preprint here.

 

Categories
Attention Auditory Cortex Auditory Speech Processing EEG / MEG Psychology Speech

AC post­doc Malte Wöst­mann scores DFG grant to study the tem­po­ral dynam­ics of the audi­to­ry atten­tion­al filter

In this three-year project, we will use the audi­to­ry modal­i­ty as a test case to inves­ti­gate how the sup­pres­sion of dis­tract­ing infor­ma­tion (i.e., “fil­ter­ing”) is neu­ral­ly imple­ment­ed. While it is known that the atten­tion­al sam­pling of tar­gets (a) is rhyth­mic, (b) can be entrained, and © is mod­u­lat­ed by top-down pre­dic­tions, the exis­tence and neur­al imple­men­ta­tion of these mech­a­nisms for the sup­pres­sion of dis­trac­tors is at present unclear. To test this, we will use adap­ta­tions of estab­lished behav­iour­al par­a­digms of dis­trac­tor sup­pres­sion and record­ings of human elec­tro­phys­i­o­log­i­cal sig­nals in the Magen­to-/ Elec­troen­cephalo­gram (M/EEG).

Abstract of research project:

Back­ground: Goal-direct­ed behav­iour in tem­po­ral­ly dynam­ic envi­ron­ments requires to focus on rel­e­vant infor­ma­tion and to not get dis­tract­ed by irrel­e­vant infor­ma­tion. To achieve this, two cog­ni­tive process­es are nec­es­sary: On the one hand, atten­tion­al sam­pling of tar­get stim­uli has been focus of exten­sive research. On the oth­er hand, it is less well known how the human neur­al sys­tem exploits tem­po­ral infor­ma­tion in the stim­u­lus to fil­ter out dis­trac­tion. In the present project, we use the audi­to­ry modal­i­ty as a test case to study the tem­po­ral dynam­ics of atten­tion­al fil­ter­ing and its neur­al implementation.

Approach and gen­er­al hypoth­e­sis: In three vari­ants of the “Irrel­e­vant-Sound Task” we will manip­u­late tem­po­ral aspects of audi­to­ry dis­trac­tors. Behav­iour­al recall of tar­get stim­uli despite dis­trac­tion and respons­es in the elec­troen­cephalo­gram (EEG) will reflect the integri­ty and neur­al imple­men­ta­tion of the atten­tion­al fil­ter. In line with pre­lim­i­nary research, our gen­er­al hypoth­e­sis is that atten­tion­al fil­ter­ing bases on sim­i­lar but sign-reversed mech­a­nisms as atten­tion­al sam­pling: For instance, while atten­tion to rhyth­mic stim­uli increas­es neur­al sen­si­tiv­i­ty at time points of expect­ed tar­get occur­rence, fil­ter­ing of dis­trac­tors should instead decrease neur­al sen­si­tiv­i­ty at the time of expect­ed distraction.

Work pro­gramme: In each one of three Work Pack­ages (WPs), we will take as a mod­el an estab­lished neur­al mech­a­nism of atten­tion­al sam­pling and test the exis­tence and neur­al imple­men­ta­tion of a sim­i­lar mech­a­nism for atten­tion­al fil­ter­ing. This way, we will inves­ti­gate whether atten­tion­al fil­ter­ing fol­lows an intrin­sic rhythm (WP1), whether rhyth­mic dis­trac­tors can entrain atten­tion­al fil­ter­ing (WP2), and whether fore­knowl­edge about the time of dis­trac­tion induces top-down tun­ing of the atten­tion­al fil­ter in frontal cor­tex regions (WP3).

Objec­tives and rel­e­vance: The pri­ma­ry objec­tive of this research is to con­tribute to the foun­da­tion­al sci­ence on human selec­tive atten­tion, which requires a com­pre­hen­sive under­stand­ing of how the neur­al sys­tem achieves the task of fil­ter­ing out dis­trac­tion. Fur­ther­more, hear­ing dif­fi­cul­ties often base on dis­trac­tion by salient but irrel­e­vant sound. Results of this research will trans­late to the devel­op­ment of hear­ing aids that take into account neu­ro-cog­ni­tive mech­a­nisms to fil­ter out dis­trac­tion more efficiently.

Categories
Attention Auditory Cortex Auditory Neuroscience EEG / MEG Papers Perception Psychology Publications

New paper in Neu­roim­age by Fiedler et al.: Track­ing ignored speech matters

Lis­ten­ing requires selec­tive neur­al pro­cess­ing of the incom­ing sound mix­ture, which in humans is borne out by a sur­pris­ing­ly clean rep­re­sen­ta­tion of attend­ed-only speech in audi­to­ry cor­tex. How this neur­al selec­tiv­i­ty is achieved even at neg­a­tive sig­nal-to-noise ratios (SNR) remains unclear. We show that, under such con­di­tions, a late cor­ti­cal rep­re­sen­ta­tion (i.e., neur­al track­ing) of the ignored acoustic sig­nal is key to suc­cess­ful sep­a­ra­tion of attend­ed and dis­tract­ing talk­ers (i.e., neur­al selec­tiv­i­ty). We record­ed and mod­eled the elec­troen­cephalo­graph­ic response of 18 par­tic­i­pants who attend­ed to one of two simul­ta­ne­ous­ly pre­sent­ed sto­ries, while the SNR between the two talk­ers var­ied dynam­i­cal­ly between +6 and −6 dB. The neur­al track­ing showed an increas­ing ear­ly-to-late atten­tion-biased selec­tiv­i­ty. Impor­tant­ly, acousti­cal­ly dom­i­nant (i.e., loud­er) ignored talk­ers were tracked neu­ral­ly by late involve­ment of fron­to-pari­etal regions, which con­tributed to enhanced neur­al selec­tiv­i­ty. This neur­al selec­tiv­i­ty, by way of rep­re­sent­ing the ignored talk­er, pos­es a mech­a­nis­tic neur­al account of atten­tion under real-life acoustic conditions.

The paper is avail­able here.