web analytics
Categories
Ageing EEG / MEG fMRI Papers Publications

New Per­spec­tive paper in Neu­ron by Waschke et al.

We are excit­ed to share that for­mer Oble­ser­lab PhD stu­dent Leo Waschke, togeth­er with his new (Doug Gar­rett, Niels Kloost­er­man) and old (Jonas Obleser) lab has pub­lished an in-depth per­spec­tive piece in Neu­ron, with the provoca­tive title “Behav­ior need neur­al vari­abil­i­ty”.
Our arti­cle is essen­tial­ly a long and exten­sive trib­ute to the “sec­ond moment” of neur­al activ­i­ty, in sta­tis­ti­cal terms, essen­tial­ly: Vari­abil­i­ty — be it quan­ti­fied as vari­ance, entropy, or spec­tral slope — is the long-neglect­ed twin of aver­ages, and it holds great promise in under­stand­ing neur­al states (how does neur­al activ­i­ty dif­fer from one moment to the next?) and traits (how do indi­vid­u­als dif­fer from each other?).
Con­grat­u­la­tions, Leo!

Categories
Ageing Auditory Cortex Auditory Neuroscience Auditory Perception fMRI Hearing Loss Papers Perception Psychology Publications

New paper in eLife: Erb et al., Tem­po­ral selec­tiv­i­ty declines in the aging human audi­to­ry cortex

Con­grat­u­la­tions to Oble­ser­lab post­doc Julia Erb for her new paper to appear in eLife, “Tem­po­ral selec­tiv­i­ty declines in the aging human audi­to­ry cor­tex”.

It’s a trope that old­er lis­ten­ers strug­gle more in com­pre­hend­ing speech (think of Pro­fes­sor Tour­nesol in the famous Tintin comics!). The neu­ro­bi­ol­o­gy of why and how age­ing and speech com­pre­hen­sion dif­fi­cul­ties are linked at all has proven much more elu­sive, however.

Part of this lack of knowl­edge is direct­ly root­ed in our lim­it­ed under­stand­ing of how the cen­tral parts of the hear­ing brain – audi­to­ry cor­tex, broad­ly speak­ing – are organized.

Does audi­to­ry cor­tex of old­er adults have dif­fer­ent tun­ing prop­er­ties? That is, do young and old­er adults dif­fer in the way their audi­to­ry sub­fields rep­re­sent cer­tain fea­tures of sound?

A spe­cif­ic hypoth­e­sis fol­low­ing from this, derived from what is known about age-relat­ed change in neu­ro­bi­o­log­i­cal and psy­cho­log­i­cal process­es in gen­er­al (the idea of so-called “ded­if­fer­en­ti­a­tion”), was that the tun­ing to cer­tain fea­tures would “broad­en” and thus lose selec­tiv­i­ty in old­er com­pared to younger listeners.

More mech­a­nis­ti­cal­ly, we aimed to not only observe so-called “cross-sec­tion­al” (i.e., age-group) dif­fer­ences, but to link a listener’s chrono­log­i­cal age as close­ly as pos­si­ble to changes in cor­ti­cal tuning.

Amongst old­er lis­ten­ers, we observe that tem­po­ral-rate selec­tiv­i­ty declines with high­er age. In line with senes­cent neur­al ded­if­fer­en­ti­a­tion more gen­er­al­ly, our results high­light decreased selec­tiv­i­ty to tem­po­ral infor­ma­tion as a hall­mark of the aging audi­to­ry cortex.

This research is gen­er­ous­ly sup­port­ed by the ERC Con­sol­ida­tor project AUDADAPT, and data for this study were acquired at the CBBM at Uni­ver­si­ty of Lübeck.

Categories
Auditory Neuroscience Auditory Perception EEG / MEG Papers Perception Uncategorized

New paper in press in elife: Waschke et al.

Oble­ser­lab senior PhD stu­dent Leo Waschke, along­side co-authors Sarah Tune and Jonas Obleser, has a new paper in eLife.

The pro­cess­ing of sen­so­ry infor­ma­tion from our envi­ron­ment is not con­stant but rather varies with changes in ongo­ing brain activ­i­ty, or brain states. Thus, also the acu­ity of per­cep­tu­al deci­sions depends on the brain state dur­ing which sen­so­ry infor­ma­tion is processed. Recent work in non-human ani­mals sug­gests two key process­es that shape brain states rel­e­vant for sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. On the one hand, the momen­tary lev­el of neur­al desyn­chro­niza­tion in sen­so­ry cor­ti­cal areas has been shown to impact neur­al rep­re­sen­ta­tions of sen­so­ry input and relat­ed per­for­mance. On the oth­er hand, the cur­rent lev­el of arousal and relat­ed nora­dren­er­gic activ­i­ty has been linked to changes in sen­so­ry pro­cess­ing and per­cep­tu­al acuity.

How­ev­er, it is unclear at present, whether local neur­al desyn­chro­niza­tion and arousal pose dis­tinct brain states that entail vary­ing con­se­quences for sen­so­ry pro­cess­ing and behav­iour or if they rep­re­sent two inter­re­lat­ed man­i­fes­ta­tions of ongo­ing brain activ­i­ty and joint­ly affect behav­iour. Fur­ther­more, the exact shape of the rela­tion­ship between per­cep­tu­al per­for­mance and each of both brain states mark­ers (e.g. lin­ear vs. qua­drat­ic) is unclear at present.

In order to trans­fer find­ings from ani­mal phys­i­ol­o­gy to human cog­ni­tive neu­ro­science and test the exact shape of unique as well as shared influ­ences of local cor­ti­cal desyn­chro­niza­tion and glob­al arousal on sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance, we record­ed elec­troen­cephalog­ra­phy and pupil­lom­e­try in 25 human par­tic­i­pants while they per­formed a chal­leng­ing audi­to­ry dis­crim­i­na­tion task.

Impor­tant­ly, audi­to­ry stim­uli were selec­tive­ly pre­sent­ed dur­ing peri­ods of espe­cial­ly high or low audi­to­ry cor­ti­cal desyn­chro­niza­tion as approx­i­mat­ed by an infor­ma­tion the­o­ret­ic mea­sure of time-series com­plex­i­ty (weight­ed per­mu­ta­tion entropy). By means of a closed-loop real time set­up we were not only able to present stim­uli dur­ing dif­fer­ent desyn­chro­niza­tion states but also made sure to sam­ple the whole dis­tri­b­u­tion of such states, a pre­req­ui­site for the accu­rate assess­ment of brain-behav­iour rela­tion­ships. The record­ed pupil­lom­e­try data addi­tion­al­ly enabled us to draw infer­ences regard­ing the cur­rent lev­el of arousal due to the estab­lished link between nora­dren­er­gic activ­i­ty and pupil size.

 

Sin­gle tri­al analy­ses of EEG activ­i­ty, pupil­lom­e­try and behav­iour revealed clear­ly dis­so­cia­ble influ­ences of both brain state mark­ers on ongo­ing brain activ­i­ty, ear­ly sound-relat­ed activ­i­ty and behav­iour. High desyn­chro­niza­tion states were char­ac­ter­ized by a pro­nounced reduc­tion in oscil­la­to­ry pow­er across a wide fre­quen­cy range while high arousal states coin­cid­ed with a decrease in oscil­la­to­ry pow­er that was lim­it­ed to high fre­quen­cies. Sim­i­lar­ly, ear­ly sound-evoked activ­i­ty was dif­fer­en­tial­ly impact­ed by audi­to­ry cor­ti­cal desyn­chro­niza­tion and pupil-linked arousal. Phase-locked respons­es and evoked gam­ma pow­er increased with local desyn­chro­niza­tion with a ten­den­cy to sat­u­rate at inter­me­di­ate lev­els. Post-stim­u­lus low fre­quen­cy pow­er on the oth­er hand, increased with pupil-linked arousal.

Most impor­tant­ly, local desyn­chro­niza­tion and pupil-linked arousal dis­played dif­fer­ent rela­tion­ships with per­cep­tu­al per­for­mance. While par­tic­i­pants per­formed fastest and least biased fol­low­ing inter­me­di­ate lev­els of audi­to­ry cor­ti­cal desyn­chro­niza­tion, inter­me­di­ate lev­els of pupil-linked arousal were asso­ci­at­ed with high­est sen­si­tiv­i­ty. Thus, although both process­es pose behav­ioural­ly rel­e­vant brain states that affect per­cep­tu­al per­for­mance fol­low­ing an invert­ed u, they impact dis­tinct sub­do­mains of behav­iour. Tak­en togeth­er, our results speak to a mod­el in which inde­pen­dent states of local desyn­chro­niza­tion and glob­al arousal joint­ly shape states for opti­mal sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. The pub­lished man­u­script includ­ing all sup­ple­men­tal infor­ma­tion can be found here.

Categories
Attention Auditory Neuroscience Neural Oscillations Papers Psychology Uncategorized

New paper in press in the Jour­nal of Neuroscience

Wöst­mann, Alavash and Obleser demon­strate that alpha oscil­la­tions in the human brain imple­ment dis­trac­tor sup­pres­sion inde­pen­dent of tar­get selection.

In the­o­ry, the abil­i­ty to selec­tive­ly focus on rel­e­vant objects in our envi­ron­ment bases on selec­tion of tar­gets and sup­pres­sion of dis­trac­tion. As it is unclear whether tar­get selec­tion and dis­trac­tor sup­pres­sion are inde­pen­dent, we designed an Elec­troen­cephalog­ra­phy (EEG) study to direct­ly con­trast these two processes.

Par­tic­i­pants per­formed a pitch dis­crim­i­na­tion task on a tone sequence pre­sent­ed at one loud­speak­er loca­tion while a dis­tract­ing tone sequence was pre­sent­ed at anoth­er loca­tion. When the dis­trac­tor was fixed in the front, atten­tion to upcom­ing tar­gets on the left ver­sus right side induced hemi­spher­ic lat­er­al­i­sa­tion of alpha pow­er with rel­a­tive­ly high­er pow­er ipsi- ver­sus con­tralat­er­al to the side of attention.

Crit­i­cal­ly, when the tar­get was fixed in front, sup­pres­sion of upcom­ing dis­trac­tors reversed the pat­tern of alpha lat­er­al­i­sa­tion, that is, alpha pow­er increased con­tralat­er­al to the dis­trac­tor and decreased ipsi­lat­er­al­ly. Since the two lat­er­al­ized alpha respons­es were uncor­re­lat­ed across par­tic­i­pants, they can be con­sid­ered large­ly inde­pen­dent cog­ni­tive mechanisms.

This was fur­ther sup­port­ed by the fact that alpha lat­er­al­i­sa­tion in response to dis­trac­tor sup­pres­sion orig­i­nat­ed in more ante­ri­or, frontal cor­ti­cal regions com­pared with tar­get selec­tion (see figure).

The paper is also avail­able as preprint here.

 

Categories
Attention Auditory Cortex Auditory Speech Processing Papers Psychology Publications Speech

New paper in press in the Jour­nal of Cog­ni­tive Neuroscience

Wöst­mann, Schmitt and Obleser demon­strate that clos­ing the eyes enhances the atten­tion­al mod­u­la­tion of neur­al alpha pow­er but does not affect behav­iour­al per­for­mance in two lis­ten­ing tasks

Does clos­ing the eyes enhance our abil­i­ty to lis­ten atten­tive­ly? In fact, many of us tend to close their eyes when lis­ten­ing con­di­tions become chal­leng­ing, for exam­ple on the phone. It is thus sur­pris­ing that there is no pub­lished work on the behav­iour­al or neur­al con­se­quences of clos­ing the eyes dur­ing atten­tive lis­ten­ing. In the present study, we demon­strate that eye clo­sure does not only increase the over­all lev­el of absolute alpha pow­er but also the degree to which audi­to­ry atten­tion mod­u­lates alpha pow­er over time in syn­chrony with attend­ing to ver­sus ignor­ing speech. How­ev­er, our behav­iour­al results pro­vide evi­dence for the absence of any dif­fer­ence in lis­ten­ing per­for­mance with closed ver­sus open eyes. The like­ly rea­son for this is that the impact of eye clo­sure on neur­al oscil­la­to­ry dynam­ics does not match alpha pow­er mod­u­la­tions asso­ci­at­ed with lis­ten­ing per­for­mance pre­cise­ly enough (see figure).

The paper is avail­able as preprint here.

 

Categories
Adaptive Control Ageing Attention Auditory Cortex Auditory Neuroscience Auditory Speech Processing Executive Functions fMRI Papers Psychology Uncategorized

New paper in PNAS by Alavash, Tune, Obleser

How brain areas com­mu­ni­cate shapes human com­mu­ni­ca­tion: The hear­ing regions in your brain form new alliances as you try to lis­ten at the cock­tail party

Oble­ser­lab Post­docs Mohsen Alavash and Sarah Tune rock out an intri­cate graph-the­o­ret­i­cal account of mod­u­lar recon­fig­u­ra­tions in chal­leng­ing lis­ten­ing sit­u­a­tions, and how these pre­dict indi­vid­u­als’ lis­ten­ing success.

Avail­able online now in PNAS! (Also, our uni is cur­rent­ly fea­tur­ing a Ger­man-lan­guage press release on it, as well as an Eng­lish-lan­guage ver­sion)

Categories
Auditory Cortex Auditory Neuroscience fMRI Papers Publications

New paper by Erb et al. in Cere­bral Cor­tex: Human but not mon­key audi­to­ry cor­tex is tuned to slow tem­po­ral rates

In a new com­par­a­tive fMRI study just pub­lished in Cere­bral Cor­tex, AC post­doc Julia Erb and her col­lab­o­ra­tors in the Formisano (Maas­tricht Uni­ver­si­ty) and Van­duf­fel labs (KU Leu­ven) pro­vide us with nov­el insights into speech evo­lu­tion. These data by Erb et al. reveal homolo­gies and dif­fer­ences in nat­ur­al sound-encod­ing in human and non-human pri­mate cortex.

From the Abstract: “Under­stand­ing homolo­gies and dif­fer­ences in audi­to­ry cor­ti­cal pro­cess­ing in human and non­hu­man pri­mates is an essen­tial step in elu­ci­dat­ing the neu­ro­bi­ol­o­gy of speech and lan­guage. Using fMRI respons­es to nat­ur­al sounds, we inves­ti­gat­ed the rep­re­sen­ta­tion of mul­ti­ple acoustic fea­tures in audi­to­ry cor­tex of awake macaques and humans. Com­par­a­tive analy­ses revealed homol­o­gous large-scale topogra­phies not only for fre­quen­cy but also for tem­po­ral and spec­tral mod­u­la­tions. Con­verse­ly, we observed a strik­ing inter­species dif­fer­ence in cor­ti­cal sen­si­tiv­i­ty to tem­po­ral mod­u­la­tions: While decod­ing from macaque audi­to­ry cor­tex was most accu­rate at fast rates (> 30 Hz), humans had high­est sen­si­tiv­i­ty to ~3 Hz, a rel­e­vant rate for speech analy­sis. These find­ings sug­gest that char­ac­ter­is­tic tun­ing of human audi­to­ry cor­tex to slow tem­po­ral mod­u­la­tions is unique and may have emerged as a crit­i­cal step in the evo­lu­tion of speech and language.”

The paper is avail­able here. Con­grat­u­la­tions, Julia!

Categories
Attention Auditory Cortex Auditory Neuroscience EEG / MEG Papers Perception Psychology Publications

New paper in Neu­roim­age by Fiedler et al.: Track­ing ignored speech matters

Lis­ten­ing requires selec­tive neur­al pro­cess­ing of the incom­ing sound mix­ture, which in humans is borne out by a sur­pris­ing­ly clean rep­re­sen­ta­tion of attend­ed-only speech in audi­to­ry cor­tex. How this neur­al selec­tiv­i­ty is achieved even at neg­a­tive sig­nal-to-noise ratios (SNR) remains unclear. We show that, under such con­di­tions, a late cor­ti­cal rep­re­sen­ta­tion (i.e., neur­al track­ing) of the ignored acoustic sig­nal is key to suc­cess­ful sep­a­ra­tion of attend­ed and dis­tract­ing talk­ers (i.e., neur­al selec­tiv­i­ty). We record­ed and mod­eled the elec­troen­cephalo­graph­ic response of 18 par­tic­i­pants who attend­ed to one of two simul­ta­ne­ous­ly pre­sent­ed sto­ries, while the SNR between the two talk­ers var­ied dynam­i­cal­ly between +6 and −6 dB. The neur­al track­ing showed an increas­ing ear­ly-to-late atten­tion-biased selec­tiv­i­ty. Impor­tant­ly, acousti­cal­ly dom­i­nant (i.e., loud­er) ignored talk­ers were tracked neu­ral­ly by late involve­ment of fron­to-pari­etal regions, which con­tributed to enhanced neur­al selec­tiv­i­ty. This neur­al selec­tiv­i­ty, by way of rep­re­sent­ing the ignored talk­er, pos­es a mech­a­nis­tic neur­al account of atten­tion under real-life acoustic conditions.

The paper is avail­able here.