web analytics
Categories
Auditory Neuroscience Auditory Perception EEG / MEG Papers Perception Uncategorized

New paper in press in elife: Waschke et al.

Oble­ser­lab senior PhD stu­dent Leo Waschke, along­side co-authors Sarah Tune and Jonas Obleser, has a new paper in eLife.

The pro­cess­ing of sen­so­ry infor­ma­tion from our envi­ron­ment is not con­stant but rather varies with changes in ongo­ing brain activ­i­ty, or brain states. Thus, also the acu­ity of per­cep­tu­al deci­sions depends on the brain state dur­ing which sen­so­ry infor­ma­tion is processed. Recent work in non-human ani­mals sug­gests two key process­es that shape brain states rel­e­vant for sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. On the one hand, the momen­tary lev­el of neur­al desyn­chro­niza­tion in sen­so­ry cor­ti­cal areas has been shown to impact neur­al rep­re­sen­ta­tions of sen­so­ry input and relat­ed per­for­mance. On the oth­er hand, the cur­rent lev­el of arousal and relat­ed nora­dren­er­gic activ­i­ty has been linked to changes in sen­so­ry pro­cess­ing and per­cep­tu­al acuity.

How­ev­er, it is unclear at present, whether local neur­al desyn­chro­niza­tion and arousal pose dis­tinct brain states that entail vary­ing con­se­quences for sen­so­ry pro­cess­ing and behav­iour or if they rep­re­sent two inter­re­lat­ed man­i­fes­ta­tions of ongo­ing brain activ­i­ty and joint­ly affect behav­iour. Fur­ther­more, the exact shape of the rela­tion­ship between per­cep­tu­al per­for­mance and each of both brain states mark­ers (e.g. lin­ear vs. qua­drat­ic) is unclear at present.

In order to trans­fer find­ings from ani­mal phys­i­ol­o­gy to human cog­ni­tive neu­ro­science and test the exact shape of unique as well as shared influ­ences of local cor­ti­cal desyn­chro­niza­tion and glob­al arousal on sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance, we record­ed elec­troen­cephalog­ra­phy and pupil­lom­e­try in 25 human par­tic­i­pants while they per­formed a chal­leng­ing audi­to­ry dis­crim­i­na­tion task.

Impor­tant­ly, audi­to­ry stim­uli were selec­tive­ly pre­sent­ed dur­ing peri­ods of espe­cial­ly high or low audi­to­ry cor­ti­cal desyn­chro­niza­tion as approx­i­mat­ed by an infor­ma­tion the­o­ret­ic mea­sure of time-series com­plex­i­ty (weight­ed per­mu­ta­tion entropy). By means of a closed-loop real time set­up we were not only able to present stim­uli dur­ing dif­fer­ent desyn­chro­niza­tion states but also made sure to sam­ple the whole dis­tri­b­u­tion of such states, a pre­req­ui­site for the accu­rate assess­ment of brain-behav­iour rela­tion­ships. The record­ed pupil­lom­e­try data addi­tion­al­ly enabled us to draw infer­ences regard­ing the cur­rent lev­el of arousal due to the estab­lished link between nora­dren­er­gic activ­i­ty and pupil size.

 

Sin­gle tri­al analy­ses of EEG activ­i­ty, pupil­lom­e­try and behav­iour revealed clear­ly dis­so­cia­ble influ­ences of both brain state mark­ers on ongo­ing brain activ­i­ty, ear­ly sound-relat­ed activ­i­ty and behav­iour. High desyn­chro­niza­tion states were char­ac­ter­ized by a pro­nounced reduc­tion in oscil­la­to­ry pow­er across a wide fre­quen­cy range while high arousal states coin­cid­ed with a decrease in oscil­la­to­ry pow­er that was lim­it­ed to high fre­quen­cies. Sim­i­lar­ly, ear­ly sound-evoked activ­i­ty was dif­fer­en­tial­ly impact­ed by audi­to­ry cor­ti­cal desyn­chro­niza­tion and pupil-linked arousal. Phase-locked respons­es and evoked gam­ma pow­er increased with local desyn­chro­niza­tion with a ten­den­cy to sat­u­rate at inter­me­di­ate lev­els. Post-stim­u­lus low fre­quen­cy pow­er on the oth­er hand, increased with pupil-linked arousal.

Most impor­tant­ly, local desyn­chro­niza­tion and pupil-linked arousal dis­played dif­fer­ent rela­tion­ships with per­cep­tu­al per­for­mance. While par­tic­i­pants per­formed fastest and least biased fol­low­ing inter­me­di­ate lev­els of audi­to­ry cor­ti­cal desyn­chro­niza­tion, inter­me­di­ate lev­els of pupil-linked arousal were asso­ci­at­ed with high­est sen­si­tiv­i­ty. Thus, although both process­es pose behav­ioural­ly rel­e­vant brain states that affect per­cep­tu­al per­for­mance fol­low­ing an invert­ed u, they impact dis­tinct sub­do­mains of behav­iour. Tak­en togeth­er, our results speak to a mod­el in which inde­pen­dent states of local desyn­chro­niza­tion and glob­al arousal joint­ly shape states for opti­mal sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. The pub­lished man­u­script includ­ing all sup­ple­men­tal infor­ma­tion can be found here.

Categories
Attention Auditory Neuroscience Neural Oscillations Papers Psychology Uncategorized

New paper in press in the Jour­nal of Neuroscience

Wöst­mann, Alavash and Obleser demon­strate that alpha oscil­la­tions in the human brain imple­ment dis­trac­tor sup­pres­sion inde­pen­dent of tar­get selection.

In the­o­ry, the abil­i­ty to selec­tive­ly focus on rel­e­vant objects in our envi­ron­ment bases on selec­tion of tar­gets and sup­pres­sion of dis­trac­tion. As it is unclear whether tar­get selec­tion and dis­trac­tor sup­pres­sion are inde­pen­dent, we designed an Elec­troen­cephalog­ra­phy (EEG) study to direct­ly con­trast these two processes.

Par­tic­i­pants per­formed a pitch dis­crim­i­na­tion task on a tone sequence pre­sent­ed at one loud­speak­er loca­tion while a dis­tract­ing tone sequence was pre­sent­ed at anoth­er loca­tion. When the dis­trac­tor was fixed in the front, atten­tion to upcom­ing tar­gets on the left ver­sus right side induced hemi­spher­ic lat­er­al­i­sa­tion of alpha pow­er with rel­a­tive­ly high­er pow­er ipsi- ver­sus con­tralat­er­al to the side of attention.

Crit­i­cal­ly, when the tar­get was fixed in front, sup­pres­sion of upcom­ing dis­trac­tors reversed the pat­tern of alpha lat­er­al­i­sa­tion, that is, alpha pow­er increased con­tralat­er­al to the dis­trac­tor and decreased ipsi­lat­er­al­ly. Since the two lat­er­al­ized alpha respons­es were uncor­re­lat­ed across par­tic­i­pants, they can be con­sid­ered large­ly inde­pen­dent cog­ni­tive mechanisms.

This was fur­ther sup­port­ed by the fact that alpha lat­er­al­i­sa­tion in response to dis­trac­tor sup­pres­sion orig­i­nat­ed in more ante­ri­or, frontal cor­ti­cal regions com­pared with tar­get selec­tion (see figure).

The paper is also avail­able as preprint here.

 

Categories
Adaptive Control Auditory Neuroscience Auditory Speech Processing Auf deutsch Events Executive Functions Hearing Loss Media Speech

Jonas pre­sent­ed for the KIND Hörs­tiftung in Berlin (Video)

Im Feb­ru­ar hat­te ich die Ehre, für die Kind Hörs­tiftung auf deren 2019er Sym­po­sium in Berlin unsere Arbeit­en zur Vorher­sage des Hör­erfol­gs exem­plar­isch anhand einiger unser­er Stu­di­en all­ge­mein­ver­ständlich zu beleucht­en. Ein 25-minütiges Video dieses Vor­trags ist jet­zt online.

(In Feb­ru­ary, I had the hon­our of pre­sent­ing some of our recent work on pre­dict­ing indi­vid­u­als’ lis­ten­ing suc­cess at the sym­po­sium of the Kind Hear­ing Foun­da­tion. A video in Ger­man is now available.)

Categories
Attention Auditory Cortex Auditory Speech Processing EEG / MEG Psychology Speech

AC post­doc Malte Wöst­mann scores DFG grant to study the tem­po­ral dynam­ics of the audi­to­ry atten­tion­al filter

In this three-year project, we will use the audi­to­ry modal­i­ty as a test case to inves­ti­gate how the sup­pres­sion of dis­tract­ing infor­ma­tion (i.e., “fil­ter­ing”) is neu­ral­ly imple­ment­ed. While it is known that the atten­tion­al sam­pling of tar­gets (a) is rhyth­mic, (b) can be entrained, and © is mod­u­lat­ed by top-down pre­dic­tions, the exis­tence and neur­al imple­men­ta­tion of these mech­a­nisms for the sup­pres­sion of dis­trac­tors is at present unclear. To test this, we will use adap­ta­tions of estab­lished behav­iour­al par­a­digms of dis­trac­tor sup­pres­sion and record­ings of human elec­tro­phys­i­o­log­i­cal sig­nals in the Magen­to-/ Elec­troen­cephalo­gram (M/EEG).

Abstract of research project:

Back­ground: Goal-direct­ed behav­iour in tem­po­ral­ly dynam­ic envi­ron­ments requires to focus on rel­e­vant infor­ma­tion and to not get dis­tract­ed by irrel­e­vant infor­ma­tion. To achieve this, two cog­ni­tive process­es are nec­es­sary: On the one hand, atten­tion­al sam­pling of tar­get stim­uli has been focus of exten­sive research. On the oth­er hand, it is less well known how the human neur­al sys­tem exploits tem­po­ral infor­ma­tion in the stim­u­lus to fil­ter out dis­trac­tion. In the present project, we use the audi­to­ry modal­i­ty as a test case to study the tem­po­ral dynam­ics of atten­tion­al fil­ter­ing and its neur­al implementation.

Approach and gen­er­al hypoth­e­sis: In three vari­ants of the “Irrel­e­vant-Sound Task” we will manip­u­late tem­po­ral aspects of audi­to­ry dis­trac­tors. Behav­iour­al recall of tar­get stim­uli despite dis­trac­tion and respons­es in the elec­troen­cephalo­gram (EEG) will reflect the integri­ty and neur­al imple­men­ta­tion of the atten­tion­al fil­ter. In line with pre­lim­i­nary research, our gen­er­al hypoth­e­sis is that atten­tion­al fil­ter­ing bases on sim­i­lar but sign-reversed mech­a­nisms as atten­tion­al sam­pling: For instance, while atten­tion to rhyth­mic stim­uli increas­es neur­al sen­si­tiv­i­ty at time points of expect­ed tar­get occur­rence, fil­ter­ing of dis­trac­tors should instead decrease neur­al sen­si­tiv­i­ty at the time of expect­ed distraction.

Work pro­gramme: In each one of three Work Pack­ages (WPs), we will take as a mod­el an estab­lished neur­al mech­a­nism of atten­tion­al sam­pling and test the exis­tence and neur­al imple­men­ta­tion of a sim­i­lar mech­a­nism for atten­tion­al fil­ter­ing. This way, we will inves­ti­gate whether atten­tion­al fil­ter­ing fol­lows an intrin­sic rhythm (WP1), whether rhyth­mic dis­trac­tors can entrain atten­tion­al fil­ter­ing (WP2), and whether fore­knowl­edge about the time of dis­trac­tion induces top-down tun­ing of the atten­tion­al fil­ter in frontal cor­tex regions (WP3).

Objec­tives and rel­e­vance: The pri­ma­ry objec­tive of this research is to con­tribute to the foun­da­tion­al sci­ence on human selec­tive atten­tion, which requires a com­pre­hen­sive under­stand­ing of how the neur­al sys­tem achieves the task of fil­ter­ing out dis­trac­tion. Fur­ther­more, hear­ing dif­fi­cul­ties often base on dis­trac­tion by salient but irrel­e­vant sound. Results of this research will trans­late to the devel­op­ment of hear­ing aids that take into account neu­ro-cog­ni­tive mech­a­nisms to fil­ter out dis­trac­tion more efficiently.

Categories
Attention Auditory Cortex Auditory Speech Processing Papers Psychology Publications Speech

New paper in press in the Jour­nal of Cog­ni­tive Neuroscience

Wöst­mann, Schmitt and Obleser demon­strate that clos­ing the eyes enhances the atten­tion­al mod­u­la­tion of neur­al alpha pow­er but does not affect behav­iour­al per­for­mance in two lis­ten­ing tasks

Does clos­ing the eyes enhance our abil­i­ty to lis­ten atten­tive­ly? In fact, many of us tend to close their eyes when lis­ten­ing con­di­tions become chal­leng­ing, for exam­ple on the phone. It is thus sur­pris­ing that there is no pub­lished work on the behav­iour­al or neur­al con­se­quences of clos­ing the eyes dur­ing atten­tive lis­ten­ing. In the present study, we demon­strate that eye clo­sure does not only increase the over­all lev­el of absolute alpha pow­er but also the degree to which audi­to­ry atten­tion mod­u­lates alpha pow­er over time in syn­chrony with attend­ing to ver­sus ignor­ing speech. How­ev­er, our behav­iour­al results pro­vide evi­dence for the absence of any dif­fer­ence in lis­ten­ing per­for­mance with closed ver­sus open eyes. The like­ly rea­son for this is that the impact of eye clo­sure on neur­al oscil­la­to­ry dynam­ics does not match alpha pow­er mod­u­la­tions asso­ci­at­ed with lis­ten­ing per­for­mance pre­cise­ly enough (see figure).

The paper is avail­able as preprint here.

 

Categories
Adaptive Control Ageing Attention Auditory Cortex Auditory Neuroscience Auditory Speech Processing Executive Functions fMRI Papers Psychology Uncategorized

New paper in PNAS by Alavash, Tune, Obleser

How brain areas com­mu­ni­cate shapes human com­mu­ni­ca­tion: The hear­ing regions in your brain form new alliances as you try to lis­ten at the cock­tail party

Oble­ser­lab Post­docs Mohsen Alavash and Sarah Tune rock out an intri­cate graph-the­o­ret­i­cal account of mod­u­lar recon­fig­u­ra­tions in chal­leng­ing lis­ten­ing sit­u­a­tions, and how these pre­dict indi­vid­u­als’ lis­ten­ing success.

Avail­able online now in PNAS! (Also, our uni is cur­rent­ly fea­tur­ing a Ger­man-lan­guage press release on it, as well as an Eng­lish-lan­guage ver­sion)

Categories
Auditory Cortex Auditory Neuroscience fMRI Papers Publications

New paper by Erb et al. in Cere­bral Cor­tex: Human but not mon­key audi­to­ry cor­tex is tuned to slow tem­po­ral rates

In a new com­par­a­tive fMRI study just pub­lished in Cere­bral Cor­tex, AC post­doc Julia Erb and her col­lab­o­ra­tors in the Formisano (Maas­tricht Uni­ver­si­ty) and Van­duf­fel labs (KU Leu­ven) pro­vide us with nov­el insights into speech evo­lu­tion. These data by Erb et al. reveal homolo­gies and dif­fer­ences in nat­ur­al sound-encod­ing in human and non-human pri­mate cortex.

From the Abstract: “Under­stand­ing homolo­gies and dif­fer­ences in audi­to­ry cor­ti­cal pro­cess­ing in human and non­hu­man pri­mates is an essen­tial step in elu­ci­dat­ing the neu­ro­bi­ol­o­gy of speech and lan­guage. Using fMRI respons­es to nat­ur­al sounds, we inves­ti­gat­ed the rep­re­sen­ta­tion of mul­ti­ple acoustic fea­tures in audi­to­ry cor­tex of awake macaques and humans. Com­par­a­tive analy­ses revealed homol­o­gous large-scale topogra­phies not only for fre­quen­cy but also for tem­po­ral and spec­tral mod­u­la­tions. Con­verse­ly, we observed a strik­ing inter­species dif­fer­ence in cor­ti­cal sen­si­tiv­i­ty to tem­po­ral mod­u­la­tions: While decod­ing from macaque audi­to­ry cor­tex was most accu­rate at fast rates (> 30 Hz), humans had high­est sen­si­tiv­i­ty to ~3 Hz, a rel­e­vant rate for speech analy­sis. These find­ings sug­gest that char­ac­ter­is­tic tun­ing of human audi­to­ry cor­tex to slow tem­po­ral mod­u­la­tions is unique and may have emerged as a crit­i­cal step in the evo­lu­tion of speech and language.”

The paper is avail­able here. Con­grat­u­la­tions, Julia!

Categories
Attention Auditory Cortex Auditory Neuroscience EEG / MEG Papers Perception Psychology Publications

New paper in Neu­roim­age by Fiedler et al.: Track­ing ignored speech matters

Lis­ten­ing requires selec­tive neur­al pro­cess­ing of the incom­ing sound mix­ture, which in humans is borne out by a sur­pris­ing­ly clean rep­re­sen­ta­tion of attend­ed-only speech in audi­to­ry cor­tex. How this neur­al selec­tiv­i­ty is achieved even at neg­a­tive sig­nal-to-noise ratios (SNR) remains unclear. We show that, under such con­di­tions, a late cor­ti­cal rep­re­sen­ta­tion (i.e., neur­al track­ing) of the ignored acoustic sig­nal is key to suc­cess­ful sep­a­ra­tion of attend­ed and dis­tract­ing talk­ers (i.e., neur­al selec­tiv­i­ty). We record­ed and mod­eled the elec­troen­cephalo­graph­ic response of 18 par­tic­i­pants who attend­ed to one of two simul­ta­ne­ous­ly pre­sent­ed sto­ries, while the SNR between the two talk­ers var­ied dynam­i­cal­ly between +6 and −6 dB. The neur­al track­ing showed an increas­ing ear­ly-to-late atten­tion-biased selec­tiv­i­ty. Impor­tant­ly, acousti­cal­ly dom­i­nant (i.e., loud­er) ignored talk­ers were tracked neu­ral­ly by late involve­ment of fron­to-pari­etal regions, which con­tributed to enhanced neur­al selec­tiv­i­ty. This neur­al selec­tiv­i­ty, by way of rep­re­sent­ing the ignored talk­er, pos­es a mech­a­nis­tic neur­al account of atten­tion under real-life acoustic conditions.

The paper is avail­able here.