Categories
Auditory Perception Auditory Speech Processing Speech

Hot off the press: New chap­ter on neur­al oscil­la­tions in speech per­cep­tion by Tune & Obleser

Neur­al oscil­la­tions are a promi­nent fea­ture of the brain’s elec­tro­phys­i­ol­o­gy and tar­get vari­ables in many speech per­cep­tion stud­ies. For the lat­est edi­tion of the Springer Hand­book Audi­to­ry Research – this time focused on speech per­cep­tion – lab mem­bers Sarah Tune and Jonas Obleser teamed up to take stock of what has been learned about the func­tion­al rela­tion­ship of neur­al oscil­la­tions and speech perception.

By focus­ing on core func­tions and com­pu­ta­tion­al prin­ci­ples, the chap­ter offers a par­si­mo­nious account of the sta­ble pat­terns that have emerged across stud­ies and lev­els of investigations.

You can find a preprint of the chap­ter here and the entire col­lec­tion of chap­ters here.

Categories
Ageing Auditory Cortex Auditory Neuroscience Auditory Perception fMRI Hearing Loss Papers Perception Psychology Publications

New paper in eLife: Erb et al., Tem­po­ral selec­tiv­i­ty declines in the aging human audi­to­ry cortex

Con­grat­u­la­tions to Oble­ser­lab post­doc Julia Erb for her new paper to appear in eLife, “Tem­po­ral selec­tiv­i­ty declines in the aging human audi­to­ry cor­tex”.

It’s a trope that old­er lis­ten­ers strug­gle more in com­pre­hend­ing speech (think of Pro­fes­sor Tour­nesol in the famous Tintin comics!). The neu­ro­bi­ol­o­gy of why and how age­ing and speech com­pre­hen­sion dif­fi­cul­ties are linked at all has proven much more elu­sive, however.

Part of this lack of knowl­edge is direct­ly root­ed in our lim­it­ed under­stand­ing of how the cen­tral parts of the hear­ing brain – audi­to­ry cor­tex, broad­ly speak­ing – are organized.

Does audi­to­ry cor­tex of old­er adults have dif­fer­ent tun­ing prop­er­ties? That is, do young and old­er adults dif­fer in the way their audi­to­ry sub­fields rep­re­sent cer­tain fea­tures of sound?

A spe­cif­ic hypoth­e­sis fol­low­ing from this, derived from what is known about age-relat­ed change in neu­ro­bi­o­log­i­cal and psy­cho­log­i­cal process­es in gen­er­al (the idea of so-called “ded­if­fer­en­ti­a­tion”), was that the tun­ing to cer­tain fea­tures would “broad­en” and thus lose selec­tiv­i­ty in old­er com­pared to younger listeners.

More mech­a­nis­ti­cal­ly, we aimed to not only observe so-called “cross-sec­tion­al” (i.e., age-group) dif­fer­ences, but to link a listener’s chrono­log­i­cal age as close­ly as pos­si­ble to changes in cor­ti­cal tuning.

Amongst old­er lis­ten­ers, we observe that tem­po­ral-rate selec­tiv­i­ty declines with high­er age. In line with senes­cent neur­al ded­if­fer­en­ti­a­tion more gen­er­al­ly, our results high­light decreased selec­tiv­i­ty to tem­po­ral infor­ma­tion as a hall­mark of the aging audi­to­ry cortex.

This research is gen­er­ous­ly sup­port­ed by the ERC Con­sol­ida­tor project AUDADAPT, and data for this study were acquired at the CBBM at Uni­ver­si­ty of Lübeck.

Categories
Auditory Neuroscience Auditory Perception EEG / MEG Papers Perception Uncategorized

New paper in press in elife: Waschke et al.

Oble­ser­lab senior PhD stu­dent Leo Waschke, along­side co-authors Sarah Tune and Jonas Obleser, has a new paper in eLife.

The pro­cess­ing of sen­so­ry infor­ma­tion from our envi­ron­ment is not con­stant but rather varies with changes in ongo­ing brain activ­i­ty, or brain states. Thus, also the acu­ity of per­cep­tu­al deci­sions depends on the brain state dur­ing which sen­so­ry infor­ma­tion is processed. Recent work in non-human ani­mals sug­gests two key process­es that shape brain states rel­e­vant for sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. On the one hand, the momen­tary lev­el of neur­al desyn­chro­niza­tion in sen­so­ry cor­ti­cal areas has been shown to impact neur­al rep­re­sen­ta­tions of sen­so­ry input and relat­ed per­for­mance. On the oth­er hand, the cur­rent lev­el of arousal and relat­ed nora­dren­er­gic activ­i­ty has been linked to changes in sen­so­ry pro­cess­ing and per­cep­tu­al acuity.

How­ev­er, it is unclear at present, whether local neur­al desyn­chro­niza­tion and arousal pose dis­tinct brain states that entail vary­ing con­se­quences for sen­so­ry pro­cess­ing and behav­iour or if they rep­re­sent two inter­re­lat­ed man­i­fes­ta­tions of ongo­ing brain activ­i­ty and joint­ly affect behav­iour. Fur­ther­more, the exact shape of the rela­tion­ship between per­cep­tu­al per­for­mance and each of both brain states mark­ers (e.g. lin­ear vs. qua­drat­ic) is unclear at present.

In order to trans­fer find­ings from ani­mal phys­i­ol­o­gy to human cog­ni­tive neu­ro­science and test the exact shape of unique as well as shared influ­ences of local cor­ti­cal desyn­chro­niza­tion and glob­al arousal on sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance, we record­ed elec­troen­cephalog­ra­phy and pupil­lom­e­try in 25 human par­tic­i­pants while they per­formed a chal­leng­ing audi­to­ry dis­crim­i­na­tion task.

Impor­tant­ly, audi­to­ry stim­uli were selec­tive­ly pre­sent­ed dur­ing peri­ods of espe­cial­ly high or low audi­to­ry cor­ti­cal desyn­chro­niza­tion as approx­i­mat­ed by an infor­ma­tion the­o­ret­ic mea­sure of time-series com­plex­i­ty (weight­ed per­mu­ta­tion entropy). By means of a closed-loop real time set­up we were not only able to present stim­uli dur­ing dif­fer­ent desyn­chro­niza­tion states but also made sure to sam­ple the whole dis­tri­b­u­tion of such states, a pre­req­ui­site for the accu­rate assess­ment of brain-behav­iour rela­tion­ships. The record­ed pupil­lom­e­try data addi­tion­al­ly enabled us to draw infer­ences regard­ing the cur­rent lev­el of arousal due to the estab­lished link between nora­dren­er­gic activ­i­ty and pupil size.

 

Sin­gle tri­al analy­ses of EEG activ­i­ty, pupil­lom­e­try and behav­iour revealed clear­ly dis­so­cia­ble influ­ences of both brain state mark­ers on ongo­ing brain activ­i­ty, ear­ly sound-relat­ed activ­i­ty and behav­iour. High desyn­chro­niza­tion states were char­ac­ter­ized by a pro­nounced reduc­tion in oscil­la­to­ry pow­er across a wide fre­quen­cy range while high arousal states coin­cid­ed with a decrease in oscil­la­to­ry pow­er that was lim­it­ed to high fre­quen­cies. Sim­i­lar­ly, ear­ly sound-evoked activ­i­ty was dif­fer­en­tial­ly impact­ed by audi­to­ry cor­ti­cal desyn­chro­niza­tion and pupil-linked arousal. Phase-locked respons­es and evoked gam­ma pow­er increased with local desyn­chro­niza­tion with a ten­den­cy to sat­u­rate at inter­me­di­ate lev­els. Post-stim­u­lus low fre­quen­cy pow­er on the oth­er hand, increased with pupil-linked arousal.

Most impor­tant­ly, local desyn­chro­niza­tion and pupil-linked arousal dis­played dif­fer­ent rela­tion­ships with per­cep­tu­al per­for­mance. While par­tic­i­pants per­formed fastest and least biased fol­low­ing inter­me­di­ate lev­els of audi­to­ry cor­ti­cal desyn­chro­niza­tion, inter­me­di­ate lev­els of pupil-linked arousal were asso­ci­at­ed with high­est sen­si­tiv­i­ty. Thus, although both process­es pose behav­ioural­ly rel­e­vant brain states that affect per­cep­tu­al per­for­mance fol­low­ing an invert­ed u, they impact dis­tinct sub­do­mains of behav­iour. Tak­en togeth­er, our results speak to a mod­el in which inde­pen­dent states of local desyn­chro­niza­tion and glob­al arousal joint­ly shape states for opti­mal sen­so­ry pro­cess­ing and per­cep­tu­al per­for­mance. The pub­lished man­u­script includ­ing all sup­ple­men­tal infor­ma­tion can be found here.

Categories
Auditory Cortex Auditory Perception Auditory Speech Processing Hearing Loss Papers Perception Publications Speech

New paper in Ear and Hear­ing: Erb, Lud­wig, Kunke, Fuchs & Obleser on speech com­pre­hen­sion with a cochlear implant

We are excit­ed to share the results from our col­lab­o­ra­tion with the Cochlea Implant Cen­ter Leipzig: AC post­doc Julia Erb’s new paper on how 4‑Hz mod­u­la­tion sen­si­tiv­i­ty can inform us on 6‑month speech com­pre­hen­sion out­come in cochlear implants.

Erb J, Lud­wig AA, Kunke D, Fuchs M, & Obleser J (2018). Tem­po­ral sen­si­tiv­i­ty mea­sured short­ly after cochlear implan­ta­tion pre­dicts six-month speech recog­ni­tion outcome

Now avail­able online:

https://insights.ovid.com/crossref?an=00003446–900000000-98942

Abstract:

Objec­tives:

Psy­choa­coustic tests assessed short­ly after cochlear implan­ta­tion are use­ful pre­dic­tors of the reha­bil­i­ta­tive speech out­come. While large­ly inde­pen­dent, both spec­tral and tem­po­ral res­o­lu­tion tests are impor­tant to pro­vide an accu­rate pre­dic­tion of speech recog­ni­tion. How­ev­er, rapid tests of tem­po­ral sen­si­tiv­i­ty are cur­rent­ly lack­ing. Here, we pro­pose a sim­ple ampli­tude mod­u­la­tion rate dis­crim­i­na­tion (AMRD) par­a­digm that is val­i­dat­ed by pre­dict­ing future speech recog­ni­tion in adult cochlear implant (CI) patients.

Design:

In 34 new­ly implant­ed patients, we used an adap­tive AMRD par­a­digm, where broad­band noise was mod­u­lat­ed at the speech-rel­e­vant rate of ~4 Hz. In a lon­gi­tu­di­nal study, speech recog­ni­tion in qui­et was assessed using the closed-set Freiburg­er num­ber test short­ly after cochlear implan­ta­tion (t0) as well as the open-set Freiburg­er mono­syl­lab­ic word test 6 months lat­er (t6).

Results:

Both AMRD thresh­olds at t0 (r = –0.51) and speech recog­ni­tion scores at t0 (r = 0.56) pre­dict­ed speech recog­ni­tion scores at t6. How­ev­er, AMRD and speech recog­ni­tion at t0 were uncor­re­lat­ed, sug­gest­ing that those mea­sures cap­ture par­tial­ly dis­tinct per­cep­tu­al abil­i­ties. A mul­ti­ple regres­sion mod­el pre­dict­ing 6‑month speech recog­ni­tion out­come with deaf­ness dura­tion and speech recog­ni­tion at t0 improved from adjust­ed R2 = 0.30 to adjust­ed R2 = 0.44 when AMRD thresh­old was added as a predictor.

Con­clu­sions:

These find­ings iden­ti­fy AMRD thresh­olds as a reli­able, nonre­dun­dant pre­dic­tor above and beyond estab­lished speech tests for CI out­come. This AMRD test could poten­tial­ly be devel­oped into a rapid clin­i­cal tem­po­ral-res­o­lu­tion test to be inte­grat­ed into the post­op­er­a­tive test bat­tery to improve the reli­a­bil­i­ty of speech out­come prognosis.

 

Categories
Attention Auditory Cortex Auditory Perception Brain stimulation Papers Psychology Publications Speech

New paper in press in Brain Stim­u­la­tion: Wöst­mann, Vosskuhl, Obleser, and Her­rmann demon­strate that exter­nal­ly ampli­fied oscil­la­tions affect audi­to­ry spa­tial attention

In a fine col­lab­o­ra­tion we com­bine exper­tise on audi­to­ry cog­ni­tion (Malte Wöst­mann & Jonas Obleser, Uni­ver­si­ty of Lübeck) and brain stim­u­la­tion (Johannes Vosskuhl and Christoph S Her­rmann, Uni­ver­si­ty of Old­en­burg) to show that exter­nal­ly stim­u­lat­ed alpha and gam­ma oscil­la­tions dif­fer­en­tial­ly affect spa­tial atten­tion to speech. Our par­tic­i­pants per­formed a dichot­ic lis­ten­ing task while being stim­u­lat­ed using tran­scra­nial alter­nat­ing cur­rent stim­u­la­tion (tACS) at alpha or gam­ma fre­quen­cy (vs sham) on the left hemi­sphere. Alpha-tACS rel­a­tive­ly decreased recall of tar­gets con­tralat­er­al to stim­u­la­tion, while gam­ma-tACS reversed this effect. These results sug­gest that exter­nal­ly ampli­fied oscil­la­tions are func­tion­al­ly rel­e­vant to spa­tial attention.

Wöst­mann, M., Vosskuhl, J., Obleser, J., & Her­rmann, C.S. (2018). Oppo­site effects of lat­er­alised tran­scra­nial alpha ver­sus gam­ma stim­u­la­tion on audi­to­ry spa­tial attention.

Now avail­able online:

https://www.sciencedirect.com/science/article/pii/S1935861X18301074

Abstract:

Back­groundSpa­tial atten­tion rel­a­tive­ly increas­es the pow­er of neur­al 10-Hz alpha oscil­la­tions in the hemi­sphere ipsi­lat­er­al to atten­tion, and decreas­es alpha pow­er in the con­tralat­er­al hemi­sphere. For gam­ma oscil­la­tions (>40 Hz), the oppo­site effect has been observed. The func­tion­al roles of lat­er­alised oscil­la­tions for atten­tion are cur­rent­ly unclear.

Hypoth­e­sis: If lat­er­alised oscil­la­tions are func­tion­al­ly rel­e­vant for atten­tion, tran­scra­nial stim­u­la­tion of alpha ver­sus gam­ma oscil­la­tions in one hemi­sphere should dif­fer­en­tial­ly mod­u­late the accu­ra­cy of spa­tial atten­tion to the ipsi-ver­sus con­tralat­er­al side.

Meth­ods: 20 human par­tic­i­pants per­formed a dichot­ic lis­ten­ing task under con­tin­u­ous tran­scra­nial alter­nat­ing cur­rent stim­u­la­tion (tACS, vs sham) at alpha (10 Hz) or gam­ma (47 Hz) fre­quen­cy. On each tri­al, par­tic­i­pants attend­ed to four spo­ken num­bers on the left or right ear, while ignor­ing num­bers on the oth­er ear. In order to stim­u­late a left tem­poro-pari­etal cor­tex region, which is known to show marked mod­u­la­tions of alpha pow­er dur­ing audi­to­ry spa­tial atten­tion, tACS (1 mA peak-to-peak ampli­tude) was applied at elec­trode posi­tions TP7 and FC5 over the left hemisphere.

Results: As pre­dict­ed, uni­hemi­spher­ic alpha-tACS rel­a­tive­ly decreased the recall of tar­gets con­tralat­er­al to stim­u­la­tion, but increased recall of ipsi­lat­er­al tar­gets. Impor­tant­ly, this spa­tial pat­tern of results was reversed for gamma-tACS.

Con­clu­sions: Results pro­vide a proof of con­cept that tran­scra­nial­ly stim­u­lat­ed oscil­la­tions can enhance spa­tial atten­tion and facil­i­tate atten­tion­al selec­tion of speech. Fur­ther­more, oppo­site effects of alpha ver­sus gam­ma stim­u­la­tion sup­port the view that states of high alpha are incom­men­su­rate with active neur­al pro­cess­ing as reflect­ed by states of high gamma.

Categories
Auditory Perception Clinical relevance Papers Perception Psychology Publications

New paper out in the ‘Euro­pean Jour­nal of Neu­ro­science’: Tune, Wöst­mann & Obleser

AC post­docs Sarah Tune and Malte Wöst­mann have a new paper out online in the spe­cial issue on Neur­al Oscil­la­tions in the Euro­pean Jour­nal of Neu­ro­science! We are excit­ed to share the results from our first study of the ERC-fund­ed project on lis­ten­ing behav­ior and adap­tive con­trol in mid­dle-aged adults. In this study, we asked whether the fideli­ty of alpha pow­er lat­er­al­iza­tion would serve as a neur­al mark­er of selec­tive audi­to­ry atten­tion in the age­ing lis­ten­er. The results of our mul­ti­vari­ate approach demon­strate that under­stand­ing inter-indi­vid­ual dif­fer­ences is para­mount to under­stand­ing of the role of alpha oscil­la­tions in audi­to­ry atten­tion across age.

Tune, S., Wöst­mann, W., & Obleser, J. (2018) Prob­ing the lim­its of alpha pow­er lat­er­al­i­sa­tion as a neur­al mark­er of selec­tive atten­tion in mid­dle-aged and old­er listeners.

Now avail­able online:

http://onlinelibrary.wiley.com/doi/10.1111/ejn.13862/full/

 

Categories
Adaptive Control Attention Auditory Cortex Auditory Neuroscience Auditory Perception Auditory Speech Processing Degraded Acoustics EEG / MEG Evoked Activity Executive Functions Neural Oscillations Noise-Vocoded Speech Papers Perception Psychology Publications Speech

New paper in press in Cere­bral Cor­tex: Wöst­mann et al. on ignor­ing degrad­ed speech

Audi­to­ry Cognition’s own Malte Wöst­mann is in press in Cere­bral Cor­tex with his lat­est offer­ing on how atten­tion­al con­trol man­i­fests in alpha pow­er changes: Ignor­ing speech can be ben­e­fi­cial (if com­pre­hend­ing speech poten­tial­ly detracts from anoth­er task), and we here show how this change in lis­ten­ing goals turns around the pat­tern of alpha-pow­er changes with chang­ing speech degra­da­tion. (We will update as the paper becomes avail­able online.)

Wöst­mann, M., Lim, S.J., & Obleser, J. (2017). The human neur­al alpha response to speech is a proxy of atten­tion­al con­trol. Cere­bral Cor­tex. In press.

 

Abstract
Human alpha (~10 Hz) oscil­la­to­ry pow­er is a promi­nent neur­al mark­er of cog­ni­tive effort. When lis­ten­ers attempt to process and retain acousti­cal­ly degrad­ed speech, alpha pow­er enhances. It is unclear whether these alpha mod­u­la­tions reflect the degree of acoustic degra­da­tion per se or the degra­da­tion-dri­ven demand to a listener’s atten­tion­al con­trol. Using an irrel­e­vant-speech par­a­digm in elec­troen­cephalog­ra­phy (EEG), the cur­rent exper­i­ment demon­strates that the neur­al alpha response to speech is a sur­pris­ing­ly clear proxy of top-down con­trol, entire­ly dri­ven by the lis­ten­ing goals of attend­ing ver­sus ignor­ing degrad­ed speech. While (n=23) lis­ten­ers retained the ser­i­al order of 9 to-be-recalled dig­its, one to-be-ignored sen­tence was pre­sent­ed. Dis­tractibil­i­ty of the to-be-ignored sen­tence para­met­ri­cal­ly var­ied in acoustic detail (noise-vocod­ing), with more acoustic detail of dis­tract­ing speech increas­ing­ly dis­rupt­ing lis­ten­ers’ ser­i­al mem­o­ry recall. Where pre­vi­ous stud­ies had observed decreas­es in pari­etal and audi­to­ry alpha pow­er with more acoustic detail (of tar­get speech), alpha pow­er here showed the oppo­site pat­tern and increased with more acoustic detail in the speech dis­trac­tor. In sum, the neur­al alpha response reflects almost exclu­sive­ly a listener’s exer­tion of atten­tion­al con­trol, which is deci­sive for whether more acoustic detail facil­i­tates com­pre­hen­sion (of attend­ed speech) or enhances dis­trac­tion (of ignored speech).
Categories
Auditory Cortex Auditory Perception Cross-Modal Integration EEG / MEG Neural Oscillations Perception

New paper out: Plöchl, Gas­ton, Mer­ma­gen, König & Hair­ston, Sci­en­tif­ic Reports

An arti­cle by our new AC group mem­ber Michael Plöchl from his PhD project in Osnabrück has been accept­ed for pub­li­ca­tion in Sci­en­tif­ic Reports. In their study, Plöchl, Gas­ton, Mer­ma­gen, König and Hair­ston demon­strate that “Oscil­la­to­ry activ­i­ty in audi­to­ry cor­tex reflects the per­cep­tu­al lev­el of audio-tac­tile integration”.

oscillatory_activity

Abstract
Cross-modal inter­ac­tions between sen­so­ry chan­nels have been shown to depend on both the spa­tial dis­par­i­ty and the per­cep­tu­al sim­i­lar­i­ty between the pre­sent­ed stim­uli. Here we inves­ti­gate the behav­ioral and neur­al inte­gra­tion of audi­to­ry and tac­tile stim­u­lus pairs at dif­fer­ent lev­els of spa­tial dis­par­i­ty. Addi­tion­al­ly, we mod­u­lat­ed the ampli­tudes of both stim­uli in either a coher­ent or non-coher­ent man­ner. We found that both audi­to­ry and tac­tile local­iza­tion per­for­mance was biased towards the stim­u­lus in the respec­tive oth­er modal­i­ty. This bias lin­ear­ly increas­es with stim­u­lus dis­par­i­ty and is more pro­nounced for coher­ent­ly mod­u­lat­ed stim­u­lus pairs. Analy­ses of elec­troen­cephalo­graph­ic (EEG) activ­i­ty at temporal–cortical sources revealed enhanced event-relat­ed poten­tials (ERPs) as well as decreased alpha and beta pow­er dur­ing bimodal as com­pared to uni­modal stim­u­la­tion. How­ev­er, while the observed ERP dif­fer­ences are sim­i­lar for all stim­u­lus com­bi­na­tions, the extent of oscil­la­to­ry desyn­chro­niza­tion varies with stim­u­lus dis­par­i­ty. More­over, when both stim­uli were sub­jec­tive­ly per­ceived as orig­i­nat­ing from the same direc­tion, the reduc­tion in alpha and beta pow­er was sig­nif­i­cant­ly stronger. These obser­va­tions sug­gest that in the EEG the lev­el of per­cep­tu­al inte­gra­tion is main­ly reflect­ed by changes in ongo­ing oscil­la­to­ry activity.