Categories
Adaptive Control Auditory Neuroscience Auditory Speech Processing Auf deutsch Events Executive Functions Hearing Loss Media Speech

Jonas pre­sent­ed for the KIND Hörs­tiftung in Berlin (Video)

Im Feb­ru­ar hat­te ich die Ehre, für die Kind Hörs­tiftung auf deren 2019er Sym­po­sium in Berlin unsere Arbeit­en zur Vorher­sage des Hör­erfol­gs exem­plar­isch anhand einiger unser­er Stu­di­en all­ge­mein­ver­ständlich zu beleucht­en. Ein 25-minütiges Video dieses Vor­trags ist jet­zt online.

(In Feb­ru­ary, I had the hon­our of pre­sent­ing some of our recent work on pre­dict­ing indi­vid­u­als’ lis­ten­ing suc­cess at the sym­po­sium of the Kind Hear­ing Foun­da­tion. A video in Ger­man is now available.)

Categories
Attention Auditory Cortex Auditory Speech Processing EEG / MEG Psychology Speech

AC post­doc Malte Wöst­mann scores DFG grant to study the tem­po­ral dynam­ics of the audi­to­ry atten­tion­al filter

In this three-year project, we will use the audi­to­ry modal­i­ty as a test case to inves­ti­gate how the sup­pres­sion of dis­tract­ing infor­ma­tion (i.e., “fil­ter­ing”) is neu­ral­ly imple­ment­ed. While it is known that the atten­tion­al sam­pling of tar­gets (a) is rhyth­mic, (b) can be entrained, and © is mod­u­lat­ed by top-down pre­dic­tions, the exis­tence and neur­al imple­men­ta­tion of these mech­a­nisms for the sup­pres­sion of dis­trac­tors is at present unclear. To test this, we will use adap­ta­tions of estab­lished behav­iour­al par­a­digms of dis­trac­tor sup­pres­sion and record­ings of human elec­tro­phys­i­o­log­i­cal sig­nals in the Magen­to-/ Elec­troen­cephalo­gram (M/EEG).

Abstract of research project:

Back­ground: Goal-direct­ed behav­iour in tem­po­ral­ly dynam­ic envi­ron­ments requires to focus on rel­e­vant infor­ma­tion and to not get dis­tract­ed by irrel­e­vant infor­ma­tion. To achieve this, two cog­ni­tive process­es are nec­es­sary: On the one hand, atten­tion­al sam­pling of tar­get stim­uli has been focus of exten­sive research. On the oth­er hand, it is less well known how the human neur­al sys­tem exploits tem­po­ral infor­ma­tion in the stim­u­lus to fil­ter out dis­trac­tion. In the present project, we use the audi­to­ry modal­i­ty as a test case to study the tem­po­ral dynam­ics of atten­tion­al fil­ter­ing and its neur­al implementation.

Approach and gen­er­al hypoth­e­sis: In three vari­ants of the “Irrel­e­vant-Sound Task” we will manip­u­late tem­po­ral aspects of audi­to­ry dis­trac­tors. Behav­iour­al recall of tar­get stim­uli despite dis­trac­tion and respons­es in the elec­troen­cephalo­gram (EEG) will reflect the integri­ty and neur­al imple­men­ta­tion of the atten­tion­al fil­ter. In line with pre­lim­i­nary research, our gen­er­al hypoth­e­sis is that atten­tion­al fil­ter­ing bases on sim­i­lar but sign-reversed mech­a­nisms as atten­tion­al sam­pling: For instance, while atten­tion to rhyth­mic stim­uli increas­es neur­al sen­si­tiv­i­ty at time points of expect­ed tar­get occur­rence, fil­ter­ing of dis­trac­tors should instead decrease neur­al sen­si­tiv­i­ty at the time of expect­ed distraction.

Work pro­gramme: In each one of three Work Pack­ages (WPs), we will take as a mod­el an estab­lished neur­al mech­a­nism of atten­tion­al sam­pling and test the exis­tence and neur­al imple­men­ta­tion of a sim­i­lar mech­a­nism for atten­tion­al fil­ter­ing. This way, we will inves­ti­gate whether atten­tion­al fil­ter­ing fol­lows an intrin­sic rhythm (WP1), whether rhyth­mic dis­trac­tors can entrain atten­tion­al fil­ter­ing (WP2), and whether fore­knowl­edge about the time of dis­trac­tion induces top-down tun­ing of the atten­tion­al fil­ter in frontal cor­tex regions (WP3).

Objec­tives and rel­e­vance: The pri­ma­ry objec­tive of this research is to con­tribute to the foun­da­tion­al sci­ence on human selec­tive atten­tion, which requires a com­pre­hen­sive under­stand­ing of how the neur­al sys­tem achieves the task of fil­ter­ing out dis­trac­tion. Fur­ther­more, hear­ing dif­fi­cul­ties often base on dis­trac­tion by salient but irrel­e­vant sound. Results of this research will trans­late to the devel­op­ment of hear­ing aids that take into account neu­ro-cog­ni­tive mech­a­nisms to fil­ter out dis­trac­tion more efficiently.

Categories
Attention Auditory Cortex Auditory Speech Processing Papers Psychology Publications Speech

New paper in press in the Jour­nal of Cog­ni­tive Neuroscience

Wöst­mann, Schmitt and Obleser demon­strate that clos­ing the eyes enhances the atten­tion­al mod­u­la­tion of neur­al alpha pow­er but does not affect behav­iour­al per­for­mance in two lis­ten­ing tasks

Does clos­ing the eyes enhance our abil­i­ty to lis­ten atten­tive­ly? In fact, many of us tend to close their eyes when lis­ten­ing con­di­tions become chal­leng­ing, for exam­ple on the phone. It is thus sur­pris­ing that there is no pub­lished work on the behav­iour­al or neur­al con­se­quences of clos­ing the eyes dur­ing atten­tive lis­ten­ing. In the present study, we demon­strate that eye clo­sure does not only increase the over­all lev­el of absolute alpha pow­er but also the degree to which audi­to­ry atten­tion mod­u­lates alpha pow­er over time in syn­chrony with attend­ing to ver­sus ignor­ing speech. How­ev­er, our behav­iour­al results pro­vide evi­dence for the absence of any dif­fer­ence in lis­ten­ing per­for­mance with closed ver­sus open eyes. The like­ly rea­son for this is that the impact of eye clo­sure on neur­al oscil­la­to­ry dynam­ics does not match alpha pow­er mod­u­la­tions asso­ci­at­ed with lis­ten­ing per­for­mance pre­cise­ly enough (see figure).

The paper is avail­able as preprint here.

 

Categories
Adaptive Control Ageing Attention Auditory Cortex Auditory Neuroscience Auditory Speech Processing Executive Functions fMRI Papers Psychology Uncategorized

New paper in PNAS by Alavash, Tune, Obleser

How brain areas com­mu­ni­cate shapes human com­mu­ni­ca­tion: The hear­ing regions in your brain form new alliances as you try to lis­ten at the cock­tail party

Oble­ser­lab Post­docs Mohsen Alavash and Sarah Tune rock out an intri­cate graph-the­o­ret­i­cal account of mod­u­lar recon­fig­u­ra­tions in chal­leng­ing lis­ten­ing sit­u­a­tions, and how these pre­dict indi­vid­u­als’ lis­ten­ing success.

Avail­able online now in PNAS! (Also, our uni is cur­rent­ly fea­tur­ing a Ger­man-lan­guage press release on it, as well as an Eng­lish-lan­guage ver­sion)

Categories
Attention Auditory Cortex Auditory Neuroscience EEG / MEG Papers Perception Psychology Publications

New paper in Neu­roim­age by Fiedler et al.: Track­ing ignored speech matters

Lis­ten­ing requires selec­tive neur­al pro­cess­ing of the incom­ing sound mix­ture, which in humans is borne out by a sur­pris­ing­ly clean rep­re­sen­ta­tion of attend­ed-only speech in audi­to­ry cor­tex. How this neur­al selec­tiv­i­ty is achieved even at neg­a­tive sig­nal-to-noise ratios (SNR) remains unclear. We show that, under such con­di­tions, a late cor­ti­cal rep­re­sen­ta­tion (i.e., neur­al track­ing) of the ignored acoustic sig­nal is key to suc­cess­ful sep­a­ra­tion of attend­ed and dis­tract­ing talk­ers (i.e., neur­al selec­tiv­i­ty). We record­ed and mod­eled the elec­troen­cephalo­graph­ic response of 18 par­tic­i­pants who attend­ed to one of two simul­ta­ne­ous­ly pre­sent­ed sto­ries, while the SNR between the two talk­ers var­ied dynam­i­cal­ly between +6 and −6 dB. The neur­al track­ing showed an increas­ing ear­ly-to-late atten­tion-biased selec­tiv­i­ty. Impor­tant­ly, acousti­cal­ly dom­i­nant (i.e., loud­er) ignored talk­ers were tracked neu­ral­ly by late involve­ment of fron­to-pari­etal regions, which con­tributed to enhanced neur­al selec­tiv­i­ty. This neur­al selec­tiv­i­ty, by way of rep­re­sent­ing the ignored talk­er, pos­es a mech­a­nis­tic neur­al account of atten­tion under real-life acoustic conditions.

The paper is avail­able here.

Categories
Attention Auditory Cortex Brain stimulation Papers Perception Publications

New paper in press in JASA: Kre­it­e­wolf et al. on the role of voice-fea­ture con­ti­nu­ity for cock­tail-par­ty listening

Oble­ser­lab post­doc Jens Kre­it­e­wolf is in press in The Jour­nal of the Acousti­cal Soci­ety of America!

Togeth­er with our col­leagues, Marc Schön­wies­ner (Montreal/Leipzig), Samuel Math­ias (Yale), and Régis Tra­peau (Montreal/Marseille), we inves­ti­gat­ed the roles of two of the most salient voice fea­tures, glot­tal-pulse rate (GPR) and vocal-tract length (VTL), for per­cep­tu­al group­ing in the cock­tail par­ty. Using care­ful­ly con­trolled stim­uli, we show that lis­ten­ers exploit con­ti­nu­ity in both voice fea­tures to solve the cock­tail-par­ty prob­lem, but that VTL con­ti­nu­ity plays a stronger role for per­cep­tu­al group­ing than GPR con­ti­nu­ity. Our find­ings are in line with the dif­fer­en­tial impor­tance of VTL and GPR for the iden­ti­fi­ca­tion of nat­ur­al talk­ers and have clin­i­cal­ly rel­e­vant impli­ca­tions for cock­tail-par­ty lis­ten­ing in cochlear-implant users.

Data were record­ed using the Dome at BRAMS dur­ing Jens’ ACN Eras­mus Mundus exchange in Montreal.

The paper is avail­able as preprint:

https://www.biorxiv.org/content/early/2018/07/30/379545

 

Categories
Adaptive Control Auditory Cortex Auditory Neuroscience Auditory Working Memory Neural Oscillations Papers Perception Psychology Uncategorized

New paper in The Jour­nal of Neu­ro­science: Wilsch et al.., Tem­po­ral expec­ta­tion mod­u­lates the cor­ti­cal dynam­ics of short-term memory

Con­grat­u­la­tions to Oble­ser­lab alum­na Anna Wilsch, who is – for now – leav­ing acad­e­mia on a true high with her lat­est offer­ing on how tem­po­ral expec­ta­tions (“fore­knowl­edge” about when some­thing is to hap­pen) shape the neur­al make-up of memory!

Record­ed while the Oble­ser­lab was still in Leipzig at the Max Planck, and analysed with great input from our co-authors Mol­ly Hen­ry, Björn Her­rmann as well as Christoph Her­rmann (Old­en­burg), Anna used Mag­ne­toen­cephalog­ra­phy in an intri­cate but ulti­mate­ly very sim­ple sen­so­ry-mem­o­ry paradigm.

 

While sen­so­ry mem­o­ries of the phys­i­cal world fade quick­ly, Anna here shows that this decay of short-term mem­o­ry can be coun­ter­act­ed by tem­po­ral expectation.

Notably, spa­tial­ly dis­trib­uted cor­ti­cal pat­terns of alpha (8−−13 Hz) pow­er showed oppos­ing effects in audi­to­ry vs. visu­al sen­so­ry cor­tices. More­over, alpha-tuned con­nec­tiv­i­ty changes with­in supramodal atten­tion net­works reflect the allo­ca­tion of neur­al resources as short-term mem­o­ry rep­re­sen­ta­tions fade.

— to be updat­ed as the paper will become avail­able online –

Categories
Attention Auditory Cortex Auditory Perception Brain stimulation Papers Psychology Publications Speech

New paper in press in Brain Stim­u­la­tion: Wöst­mann, Vosskuhl, Obleser, and Her­rmann demon­strate that exter­nal­ly ampli­fied oscil­la­tions affect audi­to­ry spa­tial attention

In a fine col­lab­o­ra­tion we com­bine exper­tise on audi­to­ry cog­ni­tion (Malte Wöst­mann & Jonas Obleser, Uni­ver­si­ty of Lübeck) and brain stim­u­la­tion (Johannes Vosskuhl and Christoph S Her­rmann, Uni­ver­si­ty of Old­en­burg) to show that exter­nal­ly stim­u­lat­ed alpha and gam­ma oscil­la­tions dif­fer­en­tial­ly affect spa­tial atten­tion to speech. Our par­tic­i­pants per­formed a dichot­ic lis­ten­ing task while being stim­u­lat­ed using tran­scra­nial alter­nat­ing cur­rent stim­u­la­tion (tACS) at alpha or gam­ma fre­quen­cy (vs sham) on the left hemi­sphere. Alpha-tACS rel­a­tive­ly decreased recall of tar­gets con­tralat­er­al to stim­u­la­tion, while gam­ma-tACS reversed this effect. These results sug­gest that exter­nal­ly ampli­fied oscil­la­tions are func­tion­al­ly rel­e­vant to spa­tial attention.

Wöst­mann, M., Vosskuhl, J., Obleser, J., & Her­rmann, C.S. (2018). Oppo­site effects of lat­er­alised tran­scra­nial alpha ver­sus gam­ma stim­u­la­tion on audi­to­ry spa­tial attention.

Now avail­able online:

https://www.sciencedirect.com/science/article/pii/S1935861X18301074

Abstract:

Back­groundSpa­tial atten­tion rel­a­tive­ly increas­es the pow­er of neur­al 10-Hz alpha oscil­la­tions in the hemi­sphere ipsi­lat­er­al to atten­tion, and decreas­es alpha pow­er in the con­tralat­er­al hemi­sphere. For gam­ma oscil­la­tions (>40 Hz), the oppo­site effect has been observed. The func­tion­al roles of lat­er­alised oscil­la­tions for atten­tion are cur­rent­ly unclear.

Hypoth­e­sis: If lat­er­alised oscil­la­tions are func­tion­al­ly rel­e­vant for atten­tion, tran­scra­nial stim­u­la­tion of alpha ver­sus gam­ma oscil­la­tions in one hemi­sphere should dif­fer­en­tial­ly mod­u­late the accu­ra­cy of spa­tial atten­tion to the ipsi-ver­sus con­tralat­er­al side.

Meth­ods: 20 human par­tic­i­pants per­formed a dichot­ic lis­ten­ing task under con­tin­u­ous tran­scra­nial alter­nat­ing cur­rent stim­u­la­tion (tACS, vs sham) at alpha (10 Hz) or gam­ma (47 Hz) fre­quen­cy. On each tri­al, par­tic­i­pants attend­ed to four spo­ken num­bers on the left or right ear, while ignor­ing num­bers on the oth­er ear. In order to stim­u­late a left tem­poro-pari­etal cor­tex region, which is known to show marked mod­u­la­tions of alpha pow­er dur­ing audi­to­ry spa­tial atten­tion, tACS (1 mA peak-to-peak ampli­tude) was applied at elec­trode posi­tions TP7 and FC5 over the left hemisphere.

Results: As pre­dict­ed, uni­hemi­spher­ic alpha-tACS rel­a­tive­ly decreased the recall of tar­gets con­tralat­er­al to stim­u­la­tion, but increased recall of ipsi­lat­er­al tar­gets. Impor­tant­ly, this spa­tial pat­tern of results was reversed for gamma-tACS.

Con­clu­sions: Results pro­vide a proof of con­cept that tran­scra­nial­ly stim­u­lat­ed oscil­la­tions can enhance spa­tial atten­tion and facil­i­tate atten­tion­al selec­tion of speech. Fur­ther­more, oppo­site effects of alpha ver­sus gam­ma stim­u­la­tion sup­port the view that states of high alpha are incom­men­su­rate with active neur­al pro­cess­ing as reflect­ed by states of high gamma.