web analytics
Categories
Auditory Working Memory EEG / MEG Executive Functions Neural Oscillations Papers Publications

New Paper by Lim, Wöst­mann, & Obleser in Jour­nal of Neuroscience

Can you atten­tive­ly “high­light” audi­to­ry traces in mem­o­ry? If so, what are poten­tial neur­al mech­a­nisms of it?

Sung-Joo Lim’s paper in J Neurosci;

Selec­tive Atten­tion to Audi­to­ry Mem­o­ry Neu­ral­ly Enhances Per­cep­tu­al Precision

is now avail­able online (full text).

Abstract
Selec­tive atten­tion to a task-rel­e­vant stim­u­lus facil­i­tates encod­ing of that stim­u­lus into a work­ing mem­o­ry rep­re­sen­ta­tion. It is less clear whether selec­tive atten­tion also improves the pre­ci­sion of a stim­u­lus already rep­re­sent­ed in mem­o­ry. Here, we inves­ti­gate the behav­ioral and neur­al dynam­ics of selec­tive atten­tion to rep­re­sen­ta­tions in audi­to­ry work­ing mem­o­ry (i.e., audi­to­ry objects) using psy­chophys­i­cal mod­el­ing and mod­el-based analy­sis of elec­troen­cephalo­graph­ic sig­nals. Human lis­ten­ers per­formed a syl­la­ble pitch dis­crim­i­na­tion task where two syl­la­bles served as to-be-encod­ed audi­to­ry objects. Valid (vs neu­tral) retroac­tive cues were pre­sent­ed dur­ing reten­tion to allow lis­ten­ers to selec­tive­ly attend to the to-be-probed audi­to­ry object in mem­o­ry. Behav­ioral­ly, lis­ten­ers rep­re­sent­ed audi­to­ry objects in mem­o­ry more pre­cise­ly (expressed by steep­er slopes of a psy­cho­me­t­ric curve) and made faster per­cep­tu­al deci­sions when valid com­pared to neu­tral retrocues were pre­sent­ed. Neu­ral­ly, valid com­pared to neu­tral retrocues elicit­ed a larg­er fron­to­cen­tral sus­tained neg­a­tiv­i­ty in the evoked poten­tial as well as enhanced pari­etal alpha/low-beta oscil­la­to­ry pow­er (9–18 Hz) dur­ing mem­o­ry reten­tion. Crit­i­cal­ly, indi­vid­ual mag­ni­tudes of alpha oscil­la­to­ry pow­er (7–11 Hz) mod­u­la­tion pre­dict­ed the degree to which valid retrocues ben­e­fit­ted indi­vid­u­als’ behav­ior. Our results indi­cate that selec­tive atten­tion to a spe­cif­ic object in audi­to­ry mem­o­ry does ben­e­fit human per­for­mance not by sim­ply reduc­ing mem­o­ry load, but by active­ly engag­ing com­ple­men­tary neur­al resources to sharp­en the pre­ci­sion of the task-rel­e­vant object in memory.

Con­grats!

Categories
Editorial Notes Events Posters Publications

See you at SfN

Soci­ety for Neu­ro­science 2015 is com­ing up. Please come and check out our stuff! Also, Jonas will be chair­ing the sym­po­sium on cor­ti­cal encod­ing of com­plex sound (with talks by for­mer PhD stu­dent Julia Erb and for­mer Post­doc Björn Her­rmann) on tues­day morning.

Posters by the Obleser lab:

Tues­day morn­ing Session:
FIEDLER et al., In-ear-EEG …, Board M46
WILSCH et al., Cor­ti­ca pat­terns of alpha pow­er …, Board Y1
Wednes­day after­noon Session:
LIM et al., Evoked respons­es and alpha oscil­la­tions …, Board BB37

See you there.

 

Categories
Editorial Notes

Wel­come Sung-Joo Lim & Alex Brandmeyer

We wel­come Sung-Joo Lim (KR) & Alex Brand­mey­er (US) as new post­doc­tor­al researchers in the group.

Sung-Joo very recent­ly received her Ph.D. from the Carnegie Mel­lon Uni­ver­si­ty, Pitts­burgh, PA (US), after

Inves­ti­gat­ing the Neur­al Basis of Sound Cat­e­go­ry Learn­ing with­in a Nat­u­ral­is­tic Inci­den­tal Task

See her abstract
Adults have noto­ri­ous dif­fi­cul­ty learn­ing non-native speech cat­e­gories even with exten­sive train­ing with stan­dard tasks pro­vid­ing explic­it tri­al-by-tri­al feed­back. Recent research in gen­er­al audi­to­ry cat­e­go­ry learn­ing demon­strates that videogame-based train­ing, which incor­po­rates fea­tures that mod­el the nat­u­ral­is­tic learn­ing envi­ron­ment, leads to fast and robust learn­ing of sound cat­e­gories. Unlike stan­dard tasks, the videogame par­a­digm does not require overt cat­e­go­riza­tion of or explic­it atten­tion to sounds; lis­ten­ers learn sounds inci­den­tal­ly as the game encour­ages the func­tion­al use of sounds in an envi­ron­ment, in which actions and feed­back are tight­ly linked to task suc­cess. These char­ac­ter­is­tics may engage rein­force­ment learn­ing sys­tems, which can poten­tial­ly gen­er­ate inter­nal feed­back sig­nals from the stria­tum. How­ev­er, the influ­ence of stri­atal sig­nals on per­cep­tu­al learn­ing and plas­tic­i­ty online dur­ing train­ing has yet to be estab­lished. This dis­ser­ta­tion work focus­es on the pos­si­bil­i­ty that this type of train­ing can lead to behav­ioral learn­ing of non-native speech cat­e­gories, and on the inves­ti­ga­tion of neur­al process­es pos­tu­lat­ed to be sig­nif­i­cant for induc­ing inci­den­tal learn­ing of sound cat­e­gories with­in the more nat­u­ral­is­tic train­ing envi­ron­ment by using fMRI. Over­all, our results sug­gest that reward-relat­ed sig­nals from the stria­tum influ­ence per­cep­tu­al rep­re­sen­ta­tions in regions asso­ci­at­ed with the pro­cess­ing of reli­able infor­ma­tion that can improve per­for­mance with­in a nat­u­ral­is­tic learn­ing task.

Alex very recent­ly received his Ph.D. from the Rad­boud Uni­ver­si­ty of Nijmegen (NL), address­ing his the­sis top­ic with

Audi­to­ry brain-com­put­er inter­faces for per­cep­tu­al learn­ing in speech and music

See his abstract
We per­ceive the sounds in our envi­ron­ment, such as lan­guage and music, effort­less­ly and trans­par­ent­ly, unaware of the com­plex neu­ro­phys­i­o­log­i­cal mech­a­nisms that under­lie our expe­ri­ences. Using elec­troen­cephalog­ra­phy (EEG) and tech­niques from the field of machine learn­ing, it’s pos­si­ble to mon­i­tor our per­cep­tion of the audi­to­ry world in real-time and to pin­point indi­vid­ual dif­fer­ences in per­cep­tu­al abil­i­ties relat­ed to native-lan­guage back­ground and audi­to­ry expe­ri­ence. Going fur­ther, these same meth­ods can be used to pro­vide indi­vid­u­als with neu­ro­feed­back dur­ing audi­to­ry per­cep­tion as a means of mod­u­lat­ing brain respons­es to sounds, with the even­tu­al aim of incor­po­rat­ing these meth­ods into edu­ca­tion­al set­tings to aid in audi­to­ry per­cep­tu­al learning.

Wish­ing you all the best.