Cortical Resonance Frequencies Emerge from Network Size and Connectivity

Neural oscillations occur within a wide frequency range with different brain regions exhibiting resonance-like characteristics at specific points in the spectrum. At the microscopic scale, single neurons possess intrinsic oscillatory properties, such that is not yet known whether cortical resonance is consequential to neural oscillations or an emergent property of the networks that interconnect them. Using a network model of loosely-coupled Wilson-Cowan oscillators to simulate a patch of cortical sheet, we demonstrate that the size of the activated network is inversely related to its resonance frequency. Further analysis of the parameter space indicated that the number of excitatory and inhibitory connections, as well as the average transmission delay between units, determined the resonance frequency. The model predicted that if an activated network within the visual cortex increased in size, the resonance frequency of the network would decrease. We tested this prediction experimentally using the steady-state visual evoked potential where we stimulated the visual cortex with different size stimuli at a range of driving frequencies. We demonstrate that the frequency corresponding to peak steady-state response inversely correlated with the size of the network. We conclude that although individual neurons possess resonance properties, oscillatory activity at the macroscopic level is strongly influenced by network interactions, and that the steady-state response can be used to investigate functional networks.

Left Superior temporal gyrus is coupled to attended speech in a cocktail-party auditory scene

Interesting low frequency (delta) results here.

Using a continuous listening task, we evaluated the coupling between the listener's cortical activity and the temporal envelopes of different sounds in a multitalker auditory scene using magnetoencephalography and corticovocal coherence analysis. Neuromagnetic signals were recorded from 20 right-handed healthy adult humans who listened to five different recorded stories (attended speech streams), one without any multitalker background (No noise) and four mixed with a “cocktail party” multitalker background noise at four signal-to-noise ratios (5, 0, −5, and −10 dB) to produce speech-in-noise mixtures, here referred to as Global scene. Coherence analysis revealed that the modulations of the attended speech stream, presented without multitalker background, were coupled at ∼0.5 Hz to the activity of both superior temporal gyri, whereas the modulations at 4–8 Hz were coupled to the activity of the right supratemporal auditory cortex. In cocktail party conditions, with the multitalker background noise, the coupling was at both frequencies stronger for the attended speech stream than for the unattended Multitalker background. The coupling strengths decreased as the Multitalker background increased. During the cocktail party conditions, the ∼0.5 Hz coupling became left-hemisphere dominant, compared with bilateral coupling without the multitalker background, whereas the 4–8 Hz coupling remained right-hemisphere lateralized in both conditions. The brain activity was not coupled to the multitalker background or to its individual talkers. The results highlight the key role of listener's left superior temporal gyri in extracting the slow ∼0.5 Hz modulations, likely reflecting the attended speech stream within a multitalker auditory scene.

Neural cross-frequency coupling: Connecting architectures, mechanisms, and functions

Looks good.

Neural oscillations are ubiquitously observed in the mammalian brain, but it has proven difficult to tie oscillatory patterns to specific cognitive operations. Notably, the coupling between neural oscillations at different timescales has recently received much attention, both from experimentalists and theoreticians. We review the mechanisms underlying various forms of this cross-frequency coupling. We show that different types of neural oscillators and cross-frequency interactions yield distinct signatures in neural dynamics. Finally, we associate these mechanisms with several putative functions of cross-frequency coupling, including neural representations of multiple environmental items, communication over distant areas, internal clocking of neural processes, and modulation of neural processing based on temporal predictions.

Congruent visual speech enhances cortical entrainment to continuous auditory speech

Nice article that bears out predictions made by Peelle & Sommers (2015).

We demonstrate that the cortical representation of the speech envelope is enhanced by the presentation of congruent audiovisual speech in noise-free conditions. Furthermore, we show that this is likely attributable to the contribution of neural generators that are not particularly active during unimodal stimulation and that it is most prominent at the temporal scale corresponding to syllabic rate (2–6 Hz). Finally, our data suggest that neural entrainment to the speech envelope is inhibited when the auditory and visual streams are incongruent both temporally and contextually.

Measuring directionality between neuronal oscillations of different frequencies

This measure is based on the phase-slope index (PSI) between the phase of slower oscillations and the power envelope of faster oscillations. Further, we propose a randomization framework for statistically evaluating the coupling measures when controlling for multiple comparisons over the investigated frequency ranges. The method was firstly validated on simulated data and next applied to resting state electrocorticography (ECoG) data. These results demonstrate that the method works reliably. In particular, we found that the power envelope of gamma oscillations drives the phase of slower oscillations in the alpha band.

Identifying neuronal oscillations using rhythmicity

Here, we present lagged coherence, a frequency-indexed measure that quantifies the rhythmicity of neuronal activity. We use this method to identify the sensorimotor alpha and beta rhythms in ongoing magnetoencephalographic (MEG) data, and to study their attentional modulation. Using lagged coherence, the sensorimotor rhythms become visible in ongoing activity as local rhythmicity peaks that are separated from the strong posterior activity in the same frequency bands.

Rhythmic auditory cortex activity shapes stimulus–response gain and background firing

We found that phase-dependent models better reproduced the observed responses than static models, during both stimulation with a series of natural sounds and epochs of silence. This was attributable to two factors: (1) phase-dependent variations in background firing (most prominent for delta; 1–4 Hz); and (2) modulations of response gain that rhythmically amplify and attenuate the responses at specific phases of the rhythm (prominent for frequencies between 2 and 12 Hz).

Oscillatory visual activity modulates spatial attention to rhythmic inputs

Using high-density electroencephalography in humans, a bilateral entrainment response to the rhythm (1.3 or 1.5 Hz) of an attended stimulation stream was observed, concurrent with a considerably weaker contralateral entrainment to a competing rhythm. That ipsilateral visual areas strongly entrained to the attended stimulus is notable because competitive inputs to these regions were being driven at an entirely different rhythm. Strong modulations of phase locking and weak modulations of single-trial power suggest that entrainment was primarily driven by phase-alignment of ongoing oscillatory activity.