Google’s Brain2Music: Reconstructing Music from Human Brain Activity

 

The process of reconstructing experiences from human brain activity offers a unique lens into how the brain interprets and represents the world. Recently, the Google team and international collaborators introduced a method for reconstructing music from brain activity alone, captured using functional magnetic resonance imaging (fMRI). This approach uses either music retrieval or the MusicLM music generation model conditioned on embeddings derived from fMRI data. The generated music resembles the musical stimuli that human subjects experienced, with respect to semantic properties like genre, instrumentation, and mood. The scientists investigate the relationship between different components of MusicLM and brain activity through a voxel-wise encoding modeling analysis. Furthermore, they analyze which brain regions represent information derived from purely textual descriptions of music stimuli.

Read the full article at: google-research.github.io

More
articles