To aim of this project was to provide a full neuroimaging workflow from preprocessing of raw data to visualisation of results, to explore longitudinal analysis between two treatments in this dataset and to visualise resting-state networks linked to the default mode network and attention. In my github repository you will find scripts and documentation about the the BIDS to NiFTY conversion, fMRI prep as well as resting-state visualisation of a single participant. There is also a powerpoint presentation slide to guide you through the work.
In this project I aim to combine data from different modalities (fMRI, EEG, and behavioral) to understand more about sound and music processing. My main focus in this project was to try to reproduce some of the results from a published paper starting form raw data.