Data Analysis
Data Acquisition
Human Connectome Project
Lab Admin
-
-
-
-
-
-
- Lab Handbook
Useful Resources
Data Analysis
Data Acquisition
Human Connectome Project
Lab Admin
Useful Resources
Page describing how we analyze cortical thickness with freesurfer
First make sure that you have the correct lines in your .bashrc file to run freesurfer:
export FREESURFER_HOME=/usr/local/freesurfer source $FREESURFER_HOME/SetUpFreeSurfer.sh > /dev/null export SUBJECTS_DIR=/mnt/diskArray/projects/freesurfer
Next make sure that you have the right tool boxes in your matlab search path. This should be done through your startup.m file
addpath(genpath('~/git/yeatmanlab'));
Step 1: In a terminal, convert the PAR/REC files to nifti images. You may not need to do this if you have already gone through the Anatomy Pipeline
cd /mnt/diskArray/projects/MRI/[subid] parrec2nii -c -b *.PAR
Step 2: In MATLAB compute the root mean squared (RMS) image. Once again this might have already been done in the Anatomy Pipeline so you can re-use that RMS image
T1path = 'Path to t1 weighted image'; T1path = mri_rms(T1path); % Root mean squared image
Freesurfer is a useful tool for segmenting a T1-weighted image and building a cortical mesh. To segment the subject's T1-weighted image using freesurfer from the command line type:
recon-all -i /home/projects/MRI/[subjid]/[YYYYMMDD]/[subjid]_WIP_MEMP_VBM_SENSE_13_1_MSE.nii.gz -subjid [subid] -all
The MATLAB script parallel_recon.m
has been written to (with slight adjustments in order to specify file input and output) automate and parallelize the process, which typically takes 6-10 hours per file. It is also recommended that the process be executed on multiple servers in order to further divide the work.
Following the segmentation of the T1-weighted image, two further pre-processing steps are necessary to proceed with longitudinal analysis of the data. Firstly, an unbiased template must be created for each subject, taking into account all time-points/sessions. Secondly, each time-point must be processed again using these templates in order to standardize subject data across time-points.
The command to create the unbiased template is:
recon-all -base [output file name] -tp [time-point 1 file] -tp [time-point 2 file] ... -tp [time-point n file] -all
Next, the command to process each time-point using this template is:
recon-all -long [template file] [time_point file] -all
The recon-all -base
command will take about as long as the original FreeSurfer segmentation, however each iteration of the recon-all -long
command will take approximately half the time. Using helper functions, the parallel_recon.m
script will also generate and execute these commands.
There are several different options for longitudinal data analysis, each which provides unique and useful information about the data and how subjects change over time.
A first step of several different forms of longitudinal analysis is the generation of a qdec file. A qdec file is a text file, which serves as a consolidation of subjects of interest, along with any other relevant measures or covariates. The file is used to extract, organize, and analyze data for subjects/time-points listed. For a longitudinal qdec file, the components should be organized in the following manner:
Because for some analyses data is numerically sorted using information from the qdec file, one version of the file should represent subjects' time-points and base templates strictly numerically. However, an original file should be kept which lists subjects time-points and base templates so they accurately correspond with a file name, which contains the data for that/those time-point(s).
In order to complete the following analyses, there are two more steps of collecting and processign data. The two commands used are mris_preproc
and mri_surf2surf
. For example:
mris_preproc --qdec-long long.qdec.table.dat --target fsaverage --hemi lh --meas thickness --out lh.thickness.stack.mgh
Followed by:
mri_surf2surf --hemi lh --s fsaverage --sval lh.thickness.stack.mgh --tval lh.thickness.stack.fwhm10.mgh --fwhm-trg 10 --cortex --noreshape
Further documentation for both mris_preproc and mri_surf2surf can be found at these links.
The output of mri_surf2surf
along with the numerical qdec file can then be used as inputs for the function long_prepare_LME.m
. The output of this function is an array which contains (among other things) data for the relevant subjects and a data matrix containing information from the qdec file. Outputs of this function will be used for the following analyses.
A whole brain correlational analysis has the potential to illuminate the relationship between behavioral measures and the anatomy of the brain. The analysis subsequently provides an assortment of regions of interest whose structural change over time can be investigated.
Using the outputs of long_prepare_LME
, the function make_correlation_map.m
will create a .mgh file which can be mapped onto the cortical surface in order to visualize regions correlated with a given covariate. This file can later be used in order to extract clusters (a process which is described under ROI Analysis) with which the function corrplot_ROIs.m
calculates the correlation value and its signficance and plots the data to a graph. An optional output of this function is a 3D matrix containing information about the ROIs which can be used for plotting the data longitudinally (using long_mtx_2_longplot.m
) or analyzed using the linear mixed effects model (described below).
The linear mixed effects model is a useful and robust method of statistical analysis. In order to use this model, the 3D data matrix output of corrplot_ROIs.m
must be reformatted to a 2D matrix using the function consol_long_mtx.m
. Once the 2D matrix is created, it can be used in the function lme_long_fitandplot.m
which will calculate the significance and plot the change of the brain anatomy over time.
The linear mixed effects model can be used to statistically analyze ROIs found through other forms of analysis. However, it is also possible to use the model to identify regions of interest, based on the significance of their change over time. The function lme_whole_brain_analysis
uses information from long_prepare_LME.m
and analyzes the longitudinal change in each vertex of the brain across time. The output is a collection of significance files which can be mapped onto the cortical surface for visualization in freeview or used for ROI analysis.
Note: As there are tens of thousands of vertices to be analyzed, the whole brain analysis will take anywhere from 4-12 hours depending on the number of time-points as well as the complexity of the hypothesis.
As described above, analysis of ROIs provides a lot of useful information, such as change in those regions over time or correlations with behavioral measures. The freesurfer command mri_surfcluster
takes an mgh file, among other inputs, and outputs several files containing useful information. The command can either be run in the command line or in MATLAB using the function do_mri_surfcluster.m
. Further documentation on mri_surfclsuter
can be found here.
In order to use functions such as lme_long_fitandplot.m
, the outputs of these files must be processed and reformatted into matrices. Regarding the analyses described above, corrplot_ROIs.m
will reformat information for correlational analysis and cluster_2_lme_longmtx.m
should be used to process the clusters extracted from the output of lme_whole_brain_analysis.m
.