Difference between revisions of "HCP Process"
From Brain Development & Education Lab Wiki
Dstrodtman (Talk | contribs) (Created page with "This page will contain information about processing HCP data. This process has been entirely automated, with proper file management.") |
Dstrodtman (Talk | contribs) |
||
(3 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | + | Software to process HCP data through AFQ is available from https://github.com/yeatmanlab/BrainTools/tree/master/projects/HCP | |
+ | |||
+ | ==HCP Extension for AFQ== | ||
+ | Processing of HCP diffusion data through AFQ has been fully automated. Output will result in a file directory roughly twice as large (resulting in around 2.5 TB for the 900 subjects diffusion data [3.6 TB if zipped files left on drive]). | ||
+ | |||
+ | HCP_run_dtiInit |
Latest revision as of 21:04, 29 March 2016
Software to process HCP data through AFQ is available from https://github.com/yeatmanlab/BrainTools/tree/master/projects/HCP
HCP Extension for AFQ
Processing of HCP diffusion data through AFQ has been fully automated. Output will result in a file directory roughly twice as large (resulting in around 2.5 TB for the 900 subjects diffusion data [3.6 TB if zipped files left on drive]).
HCP_run_dtiInit