Automated classification of depression from structural brain measures across two independent community‐based cohorts

Aleks Stolicyn* (Corresponding Author), Mathew A. Harris, Xueyi Shen, Miruna C. Barbu, Mark J. Adams, Emma L. Hawkins, Laura de Nooij, Hon Wah Yeung, Alison D Murray, Stephen M. Lawrie, J. Douglas Steele, Andrew M. McIntosh, Heather C. Whalley

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

18 Citations (Scopus)
4 Downloads (Pure)


Major depressive disorder (MDD) has been the subject of many neuroimaging case–control classification studies. Although some studies report accuracies ≥80%, most have investigated relatively small samples of clinically‐ascertained, currently symptomatic cases, and did not attempt replication in larger samples. We here first aimed to replicate previously reported classification accuracies in a small, well‐phenotyped community‐based group of current MDD cases with clinical interview‐based diagnoses (from STratifying Resilience and Depression Longitudinally cohort, ‘STRADL’). We performed a set of exploratory predictive classification analyses with measures related to brain morphometry and white matter integrity. We applied three classifier types—SVM, penalised logistic regression or decision tree—either with or without optimisation, and with or without feature selection. We then determined whether similar accuracies could be replicated in a larger independent population‐based sample with self‐reported current depression (UK Biobank cohort). Additional analyses extended to lifetime MDD diagnoses—remitted MDD in STRADL, and lifetime‐experienced MDD in UK Biobank. The highest cross‐validation accuracy (75%) was achieved in the initial current MDD sample with a decision tree classifier and cortical surface area features. The most frequently selected decision tree split variables included surface areas of bilateral caudal anterior cingulate, left lingual gyrus, left superior frontal, right precentral and paracentral regions. High accuracy was not achieved in the larger samples with self‐reported current depression (53.73%), with remitted MDD (57.48%), or with lifetime‐experienced MDD (52.68–60.29%). Our results indicate that high predictive classification accuracies may not immediately translate to larger samples with broader criteria for depression, and may not be robust across different classification approaches.
Original languageEnglish
Pages (from-to)3922-3937
Number of pages16
JournalHuman Brain Mapping
Issue number14
Early online date19 Jun 2020
Publication statusPublished - 1 Oct 2020

Bibliographical note

This study was supported and funded by the Wellcome Trust Strategic Award ‘Stratifying Resilience and Depression Longitudinally’ (STRADL) (Reference 104036/Z/14/Z), and the Medical Research Council Mental Health Pathfinder Award ‘Leveraging routinely collected and linked research data to study the causes and consequences of common mental disorders’ (Reference MRC-MC_PC_17209). MAH is supported by research funding from the Dr Mortimer and Theresa Sackler Foundation. The research was conducted using the UK Biobank resource, with application number 4844. Structural brain imaging
data from the UK Biobank was processed at the University of Edinburgh Centre for Cognitive Ageing and Cognitive Epidemiology
(CCACE), which is a part of the crosscouncil Lifelong Health and Wellbeing Initiative (MR/K026992/1).
CCACE received funding from Biotechnology and Biological Sciences Research Council (BBSRC), Medical Research Council (MRC), and was also supported by Age UK as part of The Disconnected Mind project. This work has made use of the resources provided by the Edinburgh Compute and Data Facility (ECDF) (


  • brain structure
  • classical twin
  • depression
  • diffusion MRI
  • machine learning
  • major depressive disorder
  • structural MRI
  • MRI
  • classification


Dive into the research topics of 'Automated classification of depression from structural brain measures across two independent community‐based cohorts'. Together they form a unique fingerprint.

Cite this