Fulfilling humans right-to-explanation by integrating machine learning

  • Forrest, James Robert (Student)
  • Sripada, Somayajulu (Academic Supervisor)

Project: Studentship

Project Details

Description / Abstract

Advances in machine learning (ML) are transforming our society. As more and more machine learnt models become work colleagues to humans (loan applications are, for example, processed by algorithms mainly and humans are called in for help occasionally), humans expect improved access to models, particularly to their inner workings. New regulatory regimes all over the world are introducing humans' 'right-to-explanation'. This means, for example, a customer whose loan application has been turned down could ask for explanation. Evidently, new research is required to investigate computational techniques for explaining to humans the inner workings of machine learnt models. This project aims to bring together techniques from natural language generation (NLG), machine learning (ML) and information visualization (InfoVis).
StatusFinished
Effective start/end date1/10/1731/12/20