Information theoretic measures of incremental parser load were generated from a phrase structure parser and a dependency parser and then compared with incremental eye movement metrics collected for the same temporarily syntactically ambiguous sentences, focussing on the disambiguating word. The findings show that the surprisal and entropy reduction metrics computed over a phrase structure grammar make good candidates for predictors of text readability for human comprehenders. This leads to a suggestion for the use of such metrics in Natural Language Generation (NLG).
|Title of host publication
|Proceedings of The 3rd Workshop on Predicting and Improving Text Readability for Target Reader Populations
|Place of Publication
|Number of pages
|Published - 2014