´ëÇѾð¾îÇÐȸThe Linguistic Association of Korea

ÇÐȸÁö

  • Ȩ
  • ÇÐȸÁö
  • ³í¹®ÀÚ·á½Ç

³í¹®ÀÚ·á½Ç

Á¦¸ñ Neural Network Language Models as Psycholinguistic Subjects: Focusing on Reflexive Dependency
ÀúÀÚ Wonil Chung & Myung-Kwan Park
±Ç/È£ Á¦30±Ç / 4È£
Ãâó 169-190
³í¹®°ÔÀçÀÏ 2022-12-31
ÃÊ·Ï Chung, Wonil & Park, Myung-Kwan. (2022). Neural network language models as psycholinguistic subjects: Focusing on reflexive dependency. The Linguistic Association of Korea Journal, 30(4), 169-190. The purpose of this study is to investigate the reflexive-antecedent dependency resolution accompanying the wh-filler-gap dependency resolution in neural network language models (LMs) sentence processing, comparing the processing result of LMs to the one of humans. To do so, we adopt the psycholinguistic methodology that Fraizer et al. (2015) used for humans. The neural-network language models employed in this study are four LMs: the Long Short-Term Memory (LSTM) trained on large datasets, the Generative Pre-trained Transformer-2 (GPT-2) trained on large datasets, an LSTM trained on small datasets (L2 datasets), and the GPT-2 trained on small datasets (L2 datasets). We found that only the LMs trained on large datasets were sensitive to the dependency between a reflexive and its antecedent matching in gender, but all of the four neural LMs failed to learn reflexive-antecedent dependency accompanying wh-filler-gap dependency. Furthermore, we also found that the neural LMs have a learning bias in gender mismatch.
ÆÄÀÏ PDFº¸±â  ´Ù¿î·Îµå