Abstract
A central question in language acquisition is how children build linguistic representations that allow them to generalize verbs from one construction to another (e.g., The boy gave a present to the girl → The boy gave the girl a present), whilst appropriately constraining those generalizations to avoid non-adultlike errors (e.g., I said no to her → *I said her no). Although a consensus is emerging that learners solve this problem using both statistical and semantics-based learning procedures (e.g., entrenchment, pre-emption, and semantic verb class formation), there currently exist few - if any - proposals for a learning model that combines these mechanisms. The present study used a connectionist model to test an account that argues for competition between constructions based on (a) verb-in construction frequency, (b) relevance of constructions for the speaker's intended message, and (c) fit between the fine-grained semantic properties of individual verbs and individual constructions. The model was able not only (a) to simulate the overall pattern of overgeneralization-then-retreat, but also (b) to use the semantics of novel verbs to predict their argument structure privileges (just as real learners do), and
Original language | English |
---|---|
Pages (from-to) | 1245-1276 |
Number of pages | 32 |
Journal | J Child Lang |
Volume | 43 |
Issue number | 6 |
DOIs | |
Publication status | Published - 2016 |
Keywords
- Child Child, Preschool Comprehension Computer Simulation Concept Formation Female Generalization, Psychological Humans Intention Language Development Linguistics Male Neural Networks, Computer Semantics