For more than 50 years, language scientists have assumed that sentence structure is fundamentally hierarchical, made up of small parts in turn made of smaller parts, like Russian nesting dolls.
A new Cornell study suggests language use is actually based on simpler sequential structures, like clusters of beads on a string.
Cornell psychology professor Morten Christiansen and his colleagues cite research in evolutionary biology indicating that humans acquired language (and animals did not) because we have evolved abilities in a number of areas, such as being able to correctly guess others' intentions and learn a large number of sounds that we then relate to meaning to create words.
In contrast, the hierarchy concept suggests humans have language thanks only to highly specialized "hardware" in the brain, which neuroscientists have yet to find.
The sequential concept is that human language systems deal with words by grouping them into little clumps that are then associated with meaning. Sentences are made up of such word clumps, or "constructions," that are understood when arranged in a particular order. For example, the word sequence "bread and butter" might be represented as a construction, whereas the reverse sequence of words "butter and bread" would likely not.
The sequence concept has simplicity on its side; language is naturally sequential, given the temporal cues that help us understand and be understood as we use language. Moreover, the hierarchy concept doesn't take into account the many other cues that help convey meaning, such as the setting and knowing what was said before and the speaker's intention.
The Cornell team's theory could impact natural language processing, the area of computer science that deals with human language, by encouraging scholars to focus on sequential structure when trying to create human-like speech and other types of language processing.