To fully comprehend what is being said, language processing takes place at multiple levels, breaking down both meaning and structure
A talking ape
A bonobo great ape named Panbanisha was born in captivity in 1985 and spent her entire life receiving language training until she died in 2012. Using a specially designed keyboard consisting of 400 geometric patterns and a computer that translated them into a synthetic voice, she had a vocabulary of 3,000 words by age 14.
While she had little grammar, she was able to combine symbols to form simple sentences such as “Please can I have an iced coffee?” A later study on 3 great apes found they were able to talk about both the past and the present, with language skills comparable to a young child.
Animals, it seems, can acquire language, even human language, yet lack complexity or the ability to create novel sentences as we do. So, it leaves us with a question. If language is not unique to humans, **what ‘hardware’ within the brain and cognitive processes do we use to understand and produce such elaborate communications**?
A ‘universal’ grammar
Language is incredibly complex, referring to objects, events, and people without clearly naming them and embedding clauses within sentences to generate seemingly endless and infinite variety.
Take the following sentence, “The dog chased the cat.” It can be made more complex by adding extra detail, such as “The dog called Ralph chased the cat” or, indeed, “The dog called Ralph, who had narrowly missed the car, chased the ginger cat with the blue collar.”
**Noam Chomsky**’s original 1960s ‘**Universal Grammar**,’ suggesting an **innate set of grammatical rules common to all humans**, has been challenged and changed over time. However, being born with certain key features of language acquisition may explain how children, despite limited language exposure, are able to generate such complex and novel language constructs from an early age.
However, **the jury is still out**. There are conflicting findings arising from ongoing research into the world’s 6,000 to 8,000 languages and whether there are universal features common to all.
Language and thought
While language comprehension and production are clearly vital for successful and complex communication, **language itself may also be an essential element of our cognition**.
Indeed, in the George Orwell dystopian classic _Nineteen Eighty Four_, the invention of the language ‘Newspeak’ was intended to control and limit thought by removing unnecessary words and making it less expressive.
The imagined totalitarian government’s control of language was designed to prevent subjects from being able to verbalize rebellion or plan an uprising.
The ‘**Whorfian hypothesis**’ suggests that how we think is influenced by the language we speak. Originally put forward by **Benjamin Lee Whorf** in 1956, the argument is that language differences result in thinking differences – or a more moderate form suggests it modifies certain aspects of cognition, including perception and memory.
There is some evidence to support such a hypothesis. Indeed, **research shows that our native language can impact performance on specific tasks**, including color discrimination and object categorization, and possibly may even influence our ability to remember something.
And yet, while language may result in cognitive tendencies or preferences, there is little or no evidence to suggest that language ‘determines’ thinking.
Grammar
Grammar, or syntax, refers to the **set of rules belonging to a language**, offering the potential to create a seemingly infinite set of sentences. They can facilitate the creation of novel, complex, information-rich sentences – potentially having never been seen before.
And yet the set of rules that form the grammar of a language can result in ambiguity.
For example, consider the sentence, “I saw the man with the telescope.” Was ‘I’ using the telescope to see the man? Or was I looking at the ‘man’ carrying the telescope?
‘**Prosodic cues**’ in spoken language can help – our use of **stress, rhythm, intonation**, and **word duration** can all provide clues to the intended meaning behind what was said.
Furthermore, **the overall pattern of prosodic phrasing is an important, if not vital, aspect of language comprehension**. It offers clues to relationships between phrases and objects contained within the sentence and its overall message.
Parsing up the garden path
Parsing is a crucial aspect of language comprehension, involving the analysis of the ‘**syntax**’, or structure of what is being read or listened to, and the ‘**semantics**,’ the meaning behind the words.
The timing and connection between them are vital. Yet, **theories differ**. Breaking down the structure of the sentence, or ‘syntactic analysis’, may precede or influence interpreting its meaning (semantic analysis), follow it, co-occur, or be integrated within the same process. Either way, **the relationship between sentence structure and meaning is an important one**.
What’s known as the ‘**garden-path**’ model has been put forward to explain why sentences such as “The horse raced past the barn fell” are so hard to understand.
It could be that we initially only consider one syntactic structure and that it is usually the simplest, or that we attach new words as we encounter them to the current phrase or clause if grammatically possible.
This model is in line with findings of some brain-damaged patients, confirming that syntactic analysis sometimes occurs in the absence of semantic information.
Models of sentence processing
Several models have been proposed to explain our initial interpretation of a sentence. The **‘Constraint-based model’ combines multiple information sources, including general world knowledge, syntax, and semantics to consider all possible interpretations of a sentence, with the most appropriate being selected**. After all, it seems logical and even efficient for listeners and readers to use all available information.
On the other hand, the **‘Unrestricted race model’ adopts an approach combining several models, using constraints along with an initial favored syntactic structure**. As a result, an interpretation is ‘held’ or maintained until evidence arrives that challenges it.
And yet, most sentence processing theories have similar limitations. They assume that language processing results in an **accurate, complete, and detailed representation** based on the input received, and that, sooner or later, it will find the correct syntactic structure.
On the other hand, a **‘good-enough’ representation approach suggests that often the parse involves generating a good-enough representation for the task at hand**; it may help explain both the limited processing and error-prone nature of this vital human cognition process.
Pragmatics – what we really want to say
If your partner shouts, “There’s someone at the door,” they probably aren’t simply acknowledging an individual is waiting outside and ringing the doorbell. More likely, the intended meaning is “Someone, answer the door and see who it is.”
**‘Pragmatics’ refers to how language is practically used and comprehended in the real world** – relating to the intended rather than literal meaning. To fully understand a speaker’s intended meaning, we typically need contextual information, such as the environment we are in, the tone, and the conversation so far.
**Without pragmatics, figurative language, such as metaphors, would be impossible to comprehend.** If you get ‘cold feet,’ you most likely don’t need warmer socks, but you may need encouragement and support before embarking on something new.
Such language processing is complex and often requires taking the speaker’s perspective to fully comprehend what they have said or written.
Autism and understanding other people’s intentions
Studying people who find it difficult to distinguish between literal and pragmatic meaning can clarify its importance, and involvement in, language comprehension.
While often very good at understanding the syntax and semantics of a sentence, they may become lost when it comes to its intended meaning.
Individuals with **autism** frequently find it challenging to understand others’ beliefs and intentions and may not be able to integrate information from multiple sources – negatively impacting communication.
Most, but not all, such individuals also have general learning difficulties, and language acquisition may develop more slowly than with others of the same age.
Interestingly, those experiencing the relatively mild autistic spectrum disorder, **‘Asperger’s syndrome,’ despite at least average intelligence and typically developing language normally, may still have challenges with social communication**.
Drawing inferences can be particularly tough, along with recognizing the use of sarcasm and irony, further highlighting the important role pragmatics play within social situations.
Reaching common ground
Most researchers agree that the majority of speakers and listeners conform to the ‘**cooperative principle**.’ The listener expects the speaker to refer to shared knowledge and beliefs, also known as ‘common ground,’ rather than introducing new discussion points without defining them.
**A successful conversation where the speaker shares their points and the listener identifies and recognizes them can lead to a shift in common ground** – the goal is not that it remains static.
However, where references are not apparent in the common ground, the listener often adopts an **’egocentric heuristic**,’ whereby they consider what is being said from their own perspective. While some researchers suggest this is infrequently used, it does help explain some of our communication failures.
Establishing common ground is effortful and attentionally demanding but can be helped by forming simple associations in long-term memory or relying on shared cultural backgrounds – where appropriate.
Working memory affects comprehension
**Working memory can affect performance in language comprehension processing**. Indeed, low-capacity individuals may be less able to form mental models of what is being discussed and draw inferences beyond the literal language used.
On the other hand, **those with increased or high-capacity working memory are frequently more able to discriminate between what is relevant and important rather than what is irrelevant and superfluous**.
This may, in part, be a result of improved abilities in the use of schemas – organized packets of information that assist top-down processing.
However, as working memory is typically associated with higher IQ scores, clarifying and disentangling all the dimensions involved in individual differences is tricky. However, it is vital to recognize that we don’t all comprehend language equally and that this may be due to factors other than language processing.