Artificial Neural Network Learns Language

327

Researchers from Sassari University and Plymouth University have made a cognitive model of networked artificial neurons that can learn language with no prior knowledge. The Artificial Neural Network with Adaptive Behaviour Exploited for Language Learning, or Annabell, relies on synaptic plasticity and neural gating to learn through human conversation. “The system is capable of learning to communicate through natural language starting from tabula rasa, without any prior knowledge of the structure of phrases, meaning of words [or] role of the different classes of words, and only by interacting with a human through a text-based interface,” researchers said. “It is also able to learn nouns, verbs, adjectives, pronouns and other word classes and to use them in expressive language.”

The Annabell model is made up of 2 million interconnected artificial neurons and was tested by a database of 1500 input phrases, from which the model produced 500 output sentences with no pre-coded language knowledge. “The results show that, compared to previous cognitive neural models of language, the Annabell model is able to develop a broad range of functionalities, starting from a tabula rasa condition,” researchers wrote. “The current version of the system sets the scene for subsequent experiments on the fluidity of the brain and its robustness. It could lead to the extension of the model for handling the developmental stages in the grounding and acquisition of language.”  The source code and database for the project are available online.

The following paragraph is only visible to subscribers…
[not-level-visitors]
If you can see this then you are a subscriber!
[/not-level-visitors]

#tech #linguistics #language #neuroscience

Back to News Page