Human language follows specific set of rules. For instance, a plural noun should come with a plural verb, similar in the following sentence: “The children from the neighborhood are studying.” Consequently, relationships happen where data at the starting of a sentence is pertinent at the end of the similar sentence. In German, subordinate clauses can be placed before the sentence ends. Such terms are known as nested structures or embeddings.
It is believed that the capability of recognizing such embedding is exclusive to humans. Even though some beings are able to learn different words, they seem incapable of grasping the possible combinations and complexity in the human language. The study at the Department of Neuropsychology led by Dr. Claudia Männel now asks: If this capability is fraction of what makes us human, when does it starts? At MPI CBS at the Children’s Language Laboratory, the researchers experimented on 38 babies to see how and if infants can manage embeddings. Extraordinarily, babies as young as 5 Months old were capable of recognizing embedding in inventive tone sequences made for the research.
On a similar note, it is really difficult to distinguish a cry of a baby; whether it is because they are hungry or they are in some pain. Now, the skill of differentiating the dissimilarities is merged with constant facial expression recognition software with desire of providing a novel method to assist the healthcare providers more accurately estimate whether a baby simply requires a diaper change or is experiencing pain.
A research team at the University of South Florida is making effort to evaluate the discomfort level in a newborn. The group of researchers consists of engineers specializing in facial recognition software and the neonatal nurses who have evolved a sharp sense of being capable of advising whether the babies are just crying as they are hungry or in serious pain.