09 October 2023

"Electronic language" has been taught to recognize the taste of foods

Researchers are developing emotional intelligence for AI - the ability to feel and perceive information like humans.

Researchers at the University of Pennsylvania have developed a simplified model that mimics how taste affects what we eat, depending on our needs and desires. The system consists of "electronic tongue" sensors and a model of the taste area of the cerebral cortex. 

The taste receptors on the human tongue convert chemical data into electrical impulses. These signals are transmitted via neurons to the taste zone of the cortex, where a complex network of neurons form the perception of taste. The researchers developed a simplified biomimetic version of this process from two-dimensional materials one to several atoms thick. 

The artificial taste buds consist of tiny graphene-based electronic sensors, chemittransistors, that detect gases or chemical molecules. The other part of the circuit uses memtransistors, transistors that memorize past signals, made of molybdenum disulfide. The properties of the two different 2D materials complement each other to form an artificial flavor system. For example, by detecting sodium ions, the system will "sense" a salty flavor, say the study authors.

The process is universal so that it can be applied to all five major flavor profiles: sweet, salty, sour, bitter and umami. According to the developers, such a robotic taste system has promising potential applications, from emotional intelligence-based diets for weight loss to personalized food suggestions in restaurants. The research team's upcoming goal is to expand the flavor range of the electronic language.  
"We are trying to create arrays of graphene devices to mimic the roughly 10,000 taste buds in our language, each slightly different from the others, allowing us to distinguish subtle differences in flavors," Saptarshi Das, co-author of the study.

The researchers believe that this concept of gustatory emotional intelligence in an AI system will translate to other senses such as visual, auditory, tactile and olfactory emotional intelligence to help in the development of a true robotic system that works like a human brain.
Found a typo? Select it and press ctrl + enter Print version