Translate

venerdì 30 dicembre 2016

Robots that can read your mind a breakthrough for manufacturing


Those who wish others could read their minds will enjoy a breakthrough technology out of the lab of Thenkurussi (Kesh) Kesavadas. The professor of Industrial and Enterprise Systems Engineering at the University of Illinois and his team have used brain computer interface (BCI) to control a robot (watch demonstration).
In its third year of funding, this National Science Foundation project has proven that human experts can look at an object on an assembly line and through sensors from their brain tell a robot to remove a defective object from a conveyor belt.

PhD student Yao Li demonstrates the technology which uses brain control interface to send signals to a robot.
“The robot is actually monitoring your thinking process,” Kesavadas said. “If the robot realizes you saw something bad, it should go take care of it. That is the fundamental idea in manufacturing we are trying to explore.”
In the virtual reality lab, Kesavadas and PhD student Yao Li have devised a system that runs parts through a conveyer belt; a camera takes pictures of the objects and relays those pictures to a computer screen. The operator, wearing a helmet with sensors, looks at those pictures on the screen. When the operator detects a defective object, the brain generates a certain frequency. That signal is then sent to the robot, which then removes the object from the belt.
The project, funded by the NSF’s National Robotic Initiative, uses a technique called SSVEP (Steady State Visually Evoked Potentials), which takes brain signals that are natural responses to visual stimulation at specific frequencies. When the retina is excited by a visual stimulus ranging from 3.5 Hz to 75 Hz, the brain generates electrical activity at the same (or multiples of) frequency of the visual stimulus. It essentially creates a frequency in the brain that matches the frequency of the object that person is looking.
“The signals from the brain are very similar for everybody and we know which part of the brain gives certain signals,” Kesavadas explained. “Implementing that in the real world is tougher in that through BCI, you have to pick up the signal precisely.”
Kesavadas indicates that in high volume manufacturing, robots can be programmed to detect the defect on their own, but that programming is often time consuming and expensive.
“Currently programming robots takes a significant amount of time and expertise and technicians who are fully trained to use them,” he said. “In high volume manufacturing, the time for programming the robot is well spent. However, if you go into an unstructured environment, not just in manufacturing but even in agriculture or medicine, where the environment keeps changing, you don’t get nearly the return on your investment. Our goal is to take the knowledge and expertise of the operator and communicate that to a robot in certain situations. If we can prove that process is effective, it can save significant time and money.”
Kesavadas has long been at the forefront at bringing virtual reality to medicine and directs the Health Care Engineering Systems Center on the Illinois campus. So while this technology has immediate benefits to manufacturing, he believes it can have an even greater impact on the medical field. For example, a paraplegic could tell a robot to bring a certain object to them simply by sending the right signal.
Kesavadas notes that while the technology exists, it requires a surgeon to place the sensor inside the brain.
“As we devise an external system to become much more consistent and reliable, it will benefit many people,” he said. “Surgically placing the sensors is a more expensive, invasive, and risky process.”
For now, Kesavadas is striving to ignite excitement in manufacturing to realize the technology’s potential. He presented his findings to the NSF in early December.
“Until now, there has been no research in using brain computer interfacing for manufacturing,” Kesavadas said. “Our goal at the onset was to prove these technologies can actually work and that the robots can be used in a more friendly way in manufacturing. We have done that. The next stage is to coordinate with industries that would need this kind of technology and do a demonstration in a real-life environment. We want industry to know the potential of this technology, ignite the thinking process and how they can use the role of brain computer interface as a whole to bring a more competitive edge to the industry.”
________________
For more information on this story on for other College of Engineering media inquiries, contact Mike Koon, Marketing & Communications Coordinator, 217/244-1256

https://engineeringatil.scienceblog.com/2016/12/22/robots-that-can-read-your-mind-a-breakthrough-for-manufacturing/