When electrical engineering professor Roman Kuc is not in his laboratory in Becton Center, he may be at the Grove Street Cemetery, studying bats in their natural setting.

[ydn-legacy-photo-inline el_id=”22521″ ]

Kuc, who is associate dean of educational affairs at the Yale School of Engineering & Applied Science, models bat and dolphin ears to build sensors. These sensors, he said, can help wheelchairs to weave between objects and cars detect blind spots.

“Nature is wonderful,” Kuc said about his research. “I think it has a sense of humor.”

Bats use sonar to navigate their way through dark caves and to catch prey, said Kuc, who has worked with robots for 25 years. His sensors, he added, mimic bat sonar. After building models and putting them on robots, Kuc does simulations using mathematical software.

[ydn-legacy-photo-inline id=”6085″ ]

While developing his robot, or “Robat,” Kuc noticed that bat ears do not point straight but off to the sides. As a result, Kuc’s model has angled pairs of sensors called vergence sensors, which have better acoustic function than sensors that point straight ahead. These sensors can detect even the slightest movement, he added.

“The basis of all information is energy,” he said, “so I look for the energy of the signal.”

Kuc has also worked with dolphin robot models and with ultrasound, which is used in medical imaging. Colleague and biomedical engineering professor James Duncan said ultrasound signals often have a lot of interference noise, making them difficult to interpret. The challenge is to parse out the relevant information, he said.

For his dolphin robot, or “Rodolph,” Kuc analyzes the data to represent sound energy using 16 signals.

[ydn-legacy-photo-inline id=”6084″ ]

Kuc said he may expand his research into human ears, which are similar to the ears of some bats in that they have ridges. While there is a lot of information in sound, he said, humans mainly use vision as their primary sense.

But blind people are capable of interpreting sound information, he added. Kuc said a blind person knows from tapping a cane if a floor is carpet or if a person is moving toward him. Sometimes people who are blind can even tell how far away a wall is based on analyzing how long the taps echo.

For future research, Kuc said he plans to add ridges to the sensors to better mimic bat and human ears, which should allow for even more precise sensors.

The next step for sensor research, said electrical engineering professor and colleague Lawrence Steib, is to make sensors effective in real-time applications.

“The challenge is to reinvent the solution in actual hardware and software,” he said.