N883A6_ch2_main_20240412182501_20240412183001.mp4

This video was recorded during a research expedition to the Smithsonian Tropical Research Institute (STRI) in Panama, in collaboration with Dr. Inga Geipel. The footage captures our experimental setup designed to monitor the hunting behaviour of Micronycteris microtis bats. We created an environment with artificial leaves where prey items could be placed, and each leaf was equipped with an ultrasonic microphone to record the bats' echolocation calls. This clip is one of many currently being processed and analysed to gain a deeper understanding of the sophisticated hunting strategies employed by these bats, particularly how they utilise echolocation to detect prey on vegetation surfaces. The research aims to provide insights into the sensory ecology and foraging behaviour of neotropical gleaning bats.

Bats.mp4

The video above is an outreach video we created about some of our work on modelling M. microtis. It features one of our early robots, as a cartoon.

<aside> 🎵

Mind Games by @sublunarycanon910 | Suno Link to paper: https://doi-org.uc.idm.oclc.org/10.1016/j.econlet.2024.111979

</aside>

The link above leads to an AI-generated song based on our paper in Economic Letters, which explores how people play the Ultimatum Bargaining Game against AI agents. All papers should be turned into a musical hit.

output.mp4

The video above shows early footage of a robot built from the ground up in our lab. The robot features several sonar systems (both broadband and narrowband), a camera, IR obstacle detectors, and an IMU. The robot features independent power sources for the electronics and the drive train. The robot features a tracker that records the position and orientation of the robot with high fidelity. This robot was built to model sonar-based navigation in bats.

P30dg_leaf3_ANMR0037.mp4

This video demonstrates a robotic model that mimics the foraging behaviour of the Micronycteris microtis bat. The robot is equipped with a wireless sonar head and scans artificial leaves for simulated insect prey. All leaves and prey are scaled to match the wavelengths used by the robot’s sonar head.

The robot employs a simple decision-making mechanism to determine which echoes to pursue and when to abandon approaching an echo source. Once it "believes" it has located prey, it stops, and the experimenter removes the artificial prey (simulating successful capture). The experiment then continues with the robot resuming its search pattern.

In this particular video, we can observe the robot successfully locating prey on the centre leaf. This experimental setup allows us to test hypotheses about the fundamental principles guiding bat foraging behaviour without the complexities of working with live animals.

test.mov

The video above is a replay of a robotic experiment. It shows a top-down view of the setup. The outlines of the leaves are shown in gray. Black crosses mark the center positions of the leaves, where dragonflies are placed, while the red cross indicates the current position of the dragonfly. The blue marker represents the position and orientation of the sonar head. Past positions of the dragonfly are shown in gray. Bottom panel: The echoes received at the left and right ears are displayed, along with the threshold level and integration window.

2022-07-13 18.41.12.jpg

2022-07-15 18.52.53.jpg

2022-07-12 13.32.13.jpg

The images are some pictures from a research trip to New Mexico with Dr. Laura Kloepper, where we recorded the emissions of the Mexican Freetailed bat inside its roost and as it emerged from the caves. The grey boxes contain in-house-built amplifiers for the ultrasonic microphones embedded in the 3D-printed bat ears at the end of the poles.

media3.mov

media4.mov

The videos above demonstrate some of Thinh Nguyen's work on using reinforcement learning to train a simulated nectarivorous bat to approach and find the opening of a flower. The pink arrow shows the position and orientation of the flower. To be successful, the bat has to dock with the flower by approaching it within its opening angle, indicated by the pink sector. The orange arrow gives the current estimate of the flower’s position and orientation as derived by the bat using the echo envelope at the left and right ear (depicted in the top right). The echoes are simulated but based on real echoes collected from a 3D model of a real flower that was ensonified in the lab.

Amazing bat echolocation.mp4

This footage shows M. microtis capturing prey from a leaf instrumented with multiple ultrasonic microphones, used to document its acoustic behavior and approach strategies.

Biology Meets Engineering.mp4

This video features some of our work on the Biology Meets Engineering program. This NSF-funded program aims to introduce students to transdisciplinary thinking and discovery by exposing them to the mutual inspiration between robotics and animal sensing.