User Tools

Site Tools

Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/e/e67091b2fb9ab00ba8063d1e4196ca35.i failed
Unable to save cache file. Hint: disk full; file permissions; safe_mode setting.
Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/e/e67091b2fb9ab00ba8063d1e4196ca35.metadata failed
Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/e/e67091b2fb9ab00ba8063d1e4196ca35.i failed
Unable to save cache file. Hint: disk full; file permissions; safe_mode setting.
Writing /home/users/ashutosn/public_html/CommNetS2016/dokuwiki/data/cache/e/e67091b2fb9ab00ba8063d1e4196ca35.xhtml failed

Robotics and RF: From X-Ray Vision with WiFi to Communication-Aware Robotics

Abstract: RF signals are everywhere these days. As we go on with our daily lives, we constantly leave our signature on these signals by breaking them. This naturally raises the question of how much information these signals carry about us or, in general, about their environment. For instance, imagine two unmanned vehicles arriving behind thick concrete walls. They have no prior knowledge of the area behind these walls. But they are able to see every square inch of the invisible area through the walls, fully imaging what is on the other side with high accuracy. Can the robots achieve this with only WiFi signals and no other sensors? As another example, consider the WiFi network of a building. Can it estimate the occupancy level of the building and the spatial concentration of the people with a good accuracy?

In the first part of the talk, I will discuss our latest theoretical and experimental results to achieve these goals. More specifically, I show that it is possible to achieve x-ray vision with only WiFi signals and image details through thick concrete walls. Furthermore, I discuss occupancy estimation where I show how to extract the level of occupancy from WiFi measurements. With the vision of unmanned vehicles becoming part of our everyday society soon, the talk also shows how WiFi signals can give x-ray vision to robots.

In the second part of the talk, I focus on communication-aware robotics. I will start by developing a foundational understanding for the spatial predictability of wireless channels. This allows each robot to go beyond the over-simplified but commonly-used disk model for connectivity, and realistically assess the impact of a motion decision on its link. By utilizing this framework, I will then show how each unmanned vehicle can best co-optimize its communication, sensing and navigation objectives under resource constraints. This co-optimized approach results in a significant performance improvement as I discuss in the talk.

Bio: Yasamin Mostofi received the B.S. degree in electrical engineering from Sharif University of Technology, Tehran, Iran, in 1997, and the M.S. and Ph.D. degrees from Stanford University, Stanford, California, in 1999 and 2004, respectively. She is currently an associate professor in the Department of Electrical and Computer Engineering at the University of California Santa Barbara.

Yasamin is the recipient of the Presidential Early Career Award for Scientists and Engineers (PECASE), the National Science Foundation (NSF) CAREER award, the IEEE 2012 Outstanding Engineer Award of Region 6 (more than 10 Western U.S. states), and the 1999 Bellcore fellow-advisor award from Stanford Center for Telecommunications, among other awards. Her research is on mobile sensor networks. Current research thrusts include RF sensing, see-through imaging with WiFi, X-ray vision for robots, communication-aware robotics, and robotic networks. Her research has appeared in several news outlets such as BBC and Engadget.

Host: Bhaskar Krishnamachari

robotics_and_rf/from_x-ray_vision_with_wifi_to_communication-aware_robotics.txt · Last modified: 2016/09/01 19:15 (external edit)