They’re all over the headlines: morphing from hard to soft, darting around like fish, hovering above photo shoots. Robots are big news at MIT, and the list of what they’re capable of keeps growing.

Shape-shifters

By coating foam with wax, researchers have taken the first steps toward creating a “squishy” robot capable of squeezing through tiny spaces and then expanding again—much like an octopus does.

Mechanical engineering professor Anette Hosoi and Nadia Cheng SM ’09, PhD ’13, along with researchers at the Max Planck Institute for Dynamics and Self-Organization and at Stony Brook University, developed the material as part of the Chemical Robots program of the Defense Advanced Research Projects Agency (DARPA).

Robots made of such material could be used during surgery, moving through the body without damaging organs. Or, they could squeeze through rubble to locate survivors during search-and-rescue operations.


Go Fish

Soft robots that move like real fish, changing direction in a fraction of a second, aren’t just fun to watch—they’re also safe. “As robots penetrate the physical world and start interacting with people more and more, it’s much easier to make robots safe if their bodies are so wonderfully soft there’s no danger if they whack you,” explains the director of MIT’s Computer Science and Artificial Intelligence Lab, Daniela Rus.

Andrew Marchese SM ’12, Rus, and Cagdas Onal of the Worcester Polytechnic Institute built the autonomous robot fish. Marchese says that in performance tests, the robot maneuvered nearly as well as real fish do when fleeing predators. The researchers hope to one day see their creation swimming among real schools of fish, and gathering information about their behavior in the natural habitat.


A Photographer’s Best Friend

Forget the bulky lighting equipment and time-consuming set-up: this little helicopter can provide the perfect lighting for a photo shoot.

Researchers from MIT and Cornell University say their prototype system can produce a difficult lighting effect called “rim lighting,” in which only the edge of the photographer’s subject is strongly lit. The photographer selects the direction from which the light should come, as well as the width of the rim. The helicopter flies to the indicated position, and moves as the subject changes positions.

Manohar Srikanth PhD ’13, who worked on the system, explains: “If somebody is facing you, the rim you would see is on the edge of the shoulder, but if the subject turns sideways, so that he’s looking 90 degrees away from you, then he’s exposing his chest to the light, which means that you’ll see a much thicker rim light,” Srikanth says. “So in order to compensate for the change in the body, the light has to change its position quite dramatically.”

The system also responds to the movements of the photographer. The photographer’s camera produces approximately 20 photographs per second and transmits them to a computer running a control algorithm, which adjusts the helicopter’s position.

The research team will demonstrate their prototype system at the International Symposium on Computational Aesthetics in Graphics, Visualization and Imaging in August.

Related Topics

Share your thoughts

Thank you for your comments and for your role in creating a safe and dynamic online environment. MIT Spectrum reserves the right to remove any content that is deemed, in our sole view, commercial, harmful, or otherwise inappropriate.

Your email address will not be published. Required fields are marked *