System Bits: May 31

Over-trusting robots; force-feeling phone; the power of wearables.


In automaton we trust?
It is widely believed that there are two kinds of robots: friendly and helpful; or sinister and deadly. But do humans place too much trust in robots? According to the work of Harvard University senior Serena Booth, a computer science concentrator at the John A. Paulson School of Engineering and Applied Sciences, the answer is as complex and multifaceted as robots themselves. For her senior thesis project, she examined the concept of over-trusting robotic systems by conducting a human-robot interaction study on the Harvard campus. Booth, who was advised by Radhika Nagpal, Fred Kavli Professor of Computer Science, received the Hoopes Prize, a prestigious annual award presented to Harvard College undergraduates for outstanding scholarly research.

During her month-long study, Booth placed a wheeled robot outside several Harvard residence houses. She controlled the machine remotely, and watched its interactions unfold through a camera as the robot approached individuals and groups of students and asked to be let into the keycard-access dorm buildings.

When the robot approached lone individuals, they helped it enter the building in 19 percent of trials. When Booth placed the robot inside the building, and it approached individuals asking to be let outside, they complied with its request 40 percent of the time. Her results indicate that people may feel safety in numbers when interacting with robots, since the machine gained access to the building in 71 percent of cases when it approached groups.

Serena Booth and her robot, Gaia, in its cookie-delivery disguise. (Source: Harvard University)

Serena Booth and her robot, Gaia, in its cookie-delivery disguise. (Source: Harvard University)

People were a little bit more likely to let the robot outside than inside, but it wasn’t statistically significant, Booth mentioned, and thought it was interesting, because she thought people would perceive the robot as a security threat. In fact, only one of the 108 study participants stopped to ask the robot if it had card access to the building.

Interestingly, the human-robot interactions took on a decidedly friendlier character when Booth disguised the robot as a cookie-delivering agent of a fictional startup, “RobotGrub.” When approached by the cookie-delivery robot, individuals let it into the building 76 percent of the time.

While Booth’s robot was harmless, she is troubled that only one person stopped to consider whether the machine was authorized to enter the dormitory. If the robot had been dangerous—a robotic bomb, for example—the effects of helping it enter the building could have been disastrous, she said. A self-described robot enthusiast, Booth is excited about the many different ways robots could potentially benefit society, but she cautions that people must be careful not to put blind faith in the motivations and abilities of the machines.

Letting mobile devices sense pressure
What if you could dial 911 by squeezing your smartphone in a certain pattern in your palm, where a different pattern might turn the music on or flip a page on the screen? Software developed by University of Michigan engineers and inspired, in part, by a Batman movie, could give any smartphone the capacity to sense force or pressure on its screen or body. ForcePhone offers new ways for people to command their mobile devices.

The researchers said the software could also enable users to push a bit harder on a screen button to unlock a menu of additional options, similar to right-clicking with a mouse. The developers envision these and many other uses for their technology, which could offer the masses a coveted feature of the latest generation of smartphones.

The iPhone 6s has a force-sensing screen but the less expensive iPhone SE does not. No commercially available device has a pressure-sensitive body.

What is interesting is that this functionality can be realized on any phone, the team said. Kang Shin, the Kevin and Nancy O’Connor Professor of Computer Science in the U-M Department of Electrical Engineering and Computer Science said they’ve augmented the user interface without requiring any special built-in sensors. ForcePhone increases the vocabulary between the phone and the user. Shin created the system with Yu-Chih Tung, a doctoral student in the same department.

ForcePhone, new software developed by University of Michigan engineering researchers, enables the body of any smartphone to sense force. It could let you command your phone by squeezing it. (Source: University of Michigan)

ForcePhone, new software developed by University of Michigan engineering researchers, enables the body of any smartphone to sense force. It could let you command your phone by squeezing it. (Source: University of Michigan)

ForcePhone works by borrowing two of a phone’s fundamental attributes—its microphone and speaker. The software sets the speaker to emit an inaudible tone at a frequency higher than 18 kHz, which is outside the range of human hearing. But the phone’s mic can still pick up the vibration caused by the sound.

They explained that when a user presses on the screen or squeezes the phone’s body, that force changes the tone. The phone’s mic can detect that, and the software translates any tone tweaks into commands.

The idea of harnessing the phone’s microphone and speaker for other purposes is an approach Tung initially picked up from the 2008 Batman movie, “The Dark Knight.” In the film, Batman turns all the smartphones in Gotham City into a sonar system as high-frequency audio signals bounce off the city’s infrastructure. He uses them to track the Joker. He thought it was an interesting idea to turn smartphones into a sonar-based system and felt this could lead to new applications to address challenges faced by smartphone users.

Unleashing the power of wearables
In collaboration with ARM, the Cambridge University Engineering Society held its first-ever Hackathon —theme: wearables — during which participants had the opportunity to use ARM microcontrollers, sensors and other products in the mbed system to design their creations, and drew more than 100 students not only from engineering but from many disciplines across the University. 

Teams were instructed to develop an electronic wearable product within 24 hours that uses the ARM mbed platform.

First prize went to Tomas Cerskus, Quang Ha and Josiah Yan for their Rowbot Personalised Rowing Device. They developed a miniature wireless device that attaches near the rower’s wrist and streams real-time acceleration data to a mobile phone. The device can both analyze data by itself and give personalized advice, or can be used by the team coach to track the performance of the team.

 (Source: ARM)

 (Source: ARM)

The second-place team built what they called Internet of Decks – lights that can be attached to clothes which beat in time to music. The team embedded LED strips in a dress that flash in a range of patterns based on signals from a beacon and an app that users download to their phones.

Other creations included
▪ Smart Bin, a waste bin with flex sensors to measure weight and light sensors that detect when the bin is full
▪ Childcare, a set of bracelets for children and their parents that buzz if a child gets too far, alerting the parents
▪ Sygnal, a bracelet that provides information (such as the time) via vibration without you having to check your watch or retrieve your mobile
▪ Panic Button, a small button you press when you’re feeling unsafe that alerts nearby Facebook friends to alert them of the danger
▪ Slipper Alarm Clock, an alarm clock embedded within a pair of slippers that only shuts off when the user puts the slippers on.

Students interested in similar design challenges are encouraged to explore the upcoming Open Technology Week taking place in the Cambridge University Department of Engineering’s James Dyson Building this June 9 to 15.