Malicious hackers could be spying on you — through your own eyes.
That’s what two Cal Poly computer science graduate students proved was possible — albeit not likely — with their spyware proof-of-concept for Google Glass.
Mike Lady and Kim Paterson created an application called Malnotes, which masquerades as a note-taking software, but actually takes a photo every 10 seconds without the wearer knowing.
There are two reasons the graduate students decided to design this malware, Paterson said: partly to see if they could, and partly because of the privacy concerns surrounding Google Glass.
Because someone simply has to wink or give a voice command to take a picture with Google Glass, it’s relatively easy to take a photo. However, it’s not as stealthy as people may think; whenever the camera is in use, the display is on, so someone could see the light from the Glass.
At least, that is what’s supposed to happen. As Lady and Paterson found out, there is no enforcement for this rule.
According to Lady, there are specific policies that tell developers what they can and cannot do. But it isn’t enforced by the underlying software Google gives developers to use when writing apps. Developers can still be malicious in code, he said. Theoretically, the review process should catch and stop the devious code from entering the Google Play store (where Android apps can be downloaded).
It turns out the store is “terrible at reviewing apps,” Lady said.
The two uploaded their Malnotes app onto the Google Play store and it was up in a couple of hours.
Google Play vs. MyGlass
There is a difference between Google Play — which is just for Android phones and tablets — and the app store for Google Glass. The MyGlass store is the primary way Google wants you to get apps for your Glass, Lady said. According to him, this store is a closed, very heavily reviewed market.
However, people can still put apps on their Glass without the MyGlass store through a process called “sideloading:” downloading an app off a third party website and installing it yourself.
“That’s where the danger lies, because these apps can put up on a random website and look legit,” Lady said.
Paterson and Lady did not submit their malware to the MyGlass store.
Was this legal?
According to Lady and Paterson, yes, but “very borderline,” Paterson said.
The two felt comfortable submitting it to the Google Play store for a few reasons: They were obviously going to take it down, they didn’t propose any sort of functionality for it, it was obviously bogus and with the number of apps that are available, it was very unlikely anyone would download it, Paterson said.
Lady said one could call the project a “white hack,” because it was simply for research purposes. They were not actively seeking to hack anyone; they just wanted to show Google what was possible and that they should fix it, Lady said.
The app was only in the Google Play store for a couple hours, the two said. They unpublished it and Google went in and deleted it for them when they found out about it. No one downloaded the app.
“Also, the Play store is a wasteland of malware,” Lady said. “Google is actively seeking out retribution against people who do upload malware. So we were pretty safe if we just took it down as soon as we saw it.”
How Google found out
The duo started this project in a computer security graduate course. Paterson is a Google Glass owner, and Lady has experience developing for Android.
“We wanted to test the boundaries and do things that Google tells us not to do, and see if we could,” Lady said.
Computer science professor Zachary Peterson tweeted about the project, and that tweet got approximately 979 retweets and 325 favorites (he has just 238 followers). @googleglass responded, asking if they could “chat further over DM.”
To make the situation stickier, after Paterson graduates in spring, she’ll be starting a job at Google.
Paterson was worried about losing her job and “people making a bigger deal about it than it is,” she said. However, she emailed her recruiter and went through the proper channels to make sure Google knew about it, that it was honest and that they were trying to improve the platform, not exploit it.
According to Paterson, when someone doesn’t understand how the device actually works underneath, it’s hard to get a good picture of how big a security threat can be. And just because something is possible, that doesn’t necessarily mean it’s likely to happen.
“There are much cheaper ways to spy on people,” Paterson said.