Apple is touting the $3,499 Vision Pro, which goes on sale Friday, as its next big thing after smartphones. When you wear it, you see the world around you, with computer-generated images and information superimposed on top. You may be intrigued or you may think the idea of a facial computer is ridiculous.Nevertheless, you You may be interested to know that this device collects more data than any other personal device I’ve ever seen.
If this is a possibility for our future, there are a lot of questions. At launch, Apple took steps to limit some of the data collected by Vision Pro, such as what people’s eyes see. That’s very good. But there are also new types of risks that Apple doesn’t seem to be addressing or may not be able to clarify how the technology works.
We can see the chaos surrounding privacy coming. Among the new dilemmas pointed out to me by privacy researchers is: Who can access the data these devices map our homes and how we move our bodies? . Vision Pro can reveal much more than you think.
The last time a gadget raised this kind of social issue was Google Glass in 2013. It had a small screen and only one camera on him, and people feared it could be used to secretly record him. Glass was so stigmatized that the nickname for those who wore it was glassholes. Now we should probably prepare for the Vision Brothers.
At this point, most of my concerns about the Vision Pro are just speculation. But as Apple and others develop technologies to replace smartphones, the question remains whether online problems such as location tracking, loss of anonymity, and data brokers’ collection of detailed information about our lives may increase even more. , is important to all of us.
“Should we, as a society, first address virtual and augmented reality in our lives before enacting strong privacy laws?” said Cooper Quintin, Senior Public Interest Technologist at the Electronic Frontier Foundation. . “Data brokers already have far too detailed knowledge of everything I do. I don’t want them to have this level of knowledge.”
Adding to my concerns is that Apple, which stakes its reputation on privacy, won’t answer most of my questions about how Vision Pro addresses these issues. So far, The Washington Post has also not been allowed to independently test the hardware.
But from Apple’s limited statements and conversations with developers building apps for the Vision Pro, I can piece together a complete picture of Apple’s early privacy strategy and what’s not being said there. is completed.
Will they be able to get the house and the body?
Apple certainly doesn’t want to be known for creating the ultimate surveillance machine. But for an app to do its magic inside the goggles, it needs a ton of information about you and what’s going on around you. Apple has gone further than rivals like Meta to restrict access to some of this data, but developers will continue to demand more.
“There’s a tension between this kind of experience and privacy,” says Jarrett Webb, director of technology at design firm Argo, which is looking to develop for Vision Pro. “In order to evoke these experiences, we need to capture this data to understand the world.”
And once a developer has data, it’s hard to guarantee that it won’t also be used for purposes that might be seen as violations.
On some issues, Apple drew the line, at least initially. To prevent people from being secretly filmed with Vision Pro, an indicator will appear on the front screen of the device while taking a photo or video. Apple also does not allow third-party Vision Pro apps to access the camera to capture photos and videos. This could theoretically also prevent third-party apps from doing creepy things like running facial recognition algorithms while looking at a person.
But privacy researchers tell me that photos aren’t the only big concern here. Since the days of Google Glass, we’ve accepted the idea that our smartphones can film us at any time.
The new question is what else the device is collecting: a map of the space around it. The device needs to know the contours of the world around you so it knows where in your line of sight to insert digital things.
Joseph Jerome, visiting professor at the University of Tampa and director of sensor data policy at Meta’s Reality Institute, says that understanding what’s in the room around you can be as simple as taking a photo. They say it can be even more invasive.
The Vision Pro app can access this data if you give it permission. This is similar to how iPhone apps ask for your location. These world maps may look like wireframe meshes to humans, but they tell a lot to computers.
At a basic level, Vision Pro might find itself in a room with four walls, a 12-foot ceiling, and windows. So far, Jerome says, things have gone very well. However, if you consider that he has a 75-inch TV, he may have more money to spend than someone who has a 42-inch TV. Because the device can understand objects, it could detect whether a person is carrying a crib, wheelchair, or even drug paraphernalia, he said.
Advertisers and data brokers building consumer profiles will salivate at the chance to obtain this data. Government too.
Think of this as an extension of the kinds of problems we know can occur with someone tracking your location. Jerome said a phone call might be enough to let people know they’re near a hospital or strip club. “These devices know where you are down to the centimeter and combine that with a number of other sensors to know exactly what you are looking at at the same time,” he says.
Apple didn’t respond to my questions about what visibility it has into how apps handle this data, or how it plans to vet it. “It’s your responsibility to protect the data your apps collect and to use it in a responsible and privacy-protecting manner,” Apple warns on its website for Vision Pro developers. So should users just trust?
Apple Vision Pro is powered by visionOS, built on decades of engineering innovation in macOS, iOS, and iPadOS.
visionOS features an all-new 3-dimensional user interface that is completely controlled by the user’s eyes, hands, and voice. pic.twitter.com/iBMKSCa70g
— Apple Vision Pro News (@AppleVisionPro) January 8, 2024
Other privacy researchers say the risk is even higher that devices like the Vision Pro expose reams of data about the one thing we can’t change: our bodies.
Information about how you move and what you see “not only identifies that person, but also reveals their emotions, characteristics, and behaviors in ways we could never know.” , it can also provide important insights into desire,’ says Jameson Spivak, senior policy analyst at the Future of Privacy Forum.
Apple is committed to privacy around a highly sensitive organ: the eyeball. Vision Pro tracks your eyes, so you can select objects with your gaze, just like moving a mouse on your computer. But Apple says it doesn’t share where users are looking with apps, websites, or itself. Instead, the device only reports what you select with your gaze after tapping your fingers together. This is equivalent to a mouse click in Vision Pro.
This is definitely a good place to start. But what about the rest of the body? Developers say the app has access to a stream of data about your movements, right down to your finger movements.
I was surprised when researchers at the University of California, Berkeley explained how revealing data about body movements during dance could be.
Last year, they discovered that they could uniquely and consistently identify around 55,000 different VR users based solely on data about head and hand movements. It’s just as useful as a fingerprint, if not more so.
Another study used head and hand movements during a game to infer about 40 different personal attributes of people, from age and gender to drug use and disorder status.
How can I prevent the Vision Pro app from doing the same thing? “If that motion data is being streamed to the cloud, even Apple has little visibility into what’s going on after it leaves the device.” “No,” said Vivek Nair, one of the researchers. “Our proposal is to develop privacy-preserving tools for VR motion data, as this data cannot be completely removed from most applications.”
I asked Apple what it does to protect this type of data. The answer is “crickets.”
Mixed reality devices are “very exciting and have a lot of potential,” says Berkeley computer science professor James O’Brien. “But I also think privacy considerations need to be a key design criterion, not an afterthought.”