FaceApp — the mobile application that has blown up your Instagram feed with pictures of your followers as old people, the opposite gender or babies — has raised a lot of concerns about potential privacy violations for users that upload their photos to be edited. Rumors have circulated that the application might even be taking users’ photos from their phones and uploading them to the FaceApp cloud server without explicit permission.
We reached out to experts in security and data privacy from academia, government agencies, startups and more to comment on the issues surrounding users’ privacy, asking them their opinions about the concerns associated with traditional applications as opposed to blockchain-based decentralized applications (DApps).
FaceApp uses artificial intelligence as well as a neural network to edit users’ images. The one function that made the mobile app suddenly popular last month after its 2017 release was the function that allows you to predict how you would look in the future.
United States Senate Minority Leader Chuck Schumer asked the Federal Trade Commission and the FBI to conduct a privacy investigation into FaceApp, underlining that “it is not clear how the artificial intelligence application retains the data of users or how users may ensure the deletion of their data after usage.”
Justin Brookman, a former policy director for the Federal Trade Commission’s Office of Technology Research and Investigation, said, “I would be cautious about uploading sensitive data to this company that does not take privacy very seriously, but also reserves broad rights to do whatever they want with your pictures.”
Meanwhile, FaceApp denied selling or sharing user data with third parties without permission, adding: “We might store an uploaded photo in the cloud. The main reason for that is performance and traffic: we want to make sure that the user doesn’t upload the photo repeatedly for every edit operation. Most images are deleted from our servers within 48 hours from the upload date.”
“You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you.”
Oh, for sure DApps can be better for privacy and security — if they work, and they work for more than 50 people at a time!
Scaling vs. security is a classic dilemma. Privacy vs. security is the other one. My question would be: Why does the world need another app/DApp? Why aren’t you building infrastructure and interoperability toward intelligent decentralization, personal agency and transparency?
I guess DApps could in an ideal world — but honestly, I’m not seeing useful things work in a decentralized way as much as I’d like.
— Susan Oh, CEO of Muckr.AI and board member of Blockchain for Impact at the United Nations General Assembly
Native mobile applications leak a lot of data. Every app on your phone claims rights to your information when you’re in the application, and sometimes, even when you’re not using that application, it will still collect data in the background without your consent (this is very prevalent with software development kits).
The entire app ecosystem is due for an overhaul. Decentralized applications are a move in the right direction; however, many will not be truly decentralized if there is one party controlling the transactions or the data. The purpose of decentralization is to distribute the transactions and data to where no central party owns it. Therefore, in some cases, decentralized applications will be a misnomer as the app developer or publisher may maintain control.
Facebook’s Libra is a misnomer with decentralization. The crypto payments in this case will be centralized through Facebook and easily trackable. In many ways, this would work against the ideology of cryptocurrencies because every transaction a person makes will be tracked as the person will be identified by the developer of the protocol and coin (in this case, Facebook). The risk is if other app developers pursue a similar model of using blockchain to record every transaction while also verifying identity through various ways.
Facial recognition is permanent; you can change your social security number, your phone number and even your name. But you cannot change your face. Combine this with blockchain transactions and one can easily imagine a dystopian level of surveillance. The best blockchain apps will truly be decentralized and not linked to data like facial recognition, social media data, bank data (like the JPMorgan coin), etc.
— Beth Kindig, product evangelist for Intertrust, former developer evangelist for Personagraph, specialist in security and data privacy
Many privacy concerns arise from what companies choose to do with the data that they collect. Storing data for a given duration in its servers is a choice made by apps like FaceApp. So a blockchain application would be better for people’s privacy as far as it’s designed to be better, which is a value-laden term.
Companies can exert a lot of control over how they design an application, through its architecture, default settings, what it communicates in its privacy policies, and what it does in practice. The value for a consumer concerned about her privacy would depend on the blockchain application and the kind of data collected and processed by it.
— Deirdre K. Mulligan, assistant professor at the University of California, Berkeley School of Information, clinical professor of law at Berkeley Law
With the existing, centralized way of doing things, someone merely needs to gain access to a server to then steal, alter or basically do whatever they want with the data stored there. You only need to look to the high profile hacks of Capital One and Equifax to see that.
Blockchains are built around the principles of decentralization, removing the single point of failure risk (think Equifax servers) and cutting out unnecessary third parties by establishing a more direct, peer-to-peer network. This also maintains your privacy and control of your data from third-party apps as data rests at the protocol instead of the application layer.
For something like FaceApp, this means you could temporarily grant access to your photo stored on the blockchain in order to use its fun filters, but FaceApp wouldn’t be able to maintain a copy (due to encryption and the control of your private key resting with you). Something like this will definitely exist in the not so distant future and we will wonder why we ever blindly gave up so much control of our personal data to use things like today’s social media platforms.
— Timothy Paolini, board member, NYU Blockchain
FaceApp, and any entity that uses facial recognition, should be of concern for everyone. FaceApp’s terms state that once you give it access to your face and name, the company has a permanent license to do whatever it wants with them. This includes sharing/selling your face and name to unknown third parties. You can always change a password if it becomes compromised — you can’t change your face.
We believe in decentralization as a promising path to ensure web users worldwide have control of their data. MeWe is advised by the inventor of the web, Sir Tim Berners-Lee, and we are closely following Tim’s current work on the Solid project. Solid decentralizes the web by giving web users the freedom to choose where their data resides and who is allowed to access it. MeWe plans to be an early adopter of Solid.
— Mark Weinstein, CEO and founder of MeWe
FaceApp exposed what infosec experts have long known — video, image, audio and especially written content is extremely difficult to accurately authenticate as unmodified or produced by a given individual. At Audius, we focus on audio: Determining which part of a song came from where is nearly impossible.
Technology like FaceApp will lead to the proliferation of more hoaxes and fake content purporting to be generated authentically, exacerbating problems with inaccurate news that we already deal with every day. As a society, we will need to be more skeptical of the authenticity of digital content. The identity of the publisher will become a more important part of that equation in the absence of other cues.
With Audius, for example, you can authenticate that a specific artist produced a given piece of content, because that artist’s private key was used to sign the transaction that added the content to the network. Similarly, I believe we’ll see media outlets like CNN or The New York Times starting to authenticate that they actually produced given content by signing it with a public/private key mechanism.
— Roneil Rumburg, CEO and co-founder of Audius
These quotes have been edited and condensed.
The views, thoughts and opinions expressed here are the authors’ alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.