Most of the time, I'm unable to share the work I've done with companies or even talk about what was made because of rather lengthy NDA's. It's not odd of course, companies want to protect their upcoming features or processes as much as possible. Honestly, it has a lot to do with not letting freelancers get any of the glory either. Whether it's been Spotify, IKEA, or iZettle - bulletproof NDA's are the reason you don't find these case studies in my portfolio. This level of privacy is understandable.
Getting into the outside world, the same level of secretive privacy-focused concerns remain in some areas. Look at the way public cameras are viewed currently. They are seen as an intrusion to our private lives.
When it comes to our personal tech, we seem to lose all our morals as long as there's some new fancy feature to give us a quick buzz. We are all guilty of this to one degree or another. We fail to ask ourselves, "Who's watching?" and "Who's listening?".
Who's watching?
A while back, there was a huge buzz around the app Face App. What that app does is it takes a picture of you and then adds wrinkles to give you an idea of what you'll look like in 30 or so years. Funny right? Lots of people sure seem to think so given that the app has 800,000 reviews on the AppStore with a 4.7 rating. In 2017, it was reported to have around 80 million active users.
But when it really comes down to it you just gave permission to a company you know nothing of to use your picture. A company that's specialized in face detection algorithms. To top it all off? It's a Russian company.
Shortly after the playful photo-transforming FaceApp went viral Wednesday as the most downloaded smartphone app in America, a nationwide panic began to set in: Who was this shadowy Russian tech firm everyone had been sending their photos to? And what did they want with millions of people’s faces?
Washington Post: Panic over Russian company’s FaceApp is a sign of new distrust of the Internet
Luckily with Face App, it seems like it wasn't any cause for alarm we think. The photos are all stored on American servers run by American companies (mostly Amazon) and there was no evidence found that the company has any ties to the Russian government - BUT neither did Cambridge Analytica and Facebook at first, so keep that in mind.
Who's listening?
While sharing our image with little thought can be scary, it's not just our faces that we are mindlessly sharing. People talking on their phones seem to forget that they're still in a public space. Even without eaves-dropping, I'm sure you've heard a lot of things that wasn't meant for public knowledge. Even worse, some people use the external speaker on the phone rather than the internal one, practically inviting everyone to hear both sides what's said.
It's amazing how many (actionable) company secrets you can overhear by just sitting in a public place for an hour. And we keep talking like "advanced IT security" is what's most important...
— Erik Bernskiöld (@ErikBernskiold) October 24, 2019
We're concerned about having our meetings "in private" while, at the same time, we keep filling these conference rooms with 'smart assistants' even though we all know better.
By now, the privacy threats posed by Amazon Alexa and Google Home are common knowledge. Workers for both companies routinely listen to audio of users—recordings of which can be kept forever.
Ars Technica: Alexa and Google Home abused to eavesdrop and phish passwords
So far these smart assistants have primarily been in our homes (still listening to private conversations, but of a different kind). That is changing though as Google is ready to roll out Smart Assistants for Work. It's weird how even though we all know that they listen in on way more than they should - we're still too focused on new features:
Assistant is finally getting ready for work, and what's fascinating is coming to realize how compelling it could be for businesses that are already using Google's tools all day long, even at a basic level. In the near future, Assistant will be able to:
- Access work calendars, finally, on Assistant devices (provided your admin doesn't disable it)
- Jump into work meetings from Assistant devices by saying 'Hey Google, join my next meeting'
- Hangouts Meet hardware will allow voice commands as well, so you can say "Hey Google, end the meeting" or "turn on spoken feedback" to get accessibility features.
Charg.ed: Google Assistant goes to work
So in return for having to click ONE BUTTON to join or end a call, we're willing to give away the contents of the entire meeting to Google? We're just not thinking this through...
What does it take?
I wouldn't describe myself as someone who's overly concerned about privacy. Years ago, I deleted my Facebook account, but it was more because of personal health issues than privacy concerns - I still have Instagram so... I AM amazed at how upset people were about the Facebook Analytica scandal, only to run straight to the next thing that's waiting to abuse your privacy.
Still, experts said, the FaceApp anxiety highlighted how quickly public attitudes about the Internet have changed amid a widespread reckoning over data privacy and election interference, with more people beginning to think twice about the personal data they freely give up — and the companies they decide to trust.
Washington Post: Panic over Russian company’s FaceApp is a sign of new distrust of the Internet
I agree that our concerns and awareness about privacy have grown, but if anything the FaceApp anxiety showcased just how we act as humans - a big dose of 'too little, too late'. There's almost no point to thinking twice about how much personal data you've given up and to what companies after the fact. What's done is done and you're in damage control mode now.
We're still find ourselves blinded by features and will not think of the consequences until afterwards, once buzz has settled. We praise Tesla for the always on-connectivity, but have little to no knowledge what they're tracking. We use Google and Facebook services on a daily basis even though we've seen the evidence over and over that the data is being misused for their own benefit.
I believe it's our duty not only as designers to design better products, but also to raise awareness of the privacy concerns that some of these beloved products come with. With our united voice, perhaps we can begin to awaken the consumers to the danger they're in when they consider the feature before the intent of the product.
Stay safe everyone and keep your data as private as you can.