Privacy predictions for 2021, according to global cybersecurity company Kaspersky. 2020 saw an unprecedented increase in the importance and value of digital services and infrastructure.
From the rise of remote working and the global shift in consumer habits to huge profits booked by internet entertainers, we are witnessing how overwhelmingly important the connected infrastructure has become for the daily functioning of society.
What does all this mean for privacy?
With privacy more often than not being traded for convenience, we believe that for many, 2020 has fundamentally changed how much privacy people are willing to sacrifice in exchange for security (especially from the COVID-19 threat) and access to digital services.
How are governments and enterprises going to react to this in 2021? Here are some of our thoughts on what the coming year may look like from the privacy perspective, and which diverse and sometimes contrary forces are going to shape it.
- Smart health device vendors are going to collect increasingly diverse data – and use it in increasingly diverse ways:
Heart rate monitors and step counters are already a standard in even the cheapest smart fitness band models. More wearables, however, now come with an oximeter and even an ECG, allowing you to detect possible heart rate issues before they can even cause you any trouble. We think more sensors are on the way, with body temperature among the most likely candidates.
And with your body temperature being an actual public health concern nowadays, how long before health officials want to tap into this pool of data? Remember, heart rate and activity tracker data – as well as consumer gene sequencing – has already been used as evidence in a court of law.
Add in more smart health devices, such as smart body scales, glucose level monitors, blood pressure monitors and even toothbrushes and you have huge amounts of data that is invaluable for marketers and insurers.
- Consumer privacy is going to be a value proposition, and in most cases cost money:
Public awareness of the perils of unfettered data collection is growing, and the free market is taking notice. Apple has publicly clashed with Facebook claiming it has to protect its users’ privacy, while the latter is wrestling with regulators to implement end-to-end encryption in its messaging apps.
People are more and more willing to choose services that have at least a promise of privacy and even pay for them. Security vendors are promoting privacy awareness, backing it with privacy-oriented products; incumbent privacy-oriented services like DuckDuckGo show they can have a sustainable business model while leaving you in control of your data, and startups like You.com claim you can have a Google-like experience without the Google-like tracking.
- Governments are going to be increasingly jealous of big-tech data hoarding – and increasingly active in regulation:
The data that the big tech companies have on people is a gold mine for governments, democratic and oppressive alike. It can be used in a variety of ways, from using geodata to build more efficient transportation to sifting through cloud photos to fight child abuse and peeking into private conversations to silence dissent.
However, private companies are not really keen on sharing it. We have already seen governments around the world oppose companies’ plans to end-to-end encrypted messaging and cloud backups. Pass legislation forcing developers to plant backdoors into their software, or voice concerns with DNS-over-HTTPS. And more laws regulating cryptocurrency being enacted everywhere, and so on and so forth. But big tech is called big for a reason, and it will be interesting to see how this confrontation develops.
- Data companies are going to find ever more creative, and sometimes more intrusive, sources of data to fuel the behavioural analytics machine:
Some sources of behavioural analytics data are so common we can call them conventional, such as using your recent purchases to recommend new goods or using your income and spending data to calculate credit default risk.
But what about using data from your web camera to track your engagement in work meetings and decide on your yearly bonus? Using online tests that you take on social media to determine what kind of ad will make you buy a coffee brewer? The mood of your music playlist to choose the goods to market to you? How often do you charge your phone to determine your credit score?
We have already seen these scenarios in the wild, but we are expecting the marketers to get even more creative with what some data experts call AI snake oil. The main implication of this is the chilling effect of people having to weigh every move before acting. Imagine knowing that choosing your Cyberpunk 2077 hero’s gender, romance line and play style (stealth or open assault) will somehow influence some unknown factor in your real-life down the line. And would it change how you play the game?
- Multi-party computations, differential privacy and federated learning are going to become more widely adopted – as well as edge computing:
It is not all bad news. As companies become more conscious as to what data they actually need and consumers push back against unchecked data collection, more advanced privacy tools are emerging and becoming more widely adopted.
From the hardware perspective, we will see more powerful smartphones and more specialized data processing hardware, like Google Coral, Nvidia Jetson, Intel NCS enter the market at affordable prices. This will allow developers to create tools capable of doing fancy data processing. Such as running neural networks, on-device instead of the cloud, dramatically limiting the amount of data that is transferred from you to the company.
From the software standpoint, more companies like Apple, Google and Microsoft are adopting differential privacy techniques to give people strict (in the mathematical sense) privacy guarantees while continuing to make use of data. Federated learning is going to become the go-to method for dealing with data deemed too private for users to share and for companies to store.
With more educational and non-commercial initiatives, such as OpenMined, surrounding them, these methods might lead to groundbreaking collaborations and new results in privacy-heavy areas such as healthcare.
We have seen over the last decade, and the last few years in particular, how privacy has become a hot-button issue at the intersection of governmental, corporate and personal interests, and how it has given rise to such different and sometimes even conflicting trends.
In more general terms, we hope this year helps us, as a society, to move closer to a balance where the use of data by governments and companies is based on privacy guarantees and respect of individual rights.