Big Tech uses data to the detriment of consumer health, but we are accustomed. So what is the solution?

opinion expressed by entrepreneur Contributors are yours.

As brilliantly chronicled in the 2020 Netflix documentary social dilemmaMost technology today uses data in the human brain in a way that is harmful to the health of consumers. Examples range from filling people up with ads to creating addictive algorithms on social media to keep them online 24/7. However, awareness of these practices is growing as former tech leaders speak out against it and the demand for data privacy increases.

But what if data privacy isn’t the only solution to these violent practices?

remain so

Big Tech today uses our data to pin down some of our worst insecurities and play on dopamine receptors, creating a web-browsing experience that has even been Compare To gambling

Examples of these practices are well documented: prompting people to subscribe to newsletters, add items to their carts, sign up for services, etc. Social-media platforms, such as Twitter and Facebook, have gone from notifying users when someone has interacted with them, to informing users of activities that have nothing to do with them. Big Tech’s approach to leveraging data and behavioral science to boost its bottom line has come at the expense of consumers who have literally become accustomed to its products.

A 2015 study found nearly half Of those who decided to leave Facebook for only 99 days, they didn’t even make it through the first few days. And many of those who quit successfully had access to another social-networking site like Twitter, so they simply kicked off their addiction. And, to put it mildly, it’s not like social media has become any less addictive.

Of course, businesses are inherently profit-driven and will not back their exploitation of consumer data out of sheer goodwill. But they, of all organizations, know that the customer is always right. And these days, customers Wants healthy data behavior,

RELATED: 3 Reasons Your Company Should Prioritize Data-Privacy Compliance and Security Issues

road ahead

So far, governments have addressed this concern by forcing companies to let users decide “what privacy rights to give, what data you’re willing to part with,” They say Colin Gray, human-computer interaction researcher at Purdue University.

Since 2018, the General Data Protection Regulation (GDPR) has Necessary EU companies ask people for consent to collect certain types of data. Yet banners of many apps outside Europe ask users to accept the privacy policies, with no option to opt out. Facebook’s new Privacy Checkup guides users through a range of options with brightly colored images, though the default is often set with very little privacy in mind. The endless grid of different checkboxes really has the effect of overwhelming users.

Despite the shortcomings of the GDPR, it is clear that abstaining from data exploitation has begun to be written into law, and enforced by it. More than three quarters of countries around the world have either drafted or starring Some types of personal data-privacy protection have been in place in recent years, including in China, Russia, Brazil and Australia. WhatsApp was in September Fined 225 million euros by the Irish Data Protection Commission for not being transparent enough about its privacy policies. Facebook in 2019 have paid $5 billion fine for making “misleading claims about consumers’ ability to control the privacy of their data”.

The way forward is twofold: First, tech companies need to learn to respect data privacy and collect data ethically. Second, when they inform their products with user data and behavioral science, they must do so in a way that promotes user well-being, not exploits it.

RELATED: 50 things you need to know to optimize your company’s data privacy and cybersecurity

How pervasive can healthy data practices be?

The exploitation of data has so far become so ingrained in daily life that it seems almost impossible to reverse it. But there are technologies and approaches already working toward that goal.

Decentralization and privacy have played a major role in the conversation surrounding, for example, Web 3.0 – the new Internet. Privacy advocates, many of whom come from the cryptocurrency sector, regularly argue that blockchain and decentralization should play a central role in the development of Web 3.0 and the Internet of Things (IoT) to prevent the kinds of exploitative data practices we see. See, they can be stopped. current web. Through decentralization or something, this healthy approach that accepts data privacy as a cornerstone of Web 3.0 paves the way for freedom from corporate surveillance. Like the early Internet, the new Internet emphasizes a community-driven opportunity to initiate change, and this includes the way we handle our data.

If the developers who build Web 3.0 are there for decentralization and privacy, then bad practices can be discouraged and slowly weeded out. Privacy-focused browsers, such as Brave and DuckDuckGo, are only a sideact to Google today, but may also be a world in which data privacy is the norm on the Web.

On the messaging front, cross-platforms like Signal also serve as an alternative to WhatsApp intrusion wrong move, offering end-to-end encryption and privacy. Signal is open source, peer reviewed and funded entirely by grants and donations. This is in contrast to the monetization model that has commercialized the Internet, and it gives people more control over their experiences.

Beyond mere privacy, the companies that collect and handle user data should eventually find better uses for that data if they want to survive – making people’s lives better, rather than turning them into dopamine addicts. Of. It extends beyond just social media and messaging and into areas like MedTech.

As the health care system begins to screen entire populations for disease, it will be important to act on red flags and find ways to prevent further health complications. Behavioral science and data can be leveraged to encourage healthy behavior and inquiry. Applications where data is stored privately and used only in the context of the individual’s health journey can initiate conversations and behavior analysis aimed at improving personal well-being rather than actively harming it.

In the automobile sector, companies such as Tesla are using streams of data from their large fleets of vehicles to implement real-time safety improvements. Tesla Cars Vacuum Up all types of data From their environment with sensors and cameras, which are then analyzed by machine-learning algorithms to monitor the position of the car and detect deviations. It is capable of detecting within ten milliseconds what type of accident the driver is about to face, company claim, When an accident occurs, Tesla knows the exact position of the seat and steering wheel and deploys the airbags accordingly for optimum safety.

RELATED: Americans want Facebook and TikTok to be banned over privacy concerns

Tech giants who want to make the most of the data they collect have lessons to learn from companies already using the data privately and for healthy applications. The era of big-tech surveillance is nearing its end. Investing more in the well-being of those linked to the success of a business, not short-term profits, will drive true growth going into 2022 and beyond.

,