WhatsApp Sets Up End-to-End Encryption Marketing Campaign
Updated · Feb 21, 2022
In an April National Society for the Prevention of Cruelty to Children (NSPC) charity event, Priti Patel, the home secretary, said she had concerns that the move could put children in danger. NPCC also added that private messaging could be a breeding ground for online grooming for kids.
The organization’s announcement in January to change its Terms and conditions (TCs) led to a mass exodus to competitors like Telegram and Signal.
But it’s now clear that users misunderstood the TCs. Most thought it would share their data with Facebook, which couldn’t be further from the truth.
Speaking to BBC, Will Cathcart, WhatsApp boss, said the company was taking accountability for the confusion arising from its announcements at the beginning of 2021. He added that part of the decision to run the adverts was to clear any doubts.
Not Backing Down
Cathcart said that the technology was vital in a world full of hostile governments and continuous threats from hackers.
He further said that the authorities should be demanding more privacy and security rather than fighting WhatsApp over this move.
He went to say that states shouldn’t be encouraging tech companies to incorporate less protection for fear of repercussions from authorities. Instead, they should be mandating more robust safety measures for the benefit of their citizens.
He also claims that the company was happy to educate the general public regarding data protection and encryption directly. He explained the term was abstract, and WhatsApp was determined to translate it to people.
How It Will Work
The campaign will begin running in Germany and the United Kingdom today. After that, it’ll run globally.
It will be available on different mediums, such as TV, Radio, digital outdoor advertising, and online.
End-to-end encryption will allow device owners to be the only ones that can read sent and received messages. In other words, the government, WhatsApp or Facebook won’t be able to access them.
What WhatsApp is Doing About Misinformation
Using machine learning and reports, WhatsApp can flag inappropriate content. It also checks message volumes and groups that a user may join.
Additionally, there are limits to the number of people a user can share messages with within the app. The platform also flags texts that have too many forwards.
In 2020, thanks to the tools, the app reported 300 000 images of missing and children undergoing exploitation. Additionally, the company bans around 2 000 000 accounts monthly for violating its code of ethics.
Eve is a lover of everything technology. Figuring out how software works and creating content to shed more light on the value it offers users is her favorite past time. When not evaluating apps or programs, she's busy trying out new healthy recipes, doing yoga, meditating, or taking nature walks with her little one.
Latest from Author
Your email address will not be published.