After Facebook
We must be having issues defining the whole concept of privacy, and knowing our internet boundaries, so what is Internet privacy?
“Internet privacy involves the right or mandate of personal privacy concerning the storing, repurposing, provision to third parties, and displaying of information pertaining to oneself via of the Internet.”
The public relations crisis that Facebook faced over the last weeks after the Cambridge Analytic scandal, brought forward many questions related to privacy about access to all this data and what could be done with it, and how that affects the way we use the various platforms.
My main problem relates to the privacy of kids, although the COPPA (Children’s Online Privacy Protection Act, passed in 1998, and took effect in 2000), presumably protects the privacy of children under the age of 13, I’m no longer sure that they are protected.
Also, after the Facebook scandal I grow less and less confident that we can protect their privacy, take You Tube for example, you need to be 13 to have an account, both my kids are below 13 and they have accounts, even it is not their own account, they are using their parents accounts, and in this case, how can an algorithm distinguish between the parent and the child.
YouTube announced an option to filter videos by approved content, and this way parents can check the channels and the subjects and have more control over what their kids can see, especially when they see a favorite Disney character in violent or sexual situations.
This action comes four months after it was discovered that unsuitable videos were available on so-called kids’ friendly pages. The screening process depends on machine learning algorithms, but nevertheless, some cartoons disguised as appropriate content for kids manage to find way on to the platform.
When a kid’s video is uploaded to the main platform, it does not automatically find its way to the YouTube Kids library, it is screened and reviewed by an algorithm to ascertains if it is suitable for the kids app and the process could take days.
Hopefully once this new setting switches on and the new tool that YouTube plans to launch later this year, allowing parents to screen anything their kids can see, users will be able to make sure to choose appropriate and acceptable content for children, that leaves us with the problem that a human, the parent, will have to create this list and keep screening. No matter how hands-on the parents try to be, something will slip, and damage could be done.
And even if we manage to protect the information that is getting to the kids, are we able to protect the information taken from them, hence violating their privacy even more.
Google, YouTube’s mother company had maintained that protecting kids and families is and always will be a priority, but maybe this is “wake up” time for the Tech industry, as people reevaluate their relationships with these platforms, information sharing and protecting privacy.