The algorithms that suggest your kids’ content

Next story
Alžbeta Kovaľová

When considering mental health issues linked to online behavior, our thoughts may turn to cyberbullying, but there is another concern far more formidable than we might realize — access. Search engines have given us easy access to more and seemingly ever-fresh sources of content that have the potential to be just as detrimental, if not more so, than cyberbullying. Of course, search engines, along with personal computers, the internet, and the World Wide Web, are not inherently bad, but there is reason to be cautious. While we have seen great leaps in their development and functionalities, have we as technology users understood the accompanying risks?
 
Paradoxically, while one of the greatest areas of progress in computing relates to search engine algorithms, some of the most concerning issues are rooted in them. With greater use of search engines, their design evolved largely around deep learning, location and more data processing power. This combination has made them more powerful, making it easier for users to find the content they request. However, it has also increased the opportunity for unwanted or harmful content to appear or be requested and potentially disturb the user.

Early search engines
The first web search engines were built in the 1990s, after Tim Berners-Lee’s successful proposal for the World Wide Web. Most of their key development was done in the ’90s; however, modern search engines, such as Google, are now self-optimized, with algorithms tuned in real time and daily improvements to the user experience to suit the “modern” user.  

Before Google Search became the preferred search engine, there was another giant, Yahoo! Search. Founded in 1994, Yahoo! was one of the pioneers of the web, offering a hierarchical directory of websites organized by category — Yahoo! Directory. At first Yahoo! Search could only search this directory; later, it started using its own web crawler to search the web and, later, served up results from other search engines, like Google and Bing. One of the reasons that Yahoo stumbled was that it prioritized old, trusted websites rather than new and more relevant ones. In contrast, Google brought fresh content to its users, making it increasingly popular.

Over time search engines have become even more sophisticated and elaborate and have gone mobile. Users, many of them children, have the entire web at their disposal at all times. This also means that the potential to access or receive inappropriate or harmful content is very high. The internet is an expansive place that gives various groups and communities the opportunity to meet and scale their influence for good and bad.

Not your neighborhood library
One thing may lead to another, and a child, minor or even an adult might stumble upon, attract or deliberately view content that might be harmful to them. This is an issue that has been present ever since the creation of the first search engines. As a growing problem, risky content is readily available on social media, online forums, websites and ads.  
 
Back to the evolution of the search engine. To better serve users, search engines and social media alike started using predictive search and monetizing it; as such, the algorithms leveraged by search engines began to not only locate content, but also to suggest it. Large social platforms and search companies employ these developments to drive profit via ads (for example), but also to “feed” users content that has the potential to (artificially) broaden their interests. In this manner, search behavior informs the users’ “for you” or “suggested” pages. This can be particularly problematic for children and young adults, whose interests and personalities may not have fully formed. This pattern also opens up children and their interests for immediate and future monetization.
 
When “search” gets personal
Parents and educators need to be aware of the dangers awaiting minors online and be educated enough to help them. To highlight how direct a correlation there is between behavior-based search and the provided results, let’s consider how easily a “What I eat in a day” video may land you on a pro-ana (pro-anorexia) online forum, a thinspiration (thin inspiration) message board, or even a thread full of self-harm tips or other explicit content.

The social issues and the technology have evolved to a point where stopping them is a difficult task. Algorithms work tirelessly to bring users content they calculate they might enjoy and interact with. Therefore, we have to do everything in our power to protect children, minors and ourselves.

This issue has now captured the attention of not only popular media, but also some governments who recognize the danger this brings. The story of Molly Russell was one of the first to bring the issue to the light of the day and get people talking. Even though large social media platforms endeavor to protect their users, efforts certainly lag behind rapid development in business and technology.  
 
Some states have taken it upon themselves to protect the most vulnerable. In early March 2022, lawmakers in the US state of Minnesota set out to pass a law prohibiting social media platforms from using algorithms to suggest content to anyone below 18 years of age. However, this initiative has met opposition. Tech industry lobbyists claim passing the bill would violate the First Amendment, preventing companies from recommending useful content to users, and would require the companies to collect more data on their users. Another argument in opposition is that the law, however well intended, would undermine parental choice and restrict access to useful technologies.

A toolbox of prevention
It is natural that kids want to spend time on the internet, but they should not be wholly unsupervised. A great tool to help you keep tabs on your child´s behavior online is Parental Control. In addition to providing limits on how long your child can access certain apps and websites, it can also block specific content types and URLs for PCs and mobile devices alike.  

One of the best features of ESET Parental Control, found in ESET Smart Security Premium, is Web Guard. Since websites can be categorized according to keywords, Web Guard blocks categories it deems inappropriate for your child’s age group. Of course, adult sites featuring pornography and gambling are blocked for all age groups. For Android devices, there is even a Safe Search feature that filters search engine results so that you do not have to worry that search engines suggest inappropriate content your child is not ready to view. You can also manually blacklist websites and apps you deem inappropriate for your child. The same applies to whitelisting appropriate resources.

Whether you start using Parental Control, an even more important task remains: educating yourself about the content that is on the web and having regular conversations with your children about the online and offline world. Talking to your children is one of the best tools you can give them to protect themselves. Education on any subject should start in the family, and that is especially true for personal and private topics and our online presence.

Children and minors deserve to be treated with respect and educated about the choices we make about or for them. Talking to them about their online behavior may make them feel like we are invading their privacy, so be sensitive and make sure they feel heard and understood.

To learn more about safety online for children, visit saferkidsonline.eset.com.