Saturday, May 04 2024 | Updated at 05:34 AM EDT

Stay Connected With Us F T R

Dec 15, 2016 01:07 AM EST

Google's search engine's processes have long been an interesting study among scholars. Researchers have prodded the company to share its principles and ideologies on identifying how it provides users information. With its lack of transparency, the digital era has never experienced the influence of careless misinformation and that users delegate their critical thinking to information that is being served on the web.

In an interview by the Guardian, Google acknowledged to negative news by responding and then withdrawing insulting autosuggest results for specific search queries. Google suggests "jews are evil" when a user enters in "jews are" before suggesting links to numerous right-wing antisemitic hate sites.

This concern has also influenced the controversy regarding the misinformation fiasco  regarding the US general election. Mark Zuckerberg, Facebook CEO, discussed this media controversy. He confessed that the main problem involves structural issues. Facebook financially rewards fake news and sensationalism that is spread through the social network without considering the impact or its truthfulness. Facebook does not determine whether reporting is erroneous not does it even identify fake news from satire.

Business Insider UK reports that during the US presidential election, legitimate news stories are even overshadowed by fake news stories that are even spread by some famous media companies. A tool that monitored Facebook report that the top 20 fake news stories obtained more engagements compared with 20 news stories that were real.  

The big social network is solving the problem that it started. Its step is on the development of algorithmic solutions that can evaluate the content's trustworthiness. However, Facebook is not utilizing its broad scope in promoting literacy in media, nor does it motivate users to identify or critically determine social problems with what they read or share.

This strategy is believed to have dangerous, long-term consequences in social terms. This implies that Facebook will be training its users to outsource their critical thinking to the algorithm of computers. Moreover, it defeats the purpose of utilizing 21st-century digital skills. These include reflective judgment on how technology is leading relationships and beliefs.

The dynamics that Google and Facebook has created for users has forced people to utilize critical thinking. Users do not lack intelligence when they believe fake news. However, users are conditioned to believe into what they are led to believe.

See Now: Covert Team Inside Newsweek Revealed as Key Players in False Human Trafficking Lawsuit

Follows Google, facebook, Critical Thinking, Dangers, algorithm, Mark Zuckerberg
© 2024 University Herald, All rights reserved. Do not reproduce without permission.

Must Read

Common Challenges for College Students: How to Overcome Them

Oct 17, 2022 PM EDTFor most people, college is a phenomenal experience. However, while higher education offers benefits, it can also come with a number of challenges to ...

Top 5 Best Resources for Math Students

Oct 17, 2022 AM EDTMath is a subject that needs to be tackled differently than any other class, so you'll need the right tools and resources to master it. So here are 5 ...

Why Taking a DNA Test is Vital Before Starting a Family

Oct 12, 2022 PM EDTIf you're considering starting a family, this is an exciting time! There are no doubt a million things running through your head right now, from ...

By Enabling The Use Of Second-Hand Technology, Alloallo Scutter It's Growth While Being Economically And Environmentally Friendly.

Oct 11, 2022 PM EDTBrands are being forced to prioritise customer lifetime value and foster brand loyalty as return on advertising investment plummets. Several brands, ...