Personalization also gets in the way of people’s present impulsive self and people’s aspirational self (Eli Pariser, 2012 ). Indeed, according to Eli Pariser, the way we behave is a blend between our future and present self. For instance, in the future, we want to be cultivated and well informed about what is happening in the world, but for now, we want to watch ‘silly shows’ on Netflix. At is best, medias are trying to create a balance between ‘should’ and ‘want’ stories. Hence, because filter bubbles are mainly looking at what we click on first it can throw out that balance and instead of a balances information diet, we end up as being given information junk food. What this mainly suggest is that we should always be aware that what we see on social medias, is mostly a reflection of the person we are in the presence and disregards our future indentity when the algorithm is too personalized. The flow of stimulating information has become omnipresent on the web and people are getting more and more addicted to newsfeed and related articles creating a compulsive media behavior consumption and reinforce ‘ideological polarization’(Spohr.D, 2017). Coupled with the fact that hyper personalization leads to a partially online isolation, we are going from information held by humans to algorithms ones. The problem is that algorithms don’t really have any ethics that editors do have. In that sense, if algorithms are going to decide on what actions we take, what we get to see and don’t get to see we would need to make sure that they also show us uncomfortable, challeging and other points of view content in order to make sure that they are transparent enough and give us a feeling that we may have some control over what gets through our filters and what doesn’t (Eli Pariser, 2012) . We need the Internet, to connect us all together, as we once all dreamed of. We need it to introduce us to new perspectives, new ideas and new people instead of leaving us in the web of one. Finally, we tend to desire to be around those who are like us to strengthen our vision of the world and our online behavior is not different. People gather in groups depending on interests, location, affiliations and other details, with their own rules, norms, values and even vocabulary intensifying beliefs and rejecting anyone disagreeing with the community creating what is call a “communal reinforcement”(Morozov. E, 2011). In the last decades, search engines and social medias have allowed the public to use health related inquiry. The information found through these search engines are mainly tailored to individuals through a complex algorithm. In this paper, the researcher explores how technology poses an issue for both patients and clinicians. Children vaccinations has become a hot topic on social medias. Indeed, there are serious problems with this, which in certain situation could mean the differences between choosing to vaccinate a child or leaving it to easily preventable disases. In this particular case, the decision about wether or not to vaccinate a child could, to a large degree, be driven by the filter bubble. In 2014, 23 measles outbreaks and more than 644 cases of measles were reported in the United States. Perhaps most famous is the 2014/2015 outbreak in Disneyland”,California. One of the reasons for the outbreak was a growing concern among parents about the real efficiency of vaccination(Holone.H, 2016). Furthermore, anti-vaccine organization have been successful to spread fear and misinformation which contributed to a lower vaccination rate obviously leading to a great amount of people catching the preventable measles disease. In this case, the filter bubble has contributing to the dissemination of misinformation. This issue also applies to many areas of health information from cancer treatments, nutritions and diets to epidemic outbreaks. Hence, representatives of medical profession have to take precautions and necessarily be vague with their communication with the public, increasing the problem. The effect of the filter bubble has caused a complex issue regarding the health context and solutions are hard to establish. However, this case study proposes a few possible directions such as informing the public about what they see online and how it affects their reality, provide the possibility of unfithered search allowing people to get un-biased and revelant content in one way forward or even, switch to another search engine or service, which is still a challenge since most people are loyal to their trusted brand and services (Holone.H, 2016). All in all, in spite of the fact that the filter bubble has been made for good intention and enable users to get content they want to see, its effect has been significant. In a world partially led by technology, it has become crucial to understand the flow of information we receive on our feeds and be able to analyze them to judge their revelance. Because of the flow of information, instrumentalized by social medias, we must take in consideration that information is already dogmatised by opinion leaders and big institutions. Through the case study, we had a deeper look at how people get misinformed and influenced regarding health issues, leading to diseases that could easily be avoided and treated. Generally speaking, we tend to amplyfie our desire and completely ignore the unknown, mostly because we are scared of it. In the societies we evolve in, it has become essential to have different perspectives, ideas and be introduce to novelty in order to create better cohesion among all.
More from IssuesMore posts in Issues »
More from PoliticalMore posts in Political »