When someone asks us about the suppression of freedom of speech, we usually tend to think about totalitarian regimes or dictatorships, where the government or leader uses a powerful network of spies, agents, police, and other types of services to silence any possible type of idea that is opposed to the regime’s wide-spread propaganda.
In reality, there are other underlying forces that can shape our freedoms. As they are built-in by design, they can pass unnoticed and become a part of the common norm.
In “Group Agreement,” psychologist B. F. Skinner explained how a group can influence what a person is saying, even if everything the person said is factually correct. Under the influence of the group, the person can start doubting that what he or she is saying has any substance.
Now, imagine a slightly different experiment: every time a person tries asking a group of peers something, they will ignore it. They will give the questioner the silent treatment. In the end, the person will get the impression that no one is interested in what he or she has to say and that his or her ideas and thoughts do not have any value.
Most of the online tools we use — search engines, news aggregators, and social networking services — have exactly the same effect on us. Even worse, they build wrong, addictive habits, creating a completely wrong impression about what the real world looks like. Instead of helping us to build communities and spread ideas, they actually suppress ideas. By favoring only those who are already on the top, they are preventing those on the bottom from rising to the surface.
Our internet search engines, regardless of which one we use, are bad for us. The simple reason is that, most of the time, search engines will give us “relevant” results, but those relevant results do not mean that the results we are getting are what we actually looking for; it only means that certain parameters — like interlinking with other sites, key words, number of views, and especially number of mentions in popular news — will push the article to the top. But this does not mean that the article is: truthful, brings fresh and critical ideas, or that it corresponds to what we were looking for in the first place.
When we shop in bookstores, we spend time there; we gaze, we stroll, and, sometimes, choose things without any special reason. We talk with our friends, and they tell us stories about what they have read. Frequently, we just randomly stumble upon something that completely changes our perception.
Search engines do not have some magical powers to determine what we need and what we desire — or even what will surprise us — they just collect previous data and give us results that are similar to the things we already read or visited. Let’s say we are curious to find the meaning behind the term “Aspergillus.” We will type the term in a search engine, get results, and then we will click on whatever is on the first page — at least, that is what great majority of users do, since a recent study found that 91.5% of users don’t go to the second page of results *1. If we are not satisfied with those results, we will click on results on the next few pages, but that is something that people rarely do — just 4.8% for the second page and 1.1% for the third page.
Out of a million results, in the great majority of cases, we will check only those that are displayed on the front page of the search engine, or maybe those on the page after.
We will never find out what is on the page 756, not even accidentally. Regardless of the fact that that page has interesting findings about mould’s connection with ground-breaking research we are currently working on.
Now, you may think, “Why we do not change our keywords to get that result?”
The answer is both simple and complex at the same time: we cannot change keywords to something we do not know.
Our search engines are going even further to dull our senses and close the open-mindedness. They will remember our searches, and they will offer similar pages in the future, regardless of the fact that we maybe do not want to know anything else about “Aspergillus.”
Furthermore, it will saturate our online space on any site that has AdSense adverts or is affiliated with the search engine company (like your favorite video platform) with adverts from pharmaceutical companies (in this case), secretly sending subliminal messages into our minds. Also, it will use the same advertisements to track our behavior on the internet — what kind of sites we visit, what we are looking for, and many, many more.
And often, articles at the top of the search results are pushed there by marketing strategies, instead of by content quality. They are saturated with adverts and banners, rarely having content worth mentioning.
If it is not something extremely shocking, it will be quickly forgotten. There is almost a “requirement” to have shocking news, and mainstream journalists compete in writing news that will shock you more than what the other stations are offering. Usually, that creates an opposite effect: just as people who are scared too many times become immune to that shock, the news audience may simply stop caring. That creates a paradox: as news agencies have knowledge about that effect, just as children that seek attention by misbehaving, the news agencies start to do whatever they can, fighting for attention, even more intense.
Eventually, saturated in the field of marketing context, you will maybe buy one of their products, puzzled as to why have you done it in the first place. However, that is only a mild effect, in comparison to what is going on under the surface; all that saturation will make you dull, less creative, and significantly more ignorant.
In real life, we are randomly exposed to different kinds of things, and that makes life interesting. People tell different stories and make our lives more colorful. The more we are exposed to the same content, the more our stories will align. We will all start to look alike, constantly repeating the same things, like mindless automatons.
For new authors, that is really bad news. Even if they have some nice story to tell, but they have not built a social network and community around them, they are destined to fail.
Unfortunately, the worst thing is that we currently do not have any reasonable alternatives, as the concept of digital searching requires precision, not allowing any room for coincidence. User interfaces of search engines are built in a way that roots out accidents. They have strict analytical models that rate pages by strict rules and in a defined order.
Interestingly enough, as many major breakthroughs come by pure accident, maybe the results we are getting should not be in a strict order, but, instead, it should have those random “events.” Imagine if search engines would give you 30% of the results as a random sample from other pages, maybe some random new or old results, “instead” of the most relevant ones.
Search engines are just a part of our online freedom of speech problem. Modern news aggregators all suffer from a similar illness.
When we think about suppression of speech, for most of us, the first thing that comes to mind is censorship. We tend to imagine some kind of government agency from the Soviet Union Cold War era, where elected officials worked in dusty, dark rooms to edit everything that needs to be published. In more severe cases, they work closely with secret police, fighting self-publishers and those who want to act independently, preventing any possibility of independent thought.
Censorship, by definition, is the suppression or prohibition of any parts of books, films, news, etc. that are considered obscene, politically unacceptable, or a threat to security.*2 Censorship is the simplest tool of suppressing the freedom of speech by filtering out content or information and deciding who needs to have access to what. Through censorship, authorities (government, company, agency ...) are acting as a moral compass, not allowing others to find out what is not in their best interests.
Fashion and taste have changed over time; we moved from moldy, dusty rooms to almost futuristically-styled ones, and, in the process, we also rebranded “censorship,” giving it new name: “moderation.” Almost every news aggregator has it — partly to stop ever-increasing, malicious spam and marketing content — but, most of the time, to stop “self promotion.” An army of moderators is there to do the job that once was held by secret agencies, and the only difference is that many of them do it without being paid. They do it because they believe it is right thing to do. But, the problem with information morality is similar to being a drug addict: how do you know when to stop?
What is “self promotion”?
By definition, it is an action of promoting or publicizing oneself or one's activities, especially in a forceful way. In reality, this means that most news aggregation sites will ban you for posting content from your own website/blog. Effectively, you are welcome to share someone else’s things, but you are not welcome to share your own.
Do you see anything terribly wrong with this concept?
Imagine a marketplace where you cannot sell the pears you have grown on your own, but, instead, you have to sell someone else’s, and, if you sell a large number of them, occasionally, now and then, you could slip in few of yours. Yes, many artists do that. They play music from other, famous bands, they play other gigs, and sometimes, now and then, they will slip in few of their own songs.
But, is that really the way to go? Will that boost or suppress an artist’s creativity?
Furthermore, how did the term “sharing” get the meaning “sharing other people’s things?”
Why is it acceptable to share what other’s did, but it is not permitted to share what you have done?
At least, that is the message news aggregation sites are promoting.
What bothers me about this is not that people doing it. What bothers me is that they are “forced” to do it by design, and there is no other option. If you hand over your marketing strategy and rely on search engines, you will experience the issues I’ve mentioned earlier, ending in a real-life Catch 22 circle.
This is not some old-school, communist-style type of censorship, like in China, where everything is forcefully blocked. This is more unnoticed, as it is there by design. The reason no one is complaining is because it was there from the beginning, and most people are unaware of the consequences. Nowadays, the largest news aggregating platforms, like reddit, are nothing but spaces for promoting mainstream media. Ironically, everyone is complaining about how mainstream media is largely controlled by government agencies and corporations, and they reach out to the same media that are their largest advertisers.
Additionally, the concept where you are not permitted to share your own links and content is flawed, and just opens space for finding loopholes. In reality, if you wanted to get things promoted, you would probably open several accounts, using different remote servers and IP addresses, and then you would occasionally pop your content in. A more elaborate approach would be to give your content to the individuals who are active in link-sharing as a full-time job.
The really wrong thing about our online systems is that they have become pyramidal, suppressing information flow by the sheer amount of information out there.
Corporations and big money are controlling media, and they are dictating what the majority will know, think, feel, and how are they going to behave. In this day and age, if someone wants to censor your content, he does not need to employ the police; he can just make your content hard to reach by pushing it way below.
Overwhelming users with the amount of data they need to pass before they get to something that actually corresponds to their inner contexts is a much easier and less obtrusive way of censorship. Content is still accessible, so no one can say it is censorship, but, in reality, accessing it is so hard that the chance of being noticed by the targeted group is very low.
It is not that our freedom was taken from us forcefully or that we gave it away. We were more or less cooked slowly like a boiling frog. We were seduced by the easiness of one approach over another. Just as it is easier to watch TV than read books, it is easier to get used to having someone pour lots of nonsensical information into our brains — much easier than looking on our own. It is easy to get infected by viral content, just by letting go — allowing the current to take us whenever it goes — but is that really what we want our future to look like?
The real question is why is all that necessary, and is it possible to avoid it all along?
The marketplace should be a marketplace, and everyone should be allowed to be there, with equal opportunity. People/customers should be those who decide what is good and bad for them. As usually happens in an open market, every product will find its customers.