The following article by Brad Deflin was coined "The Filter Bubble is Controlling Your Thoughts" by Ted Bauman, privacy guru at Banyan Hill in his January 2019 edition of the Bauman Letter.
Google's Data Center in Delfzijl, on the North Sea in the Netherlands.
"Brad Deflin is CEO of Total Digital Security, a company that provides industrial-strength data security and privacy solutions to individuals and small businesses. One of the things I like most about Brad is that his entrepreneurial activities are guided by a clear understanding of the need to address threats many of us don't see until they are pointed out. In this article, he adds another reason to like his work: the same things that can protect our privacy can also help protect our rapidly degenerating democracy." By Ted Bauman in the January 2019 edition of the Bauman Letter, published by Banyan Hill.
Read more, here.
Here's a term to know: filter bubble.
It's when Big Tech's algorithms twist one's reality to suit their own agenda. Big Tech's algorithms determine the results we see from search engines, news feeds, and in our social media experiences. Sophisticated software programs using your personal information enable Big Tech to know who we are, what we do, and what we like and don't like.
With that, they know in which direction to nudge us to do what they want, which isn't always what we might choose for ourselves. Filter bubbles prevent us from seeing information that doesn't fit the manipulative agenda of the algorithm. We become intellectually isolated.
Here's Techopedia's definition of a filter bubble:
"A filter bubble is the intellectual isolation that can occur when websites make use of algorithms to selectively assume the information a user would want to see, and then give information to the user according to this assumption."
Here's an example of the impact of filter bubbles ... from an actual case from the field.
A Bearish Landslide
An investor sold everything she had in the stock market at its peak in January this year. It amounted to several million dollars. She used her Chrome browser to do the trades on line. She managed the transactions with her Gmail account.
Weeks later, as the market corrects, she sends emails to close friends and family. She shares her satisfaction with being "100% in cash" while everyone else suffers.
After that, her Gmail inbox, her Google search results and her website experiences are dominated] with bearish investment strategies, precious metals dealers and other sales pitches. They are all slanted toward those looking for a place for their cash other than the stock market.
All because her momentary choice to adopt a cash position had been noted, coded and converted into an algorithm by the services she uses on the internet.
The same thing happens to an even greater degree when it comes to political opinions. Express one, and expect to see nothing but more of the same from then on unless you go looking for alternatives.
Trapped and Isolated
The effect of filter bubbles is to push each of us i:n a particular direction. They nudge, incline, influence and even enrage us to go a little further to the side we're already on.
But what if the"side" we appear to be on is only momentary?
It doesn't matter. Filter bubbles deprive us of information that might lead us to change our views.
We still don't know how much filter bubbles are changing our democracy. But from recent events, we can see that things are getting worse fast.
Without actively seeking out information, developing personal awareness and making adjustments, we are at best pawns ... and at worst, slaves to the system.
Apple CEO Tim Cook spoke in Brussels recently at the International Conference of Data Protection and Privacy Commissioners.
He said this on the matter:
"Our own information - from the everyday to the deeply personal - is being weaponized against us with military efficiency. These scraps of data each one harmless enough on its own, are carefully assembled, synthesized, traded and sold ... and let companies know you better than you may know yourself. Your profile is a bunch of algorithms that serve up increasingly extreme content, pounding our harmless preferences into harm. We shouldn't sugarcoat the consequences. This is surveillance. "
Some of the Apple chief's most vivid comments centered on the growing threat from Big Tech's ''data industrial complex." It's a throwback to President Eisenhower's "military-industrial complex." An "'industrial complex" is an alliance between industry players with enough power to influence public policy. In Eisenhower's time, it was an alliance of the US. military and defense contractors. They influenced U.S. policy to emphasize military power and its use to intervene in world affairs. Eisenhower correctly saw that this ran counter to the United States' long, peaceful history. We only engaged in foreign involvements 1in extreme circumstances, like the World Wars. Since then, the U.S. has become ever more militarized and has used its armed forces hundreds of times in hundreds of foreign countries.
Beware the Data-Industrial Complex
In our time, the data-industrial complex is influencing the way we think with its filter bubbles. In its pursuit of profit, the big data companies are transforming the way we behave as a society. The challenge facing all of us is how to remain 1individual, autonomous and free in the face of this onslaught. Recognizing the existence of the filter bubble - and adopting the tools to combat it - is the first step.
Read the article here, in the January 2019 Bauman Letter:
If you would like to learn more about our cybersecurity products and services, please contact us here: