The Web Design Group

... Making the Web accessible to all.

Welcome Guest ( Log In | Register )

 
Reply to this topicStart new topic
> "Google’s search engine manipulation effect SEME, SEME is one of the most powerful forms of influence ever discovered in
Mike777
post May 7 2022, 06:36 PM
Post #1


Member
***

Group: Members
Posts: 39
Joined: 20-March 22
Member No.: 28,287



Source: https://articles.mercola.com/sites/articles...;rid=1482425811

Robert Epstein Warns Against Big Tech Manipulation
Analysis by Dr. Joseph Mercola
May 06, 2022

Free speech is fleeting, this content will disappear in: 5 Hours: 41 Minutes

Story at-a-glance

Robert Epstein, Ph.D. warns about Google’s ability to control public policy, swing elections and brainwash our children
The methods Google uses are ephemeral and leave no paper trail behind, making it very difficult to track and prove that they’re using humans as pawns, manipulating us via ways that we can’t counteract
Research by Epstein and colleagues has found that biased search results can change people’s opinions and voting preferences, shifting opinions in undecided voters by 20% to 80% in certain demographic groups
Google’s “autocomplete” feature on its search engine can turn a 50/50 split among undecided voters into nearly a 90/10 split — all without people realizing they’re being manipulated
The first step to breaking free from Google’s dictatorship is recognizing that the manipulation is occurring; the next involves consciously opting out of it as much as possible by protecting your privacy online

Google has the power to manipulate what you see online, targeting you with certain advertisements and burying search results they’d rather you not see. But can they go so far as to control the outcome of political elections? Absolutely, according to Robert Epstein, Ph.D., a senior research psychologist at the American Institute for Behavioral Research and Technology (AIBRT).

Epstein, a Harvard-trained psychologist who founded the Cambridge Center for Behavioral Studies, likens Google to a dictator with unprecedented power because it relies on techniques of manipulation that have never existed before in human history. The free services they provide really aren’t free, he warns. “You pay for the with your freedom.”1
Google Uses Ephemeral Manipulation Tools

In the video above, Epstein speaks with Jan Jekielek, senior editor of The Epoch Times, about Google’s ability to control public policy, swing elections and brainwash our children. Google has the power “to censor content, to track our every move, to tear societies apart, to alter the human mind, and even to reengineer humanity,” Epstein writes in his report, “Google’s Triple Threat,”2 which he details in his interview with Jekielek.

The methods Google uses are ephemeral and leave no paper trail, making it very difficult to track and prove that they’re using humans as pawns, manipulating us via ways that we can’t counteract. Ephemeral experiences occur briefly, then disappear, and include things like a list of suggested videos on YouTube, search suggestions and topics in a newsfeed.

“They affect us, they disappear, they’re stored nowhere and they’re gone,” Epstein says. “It’s the ideal form of manipulation. People have no idea they’re being manipulated, number one, and number two, authorities can’t go back in time to see what people were being shown, in other words, how they were being manipulated.”3

Epstein and his team, however, have found ways to track Google’s invisible, almost subliminal, tools, including the search engine manipulation effect (SEME). According to Epstein:4

“SEME is one of the most powerful forms of influence ever discovered in the behavioral sciences … It leaves people thinking they have made up their own minds, which is very much an illusion. It also leaves no paper trail for authorities to trace. Worse still, the very few people who can detect bias in search results shift even farther in the direction of the bias, so merely being able to see the bias doesn’t protect you from it.”

Research by Epstein and colleagues has found that biased search results can change people’s opinions and voting preferences, shifting opinions in undecided voters by 20% to 80% in certain demographic groups.5 Internal emails leaked from Google talk about “ephemeral experience,” and the company makes a point to engineer ephemeral experiences intended to alter the way people think.

SEME, however, is just one of about a dozen subliminal tools that Epstein’s team has discovered. Others include the “search suggestion effect,” the “opinion matching effect” and the “YouTube manipulation effect.”6
Google Shifted Millions of Votes in 2020

As Epstein and his team began to preserve politically related ephemeral experiences, extreme political bias was uncovered on Google and YouTube, which is owned by Google’s parent company Alphabet.

In the days leading up to the 2020 Presidential election and 2021 Senate runoff elections in Georgia, for instance, they preserved 1.5 million ephemeral experiences and more than 3 million web pages, which were sufficient to shift “at least 6 million votes in the presidential election without people’s knowledge.”7

This isn’t an isolated incident. In 2016, Google’s search algorithm generated biased search results that influenced undecided voters, giving 2.6 million to 10.2 million votes to Hillary Clinton.

Epstein makes a point to state that he leans left politically, but despite Google’s bias working to support the candidates he supported, he can’t applaud it, “because rigorous research I have been conducting since 2013 has shown me how dangerous these companies are – Google-and-the-Gang, I call them.”8

Even displaying a “Go Vote” reminder on Google on election day in 2018, Epstein found, gave one political party an extra 800,000 to 4.6 million votes compared to what the other party got. What’s more, Epstein says those numbers are “quite conservative.”9 “In other words,” Epstein explained, “Google’s ‘Go Vote’ prompt was not a public service; it was a vote manipulation. This type of vote manipulation is an example of what I call the ‘Differential Demographics Effect.’”10

Epstein also had a monitoring system in place in 2018, which preserved more than 47,000 election-related searches on Google, Bing and Yahoo, along with nearly 400,000 web pages that the search results linked to. The political bias that was uncovered in the results may have shifted 78.2 million votes to one political party.11

Even the “autocomplete” feature that occurs when you start to type in Google’s search engine is a powerful manipulation tool. “A growing body of evidence suggests that Google is manipulating people’s thinking and behavior from the very first character people type into the search box,” Epstein writes.12 Just from this feature alone, Epstein’s research found Google can turn a 50/50 split among undecided voters into nearly a 90/10 split — all without people realizing they’re being manipulated.

Further, because Google’s persuasive technologies are so powerful, and many elections worldwide are very close, Epstein’s data suggest Google has likely been determining the outcomes of up to 25% of national elections worldwide since at least 2015.13

Google Is a Surveillance Agency (Video)
Download Interview Transcript:
https://mercola.fileburst.com/PDF/ExpertInt...tein-Google.pdf

It’s important to understand that Google is a surveillance agency with significant yet hidden surveillance powers, and this is one of their primary threats to society. As noted by Epstein:14

"The search engine … Google Wallet, Google Docs, Google Drive, YouTube, these are surveillance platforms. In other words, from their perspective, the value these tools have is they give them more information about you. Surveillance is what they do."

While surveillance is Google's primary business, their revenue — which exceeds $130 billion a year — comes almost exclusively from advertising. All that personal information you've provided them through their various products is sold to advertisers looking for a specific target audience. Meanwhile, they also have an unprecedented censorship ability. By restricting or blocking access to websites, they decide what you can and cannot see.

The most crushing problem with this kind of internet censorship is that you don't know what you don't know. If a certain type of information is removed from search, and you don't know it should exist somewhere, you will never know and you won’t go looking for it. This is how hundreds of millions of people have been deprived of learning the power of natural healing from me and many other clinicians who have been censored by Google.

For example, Google has been investing in DNA repositories for quite a long time, and adding DNA information to our profiles. According to Epstein, Google has taken over the national DNA repository, but articles about that — which he has cited in his own writings — have all vanished. As it stands, Epstein is worried for the future if no one steps in to stop Google’s power:15

“As the father of five children, I am especially concerned about what humanity’s future will look like if Big Tech is allowed to continue unobstructed on its path toward world domination. In the 1950s, British economist Kenneth Boulding wrote, ‘A world of unseen dictatorship is conceivable, still using the forms of democratic government.’

I am writing this essay because I believe that such a world already exists, and that unless we act quickly and decisively, the power that the technology company executives have garnered will become so firmly entrenched that we will never be able to unseat them from their invisible thrones.”

Epstein’s Six Top Privacy Tips

The first step to breaking free from Google’s dictatorship is recognizing that the manipulation is occurring. The next involves consciously opting out of it as much as possible. It’s especially important that children are protected, as they are among the most vulnerable to the onslaught of manipulation, which will have serious consequences to future generations. Epstein noted:16

“We’re trying to figure out how the manipulation works. But most importantly, we’re trying to quantify it … Because I think that what’s really happening is that there is a cumulative effect of, not just political bias, but a value literally a cumulative effect of being exposed to certain kinds of values, over and over and over again, on one tech platform, or after another.

And I think that the people who are most vulnerable to being impacted by that kind of process are children.”

Epstein has compiled six steps that can help protect your privacy online, noting that he hasn’t received a targeted ad on his computer or mobile phone since 2014 as a result. To take back some of your online privacy, for yourself as well as your children, he recommends:17

1. Get rid of Gmail. If you have a Gmail account, try a non-Google email service instead such as ProtonMail, an encrypted email service based in Switzerland.

2. Uninstall Google Chrome and use Brave browser instead, available for all computers and mobile devices. It blocks ads and protects your privacy.

3. Switch search engines. Try Brave search engine instead, which you can access on the Brave browser and will not compromise your privacy and surveil you.

4. Avoid Android. Google phones and phones that use Android track virtually everything you do and do not protect your privacy. It’s possible to de-Google your cellphone by getting an Android phone that doesn’t have a Google operating system, but you’ll need to find a skilled IT person who can reformat your cellphone’s hard drive.

5. Avoid Google Home devices. If you have Google Home smart speakers or the Google Assistant smartphone app, there’s a chance people are listening to your requests, and even may be listening when you wouldn’t expect.

6. Consider using a proxy or VPN (Virtual Private Network). This service creates a buffer between you and the internet, “fooling many of the surveillance companies into thinking you’re not really you.”

This post has been edited by Mike777: May 7 2022, 06:47 PM
User is offlinePM
Go to the top of the page
Toggle Multi-post QuotingQuote Post
Mike777
post May 7 2022, 06:43 PM
Post #2


Member
***

Group: Members
Posts: 39
Joined: 20-March 22
Member No.: 28,287



Since they say the .pdf of the transcript mentioned above will be taken down in 5 hours I have posted it below.
I think it is important to website developers. If this post is too long I apologize to the moderator.

Exposing the Deception and Fraud Committed by Google:
A Special Interview With Dr. Robert Epstein
By: Dr. Joseph Mercola
JM: Dr. Joseph Mercola
RE: Dr. Robert Epstein
JM: Welcome everyone. This is Dr. Joseph Mercola helping you take control of your health. Today we are
honored to be joined by Dr. Robert Epstein who received his PhD on Psychology from Harvard in 1981
and was the former editor in chief at Psychology Today and he now serves as a senior research psychologist
for the American Institute of Behavioral Research and Technology. From our perspective, he's really
exposing the fraud, deception and manipulation that Google has been doing for at least the last decade.
We're going to have a really engaging discussion. It's going to open your eyes.
JM: I really wasn't quite aware of Dr. Epstein's work prior to our interview, but I'm just really impressed
with the knowledge he's uncovered about the surreptitious behavior of Google. Welcome and thank you for
joining us today.
RE: Sure. It's my pleasure.
JM: We're about the same age and we've both written about the same number of books and we've both been
targeted and censored by Google. I'm wondering if, and I hope to have a really long engaging discussion
on this and I wanted just to give people an idea of where you're coming from, what your personal experience
was with Google and I believe it started from you writing how they were manipulating an election in 2012
and then the blow back started to happen. If my details are incorrect, certainly update them and tell us how
this whole journey started. It was about eight years ago, I believe.
RE: Sure. Well, in 2012 actually January 1st, it was New Year’s Day, I received some emails from Google
saying that my website contained malware and that they were somehow blocking access. This means I had
gotten onto one of Google's blacklists. My website did contain some malware. It was pretty easy to get rid
of, but it turns out it's hard to get off of a Google blacklist. That's a big problem. I started looking at Google
just a little bit differently. I wondered first of all, number one, why were they notifying me about this rather
than some government agency or some nonprofit organization? Why was a private company notifying me?
RE: In other words, who made Google sheriff of the internet? Second, I learned that they had no customer
service department, which seemed very, very strange so if you have a problem with Google, then you have
a problem because they don't help you solve the problem. I learned also that although you can get onto a
blacklist in a split second, it can take weeks to get off a blacklist. There have been businesses that have
gotten onto their blacklists and have gone out of business while they're trying to straighten out the problem.
RE: The thing that really caught my eye was because I've been a programmer my whole life, I couldn't
figure out how they were blocking access to my website, not just through their own products, for example,
not just through google.com the search engine or through Chrome, which is their browser, but through
Safari, which is an Apple product, through Firefox, which is a browser run by Mozilla, which is a nonprofit
organization. How was Google somehow blocking access through so many different means? The point is I
just started to get more curious about the company and later in 2012, I happened to be looking at a growing
literature, which was about the power of search rankings to impact sales.
RE: This was in the marketing field and it just was astonishing. In other words, if you could push yourself
up one more notch in their search results, that could make the difference between success or failure for your
company or it could mean a lot more income. It turns out that this initial research was saying that people
really trust those higher ranked search results. I simply asked a question. I wondered whether if people trust
those higher rank search results, if I could use search results to influence people's opinions, maybe even
their votes.
RE: That's where everything got started because early 2013 I started doing experiments, randomized
controlled experiments to see whether I could use search results that are biased somehow rather to shift
people's thinking, opinions, and even their votes. I was completely, completely, completely shocked by
what I found.
JM: Wow. It sounds like your experience with the malware notification from Google and them blocking
you from these other platforms really was independent of your investigation of them. They just
serendipitously or coincidentally targeted you for some reason or even mistakenly. Did you ever determine
how they were able to do that from a...?
RE: Yes, I did. In fact, in 2016 I published a lengthy investigative article for U.S. News & World Report,
which is called The New Censorship, and it was about nine of Google's blacklists. I explained exactly how
this works and exactly how they block access to websites and exactly how they're able to block access even
on other people's browsers such as Safari and Firefox. Yeah, I did eventually work all that out and it's genius
on their part. It's genius what they do, but it's also very, very frightening because basically, well, as you
know, they have the power and not just to remove you from search results or demote you in search results
or to block access to YouTube videos because YouTube is part of Google, of course.
RE: I'm studying YouTube videos right now. They can block access to websites, millions of websites. In
fact, they do block access to millions of websites every day. That's their biggest blacklist called their
Quarantine list. On January 31st, 2009 for 40 minutes, Google blocked access to the entire internet. I'm not
making that up by the way. That was reported by the guardian and Google did not deny it and just recently
for I think 11 minutes, they blocked access to every website in Japan. They have power, which it boggles
the mind and they actually use the power that they have to serve their purposes, which are sometimes
monetary, sometimes political.
JM: Yeah. I think if we can just summarize and then I'm going to let you go really deep in each one of these
is they have three powers. One is that they are a surveillance agency, second is that they censor as you just
alluded to, and then thirdly and perhaps most importantly they manipulate, which the implications of that
are just profound. You've got some shocking information to share and you, I think aptly called Google, the
GSA, the Google Surveillance and Ad agency. Why don't you take it from there? Because I think it's just a
brilliant analysis.
RE: Well, sure. We call Google Google, and it's a cool word. It's a misspelling of a word invented by a
mathematician that was meant to indicate one with a hundred zeros after it. Very large number. So Google
means very large number. Again, they spelled it a little differently, but that's the way they chose to name
the company. Google itself, that name doesn't tell you what they actually do. Now, Kentucky Fried Chicken,
that tells you what the company does, right? Apple Computers tells you what the company does, but Google
doesn't. If you really wanted to have the full name of the company, it would have to be GSA because
surveillance is actually what they do.
RE: All of the tools that you use that are Google's tools you may think they serve this purpose or that
purpose, but from Google's perspective, they're just surveillance tools. That's all they are. They dress them
up in various ways. The search engine itself, we think of as users as a search engine that's free. If from their
perspective, it's just another surveillance tool, Google Wallet, Google Docs, Google Drive, YouTube, these
are surveillance platforms. In other words, from their perspective, the value that these tools have is it gives
them more information about you. Surveillance is what they do. Now where do they get their money from?
Almost all of their money comes from advertising.
RE: Because they take the information that they've obtained about you and your family and your children
and they monetize it more or less, you could say they sell it, they put the vendors in touch with you based
on what they know about you, and as a result, now make more than $130 billion a year doing so. Really the
company, if we wanted to name it like Kentucky Fried Chicken, it would be called GSA or Google
Surveillance and Advertising. Surveillance is what they do, advertising is how they make their money. Now
there are three big areas of threat, however, to us as individuals to us as citizens of a country, and you
mentioned them.
RE: The first of course is that there's surveillance, which I've just talked about a little bit and second is the
censorship because they decide what people are going to see or not see. That's not just people in the US,
that's two and a half billion people around the world. That number will soon be over four billion. They
decide what you're going to see or not see. I can tell you more about that. Then the third area is the one I
study in my experimental research, and that's manipulation.
RE: To me, that's the scariest area because as it turns out that Google is shaping the opinions and the
thinking and the beliefs and the attitudes and the purchases and the votes of billions of people around the
world without anyone knowing that they're doing so except a handful of people like me and perhaps even
more shocking without leaving a paper trail for authorities to trace. They're using new techniques of
manipulation that have never existed before in human history and they are for the most part, subliminal.
Now, that was a scary term that the few people used to talk about a lot.
RE: They used to talk about a movie theater in New Jersey that put in subtle signals into the movie
suggesting that people buy more Coke and buy more popcorn and supposedly people did some. That kind
of subliminal manipulation turns out doesn't really work very well. The point is, what I've stumbled onto
are a whole set of techniques that Google has developed that work extremely well. I can give you some
numbers when we start to talk about specifics here, but these are invisible effects, so they're subliminal in
that sense, but they don't produce tiny shifts. They produce enormous shifts in people's thinking, very
rapidly.
RE: Literally, some of the techniques that I've discovered are among the largest behavioral effects ever
discovered in the behavioral sciences. In other words, they're very, very powerful new forms of influence.
First one I discovered was early 2013 but at this point I've discovered about a dozen of these and I'm just
currently setting, I think seven or eight of them simultaneously. Trying to understand them, quantify them.
It's a long process and I'm always just first playing catch up because I'm sure that what I've discovered is
the tip of the iceberg. I'm sure the capabilities that Google has really go beyond what I've been able to
discover so far.
JM: Yeah, and the world deeply appreciates that or if not, currently they will because you've uncovered
some really amazing information and largely because you have the training to do it. You're really well
trained. You got your PhD in Harvard in psychology, so you've got the tools and the disciplines to analyze
this. Why don't you describe some of the experiments you performed and elaborate on the amazing power
that this tool has to manipulate and shift people's perceptions?
RE: Sure. Well, the first effect I discovered, it's called SEME, S-E-M-E, which stands for search engine
manipulation effect. I discovered this in early 2013. The basic experiment is pretty straight forward and I
should point out that all of these experiments meet the very highest standards of scientific integrity. They're
randomized, they're double-blind, they're counterbalanced, they're controlled, etc. I know how to do good
research and I've used the highest standards in conducting these experiments. In the simplest experiment,
basic one, people are randomly assigned to one of three groups. Now, who are the people, first of all?
RE: Well, in the first experiments that I ran, I had small groups in Southern California, but they were very
diverse. They weren't college sophomores. In other words, I had tried to match demographic characteristics
of the US voting population because this was a study about voting. I had people of all ages and all ethnicities
and balanced for gender, and so on and so forth. I really did try to make them representative of this voting
population, although they were all from the San Diego area initially in the early experiments, and they're
randomly assigned to one of three groups. One group, they're going to end up seeing search results that are
biased in favor of one political candidate.
RE: The second group, they're going to see the search results that are biased in favor of that candidate's
opponents or the other political candidate. The third group, they're going to be seeing search results that are
not biased, they're all mixed up. That's the control group. Where it works is we ask our participant a bunch
of questions. We give them some very, very basic information, a short paragraph about each candidate. We
then ask, who do you trust? Who do you like? Who would you vote for if you had to vote today? Questions
like that.
RE: Then we let them do a search and they're using our search engine, which is called Kadoodle. Kadoodle
looks and works exactly like Google does and we're using, it turns out real search results that we got from
Google and real webpages. This is very important. We're using real search results, real webpages. The only
difference between the three groups is the order in which they see the search results. Everything else is the
same. It's just the order that's changing in the three groups. After they search for up to 15 minutes and again,
they're clicking on search results, they're reading articles, they can move back and forth between the
different pages in search results.
RE: Again, it works exactly like Google. After up to 15 minutes of searching, we ask them those questions
again. In other words, we say, "Okay, who do you like? Who do you trust? Who would you vote for?"
Questions like that to see if there's any shift in their thinking and in their voting preferences. Now I had
predicted when we first did this that we would get a shift because I thought people do trust higher ranked
search results, and of course we had biased the search results so that if in that first group, if someone was
clicking on a high ranking search result, that would connect them to a webpage which made one candidate
look much better than the other.
RE: We'd rated all the webpages before we ran the experiment. I thought, well maybe people would trust
that because it's near the top of the list, and maybe that would produce a shift in their opinion, their thinking
and maybe their voting preference. I predicted we could get a shift in voting preferences of two to 3%, that
was my prediction. I was way off. We got in that first experiment, a shift of 48% which I thought must be
an error because that's crazy. One search and we get a shift of 48%? Let's put this another way, before
they've done the search, we look at how they answer the questions and they're basically split 50-50 and then
after the search we look again at the answers to those questions and then we get this enormous shift.
JM: This was for both candidates, either way?
RE: Yeah, we can push them either way. Exactly. They're randomly assigned to the three groups. We're
pushing them any way we want. Exactly. That's what makes us so important. I should note here that in
almost all of our experiments, especially those early ones, we deliberately used undecided voters. That's the
key. You can't easily push the opinions or voting preferences of people who are partisan, who are strongly
committed to one party or another, but people who are undecided, those are the people who are very
vulnerable. In our experiments, we always find a way to use undecided voters.
RE: In these early experiments, the way we guaranteed that our voters were undecided was we used people
from the US as our participants, but the election that we chose was the 2010 election for the prime minister
of Australia. They're real candidates, a real election, real search results, real webpages, and of course,
because our participants were from the US they were not familiar with the candidates. In fact, that's why
before they do the search, we get this almost perfect 50-50 split regarding who they're going to vote for
because they don't know these candidates, they don't know them at all. The information that they're getting
from the search, that's presumably is why we're going to get a shift.
RE: It can only be happening because of the information they're getting from the search, and of course if
there's a difference between the groups, which there was, then that's only happening because of the
difference in the way we order search results. That first experiment was astonishing in producing such a
big shift, but there was another thing that I noticed and that is that very few people seemed to realize that
they were seeing biased search results. This is where it starts to get a little scary because think about it, if I
can produce big shifts and people don't even realize they're seeing bias search results, then this is an invisible
manipulation.
RE: Second experiment, we got a 63% shift, third experiment, also another very large shift. What we were
doing as we move from experiment to experiment is we were trying to see whether we could fool more and
more people into thinking that these are just average search results that are unbiased. What we did was a
little bit of masking. Let's say it's, I don't know, Clinton and Trump, which it wasn't of course in those early
experiments, but let's say it's Clinton and Trump. What that means is that in group one you might see proClinton, pro-Clinton, pro-Clinton, pro-Clinton, right? But we did some masking.
RE: Just to confuse people a little bit, we did pro-Clinton, Clinton, Clinton, Trump, Clinton, Clinton,
Clinton, Clinton, and so on in group one. Then again in the other group, the other bias group, we would
have Trump, Trump, Trump, Clinton, Trump, Trump, Trump. Just mix it up a little bit. We call this masking
and it turns out by just doing a little bit of masking, we could fool everybody. In other words, we could get
enormous shifts in opinions and voting preferences, enormous shifts with no one, no one able to detect the
bias in the search results we were showing them. This is where, again, it starts to get scary. Scarier still is
when we moved on to do a national study, our first national study of more than 2000 people in all 50 states.
RE: Now we're out of San Diego at this point so we're done with that. Again, very large shift, almost no
one aware of the bias in the search results, but because at this point we have thousands of people, we can
now look at demographic effects, we can now look at subgroups. When you have that many people, you
are going to find a few people who can see the bias, and to me this is the scariest thing of all. The very
small number of people who can see the bias in the search results, they shift even farther in the direction of
the bias, even farther. In other words, merely being able to detect the bias doesn't protect you from its
effects.
JM: Fascinating. I have some questions on how the design of the study, just for clarification. You had a list
of webpages, I would assume it was somewhere around 10 or so pages with the typical Google strategy and
the only thing you shifted was the order of those, so they were the same pages, is that correct?
RE: Same search results, same pages. The only thing that's changing is the order of the search results. That's
right.
JM: Okay. I just wanted to confirm that before my next question, which is your best guess as to the
mechanism of what happened? I believe that design of the study allowed for only 15 minutes to review, so
it would seem since there's a load of information on there, if it's a candidates’ webpage, they can read just
one page for 15 minutes. Do you think it was because they trusted the objectivity of the search and only
clicked on the first few and didn't have an opportunity to review the rest? Did that cause a distortion?
RE: Well, we looked at that very carefully. That's a very important issue, and what we found is that there's
a pattern of clicks which matches the pattern of clicks, which had been found previously in very large
studies, not our team, but other people have looked at. The pattern of clicks in millions of searches, literally
millions of searches. The pattern of clicks is basically this, that 50% of all clicks go to the top two items
and 95% of all clicks go to the first page of search results. The point is the pattern of clicks that we got in
our experiments matched perfectly the pattern of clicks that had been found in studies that just look at
patterns of clicks involving millions of searches.
RE: It's that pattern of clicks that's key. In other words, you're right, people are just spending most of their
time clicking on and reading content that comes from high ranking search results. If those high ranking
search results favor one candidate, that's pretty much all they see and that impacts their opinions and their
voting preferences. Yeah, you're right. It just has to do with what people click on. Now we also later, much
later did experiments trying to figure out why people click mainly on those high items and why they trust
them so much. All right. Here, I won't bore you with the details because it would take too long, but the
point is that it turns out that this is a conditioning phenomenon.
RE: We are basically rats in a skinner box. My doctorate from Harvard by the way, was with B. F. Skinner.
I was his last doctoral student there. I know, very ironic.
JM: [inaudible 00:26:40].
RE: Yeah, and basically we're all rats in a skinner box because this is the way it works. Almost all the
searches that we conduct are for pretty simple things, for example I recently flew to Boston and I don't use
Google, no one should ever use Google, but using another search engine I typed in "Boston weather" and
what comes up right at the top of the search is just all the numbers describing Boston weather or a website
to AccuWeather, Boston weather. The point is most of the searches we conduct are routine searches for
simple things. What is the capital of Iowa? I don't even know, but whatever it is, it's going to turn up right
at the top of the list.
RE: Over and over again we are conditioned to learn that what's at the top is better, what's at the top is
truer. As a result, when the day comes, when we type in something a little more vague, like "What's the
best restaurant in Las Vegas?" Or, "Where should I go on vacation?" Or, "Who should I vote for?" Or,
"Who's the best candidate?" You see what I'm saying? When the day comes, when you type in something
a little more open ended, that doesn't have a clear answer, the fact is we still trust what's at the top of the
list. That's why the opinions shift so dramatically because that trust has been conditioned every single day.
It never stops.
RE: That conditioning never stops. SEME, the search engine manipulation effect is a list effect and
scientists have been studying list effects for over a hundred years, but it's a list effect with a difference
because it's supported by a daily regimen of operant conditioning and that regimen never stops. The training
never stops.
JM: Reinforced continuously.
RE: Correct.
JM: Yeah. That is absolutely fascinating, but some people could be, the immediate response was, so what?
Well, tell us the implications of this because they are beyond profound and frightening as to the impact that
this behavior shifting can have.
RE: Well, soJM: Behavior shifting can have.
RE: Well, "So what?" is in fact a good response because you could say, all right, well, okay, so Google has
this power presumably, in other words, especially in an election, if they actually presented people with
search results that were biased in favor of one candidate or for that matter, day by day, if they were
presenting search results that were biased in favor of one dog food or one restaurant chain or whatever it
may be. I mean, think of all the tens of thousands of things we search for, if they were presenting us with
search results that were systematically biased, Wow, that would be a problem, right? But are they? See
that's a separate question. They have the power, do they use the power? That's the question.
RE: So in 2015 a couple of things happen. One is I published the findings from my early experiments, first
five experiments in the proceedings of the National Academy of Sciences. Now that's a pretty prestigious
place to publish something. I mean it's hard to publish in there. And that particular paper, by the way, last
time I checked, had been downloaded or accessed from the website of the National Academy of Sciences
more than 250,000 times. Now I've never in my whole career seen anything like that. I mean, if you get a
few hundred downloads or a few thousand, that's good. But 250,000, that's a lot. So there's definitely an
interest here.
RE: So I published that in 2015 and also got a phone call from the Attorney General of Mississippi. who
recently ran for governor there and lost by the way. He had sued Google and Google had sued him back,
literally had sued him as an individual. So he was in this big battle with Google and he was up for reelection as attorney general and he was very concerned. He was wondering whether Google could somehow
mess with the votes in Mississippi. And I said, "Oh yeah, quite easily." And he wanted to know how and I
explain it to him and so on. And then he said, "but how would we know if they're doing it?"
RE: Okay. That just lit my head on fire because I mean from that moment on, I just became obsessed. And
then he said that in law enforcement, he said what they would do is they would use a bot or sock puppet,
they're called, fake people and we would collect data coming into fake people and we would analyze the
data. And I said, well, I said to him, "General Hood," you address an attorney general as general, which is
kind of funny. "General Hood, I said, that won't work." "Why not?", he said. I said, "Because Google has a
profile on all of us. The profiles are immensely large." We can talk about that if you want when we talk
about surveillance some more, I mean people will be shocked, just shocked to know how big these profiles
are.
RE: I said, "so the point is if you set up a bot or a sock puppet, they know it's not real, it's not a real person.
So they won't and they personalize results, so they'll be sending non-biased search results to your fake
people." I said, "and you won't learn anything." He said, "Well, I don't know." He said, "well then what
would we do? How could we find out what they're doing?" And so the point is, I started to obsess about
that. And then early 2016, I set up the first ever monitoring system, which allowed me to look over the
shoulders of people as they were conducting election related searches on Google, Bing, and Yahoo, so three
search engines in the months leading up to the 2016 presidential election, so I had a 95 field agents we call
them, in 24 states.
RE: We kept their identities secret, which took a lot of work. And this is exactly by the way what the
Nielsen company does to generate ratings for television shows. They have several thousand families. Their
identities, they're secret. They equip the families with special boxes, which allow Nielsen to tabulate what
programs they're watching. And Nielsen does that now, by the way, in 47 countries. So the point is, inspired
by the Nielsen model, we recruited our field agents, we equip them with custom software that we designed,
which is passive software. In other words, no one can detect the fact that they have the software in their
computers. But that software allowed us basically to look over their shoulders as they conducted election
related searches. And so we got our first trickle of data coming in May of 2016, the closer we got to the
election, the more field agents we had, the more data we were collecting.
RE: And we ended up preserving a 13,207 election related searches and there're nearly 100,000 webpages
to which the search results linked. And after the election now that we'd preserve this information, because
remember we know what search results these people were actually seeing so we know where these links
were occurring. And so after the election we rated the webpages for bias, either pro-Clinton or pro-Trump.
I personally supported Hillary Clinton in that election. And then we just did an analysis to see whether there
was any bias in the search results people were seeing. The results we got were crystal clear, highly
significant statistically, this is something I tried to explain to a reporter and she just, I mean I've taught
statistics at the doctoral level. I'm using statistics in my work for almost 40 years, but I just couldn't quite
get this reporter to understand what I was saying.
RE: But the point is we, the results we got were just significant at the 0.001 level. And what that's that says
is we can pretty confident that the bias we were seeing was real, and it didn't occur just because of some
random forces. The point is what we found was a pro-Clinton bias in all 10 search positions on the first
page of Google search results, but not on Bing or Yahoo. That's very important. So there it was, I mean
there was a significant pro-Clinton bias on Google. Now because of the experiments at this point that I had
been doing now since 2013, I also was also able to calculate with that level of bias how many votes could
have been shifted. And it depends on a number of things, but the bare minimum would have been, about
2.6 million votes would have shifted to Hillary Clinton, in other words, those are undecided people who are
going online, getting information about anything that's election related, anything, and being brought to
webpages that are basically pro-Clinton.
RE: And that shifts opinions and it shifts both that we know for sure. So basically I calculated that with
that level of bias, over a period of months, that would have shifted among undecided voters, somewhere
between 2.6 and 10.4 million votes, with no one knowing that they've been influenced, number one. Number
two, without leaving a paper trail for authority to trace. The paper trail issue, that's a very interesting issue.
Last year in 2018, last year in one of the leaks of material from Google to the media, one of the email
conversations within Google was about Google's, rather Trump's travel ban and one of the employees says
to other employees there, how can we use ephemeral experiences, okay? Ephemeral experiences. I'm
actually writing an article now about ephemeral experiences, to change people's opinions about Trumps
travel ban. In other words, too, to get people to be opposed to it, to his travel ban.
RE: And that's the key to everything here. The point is, people within Google themselves, they know how
powerful ephemeral experiences are and search results are ephemeral experiences. What are femoral
experiences? They're experiences that you have it with your computer in which you're shown content that
is fleeting, it's generated on the fly just for you. It impacts you and it disappears and it's gone forever. It's
not stored anywhere. That's the key to all this. If you are influencing people using ephemeral experiences
as that Google employee said, then that's gold. Because you can shift opinions and thinking and attitudes
and beliefs, and no one knows it's happening, number one. Number two, there's no paper trail. Authorities
can't go back in time and recreate that experience. I mean it's brilliant and it's frightening and it's diabolical.
JM: Absolutely. Great summary. So that I think it's important to emphasize that as you previously stated,
that you were a Clinton supporter, so the bias you uncovered does not suggest you are biased because you
were her supporter and you covered a bias in favor of her, which would be not beneficial to her in any way.
In fact, she later, when Trump, I believe, tweeted or acknowledged your testimony to Congress, she
responded adversely and denounced you.
RE: Yes, this was horrible. This actually set in motion, what has so far been one of the most horrible periods
of my life. But yes, I testified before Congress about the same research and other research, because that
was just the beginning and that I did indicate that in 2016, according to the 13,000 searches that we
preserved, yes again, the bias in Google search could have shifted somewhere between 2.6 and 10.4 million
votes to Hillary Clinton. Yes. I was a Hillary Clinton supporter. So I did report that. It's part of my
congressional testimony. Now, a few weeks later, president Trump tweeted about my testimony. He didn't
get the numbers quite right, he makes lots of mistakes and he didn't get the numbers quite right, but he did
no harm in his tweet for, as for far as I was concerned.
RE: But then Hillary Clinton responded his tweets saying that my work had been debunked and was based
on data from 21 undecided voters. I don't know where either of those things came from, because my work
has never been debunked. I'm not even sure what debunked means, but the point is 21 undecided voters?
This was a massive amount of data that we had preserved and we spent months analyzing it. Then there
was a deluge of a mainstream newspaper articles, including in the New York Times, just dismissing me and
my work and calling me a fraud and saying my work was garbage and so on. I've had a spotless reputation
as a scholar and scientist for almost 40 years. And so this was brutal for me. This was horrible, it still is. It
still is. I don't understand it. I can't understand it. I can't understand how you can just destroy someone's
reputationJM: Well, don't take it personally. This is a classic strategy that they use to discredit anyone who opposes
their agenda. Tobacco industry did it. Wireless industry is doing it. The GMO industry in Monsanto did it.
I mean that is the absolute standard playbook strategy. Discredit the people who are speaking out in the
most reputable, in the loudest. Why did they do it? Because it's effective and it works and it requires very
little investment on their end.
RE: Well, I'm a researcher. I'm not a politician. I'm a researcher. I report what I find, if I had found proTrump search results, I would have reported that. I reported what I found and I mean it was, politically
speaking, I thought it was great that they were helping Hillary Clinton to get votes. She actually won the
popular vote in 2016 by over 2.8 million votes and I had predicted at a minimum they could shift 2.6 million.
So what that means is if you take Google out of the equation in 2016, it means that the popular vote might
have been very, very close. Because again, I know that they had the ability to shift at least 2.6 million votes.
So the point is, I report what I find. So the fact that I'm excoriated, I'm trashed by mainstream news and by
Hillary Clinton herself.
RE: I can't understand it, I've been a supporter of the Clinton for decades. I have a signed letter from Bill
Clinton from when he was president, up on the wall over my desk. I still haven't taken it down, although
maybe I should.
JM: Well, I guess nothing personal there. They're just opposing anyone that's getting in their way. And you
certainly were an obstacle. So I still don't think the average viewer or listeners are aware of the full
implications of what you just said. So let's extend it to a worldwide view. And you've made some pretty
amazing projections of what Google has and is doing to this day, how they're influencing elections around
the entire planet.
RE: I was just thinking that, so I'm glad you're shifting things in that direction because, I find a lot of people
when they look at my work, they just keep thinking about the U.S. and the U.S. elections. And the problem
is much bigger than that. As I explained when I testify before Congress, the reason why I'm speaking out
about these issues is because first of all, I believe in democracy. I believe in the free and fair election. And
I think it's important that we preserve democracy and preserve the free and fair election.
RE: And I think that's more important than my support for any particular candidate or party. To me, it's
pretty straight forward. But the problem is much bigger than elections or democracy or the United States.
Because I calculated back in 2015 that even at that point in time, Google's search engine, because more
than 90% of search worldwide is conducted on Google.
JM: That's an astounding number. It is an astounding nuber.
RE: Yes. It's actually 92% currently, and it has been for about a decade now. But because basically for
most people in the world, Google is the gateway to knowledge. It's the gateway to all knowledge of any
source.
JM: It has to be refined, is the gateway to the suppression of knowledge.
RE: Well that's kind of a separate issue, which I happy to talk about.
JM: We'll go to that layer but finishedRE: Yeah. But because people around the world rely on Google to find information. I had calculated as of
2015, Google was determining the outcomes of upwards of 25% of the national elections in the world. Now
how can that be? Well, it's because a lot of elections are very close. And that's the key to this, to
understanding this. So in other words, we actually looked at the margins, the win margins in elections
around the world, national elections, which tend to be very close. Local elections tend not to be so close,
but national elections tend to be very close. That 2010 Australian election for example, the win margin was
something like 0.2% ,it was less than 1% so if an election is very, very close and people are just searching
for various topics related to an election over a period of weeks or months.
RE: And if Google is biased, no, if the results they're getting on Google are biased toward one candidate,
well that shifts a lot of votes among undecided people. And it's very, very simple for them to flip an election
or as an article I published in political, put it to rig an election. I didn't put that title on my article. The folks
at Political did. But it's very, very simple for Google to do that. Now they can do it deliberately, which is
kind of scary. In other words, some top executives at Google could decide who they want to win an election
in South Africa or the UK or anywhere. It could be just a rogue employee at Google who does it, and you're
thinking that's impossible, how could a single employee at Google rig an election? Actually, it's incredibly
simple.
RE: Your viewers may have heard of the famous Google Street View scandal. A few years ago, a professor
just like me figured out that the Google Street View vehicles, which drive up and down our streets
photographing our homes and our businesses, which at that point have been driving around the streets of
30 countries for more than four years. That the Google Street View vehicles weren't just taking photographs
of our houses and they were also vacuuming up all of our Wi-Fi data in more than 30 countries for four
years.
RE: Literally just capturing our Wi-Fi data, capturing our passwords, capturing our whatever activity we're
engaged in, capturing which websites we're visiting, which allow them to figure out sexual orientation,
political leanings, et cetera, et cetera. And so Google got caught, again just because of someone just like
me, a researcher figured it out and Google got investigated and they got sued and this and that. Google
blamed the entire Street View project on one employee, Marius Milner, one employee, they blamed it on
just one employee. Because Google employees themselves have a lot of power. They can change a
parameter here and there, and they have a lot of power. And Marius Milner, again, he got, he got blamed
for this entire project.
RE: So naturally then they fired him. No, no, they didn't fire him. Marius Milner is still at Google and he's
a hero at the company. And if you look on his LinkedIn page, you'll see he identifies himself as working
for Google. And he says his profession is hacker, so they don't fire people like that at Google. So an
individual employee, a rogue employee could flip an election.
RE: And then there's the third possibility. The third possibility is, they've got the algorithm running. They
don't care about, Cameroon, which is in Africa. They don't care about that, they don't pay any attention.
They don't care, let's say, or Nigeria or something. They don't care. And so the algorithm favors one
candidate over one another and you're thinking, wait a minute, why would the algorithm fail? Because the
algorithm is built that way. It's built always to favor one dog food over another. One, I don't know, flavor
of ice cream over another, and one candidate over another. It has no equal time rule built into it. It always
is going to favor one candidate over another, and that can flip a close election. And that's the scariest
possibility because now you've got an algorithm, a computer program which is an idiot. It's an idiot deciding
who rules us. It's crazy.
JM: The implications are quite profound. And what you described actually is illegal behavior, but because
these are ephemeral experiences and they leave no paper trail, they are very difficult to prove. But if you've
set up some pre-emptive capturing strategies to do this, you can document this. Because there are rules,
regulations and penalties for contributing to campaigns that have to be acknowledged. And if you exceed
those, you are going to be fined and maybe even put the jail. So why don't you expand on that because I
think that the implications that are really quite profound.
RE: Yes. Well I think we should be clear here. There's not only are there no laws or regulations which
restrict the way the Google ranks its search results. There are no laws or regulations regarding that. In fact,
the courts have said over and over again in case after case that Google can can order its search results any
old way it pleases. They're just exercising, I'm not kidding you, their right to free speech. That's Google's
first amendment right. And by the way, the courts have also said that if they want to eliminate you from
their search results or demote you and I know you have some experience with these types of things.
JM: Sure, of course.
RE: Yeah, I know. It's terrible. The courts have said they can do it because they're just exercising their right
to free speech. In other words, they can take your business and destroy it. A rogue employee can do it, an
executive can do it. The algorithm itself could do it and the courts have said, no problem, no problem. Now
whether the legal issue arises is in the context of an election, as you pointed out, there are laws that say that
if you donate to a campaign and you donate above a certain amount you have to declare that. And of course
there are also limits on how much you can donate unless you're donating to these new super packs.
RE: So you could say that if Google's search results or search suggestions or YouTube videos and on and
on and on are favoring one candidate that could be considered an in kind donation, which they're not
declaring. And one could put a value on it and so one could go after them in that way. Now, has anyone
done that yet? No. One reason is because the monitoring system I set up in 2016 was a unique project. I set
up a similar system in 2018 captured even more data. Should be happy to talk about it. That was another
unique project. And to my knowledge, no one else has done this anywhere in the world. In other words, no
one is setting up systems to preserve, to capture ephemeral content and you have to do this right.
RE: You have to do it on a large scale. You have to be looking over the shoulders, you must look over the
shoulders of real people. In other words, you can't do it with bots. It's very, very, very expensive and labor
intensive to run a project like this. My opinion though, these monitoring systems must be set up and not
just in the U.S. They must be set up around the world. Because it's the only protection we could possibly
have. I mean as a species, it's the only way we can protect ourselves from new types of online technologies
that can be used to influence us.
RE: Now what one of the leaks from Google was an eight-minute video, which you can find online. You
can actually find it. The video is called the Selfish Ledger and this is an internal video of Google prepared
by one of their advanced kind of rocket science divisions, their X division, in which they explain that they
have the ability to re-engineer humankind and they specifically mentioned in this video, "According to
company values". Now I made a transcript of this. I'm sure I'd be happy to share with your viewers, I can
get you a link to it if people want to you know really dig into this. But this is serious stuff, it's not just me
discovering some manipulative techniques and trying to quantify them. I'm saying there's an awareness in
the company of the power of a ephemeral experiences. And this awareness of the company, of the power
the company has to reshape humankind. So this is a kind of threat, in my opinion, that humanity has never
faced. No dictator, no miscellany, no Hitler, no dictatorRE: No dictator, no Mussolini, no Hitler, no dictator anywhere has ever had even a tiny fraction of the
power that this company has.
JM: Yeah, it really is brilliant. If you look at it objectively, I meanRE: Yes. I agree with that.
JM: How could you design a more sophisticated and effective strategy to control the population? Virtually
it's hidden. Virtually no one understands or knows this.
RE: Well in 20... Let's see now if I can figure out what year it is. 2016 I made my second discovery. Since
then, they've been coming faster and faster, but 2016 I discovered another type of manipulation that Google
is capable of and also got incredible numbers. When you start to type a search term, so you're typing letters
into a search box, or a search bar, Google flashes suggestions at you. So this is sometimes called their autocomplete tool. I'm just going to call them search suggestions. They're flashing search suggestions at you.
When they first develop this they would flash 10 they would flash 10 at you. I think Yahoo still does 10,
and Bing maybe does eight, or vice versa. So, they still show you those long lists.
RE: Initially the idea was they were going to save you some time. That's the way they presented this new
feature of the search suggestions. They were going to save you some time. They're going to anticipate based
on your history, or based on what other people are searching for, they are going to anticipate what it is
you're looking for so you don't have to type the whole thing. Just click on one of the suggestions. I think it
really started out that way but then it changed into something else. It changed into a tool for manipulation.
RE: In June, I believe of 2016 a small news organization posted a video on YouTube, which is kind of
funny because YouTube is part of Google. It was a seven minute video in which they had a very cool guy,
just ultra cool guy, explaining that their news outfit had discovered that it was virtually impossible to get
negative search suggestions related to Hillary Clinton, but easy to get them for other people including
Donald Trump. They were very concerned about this because I don't know, maybe that could influence
people somehow. So I tried this myself and I have a wonderful image that I preserved showing this. I typed
in Hillary Clinton is, I did it on Bing and on Yahoo, and I got those long lists eight and 10 items saying,
"Hillary Clinton is the devil. Hillary Clinton is sick. Hillary Clinton is..."
RE: All negative things and all things that people were actually searching for. How do you know that?
Because we checked on Google trends. Google trends shows you what people are actually searching for.
Sure enough people were actually searching for all these negative things related to Hillary Clinton. Those
are the most popular search terms. So, we tried it on Google Hillary Clinton is, and we got, "Hillary Clinton
is winning, Hillary Clinton is awesome." And that's it. Now you check those phrases on Google trends and
you find no one searching for those who's typing in Hillary Clinton is awesome? Nobody. No one. But
that's what they're showing you in their search suggestions. So that again got my research gears running
again. I started doing experiments because I said, "Wait a minute why would they do this? What is the
point?" Here's what I found in a series of experiments. Just by manipulating search suggestions I could turn
a 50/50 split among undecided voters into a 90/10 split.
JM: Wow.
RE: With no one having the slightest idea that they've been manipulated. So search suggestions, this
became a second effect, the SSE or the Search Suggestion Effect. It's an incredibly powerful tool which
you can't... It's not billboards. Okay? In other words, you put up a billboard supporting your run for
governor. Right? I put up my own billboard across the street. Right? I can counteract your billboard. Same
with your television commercial, same with your radio ad. Whatever it is you're doing I can counteract it,
I can fight it. But Google search suggestions, how would you fight those? How would you counteract them?.
RE: Of course, they're ephemeral. Wow ephemeral again. They're ephemeral. So there's no record kept of
them, and you can't go back in time and reconstruct what they were showing people. So now we have a
second effect. I mean I could go on and on with more effects. But I'm just trying to give you a glimpse here
of the way the process has worked for me, the way I've stumbled onto things, and what I have found. I
report what I find, I don't care which candidate or which political partyJM: You're neutral.
RE: Yeah. Well I'm [crosstalk 01:04:28] not neutral. Well, I'm not neutral. I'm not neutral as a human
being.
JM: You are neutral with respect to the outcomes of your research.
RE: Correct Yes.
JM: Yeah, that's what I meant to say. Obviously, we all have biases. So I think you've really highlighted
and illustrated an enormous pieces of information that most everyone watching this was absolutely clueless
as is most of the population. So I'd like to go on to eventually... Let me come back to some of the... Well,
let's go into the censorship and the blacklisting. Then I want to really focus on what can we effectively do
because this can get very discouraging and hopeless. I mean you've got 92% of the world using Google. I
mean you just give up now. I mean if these were going to control everything!
JM: But I want to present some hope here because there are very clear specific strategies that we can use
that can actually make a difference. But let's talk a little bit about censorship and blacklisting, which we've
been focusing on the manipulation, and I want to talk about those a little bit because I think that those are
other tools that they're using that is going to encourage more of us to engage in these recommendations that
you're going to go over shortly.
RE: Sure. Well, censorship is in my opinion, is the second big threat that companies like Google, to a lesser
extent Facebook, to a much lesser extent Twitter, these are threats that these companies pose to humankind.
There could be a company after Google, which could do the same thing. So the point is that Google is
determining what billions of people right now, two and a half billion people around the world, see or don't
see. That's the problem. Because in other words, let's say it's a health issue, whether certain kinds of vitamins
are helpful or not, or certain kinds of foods are helpful or not. If they eliminate from search results, a certain
perspective, let's say, or a certain website, or a certain set of websites, how would anyone know that?
RE: See that's the problem with censorship is you don't know what you don't know. Is there evidence that
Google at times just demotes or removes material from its search results? Absolutely. I mean this has been
known for a long time, and again, Google now and then gets sued by someone who says, "You've demoted
me or you've removed me." So a lot of the stuff is kind of out in the open. I mean at one point Google
demoted JCPenney for example. Yeah. Because JCPenney, they said they were violating Google's policies
because they were using what are called SEO techniques. SEO is a big industry search engine optimization,
it's called. To try to boost their rankings in Google search results, which every company does. Every
company does that. But somehow or other they focus on JCPenney and they demoted them.
RE: I mean this was a big demotion because they didn't just knock them off the first page. They knocked
them into the 50th position, which no one ever sees. So there are many, many cases now in which we know
Google does this quite deliberately. There was a company called Eventures, based in Florida, where Google
decided they didn't like the quality of their web pages. There's so much subjectivity going into this. It's
horrible. They literally a block access to, I think about almost all of the URLs that this Eventures company
had, which I'm thinking was about a hundred. No explanation. They never explained.
RE: See, that's the other problem. That's why Tulsi Gabbard, one of the presidential candidates now, has
sued Google because after the first presidential debate, they shut down her ability to place ads. She was the
most searched for candidate at the time, and she needed to place those ads to be running on Google in order
to raise funds. They literally blocked access. They shut her down for, I think about six hours, immediately
following the debate. Well, again, think of it from the perspective of the public, not the candidate. The
public just doesn't even see, they don't see the ads. You're not seeing the search results, or the ads, or the
search suggestions, whatever it may be. You don't know what you don't know.
RE: So you don't even know that you're being manipulated. You don't understand. So it turns out that
Google sensors, all kinds of content in all kinds of ways. As I mentioned earlier, I wrote a big article about
this for U.S. News And World Report called the New Censorship. I talked about Google's blacklist. I
focused on nine of them, later in a little addendum we added a 10th one. But point is, I said, "These blacklists
exists in the company." Now, had I ever seen one? No. But I'm a programmer. I know how this works. I
know Google suppresses content. So I wrote about the blacklist, I wrote about... And I explained exactly
how they worked, and I talked about the... Because every aspect of Google has a blacklist.
RE: So there are YouTube blacklists, for example. Whatever Google does there are blacklist, always. Now,
just a few months ago, this was after I testified before Congress, a new whistleblower turned up. His name
is Zachary Vorhies and he left Google after serving there eight and a half years as a senior software engineer.
Unlike the other whistle blowers, he walked out of there more than 950 pages of documents and a video.
Among those documents or two blacklists. They were actually called blacklists within the company. He
actually walked out with two of their blacklist. So meanwhile, I should point out that when I testified before
Congress, just before I did, Google's representative testified in the same hearing. He was asked point blank,
I think it was by Senator Josh Holly. "Does Google have blacklist?" He said under oath, "No, we have no
blacklist."
JM: Well I think testifying under oath before Congress isn't illegal, right?
RE: Well, you mean lying though? People do lie.
JM: That's what I meant to say, lie.
RE: Yeah. Yeah. [crosstalk 01:11:45] No, people do lie. Back a long time ago, Congress used to exercise
its authority to punish people for lying before Congress. Of course a Michael Cohen, Trump's attorney is
in prison now, in part for lying before Congress. So it can happen, but generally speaking, now you can lie
to Congress and no one cares, and congress doesn't do anything. The last time Congress actually just used
its muscle, literally it's police. It has a kind of police to arrest someone for lying, I think that was 1930 if
I'm not mistaken. So they don't really do that anymore. Yeah, you can just lie all you want.
JM: I'm reminded of the presidents of the four or five U.S. tobacco companies in the late 90sRE: Yes, yes, yes.
JM: Every single one of them said smoking was not addictiveRE: Right.
JM: ... And it doesn't cause cancer. In the late 90s! [crosstalk 01:12:50] There was no repercussion for
those testimonies.
RE: Right.
JM: All right, so this discouragingRE: Can I just talk a little bit more about censorship becauseJM: Sure, absolutely.
RE: Because right now, what's happening in the U.S. is there's a lot of hullabaloos about Google's
censorship. It's coming mainly from conservatives, conservative candidates, conservative organizations,
and there is evidence that Google is aggressively censoring conservative content. So there is some evidence
to S to support this. I think conservatives have reason to be concerned. However, they don't just censor
conservative content. So this issue is much bigger than that, because I've received communications from
people in socialist organizations, progressive organizations, whose content has been censored by Google.
You could be anybody, and be censored by Google. Remember that even individual employees at Google
sometimes have the power to make changes like that, to demote or remove.
RE: Another senior software engineer at Google, His name is Shumeet Baluja, who's been at Google almost
since the very beginning, he published a novel that no one's ever heard of and it's called The Silicon Jungle.
Of course that title comes from a very old book from the early 1900s called The Jungle, which is a
remarkable book about the meat packing industry in the United States. The Silicon Jungle is about Google.
It's fictional, but it's about Google, and it's about the power that individual employees at Google have to
make or break any company or any individual mean. It's a fantastic novel. I asked Baluja, how Google let
him get away with publishing it and he said, "Well, they made me promise I would never promote it." That's
why no one's ever heard of this book.
RE: There's no question that they do suppress content and it's not just conservative content, it's any content
they want to. Another article I'm working on looks at how Google operates in different countries. It turns
out yes, in the U.S. Google leans heavily to the left, but in Cuba they lean heavily to the right. Of course, it
came to light just not long ago that Google was about to go back into China to help the Chinese government
control its population. The politics that we think they have in the U.S., which are pretty clear, they don't
have those politics everywhere. They do what they want to do that serves their company, serves the bottom
line, presumably. Or serves other agendas that they may have.
RE: But a lot of this is just
User is offlinePM
Go to the top of the page
Toggle Multi-post QuotingQuote Post

Reply to this topicStart new topic
2 User(s) are reading this topic (2 Guests and 0 Anonymous Users)
0 Members:

 



- Lo-Fi Version Time is now: 16th April 2024 - 09:52 AM