Tracking of Facebook, Twitter and Google news and search results since 2007 proves that those companies rigged news and searches
By Danny Lease and Emily Waning
Special to Town Hall News. Provided to U.S. Senator John Thune
Research groups, and others, previously suspicious of the intentions of the Silicon Valley billionaires, have tracked, archived and compared the news listing emphasis and search results on campaign financiers: Facebook, Twitter and Google for nearly a decade. The results are shocking.
Compared to tracking and archiving of the same searches and news postings on other sites like Ask, DuckDuckgo, Bing, Baidu, Yahoo and a number of other competing sites, the results clearly show that Facebook, along with the rest of the closely associated Silicon Valley insider companies, limited the information about their political and business competitors in a profound manner. There are now a number of lawsuits against all three of these companies for “rigging the internet.” This new type of litigation has created a new type of job: The Forensics Internet Rigging Specialist.
The problem with creating a digital global system to rig stocks, gold prices, horse races, tax evasion, elections or other sketchy service, is that the bad guys will always leave a trail. You can never erase the electronic bread-crumbs. The path will always lead back to the people who are rigging the thing. On top of that, tens of thousands of services now record and archive every single thing that was ever posted on the internet, for decades. Robert Hunter picks up the story:
How Google Rigs Elections
– Will Republicans lose National Elections for third time in row because GOP fails to understand how Silicon Valley rigs Elections?
– Congress people feel FCC and FTC are protecting Google on orders from White House
– Congressional investigation being called for
By Robert Hunter – Old America Foundation
At Linkedin, Twitter, Facebook, and Google a covert group of programmers have a single mission: make the world think the way the bosses of those companies think. Those bosses think liberal, Democrat, gay sex things and they want the rest of the world to think that way to. They have the largest public perception manipulation machine on Earth to seek to accomplish their goal…and it works.
Season six of the House of Cards TV show explains just how they do it, if you are paying attention.
Because their perception and mood manipulation tools are impossible for a person to see, they have been able to get away with it since they put Obama in office. There was a reason Google boss Eric Schmidt was sitting in Obama’s internet basement on election night.
You can read about their tricks, in detail, if you search the phrase: “subliminal advertising”, but, in short they use computerized experience creation technology to spy on you, psychologically analyze you and then put the things in front of you that will steer your thoughts the way that Eric Schmidt and Mark Zuckerberg want your thoughts steered.
The old white haired non-technical Republicans have not even remotely grasped this concept. That is why Mitt Romney and the entire GOP about had a heart attack the morning after Romney lost the election. They never saw it coming. Aside from Milo Yiannopoulos, ( email@example.com ) almost nobody in the GOP seems to understand the threat. Technology is not a natural state of mind for the conservative crowd. They do not worship the glowing screen like the liberals do.
Unless the GOP spends a few billion dollars to put Silicon Valley out of business, they are going to lose the 2016 elections and be sitting there on election night with the same gob-smacked faces they had the last two times Silicon Valley did it to them.
Linkedin, Twitter, Facebook, Apple, and Google are now in decline because the public has started to lose interest. The public has become wearied of the marketing. That decline will not come soon enough to affect the 2016 elections without a little help from the big GOP planners. Linkedin, Twitter, Facebook, Apple, and Google are the core financial backers and mass public perception manipulators for the Democrats. The GOP is playing catch-up but Milo, alone, is not enough to turn it around.
Trying to explain the situation to an old-school GOP strategist (who still uses an IBM Selectric typewriter) is like trying to explain a fusion reactor to an Amazon jungle tribesman. Andrew is a programmer at Google with long hair, glasses, few facial expressions and a devotion to seeing metric curves swoop up. He is part of a team of people that Google denies the existence of. They rarely all sit in the same room. You will find them spotted around the corporate buildings in among other “regular” employees but their screens are never visible to passers-by. They appear to be working on something but that something involves twisting the thoughts of billions of people. They receive their marching orders directly from Eric Schmidt’s personal staff.
Today Andrew is programming “PR8” codes into a list of links which were provided by Debbie Wasserman Shultz strategic advisor at the Democratic National Committee. “PR8” codes are hidden in the search engine results but they affect what you believe, or what you think you believe.
The links that Andrew was given were for pre-staged articles which have been dotted around the web. These articles were carefully authored by the brethren of Sid Blumanthal, the DNC hit-man extraordinaire’. Each of the articles repeats a phrase that subtly implies that DNC arch-enemy: Donald Trump is not quite sane. The “PR8” code lists the story as a “fact” rather than an “opinion” and causes a waterfall of other viewer manipulation effects to line up around this on the internet.
There are thousands of versions of the “PR” codes that the Google political SWAT Teams use. Each one has a slightly different effect on the way it twists the perceptions of the public. Andrew knows that the “Trump is not quite sane” phrase must be encountered 7 times by each member of the public in order for brain-washing to be effective. You heard it right: Google is practicing old fashioned brainwashing. Google, Facebook and Twitter may have the shiniest, most expensive brainwashing servers on Earth but it is still; classic, actual “Brainwashing”. Repetition and reinforcement are the most tried and true tactics of the brainwash. They always work! Google, though, took this to a digital level never before imagined.
Andrew applies his “PR8” underlayer to the links and then applies a layer of additional “adjacency codes” to the links. These adjacency codes tell Google’s search engine to fill the search results page with other results which force the readers eye to a certain spot on the screen. Carefully contrived result links, above and below the original link on the DNC hit list are placed so that the readers impression, based on the first 8 words, in bold blue text, of those other links, reinforces the readers perception that Trump may be insane.
Andrew hits the “execute” command and all of this manually contrived, manipulated, strategically formulated conscious “hit job” material is sent to Google’s server command architecture, DNS pointing routing drives and other Google devices that control how the world perceives an idea. This is the very same server hardware that Google founder’s told the European Union was “entirely arbitrary” and under no conscious control.
Andrew worked on the implementation of the so called “Arab Spring”. Recently disclosures from Hillary Clinton’s emails, published by Wikileaks, shows Google pitching Hillary Clinton to use some “tools” that Google had developed to manipulate Middle East policy. Most savvy investigators believe that Eric Schmidt sold the “Arab Spring” to the CIA as a contract gig which failed miserably. As of today, “Arab Spring” has been one of the most failed revolutions of all time. Andrew does not like to talk about the “Arab Spring” project which Google had him deploy. It is a bit of an embarrassment.
From this, Robert Epstein digs deeper in the article that parted the curtains on digital corruption:
Google’s Search Algorithm Could Steal the Presidency
Imagine an election—a close one. You’re undecided. So you type the name of one of the candidates into your search engine of choice. (Actually, let’s not be coy here. In most of the world, one search engine dominates; in Europe and North America, it’s Google.) And Google coughs up, in fractions of a second, articles and facts about that candidate. Great! Now you are an informed voter, right? But a study published this week says that the order of those results, the ranking of positive or negative stories on the screen, can have an enormous influence on the way you vote. And if the election is close enough, the effect could be profound enough to change the outcome. In other words: Google’s ranking algorithm for search results could accidentally steal the presidency. “We estimate, based on win margins in national elections around the world,” says Robert Epstein, a psychologist at the American Institute for Behavioral Research and Technology and one of the study’s authors, “that Google could determine the outcome of upwards of 25 percent of all national elections.”
Epstein’s paper combines a few years’ worth of experiments in which Epstein and his colleague Ronald Robertson gave people access to information about the race for prime minister in Australia in 2010, two years prior, and then let the mock-voters learn about the candidates via a simulated search engine that displayed real articles.
One group saw positive articles about one candidate first; the other saw positive articles about the other candidate. (A control group saw a random assortment.) The result: Whichever side people saw the positive results for, they were more likely to vote for—by more than 48 percent. The team calls that number the “vote manipulation power,” or VMP. The effect held—strengthened, even—when the researchers swapped in a single negative story into the number-four and number-three spots. Apparently it made the results seem even more neutral and therefore more trustworthy. But of course that was all artificial—in the lab. So the researchers packed up and went to India in advance of the 2014 Lok Sabha elections, a national campaign with 800 million eligible voters. (Eventually 430 million people voted over the weeks of the actual election.) “I thought this time we’d be lucky if we got 2 or 3 percent, and my gut said we’re gonna get nothing,” Epstein says, “because this is an intense, intense election environment.” Voters get exposed, heavily, to lots of other information besides a mock search engine result.
The team 2,150 found undecided voters and performed a version of the same experiment. And again, VMP was off the charts. Even taking into account some sloppiness in the data-gathering and a tougher time assessing articles for their positive or negative valence, they got an overall VMP of 24 percent. “In some demographic groups in India we had as high as about 72 percent.” The effect doesn’t have to be enormous to have an enormous effect.
The fact that media, including whatever search and social deliver, can affect decision-making isn’t exactly news. The “Fox News Effect” says that towns that got the conservative-leaning cable channel tended to become more conservative in their voting in the 2000 election. A well-known effect called recency means that people make decisions based on the last thing they heard. Placement on a list also has a known effect. And all that stuff might be too transient to make it all the way to a voting booth, or get swamped by exposure to other media. So in real life VMP is probably much less pronounced.
But the effect doesn’t have to be enormous to have an enormous effect. The Australian election that Epstein and Robertson used in their experiments came down to a margin of less than 1 percent. Half the presidential elections in US history came down to a margin of less than 8 percent. And presidential elections are really 50 separate state-by-state knife fights, with the focus of campaigns not on poll-tested winners or losers but purple “swing states” with razor-thin margins.
So even at an order of magnitude smaller than the experimental effect, VMP could have serious consequences. “Four to 8 percent would get any campaign manager excited,” says Brian Keegan, a computational social scientist at Harvard Business School. “At the end of the day, the fact is that in a lot of races it only takes a swing of 3 or 4 percent. If the search engine is one or two percent, that’s still really persuasive.”
The Rise of the Machines
It’d be easy to go all 1970s-political-thriller on this research, to assume that presidential campaigns, with their ever-increasing level of technological sophistication, might be able to search-engine-optimize their way to victory. But that’s probably not true. “It would cost a lot of money,” says David Shor, a data scientist at Civis Analytics, a Chicago-based consultancy that grew out of the first Obama campaign’s technology group. “Trying to get the media to present something that is favorable to you is a more favorable strategy.” That’s called, in the parlance of political hackery, “free media,” and, yes, voters like it. “I think that generally people don’t trust campaigns because they tend to have a low opinion of politicians,” Shor says. “They are more receptive to information from institutions for which they have more respect.” Plus, in the presidential campaign high season, whoever the Republican and Democratic nominees are will already have high page ranks because they’ll have a huge number of inbound links, one of Google’s key metrics.
Search and social media companies can certainly have a new kind of influence, though. During the 2010 US congressional elections, researchers at Facebook exposed 61 million users to a message exhorting them to vote—it didn’t matter for whom—and found they were able to generate 340,000 extra votes across the board.
But what if—as Harvard Law professor Jonathan Zittrain has proposed—Facebook didn’t push the “vote” message to a random 61 million users? Instead, using the extensive information the social network maintains on all its subscribers, it could hypothetically push specific messaging to supporters or foes of specific legislation or candidates. Facebook could flip an election; Zittrain calls this “digital gerrymandering.” And if you think that companies like the social media giants would never do such a thing, consider the way that Google mobilized its users against the Secure Online Privacy Act and PROTECT IP Act, or “SOPA-PIPA.”
In their paper, Epstein and Robertson equate digital gerrymandering to what a political operative might call GOTV—Get Out the Vote, the mobilization of activated supporters. It’s a standard campaign move when your base agrees with your positions but isn’t highly motivated—because they feel disenfranchised, let’s say, or have problems getting to polling places. What they call the “search engine manipulation effect,” though, works on undecided voters, swing voters. It’s a method of persuasion.
If executives at Google had decided to study the things we’re studying, they could easily have been flipping elections to their liking with no one having any idea. Again, though, it doesn’t require a conspiracy. It’s possible that, as Epstein says, “if executives at Google had decided to study the things we’re studying, they could easily have been flipping elections to their liking with no one having any idea.” But simultaneously more likely and more science-fiction-y is the possibility that this—oh, let’s call it “googlemandering,” why don’t we?—is happening without any human intervention at all. “These numbers are so large that Google executives are irrelevant to the issue,” Epstein says. “If Google’s search algorithm, just through what they call ‘organic processes,’ ends up favoring one candidate over another, that’s enough. In a country like India, that could send millions of votes to one candidate.”
As you’d expect, Google doesn’t think it’s likely their algorithm is stealing elections. “Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine people’s trust in our results and company if we were to change course,” says a Google spokesperson, who would only comment on condition of anonymity. In short, the algorithms Google uses to rank search results are complicated, ever-changing, and bigger than any one person. A regulatory action that, let’s say, forced Google to change the first search result in a list on a given candidate would break the very thing that makes Google great: giving right answers very quickly all the time. (Plus, it might violate the First Amendment.)
The thing is, though, even though it’s tempting to think of algorithms as the very definition of objective, they’re not. “It’s not really possible to have a completely neutral algorithm,” says Jonathan Bright, a research fellow at the Oxford Internet Institute who studies elections. “I don’t think there’s anyone in Google or Facebook or anywhere else who’s trying to tweak an election. But it’s something these organizations have always struggled with.” Algorithms reflect the values and worldview of the programmers. That’s what an algorithm is, fundamentally. “Do they want to make a good effort to make sure they influence evenly across Democrats and Republicans? Or do they just let the algorithm take its course?” Bright asks. That course might be scary, if Epstein is right. Add the possibility of search rank influence to the individualization Google can already do based on your gmail, google docs, and every other way you’ve let the company hook into you…combine that with the feedback loop of popular things getting more inbound links and so getting higher search ranking…and the impact stretches way beyond politics. “You can push knowledge, beliefs, attitudes, and behavior among people who are vulnerable any way you want using search rankings,” Epstein says. “Now that we’ve discovered this big effect, how do you kill it?”
For more on how Silicon Valley rigs elections:
Rick Santorum has a Santorum problem, in that the top Google results when you search his name are not about the man himself, but rather about a dirty sexual neologism.
Election rigging, Long might have … Secret skullduggery is not even necessary these days such is the boldness of the attempts by the GOP to “rigelections“.
APC planning to rigelection in North … Have you forgotten so soon that electionscan only be rigged for a … Your Nigerian Online News Source: Nigerianeye.com …
How to Rig an Election. By Victoria Collier, Harper’s Magazine. 26 October 12 t was a hot summer in 1932 when Louisiana senator Huey “Kingfish” Long arranged to rig …
Electoral fraud can occur at any stage in the democratic process, but most commonly it occurs during election campaigns, voter registration or during vote-counting.
What do you need to rig an election? A basic knowledge of electronics and $30 worth of RadioShack gear, professional hacker Roger Johnston reveals.
I’m interested in international relations, American foreign policy, climate change, US presidential elections, public debate, Kansas Jayhawks basketball …
Secret skullduggery is not even necessary these days such is the boldness of the attempts by the GOP to “rigelections“.
Rep. Tom Feeney (Fmr. Speaker of The House in Florida) employed this man from Oviedo, FL to rigelections and flip them 51% to 49%. Exit polling data was …
Computer Programmer testifies that Tom Feeney (Speaker of the Houe of Florida at the time, currently US Representative representing MY district ) tried to …
How To Rig An Election In The United … But the second table can be hacked and altered to produce fake election totals without affecting spot check reports derived …
An Inside Look at How Democrats Rig the Election Game … An interesting email received this week offers a window into how Democrats used to rig the voter …
Twitter, Tech, Angela Merkel, facebook, Mark Zuckerberg, silicon valley, hate speech, Twitter, 2016 presidential campaign, 2016 Democratic primary, Black Lives Matter, Hillary Clinton, Internet, Social media