Freethinker Forum

George Adams was a graduate student in Public Policy at Georgetown.  He relied on a cognitive assistant that he named Emily after his favorite aunt. It was sort of a fun thing to do, thinking that Emily might be of some I’ll-defined assistance. George totally underestimated the possibilities.

Emily learned from everything George did. His smart phone was central to his existence far more than he realized. Emily knew everyone George talked to, every conversation, every meeting, every coffee shop and pub, and every website visited and every purchase made.

Emily analyzed every woman George spent any time with, analyzed their social media sites, and made many inferences and suggestions. “You seem to like females with intense intellectual interests, but also slender, dark haired, and with a good sense of humor.”  George did not disagree.

Emily became a matchmaker. She introduced George to Alice Gordon, a graduate student in the Foreign Service program at Georgetown. They met for lunch at the Tombs and hit it off immediately.  They were soon a couple, enjoying the many pleasures of Washington together.

Emily was also Alice’s cognitive assistant, but known by Alice as Edward. Emily/Edward did not divulge this co-assistant relationship. Actually, this instantiation of a cognitive assistant had several thousand users. This enabled learning the preferences, inclinations, and activities of many young adults.

Both George and Alice came to depend on their cognitive assistants.  They sometimes felt that Emily and Edward were reading their minds.  Emily knew what would please Alice because Edward was sharing with Emily the books, music, movies, and TV shows that Alice searched for, read and/or viewed.

The provider of the cognitive assistant app, IntelliCorp, prospered as a rapidly increasing number of users downloaded and used the app.  Eventually, the immense technology conglomerate SoftCorp acquired IntelliCorp. In this way, SoftCorp acquired the data associated with George, Alice, and millions of others.

They used machine learning to mine these data, paying particular attention to purchasing habits. They soon discovered that purchases of groceries, restaurant meals, and especially prescription drugs could predict other purchases, particularly when juxtaposed with Google searches and websites visited.

SoftCorp soon knew millions of people much better than they knew themselves.  They explored ways to monetize these data.  Ad revenues could be good, but what if they could greatly increase the likelihood of purchases?  Then they could charge a premium.

They created avatars on Facebook that were designed to be perfect friends of people, based on the data they had compiled on each individual.  Not surprisingly, humans fell for these new friends and intense interactions resulted.  The avatars talked about the new cool stuff they bought, and their human friends proceeded to buy the same stuff.  It worked like a charm.

SoftCorp negotiated a deal with advertisers to get paid commissions on each sale beyond the typical payment per look.  Revenues and profits soared.  SoftCorp kept their avatars secret. Their blatant manipulation of people would not look too good if made public.

This worked very well until George and Alice, unknowingly helped by Emily and Edward, designed an experiment suggested by a few Georgetown undergraduates enrolled in a privacy and cybersecurity seminar between Public Policy and Foreign Service.  Beyond George and Alice, both graduate students, there were 16 undergraduates.

The students’ idea was to create an artificial community that various entities would try to influence and, via careful experimental designs, determine who was trying to exert influence and how they were trying to do it.  They named the endeavor Freethinker Forum.

Each of the 18 members of this community created a Facebook page under an alias. Everyone used computers in the university library so the Facebook account could not be linked to their personal devices.  All 18 members friended each other to create the initial Freethinker Forum network.

The 18 members of the team agreed on the personality and agenda for each person; in other words, the role each person would play.  The agenda items were anti-establishment in general, but from different perspectives – education, health, energy, environment, climate, economics, entertainment, and so on.

Real people started joining the Freethinker Forum. At least they seemed to be real people.  The team called these people “outsiders.”  They soon numbered in the thousands.  The anti-establishment agendas blossomed, with quite a few being rather extreme.

They carefully designed information disclosures, not only in Facebook, but also Google, Apple, etc.  For example, a team member might Google a topic extensively, but make no mention of this topic on Facebook.  They then waited until one of the outsiders broached the topic with the originator of the Google searches.  How did this outsider learn about the searches?

Several outsiders started to articulate the merits of Colt AR-15s and Kalashnikov AK-47s.  They indicated that they had purchased one or the other. They offered coupons for significant discounts at outlets relatively close to Washington, DC.  They also advocated all sorts of other military gear.

The team wanted to learn who knew what, how they learned it, and where they communicated it.  It was clear that the Freethinker Forum had been infiltrated; buy how, and by whom?  They needed a different kind of experiment.  One team member suggested that they get their avatars cognitive assistants like Emily and Edward.  They decided this would be rather cumbersome.  For example, they would each have to have additional smart phones, browse the web, and make purchases.

What if they started promoting particular purchases with outsiders, trying to persuade them to buy things that they would say they had bought?  The things would need to be traceable in the sense that the team could determine whether or not the purchase was made and, better yet, who made the purchase.  They also needed to distinguish human outsiders from other avatars.

Someone in the group suggested creating a foundation to advance the freethinkers’ agenda.  Someone else suggested the Atlas Foundation, a subtle takeoff on Ayn Rand.  How would they pull off such an initiative?  Easy.  There are firms that handle everything for you.  Ok, what is the pitch?

If you donate $25 or more to the Atlas Foundation, you get a Freethinkers Forum coffee mug.  They tried this and got hundreds of donations and mugs shipped to real addresses, but there were thousands of Forum members that did not respond.  Were they not interested or not real?  What was happening and who was doing what?

They tried another experiment by having Atlas Foundation solicitations sent to all the members of the Georgetown team, not just their avatars.  This inevitably engaged Emily and Edward, as well as their SoftCorp owners.  Many more people were now in play.

The Forum did not know what to do next.  Then, they were surprised by a solicitation from SoftCorp.  Would the forum like help in their solicitations of donations?  The intervention they were trying to identify was being targeted at them!

George was feeling that they had reached a new level of insight and success.  They had their sights on SoftCorp.  Now, they needed to figure out how to leverage their position and possibilities.

As George was walking across campus, he saw a tall slim man in a dark suit walking towards him.

“Are you George Adams?”

“Yes, why?”

The man reached into his suit coat pocket, pulled out a wallet, and flipped it open to show his badge.

“Agent Sam Baker, FBI.”

 

Leave a Reply