Social Media

Gutenberg, WordPress, & You: the What, the Why, and the How

7 August 2018

From The Digital Reader:

If you have a WordPress site, chances are you have heard of something called Gutenberg. You could have seen one of the posts written about it over the past 18 months (such as mine), or you may have seen the notice when you updated your WordPress site to v4.9.8.

Either way, if you are the average user you are probably wondering what Gutenberg is and how it will affect your site.

The following post is a short explainer that will delve into what Gutenberg is, why it matters, and how it will affect you.

. . . .

Gutenberg is a new official part of WordPress. It is currently in beta, and is scheduled to launch with the release of WordPress 5.0.

I have been following the development of Gutenberg for over a year, and in that time I have learned that the easiest way to explain what Gutenberg is to ask whether you are familiar with one of the mailing list services like Mailchimp or Mailerlite. Have you used one of their newsletter builders?

If you have used one then you will better understand Gutenberg when you see it for the first time.

Gutenberg replaces the existing post and page editor menu in WordPress with one that behaves more like Mailerlite’s newsletter builder. Where the existing editor resembled old word processor apps (think MS Word, circa 2002) and was designed on the concept of typing out paragraphs of text, the new Gutenberg editor is built on the idea of blocks.

It is not supposed to affect your existing content, but I cannot guarantee that will be true 100% of the time. (A WordPress site is just too complex to make that promise.) What I can say is that Gutenberg is intended to let you make new richer content, not force you go fix your existing content.

. . . .

Gutenberg, on the other hand, has a pop-up menu where you can select a block. You can open that menu by clicking the plus sign icon (see the screenshot above for an example) and then selecting one of the options.

Once you chose the next block, you can style it with settings that only apply to this one block, add content, etc. That custom styling is perhaps the biggest difference between Gutenberg and the existing content editor.

Link to the rest at The Digital Reader

PG is convinced that if he for some reason becam brain-dead, his fingers would still have sufficient intelligence to make blog posts via WordPress.

His brain may be intrigued by Gutenberg, but, before anything changes, PG’s fingers want a nice long vacation far away from all keyboards in a place where they can locate their inner child or some such thing.

The other thing that comes to PG’s mind when considering Gutenberg is how many WordPress themes it’s going to break. PG has one theme in particular on his mind.

TPV’s current theme is 25,000 years old in internet years. Some of the earlier posts are probably full of dinosaur tracks by now.

From time to time, PG has explored installing a new WP theme for TPV, but he can’t find one that will be a great-looking home for the blog out of the box. He has wasted a lot of time tweaking various themes, but nothing he’s developed has the same zing as the old, old, old look.

Incidentally, Nate, the proprietor of The Digital Reader, is also a WordPress whisperer in case your blog or your brain seizes up over Gutenberg.

Outnumbered: From Facebook and Google to Fake News and Filter-bubbles

1 August 2018

From The Guardian:

 Space is big,” wrote Douglas Adams in The Hitchhiker’s Guide to the Galaxy. “You just won’t believe how vastly, hugely, mind-bogglingly big it is. I mean, you may think it’s a long way down the road to the chemist’s, but that’s just peanuts to space.”

Adams’s assertion comes repeatedly to mind when reading David Sumpter’s Outnumbered, which attempts to reckon with the sheer scale of the systems that manage much of our digital lives. It’s easy, when faced with the numbers at hand, to succumb to a kind of vertigo: Facebook has two billion users, who make tens of millions of posts every hour. From this data, along with millions more photos, likes and relationships, Facebook builds models of all of us that extend in hundreds of dimensions – the puny human mind, at best, is capable of visualising four.

Google’s translation systems likewise collapse hundreds of languages into multidimensional matrices of meaning, which generate their own metalanguages unknowable to us – and which contain their own implicit biases. Plugging the UK’s most popular baby names into one such system, designed to understand how words and concepts relate to each other, gives the response: “Oliver is to clever what Olivia is to flirtatious”. “Our future generations’ gender roles,” the author worries, “have already been assigned by the algorithm.”

Sumpter is a professor of applied mathematics; his natural response to such problems is to recreate them and then unpack them – and perhaps deflate some of our wilder fears along the way. Step by step, using the same data as many of the papers he quotes from, he details the maths that underpins each of these systems, laying out the straightforward, if advanced, calculations that govern their outcomes – and their limitations. It’s all very well to apply sophisticated regression models to billions of Facebook likes, but the results are mostly underwhelming: yes, “Democrats are more likely to like Harry Potter” than Republicans, but “it doesn’t necessarily tell us that other Harry Potter fans like the Democrats”. The same no-nonsense approach is deployed to debunk lazy assertions that we are all fooled by fake news stories, or trapped within filter bubbles that mindlessly reassert our prejudices. We are, apparently, both smarter and more aware than that.

. . . .

In example after example, the toxic combination of filter bubbles, simplistic ranking mechanisms and algorithmic recommendation is seen as clearly in elite networks as in more accessible but supposedly less self-aware groups – but, we are told, the effects really aren’t as bad as we’re being told they are: “When we get time for research, we scientists still do it well. Most scientists I meet are motivated by the eternal search for truth and the desire to know the right answer.” Phew.

Link to the rest at The Guardian

From The Jakarta Post:

At a time when society is becoming more and more reliant on technology and the internet, there is widespread concern they are exerting too much influence over our lives.

Behind it all are algorithms that have the capability to predict our everyday lives, and to be frank, we’re not entirely sure what they are up to.

In Outnumbered: From Facebook and Google to fake news and filter-bubbles – the algorithms that control our lives (featuring Cambridge Analytica), author David Sumpter writes about society’s ever growing concerns about technology, as well as how algorithms work and their capability to run our lives. It is a fascinating read, drawing on real life phenomenon and stories, such as the likes of Cambridge Analytica.

Simply put, Sumpter tells readers that artificial intelligence (AI) and algorithms are more than what meets the eye.

When it comes to technology and the internet, many see it as an open field, accessible to many and a place where we can freely move around. We easily surf the web, clicking on websites, constantly using search engines to keep up with our daily lives. We don’t really put much thought into what happens after we make a single click, but we are now more aware and sensitive of what happens to our data.

That being said, the internet holds a massive digital archive of our opinions, gathered from the algorithms that lie beneath it. These opinions are then analyzed to become a mold of our preference and interests and manipulated to influence us. So we ask each other why. Why are these data researchers and tech giants collecting our data? And how are they using it? What is it about our data that interests them so much?

In this book, Sumpter investigates and explains it all. A professor of applied mathematics at the University of Uppsala in Sweden, Sumpter pitches to his readers a model that illustrates how these algorithms work, from how they analyze us, influence us and become us, and what we should and should not worry about.

. . . .

Sumpter explains all his findings through fascinating examples and case studies, such as how mathematicians applied mathematics in an effort to locate Banksy, an anonymous street artist, or how Google’s neural network had the capability to play Space Invaders against a human being, as well as how companies secretly used our data for political use and that one algorithm that analyzes how the number of “likes” and “dislikes” on platforms such as YouTube correlate with popularity. The list goes on and it is mind blowing.

Link to the rest at The Jakarta Post

From Publishers Weekly:

At a time of widespread concern about technology exerting too much influence over people’s lives, mathematics professor Sumpter (Soccermatics) devotes this enlightening book to investigating these fears and explaining clearly what algorithms do. He tackles different examples of their appearance in daily life, starting with in-the-news attempts to use the internet to study and influence voters. He discusses data harvested from Facebook users regarding their preferences in politics and other areas (theoretically, Democrats “could focus on getting the vote out among Harry Potter fans”), observing that, thankfully, the data’s accuracy is limited by the algorithim designers’ own inherent biases. As to the fake news disseminated on Facebook and other content aggregators, Sumpter believes that, for most people, it has little real impact.

Link to the rest at Publishers Weekly

From Kirkus Review:

Further frighteningly convincing research about the data infiltrating our lives.

Experts regularly warn us that today’s digital technology can extract our innermost secrets. In this ingenious addition to the genre, Sumpter (Applied Mathematics/Univ. of Uppsala, Sweden; Soccermatics: Mathematical Adventures in the Beautiful Game, 2016, etc.) agrees that there is some truth in this assessment but also serious limitations. The book, less a polemic than a combination of investigative journalism and (mostly) painless mathematical lessons, explains how social media, search engines, and merchants extract our opinions and manipulate them with a set of rules called an algorithm, which can often reveal our tastes, personality, and politics. Readers comfortable with ads tailored to previous purchases may flinch to learn that every mouse click such as a “like” under a photo, joke, or film clip enters a massive digital archive that reveals an unnervingly accurate portrait of the clicker. “Unlike our friends—who tend to forget the details and are forgiving in the conclusions they draw about us—Facebook is systematically collecting, processing and analyzing our emotional state,” writes the author. “It is rotating our personalities in hundreds of dimensions, so it can find the most cold, rational direction to view us from.” Persuading us to buy stuff seems benign, but the internet also teems with fake news scientifically designed to influence our votes. Sumpter returns repeatedly to the surprise victories of Donald Trump and Brexit. Wielding his mathematical tools, the author explains how algorithms deal with big data, and it turns out there is less there than meets the eye. Polls only calculate the odds of an event; they can’t “predict” anything. True believers lap up fake news, but it has a barely detectable effect on changing the average reader’s mind.

Link to the rest at Kirkus Review

From Psychologist World:

A subliminal message is a signal or message designed to pass below (sub) the normal limits of perception. For example it might be inaudible to the conscious mind (but audible to the unconscious or deeper mind) or might be an image transmitted briefly and unperceived consciously and yet perceived unconsciously. This definition assumes a division between conscious and unconscious which may be misleading; it may be more true to suggest that the subliminal message (sound or image) is perceived by deeper parts of what is a single integrated mind.

In the everyday world, it has often been suggested that subliminal techniques are used in advertising and for propaganda purposes (e.g. party political broadcasts).

The term subliminal message was popularized in a 1957 book entitled The Hidden Persuaders by Vance Packard. This book detailed a study of movie theaters that supposedly used subliminal commands to increase the sales of popcorn and Coca-Cola at their concession stands. However, the study was fabricated, as the author of the study James Vicary later admitted.

In 1973 the book Subliminal Seduction claimed that subliminal techiques were in wide use in advertising. The book contributed to a general climate of fear with regard to Orwellian dangers (of subliminal messaging). Public concern was enough to lead the Federal Communications Commission to hold hearings and to declare subliminal advertising “contrary to the public interest” because it involved “intentional deception” of the public.

Subliminal perception or cognition is a subset of unconscious cognition where the forms of unconscious cognition also include attending to one signal in a noisy environment while unconsciously keeping track of other signals (e.g one voice out of many in a crowded room) and tasks done automatically (e.g. driving a car).

In all such cases there has been research into how much of the unattended or unconscious signal or message is perceived (unconsciously), i.e is the whole message sensed and fully digested or perhaps only its main and simpler features? There are at least two schools of thought about this. One of them argues that only the simpler features of unconscious signals are perceived; however please note that the majority of the research done has tended to test only for simpler features of cognition (rather than testing for complete comprehension). The second school of thought argues that the unconscious cognition is comprehensive and that much more is perceived than can be verbalized.

. . . .

Subliminal messages might gain their potential influence/power from the fact that they may be able to cirumvent the critical functions of the conscious mind, and it has often been argued that subliminal suggestions are therefore potentially more powerful than ordinary suggestions. This route to influence or persuasion would be akin to auto-suggestion or hypnosis wherein the subject is encouraged to be (or somehow induced to be) relaxed so that suggestions are directed to deeper (more gullible) parts of the mind; some observers have argued that the unconscious mind is incapable of critical refusal of hypnotic or subliminal suggestions. Research findings do not support the conclusion that subliminal suggestions are peculiarly powerful.

. . . .

A form of subliminal messaging commonly believed to exist involves the insertion of “hidden” messages into movies and TV programs. The concept of “moving pictures” relies on persistence of vision to create the illusion of movement in a series of images projected at 23 to 30 frames per second; the popular theory of subliminal messages usually suggests that subliminal commands can be inserted into this sequence at the rate of perhaps 1 frame in 25 (or roughly 1 frame per second). The hidden command in a single frame will flash across the screen so quickly that it is not consciously perceived, but the command will supposedly appeal to the subconscious mind of the viewer, and thus have some measurable effect in terms of behavior.

As to the question of whether subliminal messages are widely used to influence groups of people e.g. audiences, there is no evidence to suggest that any serious or sustained attempt has been made to use the technology on a mass audience. The widely-reported reports that arose in 1957 to the effect that customers in a movie theatre in New Jersey had been induced by subliminal messages to consume more popcorn and more Coca-Cola were almost certainly false. The current consensus among marketing professionals is that subliminal advertising is counter-productive. To some this is because they believe it to be ineffective, but to most it is because they realise it would be a public relations disaster if its use was discovered. Many have misgivings about using it in marketing campaigns due to ethical considerations.

Link to the rest at Psychologist World

PG has linked to several reviews of a new book, Outnumbered: From Facebook and Google to Fake News and Filter-bubbles – The Algorithms That Control Our Lives (featuring Cambridge Analytica) and one short item discussing subliminal messaging because each of the reviews reflects, to varying extents, the idea that masses citizens in free and open societies can be manipulated like pawns on a chessboard by some surreptitious, super-intelligent and powerful group of people for nefarious ends.

Vance Packard’s bestselling book, The Hidden Persuaders, had a huge impact on advertisers and advertising agencies (and politicians as well) in the late 1950’s and 1960’s. For obvious reasons, the idea that a properly-constructed subliminal message could have a substantial effect on consumer behavior was very attractive.

The problem with Packard’s theories is that they didn’t seem to work with consumers. The traditional marketing approach of understanding what consumers wanted and needed (not necessarily the same thing) and creating products and messages that addressed those wants and needs wasn’t enhanced in any measurable way by hidden messages that were not clear and obvious. Handsome men and attractive women were depicted enjoying a product in commercials to catch a viewer’s attention and encourage them to buy a product. Pitching the psychic benefits of a product was not particularly subtle and certainly didn’t pass a secret and influential message below the level of consciousness.

In PG’s observation, large groups of people tend to denigrate those who disagree with them on a subject of some importance as uninformed or stupid. The belief that people make choices or cast votes against their own logical self-interest is widespread among those who believe themselves to be more intelligent or better-informed than those who disagree with them.

PG suggests that the belief that large groups of people are too ignorant or ill-informed to make proper decisions for themselves is a step towards political disaster. One feature of dictatorships is a constant stream of advertising messages about how intelligent, insightful and beneficent the great leader and those who surround him are and how important it is for everyone else to leave important decisions in the leader’s hands.


5 Old School Social Media Tactics That Are No Longer Effective (And What To Do Instead)

30 July 2018

From Buffer Social:

5. Asking people to share your content

You’ve worked hard to create an awesome piece of content—and naturally, you want as many people to see it as possible. So, along with sharing the link on social media, you ask your contacts to post it on their own networks.

The problem? This request puts your connections in a really awkward spot. Saying no feels pretty uncomfortable (after all, you’re asking for a share, not a kidney), but they might want to for any number of reasons: the content doesn’t work with their brand, audience, or social strategy; they don’t agree with everything it says; or they simply resent being asked.

In the end, this strategy might help you get more views on, say, a blog post or Slideshare—but your professional relationships will take a hit. (Want to brush up on social media etiquette? Check out the 29 most common rules and which ones you should actually follow.)

What to do instead:

You want people to link to your content because, well, they want to. With that in mind, focus on making it as shareable as possible.

recent analysis of 65,000 articles found that a piece’s virality comes down to two main factors: arousal and dominance. In plain English, arousal means “riled up.” Both anger and excitement are high-arousal emotions. Dominance, on the other hand, is the feeling of being in control. When you’re inspired or joyful, you’re experiencing high dominance; when you’re scared, you’re experiencing low dominance.

Articles that perform the best on social use a high-arousal, high-dominance combo. What would that look like? Well, a photo of Vin Diesel with his daughter racked up 8.1 million interactions (making it the fifth most popular Facebook post of 2015), thanks to the strong, positive emotions it generated. But strong, negative emotions can be powerful too—take the Dove “Choose Beautiful” campaign, which put a spotlight on low self-esteem.

Link to the rest at Buffer Social

Big Twitter Accounts See Follower Numbers Drop After Fake-User Purge

14 July 2018

From Variety:

As expected, Twitter’s elimination of “locked” users accounts from public follower counts has resulted in a decline for many users — including millions lost for the biggest celebs on the platform, like Katy Perry, Justin Bieber, Barack Obama, Taylor Swift and Lady Gaga.

One of the biggest losers seems to have been Twitter’s own primary account (@Twitter), which shed 7.5 million fake accounts to drop 12% Thursday, from 62.85 million earlier in the morning to 55.35 million as of 2:45 p.m. ET. By Friday morning, that was down to 55.1 million.

By comparison, the decline of other large accounts has been smaller. The 100 most-followed Twitter accounts saw an average drop in followers of 2% on Thursday, according to social-analytics firm Keyhole, with a median decline of 734,000 followers.

Singer Katy Perry, who has the most-followed account on Twitter, lost 2.8 million followers through Friday at 8 a.m. ET, dropping 2.6% to 106.8 million followers. Follower counts for Justin Bieber and Rihanna fell 2.5%, Ellen DeGeneres dropped 2.6%, Taylor Swift fell 2.7%, and Lady Gaga declined 3.2%.

. . . .

On Wednesday, Twitter said it was making the change in order to boost the credibility of follower-count numbers and improve transparency. The change in follower counts doesn’t affect the active user totals Twitter tracks and reports on a quarterly basis to investors, according to the company.

The majority of Twitter users will see a reduction of four followers or fewer, but those with larger follower counts will see a bigger drop, the company said. Twitter began culling locked accounts from follower figures Thursday, and as the process continues the numbers will likely decline further. All told, Twitter expects the number of followers to decline around 6% platform-wide by the time it’s completed the purge.

Link to the rest at Variety and thanks to Nate for the tip.

How Google and Facebook Are Monopolizing Ideas

5 July 2018

From The Wall Street Journal:

In early May Google banned bail-bond companies from advertising on its platforms. Such companies profit from “communities of color and low income neighborhoods when they are at their most vulnerable,” it explained in a blog post. They use “opaque financing offers that can keep people in debt for months or years.”

That Google can ban ads from an industry that offends its values is not, by itself, noteworthy. Media companies have long decided what content or ads to carry for the same reason. The difference is that even after decades of consolidation, no media company enjoys a U.S. market share as dominant as Google’s in Internet search (close to 90%) or Facebook Inc.’s in social networking. Like earlier bans on payday-loan ads, Google’s bail-bond ad ban, which Facebook copied the next day, effectively kicked an entire industry out of a major advertising channel.

The debate over whether Google, a unit of Alphabet Inc., and Facebook are too big usually revolves around economics: Do they suppress competition for goods and services? The bail-bond ad ban raises a different, and potentially more troubling, possibility: that they also undermine competition for values and ideas. While Google and Facebook claim to be neutral platforms connecting users, advertisers and content providers, decisions about which ads to ban and which content to delete or reclassify are inherently value-laden, even when those values are embedded in an algorithm.

Data monopolies “can actually be more dangerous than traditional monopolies,” Maurice Stucke, a law professor at the University of Tennessee, Knoxville specializing in antitrust, wrote earlier this year in Harvard Business Review. “They can affect not only our wallets but our privacy, autonomy, democracy, and well-being.”

Bail bonds aren’t a sympathetic industry. For a steep fee, agents agree to pay the court’s required bail if the client doesn’t show up for a court date. They are, however, legal and, in most states, regulated. And the industry says it serves low-income and minority clients because they are caught up in the criminal-justice system without the means to post bail on their own.

Jeff Clayton, executive director of the American Bail Coalition, whose members insure bail agents, says Google gave the industry no opportunity to comment on or appeal the ban. A Google spokeswoman declined to comment. Facebook did consult with both the industry and criminal-justice-reform groups after announcing its ban, a spokesman said.

Bail-bond agents used to advertise in the yellow pages, but as the public abandoned phone books for Google, so did the industry. “There are just no other options,” Mr. Clayton said. The ban doesn’t extend to regular search results, but it makes it harder for individual companies to stand out.

Conservatives tend to see tech companies’ progressive leanings at work in what gets banned or reclassified—for example, Facebook’s labeling of videos by two prominent supporters of President Donald Trump as “unsafe.” Bail bonds and payday loans have long been targets of progressive activist groups.

But as the companies come under growing pressure to police their platforms and weed out “fake news,” a growing range of content gets banned, labeled or deleted for often opaque or arbitrary reasons. ProPublica and Reveal, both nonprofit news publications, have had content dealing with hate groups and immigrant children, respectively, deleted or rejected by Instagram or Facebook. Video artists complain of viewership and ads being restricted because their content violated YouTube’s community standards.

Unhappy users, advertisers and content providers wouldn’t have as much to complain about if Google (which bought YouTube in 2006) and Facebook (which acquired Instagram in 2012) had strong competitors to which they could switch.

Absent such competition, expect pressure for the government to regulate it. But that’s a slippery slope. Politically appointed overseers may simply replace the companies’ judgments with their own. For that reason the Federal Communications Commission long ago gave up policing the nation’s airwaves for fairness.

Link to the rest at The Wall Street Journal 

Facebook’s New Political Ad Policy Ends Up Censoring Bookstore’s Author Event Ads

29 June 2018

From the American Booksellers Association:

Facebook’s attempts to regulate political advertisements on its social media platform have made it more difficult for bookstores to “boost” author events. Boosted posts are those which have been paid for to ensure that they reach a wider audience.

In early June, A Room of One’s Own Bookstore in Madison, Wisconsin, encountered a problem when trying to advertise author events on Facebook. The bookstore’s events coordinator, Gretchen Treu, requested to boost Facebook posts to promote two author events only to find that they were rejected on the basis of what Facebook characterized as their “political nature.”

The rejected posts are an outcome of a new Facebook political ad policy. The policy, which went into effect in May and applies only to ads targeting an American audience, was established to prevent foreign individuals or groups from running Facebook ads to influence U.S. politics. In order to pay for a “political” ad, advertisers must become authorized to do so. The authorization process includes submitting a government-issued ID and providing a residential mailing address. The new policy represents Facebook’s voluntary compliance with the proposed Honest Ads Act, a bipartisan bill sponsored by Senators Amy Klobuchar (D-MN), Mark Warner (D-VA), and John McCain (R-AZ) that subjects online political advertisements to the same rules as ads sold on TV, radio, and satellite.

“Here, Facebook’s solution might be worse than the problem,” said David Grogan, director of the American Booksellers for Free Expression, Advocacy and Public Policy for the American Booksellers Association. “While we are sympathetic to Facebook’s attempt to filter out false news meant to influence our democratic process, attempts to regulate or control speech will usually result in unintended consequences. And in this case it has, as bookstores that are advertising important author events — critical to the free exchange of ideas — are censored indiscriminately alongside foreign actors. Facebook needs to go back to the drawing board on this policy.”

The posts in question advertised events with Ijeoma Oluo, promoting her book So You Want to Talk About Race (Seal Press), and Cecile Richards, discussing her memoir Make Trouble (Touchstone Books).

Link to the rest at the American Booksellers Association

Facebook’s Latest Problem: It Can’t Track Where Much of the Data Went

27 June 2018

Not exactly to do with book authors, but game and app developers are authors as well. Interesting issues about how your data can potentially become someone else’s intellectual property.

From The Wall Street Journal:

Facebook Inc.’s internal probe into potential misuse of user data is hitting fundamental roadblocks: The company can’t track where much of the data went after it left the platform or figure out where it is now.

Three months after CEO Mark Zuckerberg pledged to investigate all apps that had access to large amounts of Facebook data, the company is still combing its system to locate the developers behind those products and find out how they used the information between 2007 and 2015, when the company officially cut data access for all apps. Mr. Zuckerberg has said the process will cost millions of dollars.

One problem is that many of the app developers that scooped up unusually large chunks of data are out of business, according to developers and former Facebook employees. In some cases, the company says, developers contacted by Facebook aren’t responding to requests for further information.

Facebook is now trying to forensically piece together what happened to large chunks of data, and then determine whether it was used in a way that needs to be disclosed to users and regulators. In cases where the company spots red flags, Facebook said it would dispatch auditors to analyze the servers of those developers and interrogate them about their business practices.

Ime Archibong, Facebook’s vice president of product partnerships, said most developers have been “responsive” but noted that the process requires a fair bit of detective work on their end. “They have to go back and think about how these applications were built back in the day,” Mr. Archibong said.

. . . .

Facebook’s app investigation is a response to broader criticism over revelations earlier this year that data-analytics firm Cambridge Analytica improperly accessed and retained user data obtained from Aleksandr Kogan, a psychology professor at the University of Cambridge. The data, which was gathered by Mr. Kogan and his associates through a personality-quiz app, was used by the Trump campaign in 2016. Facebook eventually notified around 87 million users that their data may have been improperly shared with Cambridge Analytica, though many questions remain about that incident as well.

. . . .

Some developers say they have little incentive to respond to Facebook’s requests to cooperate with the probe, either because they are out of business, have moved on to other projects or are uneasy about allowing another company to look at their servers and the way their apps are constructed. Such intellectual property is “the lifeblood” of a developer’s business, said Morgan Reed, president of ACT | The App Association, a trade group that represents more than 5,000 app makers and connected-device companies.

In addition, Facebook doesn’t have legal authority to force developers to cooperate.

“They can’t really compel these developers to hand over information,” said Ian Bogost, a professor at Georgia Institute of Technology. “This is not a federal inquiry about a crime or something. It’s a private company. What are the consequences?”

Mr. Bogost is also a game developer, and built a game for the Facebook platform called Cow Clicker. He said Facebook hasn’t contacted him about conducting a full-scale audit of Cow Clicker, which drew about 180,000 users.

. . . .

Facebook created its developer platform in 2007, giving outsiders the ability to build businesses by leveraging the Facebook data of users and their friends. Facebook tightened access in 2014 and gave pre-existing apps a one-year grace period to comply with the new rules.

Facebook engineers working on the platform didn’t always document their changes, according to one former employee. At times, apps would stop working because of some unannounced tweak by a Facebook employee and developers would have to complain to get it fixed, developers said.

Over the years, Facebook at times tried to build systems that would allow the company to track down user info gleaned from the developer platform—but those efforts failed in part for technical reasons, former employees said.

The internal investigation is a sign of what Mr. Archibong, echoing other Facebook executives, described as a massive cultural shift within Facebook to focus more on “enforcement as a key component” of its system. Previously, executives have said, the emphasis was on growth and connecting more users to one another around the world.

Link to the rest at The Wall Street Journal

The 80/20 Rule

20 June 2018

From Medium:

A few years ago, when I was single and desperate to find a boyfriend, I asked my friend Amy if she thought my blog made me undatable. She didn’t have an answer, but she did share an anecdote. After Amy and her friend Max met me at a book party in SoHo, she received an email from a friend expressing his surprise that we had become friends. “You and Max are like Statler and Waldorf, heckling from the audience,” Amy’s friend wrote to her. “Tyler Coates is like Miss Piggy, preening on the stage.”

It was not the first time I had been accused of oversharing. I have been writing about myself online in some form since graduating from high school in 2001. My first attempts at blogging were on and, two sites that focused more on journaling than writing for an audience, although there were clear social aspects to both. Then there was LiveJournal in college (my friends were the only people who read that one) and, post-graduation, Blogger. After joining Tumblr at the beginning of 2008, I used it as more of a scrapbook, casually posting and reblogging pictures and songs. But a year in, I changed my pseudonymous username to my own. Suddenly, because I was writing under my own name — my first byline, really — the criticisms changed. Even though I was hardly anonymous on my earlier blogs, my Tumblr had some deliberate accountability because my name was attached.

Mixed in with the Liz Phair MP3s and pre-selfie-era selfies were brief posts about my feelings and emotions — two things that are never supposed to be put on the internet, I learned.

. . . .

The communal nature of Tumblr offered some solace and support. I made a lot of friends based on the mutual pop-culture interests we were writing about, and a lot of those people crossed the email boundary and offered me emotional support.

At the first Tumblr meetup I had ever attended, in Chicago in 2009, organized by a group of users who barely new each other offline, strangers told me how they found it inspiring that I would write about things they would never share on the internet. Oversharing never felt like the appropriate word for what I did—it really is such an overused and misunderstood phrase. To overshare means simply to share anything the reader might not share him or herself. Our personal boundaries are subjective, so the term serves to attack a writer for doing what the accuser wouldn’t: revealing something personal, something that makes the reader vulnerable. It carries with it the resentment that the person doing the revealing isn’t embarrassed, even though the reader thinks the writer should be.

. . . .

I have what I call my 80/20 Rule, a theory based entirely on presumption and not at all mathematical. (If I were good at math, I’d be in a different business.) The premise is essentially this: Everything you know about someone based on what they put on the internet represents about 20 percent of who he or she really is, while the other 80 percent is not actually present in that online persona. On the other hand, the 20 percent you put out there can be perceived as 80 percent of your inner life, maybe even more.

Link to the rest at Medium

PG suggests the possibility that, just as some people are socially awkward offline, some people are socially awkward online as well.

That said, does it really matter?

If an author is promoting his/her work online, social awkwardness could cause some problems, but there are so many different examples of successful online promotion that adapting them for an author’s business purposes is one alternative. Another alternative might be to create an online persona in the same manner an author creates an interesting character.

PG has been on the internet for over 30 years and has seen what might be an analog for community standards develop over time, then pretty much disappear as the number of online interactions increased and communities proliferated to the point where you can pretty much find any sort of community (with accompanying standards) you might like.

« Previous PageNext Page »