\
  The most prestigious law school admissions discussion board in the world.
BackRefresh Options Favorite

REMINDER: Zuckerfuck is *literally* evil incarnate

If you bothered to read the fine print when you created your...
buck-toothed dashing market
  05/23/16
His "money" is a lie and he's an ugly rwink Jew sc...
Lascivious potus associate
  05/23/16
Facebook is the world’s most influential source of news. ...
buck-toothed dashing market
  05/23/16
"you were giving them your life as content — the right ...
scarlet pontificating whorehouse fortuitous meteor
  05/23/16
...
Drab free-loading base
  05/23/16
His 300+billion is a lie no one uses this spam fraud and mos...
Lascivious potus associate
  05/23/16


Poast new message in this thread



Reply Favorite

Date: May 23rd, 2016 8:59 AM
Author: buck-toothed dashing market

If you bothered to read the fine print when you created your Facebook account, you would have noticed just how much of yourself you were giving over to Mark Zuckerberg and his $340 billion social network.

In exchange for an admittedly magical level of connectivity, you were giving them your life as content — the right to run ads around video from your daughter’s basketball game; pictures from your off-the-chain birthday party, or an emotional note about your return to health after serious illness. You also gave them the right to use your information to help advertisers market to you, based on your likely state of pregnancy or new place among the consciously uncoupled.

There are privacy protections. Facebook says it will not share your identity with advertisers without your permission. And you can set limits on what they can know. But at the heart of the relationship is a level of trust and a waiving of privacy that Facebook requires from its users as it pursues its mission to “make the world more open and connected.”

But how open is Facebook willing to be in return? The way it initially handled this month’s flare-up over accusations of political bias in its Trending Topics feed can only lead to this answer: not very.

And that should concern anyone of any political persuasion as Facebook continues to gain influence over the national — and international — conversation.

That influence comes through the astounding growth in its users — 1.6 billion people and counting. Increasingly, those users are spending time on Facebook not only to share personal nuggets with friends, but, for more than 40 percent of American adults, according to Pew Research Center, to stay on top of news, which flows in and out of their individually tailored and constantly updating Facebook News Feeds.

That has helped chip away at the centrality of destination news sites like The New York Times, The Washington Post, the right-leaning Daily Caller and the left-leaning Talking Points Memo. Their articles must now vie for attention in the Facebook algorithms that help determine which items will be prominent in which users’ feeds and which will be highlighted in the Facebook Trending section that is prominent on users’ home pages.

Continue reading the main story

RELATED COVERAGE

MEDIATOR

When TV Ads Go Subliminal With a Vengeance, We’ll Be to Blame MAY 15, 2016

MEDIATOR

The Republican Horse Race Is Over, and Journalism Lost MAY 5, 2016

MEDIATOR

Letting Stephen Colbert Be Stephen Colbert (Whoever That Is) MAY 1, 2016

So Facebook, born of the open Internet that knocked down the traditional barriers to information, becomes a gatekeeper itself. It now has an inordinate power to control a good part of the national discussion should it choose to do so, a role it shares with Silicon Valley competitors like Google and Twitter.

It’s a privileged position that Facebook won through its own ingenuity and popularity. Mr. Zuckerberg seemed to approach this new perch with a solemn sense of responsibility when he took the company public in 2012, swearing in an investor letter, “We believe that a more open world is a better world.”

And yet…

There we were earlier this month, with Facebook ensnared in one of those big public relations crises for which openness is always the best salve. The report in Gizmodo that Facebook had a team of editorial contractors who injected their own judgment into its computer-generated Trending list — and at times suppressed “news stories of interest to conservative readers” — ran without a response from Facebook, which ignored Gizmodo’s detailed questions.

Then came the slow and awkward response. There was the initial statement that Facebook could find “no evidence” supporting the allegations; Facebook said it did not “insert stories artificially” into the Trending list, and that it had “rigorous guidelines” to ensure neutrality. But when journalists like my colleague Farhad Manjoo asked for more details about editorial guidelines, the company declined to discuss them.

Only after The Guardian newspaper obtained an old copy of the Trending Topics guidelines did Facebook provide more information, and an up-to-date copy of them. (They showed that humans work with algorithms to shape the lists and introduce headlines on their own under some circumstances, contradicting Facebook’s initial statement, Recode noted.) It was openness by way of a bullet to the foot.

As his staff prepared answers to pointed questions from Senator John Thune of South Dakota, Mr. Zuckerberg took another step into the sunshine last week by holding a grievance session at Facebook’s campus with conservative commentators and media executives, including the Fox host Dana Perino, the Daily Caller editor Tucker Carlson and the Blaze founder and commentator Glenn Beck, who wrote a defense of Facebook afterward.

Many of Mr. Zuckerberg’s visitors seemed at least temporarily placated by his explanation: That Facebook had so far found no systemic attempt to excise conservative thought from the Trending list and that any such move would harm Facebook’s primary imperative (which is, in lay terms, to get every single person on earth to spend every waking moment on Facebook and monetize the living expletive out of it).

But a more important issue emerged during the meeting that had been lying beneath the surface, and has been for a while now: the power of the algorithms that determine what goes into individual Facebook pages.

“What they have is a disproportionate amount of power, and that’s the real story,” Mr. Carlson told me. “It’s just concentrated in a way you’ve never seen before in media.”

What most people don’t realize is that not everything they like or share necessarily gets a prominent place in their friends’ newsfeeds: The Facebook algorithm sends it to those it determines will find it most engaging.

For outlets like The Daily Caller, The Huffington Post, The Washington Post or The New York Times — for whom Facebook’s audience is vital to growth — any algorithmic change can affect how many people see their journalism.

A cautionary tale came in 2014. The news site Upworthy was successfully surfing the Facebook formula with click bait headlines that won many eyeballs. Then a change in the Facebook algorithm punished click bait, which can tend to overpromise on what it links to. Steep traffic drops followed. (Upworthy has recovered, in part by relying on more on video.)

Throughout the media, a regular guessing game takes place in which editors seek to divine how the Facebook formula may have changed, and what it might mean for them. Facebook will often give general guidance, such as announcing last month that it had adjusted its programming to favor news articles that readers engage with deeply — rather than shallow quick hits — or saying that it would give priority to live Facebook Live videos, which it is also paying media companies, including The New York Times, to experiment with.

This gives Facebook enormous influence over how newsrooms, almost universally eager for Facebook exposure, make decisions and money. Alan Rusbridger, a former editor of The Guardian, called this a “profound and alarming” development in a column in The New Statesman last week.

For all that sway, Facebook declines to talk in great detail about its algorithms, noting that it does not want to make it easy to game its system. That system, don’t forget, is devised to keep people on Facebook by giving them what they want, not necessarily what the politicos or news organizations may want them to see. There can be a mismatch in priorities.

But Facebook’s opacity can leave big slippery-slope questions to linger. For instance, if Facebook can tweak its algorithm to reduce click bait, then, “Can they put a campaign out of business?” asked John Cook, the executive editor of Gawker Media. (Gawker owns Gizmodo, the site that broke the Trending story.)

No Facebook executive would discuss it with me on the record. That’s not the only reason this column may seem a little cranky. My Facebook Trending list this week included this beaut: “Gastroesophageal Reflux Disease.” First of all, it was dyspepsia, and it was, like, 20 years ago. See that? I shared something pretty revealing. Your turn, Mr. Zuckerberg.

(http://www.autoadmit.com/thread.php?thread_id=3231638&forum_id=2#30538224)



Reply Favorite

Date: May 23rd, 2016 12:14 PM
Author: Lascivious potus associate

His "money" is a lie and he's an ugly rwink Jew scammer with an ugly wife..I only post crap on there and most of gid accounts are fakes

(http://www.autoadmit.com/thread.php?thread_id=3231638&forum_id=2#30539187)



Reply Favorite

Date: May 23rd, 2016 9:02 AM
Author: buck-toothed dashing market

Facebook is the world’s most influential source of news.

That’s true according to every available measure of size — the billion-plus people who devour its News Feed every day, the cargo ships of profit it keeps raking in, and the tsunami of online traffic it sends to other news sites.

But Facebook has also acquired a more subtle power to shape the wider news business. Across the industry, reporters, editors and media executives now look to Facebook the same way nesting baby chicks look to their engorged mother — as the source of all knowledge and nourishment, the model for how to behave in this scary new-media world. Case in point: The New York Times, among others, recently began an initiative to broadcast live video. Why do you suppose that might be? Yup, the F word. The deal includes payments from Facebook to news outlets, including The Times.

Yet few Americans think of Facebook as a powerful media organization, one that can alter events in the real world. When blowhards rant about the mainstream media, they do not usually mean Facebook, the mainstreamiest of all social networks. That’s because Facebook operates under a veneer of empiricism. Many people believe that what you see on Facebook represents some kind of data-mined objective truth unmolested by the subjective attitudes of fair-and-balanced human beings.

None of that is true. This week, Facebook rushed to deny a report in Gizmodo that said the team in charge of its “trending” news list routinely suppressed conservative points of view. Last month, Gizmodo also reported that Facebook employees asked Mark Zuckerberg, the social network’s chief executive, if the company had a responsibility to “help prevent President Trump in 2017.” Facebook denied it would ever try to manipulate elections.

Even if you believe that Facebook isn’t monkeying with the trending list or actively trying to swing the vote, the reports serve as timely reminders of the ever-increasing potential dangers of Facebook’s hold on the news. That drew the attention of Senator John Thune, the Republican of South Dakota who heads the Senate’s Commerce Committee, who sent a letter on Tuesday asking Mr. Zuckerberg to explain how Facebook polices bias.

The question isn’t whether Facebook has outsize power to shape the world — of course it does, and of course you should worry about that power. If it wanted to, Facebook could try to sway elections, favor certain policies, or just make you feel a certain way about the world, as it once proved it could do in an experiment devised to measure how emotions spread online.

There is no evidence Facebook is doing anything so alarming now. The danger is nevertheless real. The biggest worry is that Facebook doesn’t seem to recognize its own power, and doesn’t think of itself as a news organization with a well-developed sense of institutional ethics and responsibility, or even a potential for bias. Neither does its audience, which might believe that Facebook is immune to bias because it is run by computers.

Continue reading the main story

RELATED COVERAGE

Senator Demands Answers From Facebook on Claims of ‘Trending’ List Bias MAY 10, 2016

COMMON SENSE

Facebook Has 50 Minutes of Your Time Each Day. It Wants More. MAY 5, 2016

That myth should die. It’s true that beyond the Trending box, most of the stories Facebook presents to you are selected by its algorithms, but those algorithms are as infused with bias as any other human editorial decision.

“Algorithms equal editors,” said Robyn Caplan, a research analyst at Data & Society, a research group that studies digital communications systems. “With Facebook, humans are never not involved. Humans are in every step of the process — in terms of what we’re clicking on, who’s shifting the algorithms behind the scenes, what kind of user testing is being done, and the initial training data provided by humans.”

Everything you see on Facebook is therefore the product of these people’s expertise and considered judgment, as well as their conscious and unconscious biases apart from possible malfeasance or potential corruption. It’s often hard to know which, because Facebook’s editorial sensibilities are secret. So are its personalities: Most of the engineers, designers and others who decide what people see on Facebook will remain forever unknown to its audience.

Photo

Credit Stuart Goldenberg

Facebook also has an unmistakable corporate ethos and point of view. The company is staffed mostly by wealthy coastal Americans who tend to support Democrats, and it is wholly controlled by a young billionaire who has expressed policy preferences that many people find objectionable. Mr. Zuckerberg is for free trade, more open immigration and for a certain controversial brand of education reform. Instead of “building walls,” he supports a “connected world and a global community.”

You could argue that none of this is unusual. Many large media outlets are powerful, somewhat opaque, operated for profit, and controlled by wealthy people who aren’t shy about their policy agendas — Bloomberg News, The Washington Post, Fox News and The New York Times, to name a few.

But there are some reasons to be even more wary of Facebook’s bias. One is institutional. Many mainstream outlets have a rigorous set of rules and norms about what’s acceptable and what’s not in the news business.

“The New York Times contains within it a long history of ethics and the role that media is supposed to be playing in democracies and the public,” Ms. Caplan said. “These technology companies have not been engaged in that conversation.”

According to a statement from Tom Stocky, who is in charge of the trending topics list, Facebook has policies “for the review team to ensure consistency and neutrality” of the items that appear in the trending list.

But Facebook declined to discuss whether any editorial guidelines governed its algorithms, including the system that determines what people see in News Feed. Those algorithms could have profound implications for society. For instance, one persistent worry about algorithmic-selected news is that it might reinforce people’s previously held points of view. If News Feed shows news that we’re each likely to Like, it could trap us into echo chambers and contribute to rising political polarization. In a study last year, Facebook’s scientists asserted the echo chamber effect was muted.

But when Facebook changes its algorithm — which it does routinely — does it have guidelines to make sure the changes aren’t furthering an echo chamber? Or that the changes aren’t inadvertently favoring one candidate or ideology over another? In other words, are Facebook’s engineering decisions subject to ethical review? Nobody knows.

The other reason to be wary of Facebook’s bias has to do with sheer size. Ms. Caplan notes that when studying bias in traditional media, scholars try to make comparisons across different news outlets. To determine if The Times is ignoring a certain story unfairly, look at competitors like The Washington Post and The Wall Street Journal. If those outlets are covering a story and The Times isn’t, there could be something amiss about The Times’s news judgment.

Such comparative studies are nearly impossible for Facebook. Facebook is personalized, in that what you see on your News Feed is different from what I see on mine, so the only entity in a position to look for systemic bias across all of Facebook is Facebook itself. Even if you could determine the spread of stories across all of Facebook’s readers, what would you compare it to?

“Facebook has achieved saturation,” Ms. Caplan said. No other social network is as large, popular, or used in the same way, so there’s really no good rival for comparing Facebook’s algorithmic output in order to look for bias.

What we’re left with is a very powerful black box. In a 2010 study, Facebook’s data scientists proved that simply by showing some users that their friends had voted, Facebook could encourage people to go to the polls. That study was randomized — Facebook wasn’t selectively showing messages to supporters of a particular candidate.

But could it? Sure. And if it happens, you might never know.

(http://www.autoadmit.com/thread.php?thread_id=3231638&forum_id=2#30538232)



Reply Favorite

Date: May 23rd, 2016 12:17 PM
Author: scarlet pontificating whorehouse fortuitous meteor

"you were giving them your life as content — the right to run ads around video from your daughter’s basketball game; pictures from your off-the-chain birthday party, or an emotional note about your return to health after serious illness. "

anyone who poasts these things on FB deserves everything they get

(http://www.autoadmit.com/thread.php?thread_id=3231638&forum_id=2#30539204)



Reply Favorite

Date: May 23rd, 2016 12:28 PM
Author: Drab free-loading base



(http://www.autoadmit.com/thread.php?thread_id=3231638&forum_id=2#30539257)



Reply Favorite

Date: May 23rd, 2016 12:46 PM
Author: Lascivious potus associate

His 300+billion is a lie no one uses this spam fraud and most of the I to always has been lies and most of the accounts are fakes

(http://www.autoadmit.com/thread.php?thread_id=3231638&forum_id=2#30539373)