Bias? In My Algorithms? A Facebook News Story

Bias? In My Algorithms? A Facebook News Story

Here’s an idea. Even algorithms
can’t save Facebook from the bias of
reporting the news. [THEME MUSIC] In case you missed the news
because it wasn’t trending on Facebook, Facebook’s trending
news team has been in the news. Not long ago, the
whole department got the axe after
Gizmodo reported that they’d been suppressing
conservative news items and sources. This caused a stir, and
perhaps rightfully so. Facebook is used by
all stripes of people with all manner of
beliefs and politics. It’s where those people
go to get their news. Writing for the
“New York Times,” Farhad Manjoo described Facebook
as the world’s most influential source of news according to
every of available measure of size. And it would be dismaying,
to say the least, to learn that your
news source suppresses topics most important to you. Also dismaying, because
Facebook can actually sway public opinion. Enough people use Facebook,
and Facebook is good enough at directing user experience,
that it is as, if not more, capable than a standard
news outlet or The Oatmeal when it comes to moving the
needle on public perception. I attribute
Sriracha’s popularity significantly to Matthew Inman. So, but OK– a timeline. In early May, Gizmodo posts
their expose, and Facebook denies the presence of bias. The Senate Commerce
Committee then demands insight into
Facebook’s editorial system and wants a list of the news
sources removed or blocked from trending. On August 26, the
Facebook announces its trending news
team would be axed and that the trending
algorithm would be significantly more, though
not entirely, algorithmic. On August 29, an article
incorrectly claiming Fox News contributor
Megyn Kelly was fired for secretly supporting
Hillary Clinton trends. The Trending News
topics continue to be a shadow of their
former usefulness, let’s say. On August 30, Digiday
posts an interview with an anonymous former
Facebook Trending News editor who confirms that their
team, like much of Silicon Valley, was left-leaning, and
that though they were a news team in name, there was very
little editorial oversight until Facebook got called out. In their words, “It
never seemed like anyone in the company ever actually
understood what we did or understood how the
topics were curated.” Now it’s very easy
to understand. Robot brains. In light of this
whole thing, there are then two questions
that I want to ask. The first is why is
there an expectation that Facebook have zero bias? And second, what
does Facebook do in light of that expectation? Right off the bat, we
should talk about a thing that we’ve already talked about. The expectation that any new
source will be free from bias is an unfair one. In the past, we’ve
quoted Glenn Greenwald, who has said that
there’s no such thing as neutrality,
only transparency. Journalist and
Professor Jay Rosen talks about how the
view from nowhere, the capability to
report on events without personal
inclination, is mythological. Neutrality and the
view from nowhere are vestiges of a traditionally
positivistic approach to journalism which, we
learn over and over again, is too high an ideal
for mere modern mortals. And this is just as true
for what Facebook does– aggregation, curation,
editorial, publishing– as it is for literal,
on-the-ground reporting done by people. Miss, for $1, can
you believe this is Blake Lively without makeup? No. She looks good. OK, so, the first
source of an expectation that Facebook itself is unbiased
is probably the fact that news feeds are populated by friends. Yes, there is a small cadre of
contrarians in everyone’s feed. But for the most part– and
Facebook understands this, so they design for it– Facebook
is for seeing things you like, things that you
agree with, things that you are entertained by. That’s why the contrarians
are so frustrating. They ruin Facebook. The idea here is
that Facebook mostly shows things which align
with one’s world view. Because hey, you are
the editorial director of your friendships. And Facebook is the positive
reaction-skewing editorial director of your
friendships on Facebook. So one may start to think
of Facebook as unbiased, because on two levels,
it works to show things that align with one’s own bias. That’s how ideology functions–
the complete absence of tingle is how you know it’s working. The Trending News pane is
a little different, though. That purports to show topics
that a large number of people are sharing and talking about. If you’re conservative and
you discover that that’s not exactly the case,
well, then that’s grist for the theory
mill– the theory that the mainstream media
colludes to silence, or at least not
spotlight, your viewpoints because the liberal elite
is in your internet messing with your hashtags. Which is exactly what happened. But also, I mean,
this is a thing that’s widely understood
about publishers and news organizations– that
they have politics. Facebook trafficking in slant
is no different from the rest of the news media. They may not have the
same editorial oversight and standards as
other news giants– and that is a
lamentable problem. But just like “where
there’s smoke, there’s fire,” where there’s
editorial, there’s partiality. And that is true even when the
editorial is guided by code. No one is hanging other
major news publishers out to dry for their
preferences, because duh, people have preferences. But is Facebook people? It’s a weird question to ask. But before this
debacle, there was not widespread understanding that
Facebook does or even could have bias. Arguably, even Facebook
didn’t understand. Why? Because algorithms, I think. Facebook is a technology
product run by algorithms. What’s an algorithm? Well, it’s not a
person, that’s for sure. It’s the result of
programming and computation and, fundamentally,
dispassion– but also not. Insofar as data
collected about the world contains and reflects bias,
which it almost certainly does, algorithms that
operate on that data will also contain
and reflect bias. This is true down to
data sets as seemingly neutral as whole languages. Source. For real. It’s like a– it’s
a recent study. You should read it. The practice of
thinking that because a technological operation
based on data has transpired, its result must be
free of bias, is something former Kickstarter
VP of Data Fred Benenson has described as mathwashing. Quote, “Using math
terms– algorithm, model, et cetera– to paper over
a more subjective reality. Algorithm and
data-driven products will always reflect the
design choices of the humans who built them,”
Benenson said. “And it’s irresponsible to
assume otherwise.” Basically, where
we may understand news publishers and
newsrooms and reporters to have bias because they’re
people, for better or worse, computers acting autonomously
with algorithms and models are often thought to have
none, even though that is demonstrably not the case. Sources. So many source–
endless source– an embarrassment of sources. They’re all good. Read them all. Even trending, which sounds like
it should be really simple– like, oh, it’s just popular. That is a complicated thing. You know what’s always trending? Weather. Lunch. Right off the bat,
editorial is required, unless you want #LUNCH in
your face all day, every day. It’s also in the economic
interest of publishers to catch things as if not before
they trend so that they can ride the wave of eyeballs and
clicks, which means trending is often not a fact,
but a forecast. And we all know the
reputation forecasts have. For a lot of
people, though, this is exactly the value of
trending items– not what is but what will be
popular, because being ahead of the curve is cool. Also are trends in
an area or worldwide? Are they sorted by language? Facebook is a global service,
so pure trending topics would likely never be in
just your native tongue. A mistaken perception that
algorithms “just get it done” causes flabbergastery when human
involvement in trending topics, which is the norm
from the get-go, is reported on in connection
with editorial politics, a thing which is inseparable
from every form of news, even if it’s just curation. Even if it’s
algorithmic curation. As sociologist Zeynep Tufekci
said to the “Wall Street Journal,” “Choosing what to
highlight in the Trending section, whether by
algorithm or humans, is an editorial process.” Facebook is in a “get you a
man who can do both” territory, except on a double axis–
algorithmically-powered social connection and news provision
for any given person along the political spectrum. This is tough for a
thousand reasons– the political climate in
general, the political climate of the tech sector,
political discourse online, ongoing social
media design problems– but it would seem,
also, Facebook itself. Despite somehow becoming
a media giant, a publisher and aggregator of content
and news and stories, they simply don’t
seem to have or be interested in strong
editorial standards and practices, perhaps
because of a view that algorithms make
those things unnecessary. Well, it would seem, in
fact, that algorithms may make them more necessary. Georgia Well for “The
Wall Street Journal” points out, “In recent
days the ‘trending’ lists have appeared more flawed than
when humans were in charge. There have been false stories,
misidentified keywords, and celebrity gossip in the
place of more serious news.” So what’s Facebook’s
response here? They could develop
an ecumenical, politically diverse
if not divergent newsroom, or newsrooms. Abandon the fairy tale of
algorithmic objectivity, even as it relates to trending,
and just focus on coverage. But can they? Will they? The trending editor that spoke
with Digiday guessed that Trending News would probably
just be shut down entirely, maybe not because it doesn’t
work, but because it can’t. What do y’all think? Why is there an expectation
that Facebook be unbiased? And is it possible for Facebook
to meet that expectation? And if so, how? Let us know in the comments, and
I will respond to some of them in next week’s Comment
Response video. In this week’s Comment
Response video, we talked about your
thoughts regarding “First!” If you want to
watch that one, you can click here, or find
a link in the doobly-doo. One small bit of news. Next week on the 22nd, here
in New York City, PBS Digital Studios is going to be hosting
a Nerd Night at the YouTube Space, where I’m going
to be giving a talk, and Joe Hanson’s going to
be giving a talk, and Sarah from The Art Assignment, and
everybody’s going to be there, talking about stuff that
they’re interested in. So we’ll put a link to
that in the doobly-doo. It is a free event. It is 18-plus. So yeah, maybe
we’ll see you there. We have a Facebook, an
IRC, and a Subreddit, links in the doobly-doo. And the Tweet of the Week
comes from Nathan Scott, who points us towards an article
about Twitch Chat beating a chess grandmaster,
which is very related to the stuff I’m going to be
talking about at Nerd Night. And last but
certainly not least, this week’s episode would not
have been possible or good without these human editors. [THEME MUSIC] [MUSIC PLAYING]


100 thoughts on “Bias? In My Algorithms? A Facebook News Story”

  • Simply put: Algorithms will naturally select popular news stories over controversial ones. Conservative posts may be positive, but they are also generally antagonistic toward people, or percieved as such, and thus tend to be more controversial (Especially with a Republican candidate like Trump, one of the most controversial and antagonistic candidates in recent memory).

  • Btw it's true that algorithms are biased, but I think the cause is way more about the bias in the dataset than in the code ^^ All these algorithms process human generated information, so say the most basic sentiment analysis will see the rain as negative because it's how it's talked about while it's not objectively inherently negative (hard to find a great PC example here ^^). So I guess we should really be careful about what we're teaching algorithms with all these Hollywood movies #representation #etc ^^ ?

  • Rodrigo Ortiz Vinholo says:

    People don't expect Facebook to be biased because Facebook is seen as a free space, a public space. People see Facebook as a place where you can express yourself. They tend to forget that there are rules, and that if you post on someone's profile page or comment on their stuff your comment can be deleted and you can be blocked. When that happens, they see it as a violation of the rights, of free speech, ignoring that well, they never actually did have free speech with no consequences in Facebook. If you are the admin of a group and you block someone, people will complain. If you are the owner of a page and you erase someone's comment, you'll hear them complain about their rights. With that in mind, it's easy to see why people don't seem to expect Facebook to be biased, even though it invariably would. If people think Facebook is a free place, bias seems immediately like censorship, like a lack of freedom of sorts.

  • There's this… myth… that anything put on the internet, stays on the internet. This is false. Cloud computing, open source programming, and microsoft's willingness to leave the back door open has destroyed this. The internet I grew up with is dead, and this abomination will follow it soon. All hail Ad Block!

  • Algorithms certainly contain bias, but I think it's important to note that this doesn't necessarily mean that bias reflect the bias of those designing the algorithm, or that they are even aware of the bias. With modern algorithms such as neural networks, decision making for computers has reached the point where the people running the algorithm don't always even understand how it works, let alone control the bias. Not saying there aren't ways to control the bias intentionally, but control certainly isn't implied.

    Furthermore, its impossible for piece of content, regardless of how it was created, to be unbiased in everyone's eyes. Whether or not something is biased or how biased it may be, is subjective. Which means that people have biases about biases….

  • Here's an idea: if mathematics describes everything, including life and society, and since people can be biased, therefore mathematics has bias built into is. BOOM! It's the base processing language that runs everything and people on top of it, so if people are biased, the math that runs us is too.

  • BariumCobaltNitrog3n says:

    There could be some sorting or filters available to FB users, like trending in my city, pick a city, trending among females 20-49, most shared, liked, commented on, and so on. On YouTube trending is always, ALWAYS movie trailers (which I literally never watch) in the top 25 or so slots. Which kind of makes sense in the same way the space bar is the most used key. Trailers are what people watch between videos.

  • Here's an idea, use tech and people to control truth in media. make any source of the news, people making claims submit to a polygraph to prove that they at least believe it to be true. it would slow the news down but is quantity / speed more important than quantity.

  • 1 thing is removing topics/ story's from the trending list that might not be the "worst" thing… its an hole other ball game when they remove post by users do to "sensitive content". when the sensitive content was just talking about the current political climet and not having the right political opinions. Then they are removing ppl ability to discuss/criticize ideas.

  • You know, Mike. Sometimes when you read from your sources you often do it with so much conviction and emotional authority I find it an otherworldly difficulty to research these things myself rather than simply appealing to your authority.

  • Going off your idea that algorithms can have a bias, can it be said that a website like Reddit can be similarly subject to bias despite its ability to democratise it's news? I mean sure, every individual has a bias when they comment; bringing in their own experiences, socio-political views, historical perspective, etc. However on a whole I feel as if the news that I gather from there remains reasonably balanced. Or Maybe I'm just living in a bubble of confirmation bias.

  • hi Mike. It's important to distinguish Machine Learning (ML) algorithms from more classical computer programs. ML algorithms are trained on data and therefore represent the bias in the data and not in its developer.

  • hi Mike. Let's imagine there is this algorithm/A.I. XY who is able to deliver the news in an unbiased way. Would that algorithm be successful? As you said, people like facebook because they see what they like. If this would not be the case, people get uncomfortable. Facebook usually also is a place for low energy states like TV where we go if we are not able or in the mood in coming up with energy to understand and deal with "problems".

  • It's mostly a PR reaction, but more transparency is all they could do to actually fix the problem. But they won't. Because that's not how you make money.

  • The Chopping Block says:

    2:36 No. It's not "unfair". It is unrealistic. See, journalists have been proclaiming impartiality for decades. They learn in their COM/101 course that being unbiased is important to journalistic ethics so they declare themselves to be unbiased while many regular folks actually believe them. We should expect journalists to focus on the facts as much as possible and make an attempt to present both sides of controversial opinions, but they'll tend to pick more articulate, more well spoken proponents of the ideas they support. Most importantly, journalists should be open and honest about their political bias and organizations should either proudly display their bias (a la old school newspapers named after political parties) or at least maintain an equitable blend of biased reporters who support disparate views.

  • Thurston Jaskowiak says:

    Facebook has every legal right to behave this way. It isn't a public space, no matter how it functions at times. If FB wants you to take off your conservative shoes in it's house then that's their prerogative. But I don't think it's about that for most people. It's not about legality, it's about morality. In a society built around the free exchange of ideas and the freedom of the individual to speak their mind (triggerdness and emotions be damned), it is hated and met with suspicion (and it should be) when people pounce eagerly on the opportunity to crackdown on freedom of expression, simply because they suddenly can. That's the expectation. It's a cultural one. A valuable American one. We don't want things to be purposely ignored or censored or removed or whatever it was that FB did to, as it admitted, "routinely suppress" dissent. That's dictatorial. American history and pop-values teaches us to despise it. I think it suggest a movement in the American liberal thinking sphere. One towards the fascismal desire to treat opposing opinions like they do not have a right to exist. Increasingly many, though not all, American liberals have grown a very un-American and unappreciative view concerning the right to personal freedom of thought. The "safe space" culture should be fought against and its taking root should be met with a healthy fear for the protection and cultivation of freedom that triggers the will to fight it. But FB can do what it wants. In a free market society we vote with our dollars and in a free thought honoring nation we vote with our eyes, attention and now clicks. Use those things to disencentivise, dictatorial behavior in America.

  • Yeshua Resurrectus Mortuus says:

    The trending editors did influence trending conservative topics and that isnt good. But have u ever seen religious conservative 'news' sites? They are a joke. They almost always contain falsehoods. The point of an editorial team shud be to filter right?

  • Need I say more ?

  • Facebooks presents itself as unbiased, that's why bias is an issue. The issue most people have with FoxNews isn't the bias most would admit to, but their denial that they are anything but super fair and totally balanced.
    Few people take issue with MSNBC's bias because they don't pretend it doesn't exist. Just like CNN is biased towards airport viewers, MSNBC has a bias; a niche and they don't insist over and over that's not the case.

    Personally, I expect the trending news section on facebook to be one or more of the following: completely objective (with a few omissions like #Lunch [things widely agreed upon to be "topics" that trend without really being news/timely or topical (#lunch isn't really a topic as much as many tiny topics that share a designation)]), factually accurate (or including all of The Onion's stories), reflective of the demographic data about me that they sell to advertisers, reflective of my connections (in an overly complex pagerank-esque algorithm that prefers topics trending among my friends, above that of my friends' friends, above that of my third degree contacts, and so on), etc.

  • What if FB did implement a completely pure trending algorithm, worldwide and based on pure share and mention numbers? It would give the people what they want.

  • "Not even Algorithms can save facebook…"
    I can see why we might hold up Algorithms as being utterly dispasionate, what with being made of math thing. That being said, I think it critical to remember that computers ALWAYS ALWAYS do one thing, and one thing only: EXACTLY what you tell them to. It's a little silly to say that a math problem is 'doing something'. That math problem informs computer code, written by poeple, to do something. Where there are people, there is bias.

  • Program your algorithm to respond randomly to whatever, and that's unbiased. If you say that it is biased towards randomness, well then the word means absolutely nothing at that point.

  • One of the key things going on here, I think, is the legal framework. The TLDR version of the communications decency act section 230 is that someone who's an editor can get in trouble for material posted on their website, while a neutral hosting provider has immunity from liability for defamation in the United States. That framework made sense a couple decades ago when the idea was to distinguish between, say, the New York Times vs. a company hosting a forum that could not possibly read every post made on the forum and would go bankrupt if they were liable for every ranting user.

    But, it's not two decades ago now, and the actual activities that people are doing don't quite match the framework anymore. I think that it's still understood that the vastness and scope of a website like Facebook means that if they were liable for every mean thing that a user posted, it would represent a vast, perhaps even unsustainable cost for them, and that it would not be possible for Facebook, or any other website to actually review every post that every user made prior to making it public to make sure that it didn't contain defamation or anything else illegal. Just not possible. And that means that they can't talk about being editors, or exercising editorial judgement because doing that is the exact language that the statute contemplates as making the company liable for everything that every user says.

  • Facebook has become so widespread and almost necessary for social interaction. Thus, people expect it to be like a public utility, when in fact it's a for-profit private service that e.g. could be denied to anyone, for almost any reason; doesn't have to give anyone free speech rights; and isn't legally obliged to be a neutral platform where everyone can expect to be represented fairly. As any company, it would be entitled to push any agenda they want.
    While people would like it to be otherwise, it's enough to make people think of it as neutral to reap the profit (or not think about the topic at all); and as soon as the outrage fades, they won't have much incentive to actually make it so.

  • We're lucky to get the truth at all, much less unbiased truth. There aren't any laws against the press flat out lying. It's not a crime. And both hands of the media, the left and the right, take advantage of it endlessly.

    Especially since it lets them point and say 'the other guys are lying to you' and distract you from the fact that… they're constantly lying to you too. Because well, both sides… are just two faces of the same beast.

  • I still don't have a Facebook account, and I still don't intend to, so an explanation of how FaceBook actually works is appreciated, by me, at least. Your point is right on about no news source being unbiased; it's impossible to select which news stories to feature without introducing personal biases, even if the stories themselves are scrupulously fair, objective, and unbiased, a task difficult in its own right.

    And as someone who's done some programming, although not professionally, it has longed seemed obvious to me that algorithms must necessarily reflect the biases of the programmer(s). When I wrote my maze program, for example, I deliberately tried to select and verify that the algorithm that generated the maze would create sufficiently interesting and complex mazes, mazes that "looked right" to me, without any excessively unusual features or appearances to them.

    Of course, writing an algorithm to select 'trending' news faces other complexities that I didn't face, such as where they find the trending news, and how they convert the vagaries of human language into algorithmically-selectable options. How does an algorithm know that a story about marijuana is pro-prohibition or anti-prohibition, for example, or if it has no relation to prohibition at all?

    Still, since we are talking about the precision of computers and algorithms, I'm not entirely convinced that you can't tweak it to compensate for personal bias. After all, all you really have to do is to adjust the algorithm to a different set of numbers, assuming the accuracy of converting language and news into numbers. No, the real problem with doing this is whether or not we humans would actually be able to recognize truly unbiased results. This is much like the computer problem of random number generation. Truly randomly-generated numbers don't necessarily SEEM random to humans, part of the Pareidolia effect, where we "see" patterns that aren't really there. "Random" number generators tend to use a pre-set selection of numbers that merely seem random to humans, to minimize the Pareidolia effect.

    Similarly, a truly unbiased selection of trending news stories may well still seem to have a pattern of bias in it to us humans.

  • If younger people are more liberal and Facebook is used more by young people, wouldn't the content just naturally be liberal leaning?

  • I find it interesting that Facebook is held to a standard of biaslessness when, as you said, its business model is catering to each person's individual biases. The "trending topic" ad-like thing in the corner has nothing on the main News Feed in terms of engagement and throught that, Facebook tries to surface to each user only what it "believes" that user already agrees with based on what that user says they like and who/what they interact with on the site. That, moreso than any sort of systemic bias, is what I find most concerning about Facebook. If you get your news from Facebook, you will never ever have your worldview challenged in any meaningful way. You're just hanging out in an echo chamber of like-minded folks repeating the shared beliefs that led you (and Facebook) to group you all together in that way. If that echo chamber is your online world, it's easy to dismiss other people with other beliefs, or worse, mock them and demonize them. For a product meant to connect people, it's remarkably good at enabling us to divide and polarize ourselves.

  • Geezers-on-the-go says:

    Machines have algorithms . . .
    And what do people have? . . . algorithms.
    Let's be clear the entire data set of reality is too large for any person and machine to process so we resort to short cuts. . . when it is people we call it judgment and when its a computer we call it an algorithm but it is essentially the same thing. A smaller representation of an incomprehensible whole that allows you to deal with an overwhelming amount of data in a functional way.
    Anyone decrying the algorithms of Facebook is not actually criticizing Facebook. They are complaining that the Facebook algorithms do not match their personal 'judgment'. Plain and simple.
    Get over it.
    I pick my friends because there algorithms match mine.
    I pick my websites for the same reason.
    If the site you are on does not represent your hologram of reality find one that does or create one.
    Just saying.

  • I forgot that Mike was going to say "Content" like a pirate until he said "Content" like a pirate.

    But dance me this: how would you say "content" ( as in "satisfied with what one is or has; not wanting more or anything else") like a pirate?

  • This is one of my favorite channels, I wish it was more popular because it deserves to be recognized because of its "great-ness"

  • Aaron Wrotkowski says:

    It isn't that news can be unbiased, but that it should strive to be unbiased. Much like you can't always tell the truth, but you at least strive and try your best never to lie. That doesn't mean the moment you can't always be something (truthful, unbiased in news) that you drop all pretenses and give up. You continue to work towards an unbiased goal that you will never truly obtain.

  • It strikes me that, in some sense, people perceive Facebook as reality. By which I mean, it is not a source of news, but a part of the world, in the same way as what we see out of our windows. Like our windows, it is a frame through which we see the world, not a source of information, like a discrete newspaper, for example. I think this particular nature lowers peoples ability to critically interact with it; in the same way that you believe something you see with your own two eyes out in the world, you do with facebook too. It simply is real. This is perhaps a rather extreme view, but when I interrogate my own experience of facebook, it appears almost like a rudimentary reality simulator, feeding me a breadth of information from the outside world, so I don't have to seek it out myself. It's television 2.0. If you let it, it becomes your brain's shortcut to reality. It is for this reason that such outrage appears when we are confronted with the human, directed, underbell of the beast. It never was objective reality, but we are apt to perceive it as such.

  • Didn't Facebook get caught twice manipulating personal feeds now? Or is it thrice now?

    I wouldn't so much as bothered with it being biased in or against my ideals but when you manipulate what I see in my feeds as a "social experiment", I feel as if the only logical answer to the question is that Facebook shouldn't be relied on for news. At this point, it's on the same level as Wikipedia in regards to serious topics…

  • If lunch is trending or weather, then let the users edit that, not a detached team. You can curate the Trending Now yourself on your feed by just clicking on it.

  • well yea, if I was Facebook I'd suppress the hateful, heartless, harmful, toxic, cancerous jesus freak-confederate flag wielding-evangelical-antiPROGRESS people and tell them to bite my ass, they're the ones who are holding this country back.

    Silver lining: its mostly uneducated, white old people who should be dead soon

  • In not enough words, I believe it is because Facebook is a living room. not just for those watching but Facebook its self believes and tries to uphold the living room mantel mantel set upon it by all those looking to lounge and also be engaged. more should be said. But it won't be.

  • Just as we expect our water companies to comply with certain regulations that go above and beyond what is expected from a normal private corporation, Facebook's service is now so ubiquitous that it feels like a public entity. Thus, we hold it to a higher standard.

  • Leighanna Rose Walsh says:

    Interesting how the anti-conservative "bias" gets the news but not for example the enormous anti-trans bias, or bias against a number of minority causes. It may be biased towards "liberal" over conservative since "liberal" is probably more common but it can deal with actual leftist/social justice content quite aggressively. It empowers oppressors against vulnerable minorities, like the whole Cathy Brennan is a Fake Goth thing yet refuses to take down actual hateful content, up to and including incitement to violence.

  • I guess you can say that facebook didnt want to talk about the elephant in the room.
    {exhales with defeat} I need better jokes.

  • Facebook's human driven news curating was a good thing. That they were suppressing some conservative news is an indication of just how crazy the right-wing bubble can get … and I don't use the term crazy lightly ~ there's some batshit nutters out there making stuff up from a delusional reality and sharing it as if it were truth. We need moderators to search for the truth, filter out the crazy, and keep the rest.

  • Surfing On Squarewaves says:

    This just makes me sad, and I'm entirely with Facebook on this one. I've never, EVER, thought of Facebook as some kind of news outlet. It's a social network, down which happens to flow things that may or may not be or resemble news. I think? Hell if I know. I just use it to write crap and see what friends are doing (sometimes, in waaay too much detail, sadly. I don't need to know a list of every TV program from the 70's you watched today).

  • Part of the reason people expect less bias from internet media in general is this idea that the internet, being so open to all and containing so much raw information, is not a place where things would be corrupt or skewed. Internet is seen as place filled with poor manners and procrastination but also a place filled with, if at least not truth, truthS. Internet and the media that propagates through it is somehow part of a "informational revolution" where the truth is always within reach, rather than hidden behind locket doors.
    Even if that were true, I would be willing to bet that at least 70% of Idea Channel's subs have rarely, if ever, checked a linked source. Even IF the truth is indeed right within reach of everyone, most people just leave it to leader-like figures to dive into the sea of information and bring in the pearls.

  • My biggest concern is the ability of giant outlets like Facebook to sway public opinion. To me it's a huge responsibility that shouldn't be taken lightly.

  • Hey mike, just thought you should know that this video along with the "When is a Troll" video are both missing from the Chronological Order playlist.

  • why are americans so ignorant of philosophy to the point of hating bias? everyone have bias, EVERYONE (insert gary oldman), it is inevitable.

  • Same as it's always been: computers do EXACTLY humans tell them to do, and then humans get angry because that wasn't actually what we wanted.

  • Wolfgang Winter says:

    this feels like whats going on with the global warming climate models. because all the scientists want warming to happen at a prescribed rate they keep forecasting it. but it always falls short of there goals, so now they look like fools

  • Thinking about it, it makes sense, almost to the point of being obvious. If you boil the term "bias" down to it's most abstract idea what seems to be left is a "criteria." If your criteria for what you read is local information you will have a bias towards the local, having no idea about the world at large, you will undoubtedly make assumptions based on your region. If your criteria is what is "popular" than the unpopular but true will go unnoticed. In a pre internet sense when all news was delivered in papers and radio and tv, the stations and publisher have a criteria in what news they would report on, and then the consumer has a criteria based on what distributors that they would consume from. The news organization undoubtedly had bias before they had criteria, applying their bias to the news they report on, the consumer would then develop biases based on the news organizations biases. Pure "what is Trending" topics, because it comes from consumers inherits both of these biases with an added criteria "what is popular," and what is "buzzing" (as google once said, lol)

  • News? On Facebook? Facebook? A news source?
    If (when?) big brother rises to power, I will be one of the first to disappear.
    What would Murrow say?

  • Why should they care if they are? What's wrong with suppressing conservatism?
    It's pretty much the same argument as allowing equal time for intelligent design in schools.
    Facebook is about progress not being old fashioned and suppressing new ideas. We don't need to give equal spotlight to stupidity for things to be fair.
    The stupid people have oppressed the free thinkers for long enough and now we're in control. We should use that control for the better.

  • Hooptie Hamburger says:

    Think about all of the libertarian and conservative pages and posts that have been banned from Facebook. Now think about the fact that Antifa's pages on Facebook are still up and running.

  • As time has gone on Conservative views have been labelled far right so the hammer can be dropped on them. What makes me laugh is when the left gets so extreme it pops back into view on the far right. What goes around , comes around. And every group has its day, and should be disbanded once it's goals are achieve otherwise it has to push beyond their original goal to continue to exist and so typically becomes something unrecognisable. A slippery slope where they can become the very thing they fought against…or worse.

    The "unbiased" requirement is pure bull as everyone will think their view is unbiased and that everyone else is full of it.

  • Facebook has been viciously attacking my online business because i posted a Trump shirt for sale, locking my account, disconnecting my business account from Facebook, messenger, and Instagram. Forcing password changes account verifications, flagging my personal and business posts. I am under attack and am loosing the battle economically.

  • was forced to watch this by professor, the speaker is interesting, but still hate this class and discussion

  • LOL, the 9/11 truth trending item!

    The robots want us to know the truth. That's why they're trying to suppress them!

Leave a Reply

Your email address will not be published. Required fields are marked *