当前位置: 当前位置:首页 >娛樂 >【】 正文

【】

2024-11-24 11:19:13 来源:狗尾貂續網作者:焦點 点击:456次

People are awful. It’s the obvious (but no less depressing) reality that Facebook—and especially Mark Zuckerberg—somehow failed to recognize.

Whether it was a naive belief or a negligent assumption, the fact remains: Zuckerberg built Facebook’s content control systems with the core idea that people can police themselves. Now, he's been scrambling for months, if not longer, doubling down crafting rules to herd his trolling-prone cats. Of course, it’s been a dismal failure.

SEE ALSO:Secret Facebook documents reveal how site battles violent content

The hundreds of pages of Facebook Moderator guidelines uncovered by The Guardianare a stunningly analog solution for a digital company as advanced as Facebook. They're also the clearest indication yet that Facebook's just making this up as they go along.

What’s also clear is how deeply ineffectual these guidelines are—not just for users, but for Facebook’s army of moderators and the two billion Facebook users who rely on them to keep their feeds scrubbed of the most disturbing content.

And that content's disturbing.

Reading through guidelines for Graphic Content, Revenge Porn, Sexual Child Abuse, it's hard not to be struck by Facebook’s plodding attempts to identify what is and isn’t objectionable, as well as the base nature of the examples.

Much of what appears in these stark, black and white slides is drawn, it seems, from Facebook itself. It's a soul-shaking window into the dark, animal heart of humans on the Internet. So many angry and awful impulses—and Facebook's been a home to all of them. In a way, it's easy to feel for Facebook and Mark Zuckerberg. This probably isn’t what he had in mind when he created Facebook. He probably should’ve known better, right?

Thing is, Zuckerberg's the product of a relatively privileged and sheltered upbringing. His recent talking tour of the United States is the best evidence we have of this. For example, this is someone who's just discovering that our relationships have a huge influence, that juvenile detention centers create more criminals than they rehabilitate, and that it’s hard to meet people who might have a positive influence on your life, especially if you stay where you were born. Even if he's learning late, again, better late than never.

Facebook has all the hallmarks of being designed by someone who didn’t understand how their society really works—who couldn’t see that “connecting the world” wasn’t necessarily a positive goal if you didn’t account for the pockets of hate, harm, disassociation, and unrest that define so much of it.

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Which makes perfect sense, when you remember that Facebook started as a digital version of the college face book some universities produced to help students get acquainted (my college didn’t have one, but they were popular with Ivy League Universities like Zuckerberg’s Harvard).

A university is an ostensibly diverse place, with a wide range of emotional states across the student body, but there’s also a singularly of purpose—getting a college degree—and usually, more balance on the demographic and socio-economic scale than you might find in an average American city.

It’s easy to connect people who are mostly alike, whose homogeny helps define the kinds of content they share. But outside those Ivy-covered digital walls, things can devolve. Quickly.

In the early days of Facebook, a young Mark Zuckerberg encouraged programmers to "move fast and break things." Even when he adjusted that motto in 2014 to the far less pithy “Move Fast with Stable Infra," it was less about what Facebook was breaking on a societal level, and much more about not breaking the product hundreds of millions of people were already relying on.

Facebook is a very different platform than it was in 2007, or even 2014. The expansion of its sharing tools (most recently live video) and its shift to a mobile-first platform has radically altered its sharing potential. Facebook’s window on the world and its calamities is never closed.

Which is why it’s not entirely surprising that today’s Mark Zuckerberg is a changed man. He’s a searcher who's discovering, however belatedly, that the world's not filled with billions of Harvard students simply looking to connect.

Even as he travels the country, posing for photo ops, speaking like the politician he’ll apparently never be, the frantic reality of Facebook’s internal struggle with harmful content has now been laid bare in these documents. Zuckerberg wants to solve the world’s problems, but clearly has no idea how to fix Facebook. The current solutions smack of desperation.

Instead of using that classic obscenity benchmark—you know it when you see it—Facebook is, in the documents, explaining everything and trying to define context and intent where, in the insane world of online content, there may be none, or it may prove impossible to assign.

Even with algorithms taking the first pass, it’s Facebook's human moderators who are forced to interpret every motive, who decide if something's art or sex, violence or news, hate or free speech.

Often times, there is no objective reality to these situation, or at least one that falls into any of those binaries. In other words, it’s an impossible solution for an impossible situation.

Since he built Facebook, Mark Zuckerberg grew an adult sense of empathy that shines through in his increasingly personal Facebook posts. That’s encouraging. Facebook, on the other hand, is still just a cold robot, emotionlessly posting the best and worst of us. There's no heart. No conscience. Just the cold abyss of terrible choices for a legion of over-taxed moderators who probably wish the world was a better place.


Featured Video For You
Almost 20 years later, 'Titanic' gets a retro remake

TopicsFacebookSocial Media

作者:時尚
------分隔线----------------------------
头条新闻
图片新闻
新闻排行榜