FB to curb internal debate over sensitive issues amid employee discord
mful
5 years ago
266
477
https://www.wsj.com/articles/facebook-to-curb-internal-debate-over-sensitive-issues-amid-employee-discord-11600368481
ganoushoreilly5 years ago
Good. More companies should move this way rather than the other.
AlexandrB5 years ago
Weird that an outspoken free speech advocate like Mark Zuckerberg would want to limit the speech of his employees.
kurthrAlexandrB5 years ago
Especially since they presumably use FB for internal discussions... Why not use the same (weak) moderation mechanisms everyone else has? Not like you can avoid politics on your regular feed.

I guess he realizes they don't work.

JamesBarneykurthr5 years ago
It seems far easier to police the 50,000 people you write checks to, than a billion users.
fatnoahkurthr5 years ago
AFAIK, Facebook is not used for internal discussions. Unlike the Facebook we all have, there are no controls to block users, flag posts as "see fewer like this", etc. I've used those to great effect and see zero political content on my FB feed.
bhupykurthr5 years ago
> Why not use the same (weak) moderation mechanisms everyone else has? Not like you can avoid politics on your regular feed.

Because people don't typically use Facebook at work? It's a recreational tool. The issue is that for Facebook employees, they are working on building that recreational tool, and that hampers productivity and professionalism.

> I guess he realizes they don't work.

Or he realizes that they work for a service that's typically consumed during one's free time, and not as a direct component of one's employment.

ugh123AlexandrB5 years ago
I think the point is that employees that are taking advantage of free speech within the workplace are forcing others into the debate unwillingly, and this is causing strife and resentment.

I applaud the move.

sg47ugh1235 years ago
Just like miscreants are taking advantage of free speech by spreading misinformation and conspiracy theories on FB the platform?
themacguffinmansg475 years ago
Forcing others into debate and spreading conspiracy theories are two different matters. The former impinges on the basic freedom of speech and association for other people, while the latter affects no one's freedoms. The former is a matter of principle, the latter is a subjective judgement.
sg47themacguffinman5 years ago
Where's the proof that others are being forced into debate?
themacguffinmansg475 years ago
Unsure what the GP was referring to, but the article suggests that the pressure comes from activists spamming work feeds:

> “What we’ve heard from our employees is that they want the option to join debates on social and political issues, rather than see them unexpectedly in their work feed.” [said Facebook spokesman Joe Osborne]

Others have organized to skip work in an act of defiance, affecting colleagues who are not engaged in their political cause:

> a group of employees staged a virtual walkout in early June to protest Facebook’s decision to leave up a post from President Trump about social unrest

Not exactly a "gun to your head" type of forced, but nonetheless these are actions designed to foment a political clash with other employees to engage with certain political causes when they may not want to concern themselves with that at all. What makes this particularly problematic in the workplace is that many employees rely on work feeds and teamwork to earn a livelihood for themselves and their families, so they cannot realistically avoid it.

kelnosthemacguffinman5 years ago
It is kinda funny to me that FB internally has realized that their feed algorithm is awful, but has enough cognitive dissonance built up to not associate that with the public non-work FB as well.
pjc50ugh1235 years ago
"Forcing"? Is there actual force involved, or is it just more speech?
dylan604pjc505 years ago
If you are at work in an open floor plan just trying to do your job while some of your co-workers are carrying on about whatever, you are pretty much forced to have to listen to it. You are now no longer achieving 100% of what you could be doing because your brain is distracted. That's before you even attempt to participate.
ianmobbsdylan6045 years ago
Facebook has been remote for most of the year and will continue to be remote until next summer, so this is clearly not the case here since this decision is coming now.
derwikidylan6045 years ago
Is that to say, it's less of a problem in companies where there are no open floor plans? Or where everyone is working remotely?
dylan604derwiki5 years ago
No, cubicle farms are just as annoying when people cluster near your cubicle. People just have no consideration. Take it to the break room and/or water cooler. You're clearly not working anyways, so other than hiding "at your workspace" there's no reason to carry on right there.
pjc50dylan6045 years ago
"Free speech is bad because I don't like hearing people talk in an open office" is a really terrible take, and I say that as someone who doesn't like either Facebook's role in far-right politics or people talking in open offices.
kelnosdylan6045 years ago
I mean, that reduces to basically anything that distracts me from work. Ideally there are no non-work conversations of any kind happening in an open floor plan office. And even work conversations should be kept to a minimum, and taken to a conference room if there's going to be any length and substance.

Why focus on political speech when it's really any non-work speech that's distracting and should be banned?

But we all know that's impossible to enforce, and we've developed unfortunate but necessary coping mechanisms, like noise-canceling headphones.

Or maybe, y'know, just give up on the open-office concept, which has always been a productivity destroyer. I imagine post-COVID office spaces will be much more conducive to quiet and productivity.

mhoadAlexandrB5 years ago
I just got done reading this[1] and it's becoming harder and harder believe the idea that Facebook is just pro "free speech". Their actions aren't meeting their words at all.

[1]https://www.bloomberg.com/news/features/2020-09-17/facebook-and-mark-zuckerberg-need-trump-even-more-than-trump-needs-facebook

compiler-guyAlexandrB5 years ago
All but the most strident and radical free-speechers support time, place, and manner restrictions.
patorjk5 years ago
At 3 paragraphs (131 words), that was a really short article. However, I agree with the subheading: "Mark Zuckerberg says employees shouldn’t have to confront social issues in their day-to-day work unless they want to"

That sounds good to me. I've never had to talk about these kind of things at work. Are there work places where this is unavoidable?

daokpatorjk5 years ago
Yes. There are places where employees MUST go in these meetings that talk about social differences, inclusions, diversity, etc.
dahartdaok5 years ago
Everyone has to discuss diversity at work because there are laws prohibiting discrimination, that's a good thing. Not everyone has to discuss politics at work.
ikirisdahart5 years ago
I'd love it if there was a difference between the two again.
dahartikiris5 years ago
I think I know what you mean, and me too. But... mandatory diversity training at work is typically very bland, sticks to communicating what the rules are, and is very apolitical compared to forums, employee banter, or stuff you find on the news or Facebook or YouTube. Having been through a bunch of them at several companies, I wouldn't put them in the same camp as 'politics' at all.
stateofnouniondahart5 years ago
> Everyone has to discuss diversity at work because there are laws prohibiting discrimination

How are these two things causally connected in your mind?

dahartstateofnounion5 years ago
Are you implying they're not connected, that discrimination and diversity are unrelated? Your wording makes it sound like some kind of political debate trap you're setting. FWIW, I'm uninterested in arguing here why and whether there should be such laws, or justifying efforts to prevent discrimination. The existence of these laws is a fact, and you're free to study the history and legal precedents for why they exist. Here's a generic but decent starting point: https://en.wikipedia.org/wiki/Discrimination
davidivadaviddahart5 years ago
No. He's implying that the fact something is a law doesn't mean your employer should lecture you about it.
dahartdavidivadavid5 years ago
Why? Isn't making people aware of laws how you abide by them? The risk of not educating people and being caught ignoring abuses is fairly serious. Broadly speaking, some of the laws we're talking about are laws that specifically require making employees aware of their existence.
s1artibartfastdahart5 years ago
I think they are talking about meetings at work which do not focus on the law or workplace etiquette, but serve to raise awareness of for diversity/minority/social justice issues outside the workplace.
daharts1artibartfast5 years ago
Where is that happening? And are you sure we're not actually talking about people who misunderstand the purpose and legal requirements of diversity training programs?

As far as I can tell, this is the first comment in the thread to suggest we're talking about widespread mandatory "social justice" meetings that are political in nature and unrelated to the workplace's legal obligations. That's not what I got from @daok's comment or anything in between, and there are widespread mandatory diversity training programs in the U.S. that explain all the comments above, from my perspective.

I've never seen that (mandatory meetings unrelated to work) myself, across employment at 2 multinational corporations, several mid-sized companies, and a handful of startups. The mandatory diversity training programs in most companies are there to meet the legal obligations of discrimination law, whether they tell you that or not. Usually they tell you that.

s1artibartfastdahart5 years ago
I have had department meetings at my workplace in California in the wake of the BLM movement, which also happens to be a multinational with 10s of thousands of employees. Groups of 20 or so employees were put together and encouraged to share stories of how social and racial injustice has impacted their personal lives and ideas for what can be done.

I have a peer who had similar meetings at their firm and the VP of Diversity and Inclusion went as far as to say attendance at such meetings would be tracked on an individual and ongoing basis.

I feel that in general, my company has a very traditional and apolitical work culture, so my imagination runs rampant with what things must be like at FB, Google, & social service sector workplaces.

daharts1artibartfast5 years ago
Okay I totally would expect the VP of Diversity to say things to encourage people to participate in their new diversity program. :P

That's interesting, and I might be out of touch with this year's corporate response to the riots.

kelnosdahart5 years ago
I think just the fact that companies are increasingly hiring executives that head diversity/inclusion departments (mine does as well) shows that they're aiming for a lot more than just compliance with anti-discrimination laws.
dahartkelnos5 years ago
Almost made the same comment myself, I totally agree. Such positions didn't used to exist.
greencabss1artibartfast5 years ago
greencabsstateofnounion5 years ago
BeetleBdahart5 years ago
> Everyone has to discuss diversity at work because there are laws prohibiting discrimination

Sorry, but this is misleading. Most such companies are actively complying with antidiscrimination laws. These diversity discussions/initiatives at work are almost always unrelated to those laws. It's not a case of "Hey, we suddenly realized we're not complying with laws, so let's launch these initiatives." Most such discussions don't have any content about the law or legal aspects because they are not about laws prohibiting discrimination.

dahartBeetleB5 years ago
What are the initiatives about then, if not the laws and/or prohibiting discrimination? Do these companies agree with your claim, are you getting that from HR and/or company lawyers, or is this an opinion?

I've never been to a diversity training program that did not talk about the laws, and I've been to many. You are suggesting that companies are wasting their money for no reason, and consciously running training programs that are unnecessary? Why would "most" companies do that?

> Most such companies are actively complying with antidiscrimination laws.

Complying with US discrimination law means that companies are providing "reasonable accomodations" for people in historically discriminated categories, and for anyone who might file a complaint in the future. Educating the employees about acceptable behavior is a pretty obvious way to avoid getting sued.

> It's not a case of "Hey, we suddenly realized we're not complying with laws, so let's launch these initiatives."

The risk of running into both PR & legal trouble over diversity complaints has risen over time, due to increasing levels of education, increasing exposure to cases of discrimination and abuse, and generally increasing awareness of diversity issues.

BeetleBdahart5 years ago
I am speaking specifically to diversity initiatives visible to all employees, as opposed to those that are specific to managers or HR.

The law doesn't require us to banish the phrase "master/slave" from our lexicon, yet it is part of our diversity initiative.

The law doesn't require us to banish the words "whitelist/blacklist" from our lexicon, yet it is part of our diversity initiative.

The law requires we do not discriminate based on a protected category (gender, race, etc). We have always been compliant. However, the law does not require us to hold special recruiting events for underrepresented folks when we are already compliant, but yet that is a major part of my company's diversity initiative.[1]

There are pay disparity laws. We were fully compliant before they became the law. They are not part of the diversity initiative and usually not discussed.

The law doesn't prohibit us from saying "Merry Christmas!" but our diversity initiative addresses this.

The list goes on and on.

Note: I am not saying I'm against such initiatives. Merely noting that at least for my company (and jurisdiction), these initiatives are not about what the law requires. We have always had employee expectations training for all employees, and it already covered things required by law, long before the company dove into these initiatives.

It may be that your company's diversity initiative is about legal compliance, but for many company's, it is a lot more than that (by a lot, I mean the majority of the initiative is not required by law).

> The risk of running into both PR & legal trouble

There's a world of a difference between PR trouble and legal trouble. Your comment originally was about the legal side, and so is my response. I do believe that most diversity initiatives are about PR, and not about the law, which was why I responded.

[1] These are invite-only, and so it's not open to all groups - just the ones we picked.

dahartBeetleB5 years ago
> these initiatives are not about what the law requires.

Except they are. Your word list is an exceedingly literal straw-man interpretation of a law that is purposefully vague, and doesn’t prescribe which words you can use. You can’t assume that specific policies of a company aren’t there for legal reasons just because the law doesn’t state the exact same policy in the exact same words.

The spirit and the letter of the law, as I already pointed out, requires that employers take “reasonable” actions toward making all employees feel welcome. Society and business have collectively decided that certain words can or do make some people feel unwelcome. Because of that fact, and because companies don’t want their management sued for neglect, we have interpretations of what it means to be compliant that are predictive and speculative. That really does not mean that the initiatives are not about the law, in fact exactly the opposite. Asking people not to use certain words counts as one of those “reasonable” actions, and gives the company a paper trail of attempting to be compliant.

BeetleBdahart5 years ago
Your whole argument is predicated on "The law is vague. Society has decided such phrases make people unwelcome, and thus companies can be penalized for it."

Unfortunately, the only way to know is to test it in the courts. We differ as to where society currently stands on such matters, and it is certainly not clear that were this to go to court, rulings will be made to penalize companies utilizing the lexicon in the manner that it has always been utilized. And of course, my company never said "We should stop using these words because we are interpreting the law this way (or we think society interprets the law this way)." The notion that there may be a lawsuit was not even mentioned with regards to these phrases.

Your argument is "law is intentionally vague, and society believes X". I agree on the former, and not the latter. Unfortunately, anyone can make any statement when it comes to vague laws, which is why having precedence in a court is important.

You've also not addressed the other examples in my comment. My company, which is compliant with the law on hiring based discrimination, is not going to get sued if they decide not to hold special recruiting events for people of certain protected classes. Your earlier comment asked whether "I am getting it from HR and/or company lawyers", and in this particular case, the answer is definitely "Yes". The need to hold such events has been widely discussed in the company, and it has been made clear that this is a proactive initiative, and not required at all of us. Much of the criticism about this particular part of the initiative within the company is very much about "this is not required by law". There is no way HR is going to claim it is.

Furthermore, I just noticed in your earlier reply:

> You are suggesting that companies are wasting their money for no reason, and consciously running training programs that are unnecessary? Why would "most" companies do that?

I suggested no such thing. Companies do a lot of things that are not required by law. I didn't say all such things are a waste and are "for no reason".

dahartBeetleB5 years ago
> Your whole argument is predicated on "The law is vague. Society has decided such phrases make people unwelcome, and thus companies can be penalized for it."

No, my argument is predicated on my experience and discussions with company HR and lawyers at multiple companies. It sounds like your experience differs, and that's fine. My explanation for why the words you chose aren't specifically listed in the law is because the law is intentionally vague, and not prescriptive about which words you can use. The law is stating a goal, and companies are interpreting how to achieve that goal, because they have no other choice.

> Your argument is "law is intentionally vague, and society believes X". I agree on the former, and not the latter.

I feel like this is getting unnecessarily argumentative, and I'm guilty of escalating it. But I did not say, and didn't intend to mean that all of society agrees. However, it's sort of a fact and not a debatable point whether certain groups of people and businesses have decided that some words are sensitive. That's exactly why it's showing up in diversity programs.

> I didn't say all such things are a waste and are "for no reason".

Okay, I apologize for mis-interpreting. You have said multiple times that it is "not about the law" and you haven't offered an alternative explanation. If the reason has nothing to do with the law, then what is it, and why are companies saying it has to do with the law? What is the goal behind the proactive initiatives?

> You've also not addressed the other examples in my comment. My company, which is compliant with the law on hiring based discrimination, is not going to get sued if they decide not to hold special recruiting events for people of certain protected classes.

I tried to address this. Attitudes are changing over time. Being compliant yesterday doesn't necessarily mean you're compliant today, even if the law doesn't change wording. Growing awareness means that what's "reasonable" is a moving target. Also, we were talking about diversity training, and not affirmative action nor only hiring discrimination laws. Your company might get sued in the future if it doesn't take reasonable actions along the way to prevent people from feeling marginalized or ostracized, even though it believes it's in line with the law today. That has already happened at other companies, and one reason companies are trying to be proactive.

BeetleBdahart5 years ago
> You have said multiple times that it is "not about the law" and you haven't offered an alternative explanation. If the reason has nothing to do with the law, then what is it, and why are companies saying it has to do with the law? What is the goal behind the proactive initiatives?

I'm confused with the question, given that you yourself gave the answer:

> The risk of running into both PR & legal trouble

You yourself stated a reason other than legal.

But even without PR, I'm surprised you're asking. Do you not believe they can be pushing these initiatives because they actually care about diversity? Or because they view it to be a competitive advantage over other companies? Or because they believe diversity will lead to better company performance? The last is one of the main stated goals in my company. I don't know if they themselves believe it, but it's clear that many, many people do.

> I tried to address this. Attitudes are changing over time. Being compliant yesterday doesn't necessarily mean you're compliant today, even if the law doesn't change wording.

I really, really do not see any group winning a court case against a company because they did not have special hiring events for people of their group. Perhaps in the future, but not any time soon. I do not for a second believe my company did this because they were concerned about the law.

If the company's normal recruitment practices is discriminatory towards a certain group, I can understand. That's not the case here. Moreover, even if it were, having such events would not protect them. You can't wipe discrimination in one part of the company by compensating in another. If your job application page has stuff that discriminates against, say, African Americans, then having special recruiting events for them will not alter the fact that you are discriminating.

> Also, we were talking about diversity training, and not affirmative action nor only hiring discrimination laws.

The thread is about discussions of diversity in the workplace, and were not limiting it to training.

dahartBeetleB5 years ago
I don't think fears about PR & legal ramifications are so easily separable. PR problems can and do become legal problems very quickly.

Sure, I do think companies care about diversity, and are interested in the competitive advantages. But the only mandatory meetings on diversity I've ever had were about communicating policy that is attempting to adhere to the law, even if in a proactive sense. The goal of the law is to care about diversity and is founded on a belief that a diverse society has a competitive advantage, so I don't necessarily see a hard line between complying with the law and actually caring about diversity.

> I really, really do not see any group winning a court case against a company because they did not have special hiring events for people of their group.

I don't see that either, and it's not something I claimed. We weren't talking about affirmative action, you're moving the goal post. We were talking about widespread mandatory diversity programs.

kelnosdahart5 years ago
> You have said multiple times that it is "not about the law" and you haven't offered an alternative explanation.

In addition to what the parent says about PR, there are some actual good reasons:

1. Some companies are starting to understand and believe that having a diverse team is a competitive advantage when it comes to designing and marketing products intended for a diverse audience.

2. Some companies are starting to believe that it's just the right thing to do to try to increase diversity in their ranks, to attempt to combat systemic sexism and racism that historically has kept certain groups on the sidelines for some roles.

Whether you agree with these things or not, companies are increasingly believing in them, and that's at least a part of why they go far beyond what the law requires when it comes to anti-discrimination. I find it unlikely that a company would lose a court case for not pushing to hire more diverse candidates, or not having implicit-bias training, or not removing terms like master/slave or whitelist/blacklist from their internal lexicon. It does not seem like companies are doing this because they are afraid of running afoul of the law.

dahartkelnos5 years ago
Hey I do agree with those things, I agree there are some actual good reasons and I agree there are PR reasons, I'm not arguing that. I guess I am miscommunicating, being misleading in a way I don't understand, or my style is grating some people.

I will just add that I agree that pushing to hire more diverse candidates isn't likely to cause a law suit for most companies. It might for a large company that is lopsided and clearly discriminating, but it'd take evidence which is hard to get. That's really outside the scope of what I thought we were talking about, though, because pushing to hire diversity isn't something that requires all employees to actively participate in the process.

The other two, avoiding implicit bias training and not removing sensitive words, in combination those could cause problems - and I know of companies where they have caused problems. If people actually use words that make multiple employees feel uncomfortable, and the company management has a record of complaints and no record of action to resolve the complaints, there is real liability there in today's world.

By and large I think there's probably a lot more agreement here under the surface than it looks like. My mistake might be failing to clarify that I'm not saying legal reasons are the only reasons. There are other reasons, I'm just saying the legal reasons are usually there, and are important. This is probably getting less true over time, where legal reasons were what it took to get some companies to actually do something, and today growing awareness means that companies are more likely to think it's the right thing to do, more likely to agree with the law, and more willing to begin taking action without any specific legal concern. I guess maybe it's quite a good sign that people here are disagreeing with me because it means things have been going the right direction, compared to my work experience over the last couple of decades.

sidllsdahart5 years ago
> Except they are.

They aren't. In fact, the discussions I've seen at companies in the bay area go far beyond the minimum requirements of the law. They aren't about preventing discrimination, they're about actively increasing diversity (in some areas, but not all).

greencabssidlls5 years ago
strkengreencabs5 years ago
Protecting the company makes them sound like a defensive measure, when they seem equally an offensive measure to get ahead of the competition: "use our product, or come work for us, and feel good about yourself!"
greencabsstrken5 years ago
They cover the company in the case of discrimination lawsuits.
sidllsgreencabs5 years ago
That's a fantastic read; thank you for sharing it.
greencabssidlls5 years ago
Sure! All the articles on that site are fascinating!
an_opabiniaBeetleB5 years ago
It's a really bad look to say that these debates about nomenclature are not worthy of your time, and then go write 11 righteous paragraphs about it.
BeetleBan_opabinia5 years ago
> It's a really bad look to say that these debates about nomenclature are not worthy of your time, and then go write 11 righteous paragraphs about it.

It indeed would look bad if I said any such thing. Fortunately, I did not.

hnreader998dahart5 years ago
Oddly, companies are actively and deliberately discriminating against race and gender in tech companies in the name of diversity and inclusion. It boggles my mind that none of them have been sued yet for these practices.
daharthnreader9985 years ago
What are you referring to?
hnreader998dahart5 years ago
I should have been more clear. Like someone else mentioned, these diversity (and inclusion) discussions are not related to discrimination laws and equal opportunity. They are always related to increasing diversity in the company by hiring more underrepresented people (previously this just meant women in eng but now it's been expanded to POC). In order to achieve this, companies are actively discriminating against non-minorities, which some may call positive discrimination. Regardless, it's discrimination nonetheless and it's very common yet somehow nobody cares.
daharthnreader9985 years ago
> these diversity (and inclusion) discussions are not related to discrimination laws and equal opportunity.

FWIW, that's a claim that doesn't quite match my experience at work, nor of talking to some of the people who implement these programs. Though to be clear, we were talking about widespread mandatory company-wide meetings, not any old discussion on inclusion that happens to occur while at work. We may need to get more specific about which programs we're talking about, they're certainly not identical everywhere. It also doesn't seem to add up when I read our current laws, which are changing over time to emphasize.

> In order to achieve this, companies are activity discriminating against non-minorities, which some may call positive discrimination.

The historical term for this is affirmative action (https://en.wikipedia.org/wiki/Affirmative_action), and the idea is to temporarily increase benefits for a disadvantaged group, not to intentionally decrease benefits for the advantaged group. Calling it discrimination, therefore, is a framing that isn't always true, and is somewhat political. It's not always true because things aren't always zero sum. If I choose to give someone a dollar, you don't lose a dollar.

If there really are a fixed number of jobs at a company that is 80/20 men, and the company decides 30% must go to women, then technically yes, that is a form of discrimination. But - just hypothetically - is it a negative discrimination if the reason that the company is 80/20 men in the first place is because it previously discriminated against women, and would have been 50/50 without several decades of history of unspoken discrimination?

It's important to also think about a few things- One, that unlike social prejudices, affirmative action is not intended to be permanent. It's intended to help boost people who've been unfairly and systematically disadvantaged, while they're disadvantaged, and only until things even out. After that, the boost should go away by design. Two, that some of those disadvantages in history have been really extreme, and the kind of discrimination you might imagine you feel when your company tries to hire more women isn't the same order of magnitude of what women and black people as a whole have gone through.

> Regardless, it's discrimination nonetheless and it's very common yet somehow nobody cares.

Is all discrimination bad always? I'm very discriminating about my partners. I'm not sure that nobody cares, I think some people are in favor of seeing that gender and racial injustices actually go away, since not doing anything about it hasn't worked yet.

kelnoshnreader9985 years ago
I don't think you're looking at it from the right perspective, or have the right base assumptions in place.

The thesis here is that current hiring practices are biased (often implicitly and unintentionally) against women and POC. Since removing implicit bias is exceedingly difficult, actively requiring hiring managers to hire more people from underrepresented groups is a way to put your thumb on the scale in order to equalize them.

I can understand how you'd see that as discrimination against white men, and if you squint at it in just the right way, it really seems like it is, but what it's really doing is attempting to reduce an unfair advantage that white men have. No, it's not perfect, and I'm sure occasionally a white male does legitimately get discriminated against. But that's a small price to pay to lift a ton of other people out of the status quo of discrimination they're usually stuck in.

hnreader998kelnos5 years ago
I don't agree.

The fact is that women simply represent a small percentage of the overall workforce in engineering. The only way you can get parity in representation is to get parity in the underlying workforce. The only way to do that is to encourage women to pursue a career in this industry, but that's not something you can change overnight and I doubt companies care enough to invest in something that may pay off in 20 years.

I'm all for doing things that aren't discriminatory and removing unconscious biases in interviews, job descriptions, and whatever, but that will not move the needle. It's a supply issue.

Discrimination is discrimination, no matter how you want to dress it up, and it's never OK.

I'll add that some of my best colleagues have been women. I much rather not work in a sausage fest, but I also don't want to work in a world where active discrimination is supported.

daharthnreader9985 years ago
> Discrimination is discrimination, no matter how you want to dress it up, and it's never OK.

Let’s agree on this 100% and then ask the question: how do we get rid of discrimination? If we have some implicit social bias that is causing a measurable difference in outcome for women, how can we get rid of the bias? If we take it on face value as truth that all discrimination is bad, the no discrimination at all is the ideal. I assume we both agree on that completely. In the mean time, before we’re able to fully eliminate all discrimination, which is better: negative discrimination against women resulting in the outcome of fewer women working and lower pay, or that plus an offset positive discrimination that boost the outcome for women so that there are more in the workforce and the pay is closer to equal?

We can try to push outcomes to be closer to equitable, but the most important question there, I think, is: will the affirmative action actually help remove the original implicit bias against women?

> The fact is that women simply represent a small percentage of the overall workforce in engineering.

That has changed over time, and is different depending on where you live. It went up from 0 a century ago to an average of something like 35% in the 70s, and has declined since then to like 20%. In some countries, the balance is closer to 50% and in a few places, its over 50% - spots in India for example. Isn’t that alone evidence indicating things have not settled, that we can’t rest on some notion that the workforce balance today represents the natural state of things? That we are obligated to ask why, and make sure men aren’t accidentally contributing to the discrepancy? (Especially given that in the past there is a documented history of that happening.)

> The only way to do that is to encourage women to pursue a career in this industry

What if the reason women are choosing not to pursue engineering is because there are still biases, and they know it? Then how would you encourage them?

> Discrimination is discrimination, no matter how you want to dress it up

What if the job you’re talking about being offered to a woman is subsidized and would not have been offered to a man either way? Is that still discrimination?

MertsAdahart5 years ago
I think he's referring to the incident where Google decided to throw out all applications from caucasian and asian men back around the beginning of 2018. They were directed to “purge entirely any applications by non-diverse employees from the hiring pipeline". Non-diverse meaning caucasian or asian men. They had a diversity problem so they decided to "solve" it in the most racist way possible. They also had another program to assign interviewers based on race and gender to match the applicant. Rather than trying to increase their hiring pool to cover some of the implicit bias they just decided to double down on the racism.

https://www.theverge.com/2018/3/2/17070624/google-youtube-wilberg-recruiter-hiring-reverse-discrimination-lawsuit

samthecoypatorjk5 years ago
If you work for a large social network with huge social responsibilities, discussions about ethics ought to be unavoidable, in my opinion.

If you're writing accounting software for paper suppliers or something equally banal with few ethical implications, then sure, there's no need (and less reason) to have water cooler conversations about pro-genocide agitprop or whatever.

EDIT to add that of course not all departments at Facebook make the sort of decisions that have a marked social impact. More referring to the content policy teams, and the news feed algo teams, and so on.

toomimsamthecoy5 years ago
> If you work ... with huge social responsibilities, discussions about ethics ought to be unavoidable, in my opinion.

The problem is distinguishing ethics from politics. These are very hard to disentangle, because ethical values are usually based on some political orientation. And I don't want Facebook to be making political decisions on my behalf, as a user. And I don't even want internal employee discussions to be derailed by political considerations.

So how do you distinguish ethics from politics? I don't think it's possible, unless the company defines its own ethical values, a priori, and only considers those when making decisions.

If you read the article, I think that this is precisely what Facebook's new policy is trying to do by putting a fence around "social issues."

t-writescodetoomim5 years ago
> because ethical values are usually based on some political orientation

Are you sure that isn't exactly backwards? Because it definitely should be the other direction in my very strong opinion. One's morals or beliefs on ethics should inform their politics, not the other way around.

One way is based around a person's inner being, the other is molding their being and stances based on a sports team.

sigstoatt-writescode5 years ago
one of you is describing the world as it is, and the other the world as you think it ought to be.
t-writescodesigstoat5 years ago
It's how I vote and how I encourage everyone to vote.
vonmoltkesigstoat5 years ago
I don't agree with that. I don't think people choose their ethics based on their politics, but I do think they compromise them in the name of political tribalism. It's a mistake to ascribe the personal ethics of a specific party member to the ethics of the party platform, particularly when there is limited choice of parties.
samthecoytoomim5 years ago
Facebook must make political decisions, because it must have a stance on content moderation. Even if that stance is "we shouldn't moderate content", that is a political decision. All possible actions and inaction around content moderation require political decisions, and you can't be in the social media business without having a content moderation policy.
accting_discrdsamthecoy5 years ago
I work for an accounting software mega-corp in silicon valley. Our CEO sends company wide emails regarding every notable social issue event. After the George Floyd murder we've been told we need to openly discuss racial issues at work. Managers have been told that they must have these conversations, since if they don't, employees may think that they don't care.

The company's products are in no way social media platforms.

wmfaccting_discrd5 years ago
Maybe you can get some kind of settlement when it inevitably blows up.
jseligerpatorjk5 years ago
It's interesting to watch companies rediscover the old rule about leaving politics and religion at the door.
jjicejseliger5 years ago
It seems like it's slowly died over the past decade. Thankfully, my past two jobs were very work focused without much political involvement.

However, I'm sure it's easier for my jobs since they were for a retail company and an engineering firm.

dylan604jseliger5 years ago
When your company is the place the rest of the planet gather to discuss these very topics, it's not easy to not have to talk about them just in discussing what topics your platform is being used.
munificentjseliger5 years ago
If a business wants its employees to leave politics at the door, the business should too. If Facebook is going to have departments for government affairs, public policy, and lobbying, then it is entirely reasonable for employees to be politically active too.

Otherwise, you're basically saying corporations should participate in the political process but individuals should not. And that's exactly how we got the Earth into the increasingly shitty state it is currently in.

reader_modemunificent5 years ago
That's a straw man argument, just because an employee isn't allowed to bring politics to work doesn't mean they can't be politically active on their own time.
munificentreader_mode5 years ago
So individuals can do recreational politics but corporations are free to sink as much of their resources into it as they want?

Is your claim that we really need more corporate control over politics in the US and less citizen participation?

newcomputermunificent5 years ago
Individuals are also free to sink as much of their resources into it as they want.

The fact of the matter is political conversations have high risk of annoying/frustrating/alienating their participants. To have these conversations at work is just making employees less productive and asking for a controversy.

namuolnewcomputer5 years ago
> To have these conversations at work is just making employees less productive and asking for a controversy.

I dare say it might be time for some employees at Facebook to pause and think about how their work may have an impact on the world.

fxtentaclenamuol5 years ago
That would be frustrating. And it might lower their productivity.

As Facebook, I'd rather pay for some yoga classes so that people don't have time to think about their actions.

kelnosnewcomputer5 years ago
I'm not the parent you're replying to, but I think we're talking about two different things here.

I agree that there's no need to bring up politics in the break room (or worse, during active work) and risk alienating people. It's a bad idea, just like talking about or advocating for particular religious beliefs.

But if your company is being politically active in ways you find unethical, I don't think it's reasonable to expect people to just put their heads in the sand, ignore it, and get their work done. And not everyone has the luxury of quitting a job whenever their don't agree with the company's politics.

Talanesnewcomputer5 years ago
And you believe that people being asked to quietly do work for a company advancing politics they disagree with are going to be MORE productive? I know I'd be quietly fuming and doing minimal work for weeks after a mandate like that.
kelnosreader_mode5 years ago
I don't think it is. If the corporation is going to be donating to groups and lobbying politicians in support of certain policies, are employees who are strongly against those policies supposed to just shut up, ignore it, and do their work?

I'm lucky that I'm in a place financially and career-wise that I can just quit and find a new job if I disagree with my company's politics, but many others don't have that luxury. Their choices are either to talk about it and try to get their company to change, or feel awful keeping quiet.

andreilysmunificent5 years ago
Any sufficiently large company will be involved with politics. That's the nature of being a multi-billion dollar company.
refurbmunificent5 years ago
Otherwise, you're basically saying corporations should participate in the political process but individuals should not.

I have no idea how you came to this conclusion.

That's like saying "if employees don't have total freedom of speech at work that means only corporations have freedom of speech".

StanislavPetrovmunificent5 years ago
That's not how it works. The business is employing you, not the other way around. They are the ones who set the standards for behavior in the workplace, not you, the employee. If you don't like it, you are free to move on.
kelnosStanislavPetrov5 years ago
> The business is employing you, not the other way around.

It's a two-way street. You are providing your services to the company in exchange for compensation. It's an unequal relationship, to be sure, but the company needs employees to exist and survive.

> If you don't like it, you are free to move on.

If only life were that simple, and if jobs were so plentiful and easy to come by that people could be so picky. Sure, a lot of tech workers are in a great place financially such that they can quit in protest (and lose their health insurance, among other things), but most workers don't have that luxury.

jsabopatorjk5 years ago
That depends on if your existence is political I suppose. Ask some of your LGBT colleagues, especially trans colleagues, if they feel like can just leave politics at the door.
travisoneill1jsabo5 years ago
I have worked with many LGBT colleagues who never bring up politics at work. This "existence is political" thing is just a bullshit phrase that the obnoxious people who can't go an hour without bringing that crap up use to justify it.
jsabotravisoneill15 years ago
Remember there are multiple supreme court cases just this year about trans people's right to both be trans and employed. That's part of what is meant by "existence being political".
nagaiaidatravisoneill15 years ago
How confident are you that the root cause of that political silence is a shared belief that politics shouldn't come up in the workplace rather than potential concern about the effects of openly discussing the issues that affect them?
ketzotravisoneill15 years ago
Correction: they never brought up politics to you. You don't know that they weren't political in their discussions with other colleagues. Perhaps it is worth considering why that might be the case.

People whose existences are deeply politicized (and they do indeed exist) are not often excited to have political conversations with people who say things like "'existence is political' thing is just a bullshit phrase."

I'm not trying to be a dick, but it took me a long time to realize that there were conversations I was not being made a part of because I was not receptive to, or dismissive of, those conversations.

thu2111ketzo5 years ago
You've set up an unfalsifiable belief there. No matter what evidence is brought up to the contrary, you can just blow it off by claiming said people never reveal the truth to whoever is arguing with you. It's a fallacious way to shut other people down and that is, kind of, a little bit dickish.
jcimsthu21115 years ago
>You've set up an unfalsifiable belief there.

This is a tactic that needs to be called out more often.

kelnosthu21115 years ago
I don't think that's the case here, though. The parent admitted that they fell into the exact same trap: assuming that certain conversations didn't go on because they weren't a part of those conversations, but later learning that wasn't the case, and it was their attitude that kept them out of those conversations.

It's of course not universally true that's the case for everyone, but I think it's worth thinking about. If your attitude is dismissive of someone's lived experience, it's not likely that they're going to go out of their way to include you in conversations about it; on the contrary, I'd expect them to explicitly exclude you in order to protect themselves.

travisoneill1ketzo5 years ago
Kind of irrelevant as the implication of the parent is that it is impossible for the person not to bring up politics. But I have discussed politics extensively, outside of work, with a gay former co worker. Most of his political beliefs have nothing to do with being gay. The loud activist types represent only themselves and not the people they claim to.
tha0x5ketzo5 years ago
>People whose existences are deeply politicized (and they do indeed exist) are not often excited to have political conversations with people who say things like "'existence is political' thing is just a bullshit phrase."

It is a bullshit phrase though. They don't want to bring it up because any skepticism is viewed as a direct attack on their ideology, and their ideology is the core of their existence/identity, so, calling out the illogic of their ideology is a political attack on their existence.

They want to TELL you, they don't want a discussion.

chance_statejsabo5 years ago
If your very existence is so wrapped up in political and gender/sexuality issues that you can't stand not talking about them at work, maybe you're not emotionally prepared to join the workforce.
lmmjsabo5 years ago
Demanding that other people describe you a certain way is not "your existence".
kelnoslmm5 years ago
In some cases perhaps it isn't, in others it is. Who are you to decide for them? Grow a little empathy and compassion, perhaps. No one is asking you to use particular pronouns in order to piss you off; they're doing it in order to feel comfortable in their own skin. It's petty, selfish, and inhuman to deny someone that.
lmmkelnos5 years ago
What's next, if I'm not handing over $100/week then I'm damaging their existence? No. If you want me to do something to accommodate you then ask nicely. If you demand the right to put words in my mouth and control what I think, you can fuck off.
lackerpatorjk5 years ago
I was working at Facebook during the 2016 election and it was pretty unavoidable.

A big part of the problem was just that everyone was using Facebook for work all the time. So quite often, there would be some enormous thread arguing about whether X or Y was the right policy, was Trump violating the rules and should be kicked off Facebook, or was Facebook's anti-Trump policies violating freedom of speech, or was it racist for an employee to say they supported Trump during a meeting, etc etc.

And you use the same interface for important things like, announcing hey this database service team is launching a new API next week, could you provide feedback on it. Type X of hardware is being deprecated next quarter. So you really have to be checking Facebook-for-work consistently for professional reasons. You have to scroll past the political debates all the time.

mattmlacker5 years ago
I was shocked when I learned that employees at Facebook use Facebook internally for work related discussions. Facebook is not built for that purpose.
lackermattm5 years ago
It is actually built for that purpose! Or at least, there are many people dedicated to optimizing the use of Facebook at work, selling it into companies as an enterprise solution, and so on. There are millions of paying customers. At some point I believe it was called "Facebook for Work" but now it has been rebranded as "Workplace from Facebook".

Some features are work-specific, but it feels basically like you are using the Facebook interface, but just with all the content being from your coworkers about work stuff.

https://www.workplace.com/

ryan_lanelacker5 years ago
Easily one of the worst "productivity" tools I've ever been forced to use. Like facebook, it's optimized to bring your eyes to it, and not to make you productive.

Notifications are exactly like facebook notifications, so you'll get an email that says "open workplace to see this". To use the tool effectively, you need to have it open at all times. It's a weird in-between state of slack and email, where it's somewhat async and somewhat sync. You'll get overwhelmed with notifications and will want to have "inbox zero", which is considerably more difficult than in email, where you can optimize your workflow. The sidebar will have a list of groups you're in with read/notification counts that make absolutely no sense.

It also has chat functionality that can't be turned off, and it's not group based. It's like being forced to use gchat for all communications.

It has a "organization tree" feature that requires employees to fill in their own reporting structure. This also can't be turned off.

If you want your employees to spend all day chatting in forums inside of facebook, workplace is the product for you.

patorjklacker5 years ago
Thank you for this insight!

> was it racist for an employee to say they supported Trump during a meeting

This is sort of what I was thinking about in my original comment. I would hate to have to discuss political affiliation, or make public judgements on other people/issues.

> And you use the same interface for important things like, announcing hey this database service team is launching a new API next week,

I miss so much in my news feed already, using it for work would make me nuts.

pjc50lacker5 years ago
It's vaguely hilarious to discover that one of the problems with Facebook's work culture is ... too much Facebook.
wmichelinpatorjk5 years ago
Facebook is not your average workplace. If your platform is profiting off of active political misinformation, you have an obligation to not only discuss these issues, but solve them. If you determine that you can't solve them, it's time to stop accepting the money.
throwaway1777wmichelin5 years ago
Most employees at Facebook are doing nothing of the sort. They are working on databases and internal tools and changing the color of buttons. If you work on an integrity team then yes, you'll have to discuss such issues, but most employees don't.
jonathankorenthrowaway17775 years ago
But they are. Everyone at the company is complicit.

Lets say you're a delivery driver for a pizza parlor that is famous for the level of rat poison in its sauce. Do you continue to knowingly deliver the poison pizzas because you're not the one making the sauce? Afterall, who's going to deliver the poisonous pizzas?

Come on. This is right up there with, "I was just following orders."

onion2kthrowaway17775 years ago
One of the things I love about startups is feeling like my input is partially responsible for the mission, culture, and direction of the company. I can't imagine doing work for a company that was doing something and not feeling like I'm making a contribution to that goal.

Maybe you're right and developers at FB don't see that they have any responsibility or input towards what FB does with the tech they build, but I hope not. It would be horrible to be that detached from the outcome of your work.

kelnosthrowaway17775 years ago
That feels disturbingly close to a "just following orders" defense. Just because you're in the basement keeping the boiler running, it doesn't mean you can ignore what goes on upstairs. If what they're doing upstairs is unethical, it's unethical for you to keep the lights on for them.
hnreader998patorjk5 years ago
It's even worse than unavoidable. My company has removed any kind of anonymous discussion/Q&A, which just fuels a mono culture.

You only get one-sided discussions because going against the grain will be career suicide.

onion2khnreader9985 years ago
All this means is that you value your career over the political statement you want to make. The point you want to contribute isn't all that important to you. This is why it's worthwhile listening to people who speak out when there is a cost to themselves - they believe in what they're saying enough to sacrific something for it.
hnreader998onion2k5 years ago
I don't disagree. My main point is that at a bay area tech company, the cost is one-sided, which drives company policies and philosophies, all while endlessly preaching about diversity.
NicolasGordenonion2k5 years ago
This seems naïve. Cost is a factor.

Greg, a nobody who nobody likes says X thing - X isn't a big deal, and doesn't affect most people and about 45% of people would agree X shouldn't be said, even if it isn't a big deal.

Mark, a senior directo who everybody loves says Y thing - Y is a big deal, but only 10% of people would agree it shouldn't be said.

Guess which will get called out?

rjkennedy98onion2k5 years ago
You are right we should listen more to terroristic suicide bombers because they put their lives on the line for what they believe /s.

Seriously, haven’t they don’t studies that most of these revolutionaries are actually not very smart people. How does risking your life for a cause have anything to do with being correct about an issue?

tanilamapatorjk5 years ago
You hate how polarized the disclosure had became, where not taking a stance is now the same as taking the stance for the opposite side.

I would say, FB did the right thing here, to not supporting a platform that actively politicizing itself.

ashtonkempatorjk5 years ago
Work on projects that have less moral ambiguity.
luckylionashtonkem5 years ago
Such as ... food production? "But your food is consumed by literal fascists"... water supply? "Did you know that evil people also drink the water you are providing and hydrate their bodies so they can do more evil things?"

For people who believe that everything is political, there are no projects with less moral ambiguity, it's just more or less openly visible.

ryan_laneluckylion5 years ago
If you provide catering services to ICE maybe you'd have people getting political about who's eating your food. If you're producing water and only allowing hate groups to consume it, sure maybe you'll have this happen, but otherwise these examples are strawmen.

Everything has some political issue around it, but Facebook has politics baked into it because it's using political issues as a means of making money. They sell advertising to politicials, when they know the ads are lies. Their platform is filled with fake accounts pushing genocidal agendas from dictators, and in many cases facebook is sweeping it under the rug.

The way their platform is built is setup to manipulate people, and that platform is being used at scale to do so in ways facebook knows is fucking up the world. Its very existence is political at this point.

luckylionryan_lane5 years ago
> They sell advertising to politicials, when they know the ads are lies.

I don't really take issue with that, Germany even has that codified, and we're very far from being free-speech-absolutists. Media companies are compelled by law to air political ads by all political parties without checking them, judging them or commentary. Short of being obviously illegal, there's nothing they can do which lead to our center-left state media being told by the supreme court to air the far-right (actually far right, with skin heads, boots and all the stops, not just anti-low-skill-immigration conservatives) NPD's spot.

> Their platform is filled with fake accounts pushing genocidal agendas from dictators, and in many cases facebook is sweeping it under the rug.

But not really. They exist, but the platform isn't "filled" with them. The vast majority on FB is not political.

I'm sure that FB would be quite okay with not having politics at all. Sure, people are on the platform, but they'd rather have engagement around cat pictures, celebrity news and similar things, because people shouting at others about their ideology aren't buying sneakers. They're not a political advertising company that relies on political ads as their primary funding.

Banning political speech is simply not an option, because some people sometimes want to argue about politics, and you're going to have to fight your users if you don't allow that. You never want to fight your users.

4ggr0luckylion5 years ago
What a weird example to pick...

Food and water production has VERY big ethical issues. Palm oil, mass-slaughter of animals, deforestation, Nestle taking away water from locals, CO2 emmissions etc.

So yes, there are problems in the food and water industry, but I don't really get what your point is? Should we just close our eyes, ears and mouths and say "fuck it, not my problem"?

luckylion4ggr05 years ago
> Palm oil, mass-slaughter of animals, deforestation, Nestle taking away water from locals, CO2 emmissions etc.

Not at all what I was aiming at. The problem people have with FB isn't how they produce the product, but who uses it.

The problem with food and water in the equivalent scenario, would be in who consumes it. If you let everyone consume it "woah, that's a political choice". But it really isn't. It's the default, deviating from it is a political choice.

evgenluckylion5 years ago
> The problem people have with FB isn't how they produce the product, but who uses it.

No, the problem is that the product they produce is _specifically designed_ to be used in this manner because conflict and argument increases "engagement" and for a large portion of the employee base their bonus depends on performing work that leads to this outcome.

dexenluckylion5 years ago
Upon reading your, and GP, it occurred to me that we've lost the idea that it is the buyer that is responsible for the purchases.

It used to be that people and organizations making unethical purchases were the ones we considered, and held, responsible. For a long time we've had good, positive movements centered on informing the buyer. We added expiration dates, ingredient lists, nutritional value information, crashworthiness scores and reliability ratings, country of origin labels, even ethical sourcing labels. Perhaps too much of a good thing caused information overload and resulting numbness? Somehow, between the Prohibition, the "war on drugs", and the supply side moral regulations, we've lost the spirit of "well informed free agents making decisions".

Most of the services (FB and the likes) we're discussing here are morally neutral by their nature, and it takes concerted efforts to make them non-neutral[1]. It is the particular use they are being put to that is moral or immoral. Let's not shift vast moral powers from the wide society to a narrow cadre, shall we? The economy is a neat distributed system. It's the popular democracy before democracy became popular. Let's not give it up.

--

[1] example of non-neutrality: the current trend of algorithmic manipulation

kelnosdexen5 years ago
> Most of the services (FB and the likes) we're discussing here are morally neutral by their nature

I don't think that's the case. Is it moral to exploit human psychology when developing addictive features that pull people into the site over and over? Is it moral to sell user information to advertisers so they can emotionally manipulate you into buying crap you don't need? Is it moral to design interactions that evoke outrage and disagreement in order to increase engagement? Is it moral to track user activity across the web, outside the company's site?

I don't think any of these things are moral. These practices might not be necessary for a site like FB (then again they might), but this is the model they all seem to choose. And that's what actually matters.

dexenkelnos5 years ago
I hear your objections, and I should have worded my idea better.

The gist was, a bare messaging+microblogging platform is, by its own nature, morally neutral[1]. Of course if the operator starts doing editorial decisions - like algorithmic timelines, or propping up/pushing down content, or manipulating user mood - then the operator clearly is making moral judgements & decisions.

Funny how respecting user privacy does, at least partly, absolve the operator from a lot of risks related to making moral judgements on a mass scale in a hurry.

--

[1] with the only caveat that, if somebody believes facilitating communication to be evil or good, then it would be considered respectively evil or good.

kelnosdexen5 years ago
I agree that the mere concept of a bare messaging+microblogging platform is morally neutral, but frankly I just don't see what the point is of making that observation, because we don't have one of those, at least not something that's wildly successful enough to matter. (By that I mean that a platform that has 100 or 1000 or even a million users can do whatever it wants; unethical behavior just doesn't move the needle on a global scale.)

It's the classic argument, "technology is neutral; how it's used determines the ethics". Well, yes, I agree with that, but here we have a company that's using it unethically, and has no desire or need to stop their bad behavior. And that bad behavior has been instrumental to their success. That's what matters.

xnyanpatorjk5 years ago
What this means in intent and practice is that you are not allowed to discuss social issues that the hierarchy and those that align with the hierarchy find objectionable.

I'm going to go out on a limb and predict that if a facebook employee wants to talk about poverty in underdeveloped countries on internal social media, then that's going to be ok, but if the discussion concerns people who were harmed because they followed bad medical advice that was spread by use of facebook, all of a sudden that's an unacceptable socal issue at work.

clomondxnyan5 years ago
I.e. it’s all good as long as you don’t rock the boat.
searchableguyxnyan5 years ago
I wonder if free-speech or political protection in workplace can be abused by competitors to destroy your culture as proposed by people in the comments. Is there a chance of that happening or does it already happen?
an_opabiniasearchableguy5 years ago
> Is there a chance of that happening or does it already happen?

Nah.

Every Facebook employee I've met, every one that I'm reading, they are sincere about what they're angry about.

It's a bad look, to assume some Facebook employee's opinions are being co-opted by... fucking Google? Apple? That's ridiculous.

I'm not even going to speculate why anyone questions some random Facebook employee's sincerity.

Instead I offer: Imagine if someone told you, every opinion you had, all the time, talked over you or told you to shut the fuck up and said, "Oh you're getting co-opted by Google, this is exactly what they want you to do, 'destroy our culture.'" And then, in the same breath, that guy defends, breathlessly, some idiot outraged over the removal of master/slave nomenclature, or some idiot trying to mainsplain crackpot sex difference theories to his female coworkers.

C'mon, you'd be mad as hell, it's so utterly ridiculous.

fxtentaclean_opabinia5 years ago
"some idiot outraged over the removal of master/slave nomenclature"

Since GitHub repositories renaming their master branch did cost us significant money and almost caused downtime, I would be fully understanding if someone else is feeling outraged about it.

I belive you need to work on your example of a bad person ;)

How about we go with? And then in the same breath that gal defends selling people's secrets for pennies on the dollar and willingly accepting that they'll likely have very real negative consequences for your users once the private data in your database invariably gets leaked onto the internet.

Oh wait, that wouldn't leave much left at Facebook, wouldn't it?

So let's just question the moral integrity of anyone working at Facebook. Seems reasonable, given what egregious privacy infringements their work enables.

kelnosfxtentacle5 years ago
> Since GitHub repositories renaming their master branch did cost us significant money and almost caused downtime, I would be fully understanding if someone else is feeling outraged about it.

I think "outrage" is an inappropriate response. We're talking about removing nomenclature that has been (and continues to be) used to oppress an entire segment of society. I think removing that is worth a little money and downtime, if it comes to that. People who are "outraged" that it cost them some time and work probably could stand to show some compassion for their fellow humans.

fxtentaclekelnos5 years ago
Surely it is very easy to take the moral high ground on these issues. But if you're the one working over-hours to fix the resulting mess, you start to wonder why those knights in shining armor forgot to show you the same compassion.

Usually, when open source projects introduce a breaking backwards-incompatible change, they will first deprecate things and then wait some months to give people time to update. After this nomenclature had been in use for 10+ years, I can't help but wonder why there was no time to take the user-friendly path in this instance.

So to the people who are fixing the mess, it certainly feels more like you got kicked because someone else wanted to show off his/her moral superiority.

There existed a reasonable way to change the nomenclature, but it wasn't taken.

kelnosfxtentacle5 years ago
> it certainly feels more like you got kicked because someone else wanted to show off his/her moral superiority.

And that's where the question of empathy and compassion comes in.

As a random white dude, the "pain" I face by dealing with problems around these name changes is completely minimal and trivial when compared to the emotional pain they cause people in certain groups that actually have a lived experience of oppression.

Regardless, I do agree that if there is crazy scrambling and short timelines to change these names, that's a problem in your org. There should not be reckless urgency to get this done; it should be done just as any other major change should be: with planning and risk assessments. Where I work, we are doing it slowly and with an eye toward not causing downtime. If your org is not doing that, then I agree that you have a valid complaint. But this complaint should be directed at the bad process, not at the work itself.

vonmoltkekelnos5 years ago
> > Since GitHub repositories renaming their master branch did cost us significant money and almost caused downtime, I would be fully understanding if someone else is feeling outraged about it.

> I think "outrage" is an inappropriate response. We're talking about removing nomenclature that has been (and continues to be) used to oppress an entire segment of society.

In this specific case, outrage is appropriate because the nomenclature as used by git has nothing to do with "master/slave". It's a well-intentioned but misguided attempt at what you describe, unless you are making the preposterous claim that the word "master" should be purged from all contexts.

Of course, WRT this article and the discussion on it, if you try to point this out and discuss it at a company like Github (or mine) where the group making these decisions is convinced of their correctness you risk ostracization and career suicide. In fact, the statement you closed with

> People who are "outraged" that it cost them some time and work probably could stand to show some compassion for their fellow humans.

implies that you also are convinced of the correctness of this decision, and that anyone who objects to it is not compassionate (and by implication, not worthy of consideration). This is not a good approach to take if your goal is to educate.

kelnosvonmoltke5 years ago
Well, yes, I am convinced it's the correct decision, and by extension I do believe that people who disagree are at best misguided, and at worst are actively invested in perpetuating systemic inequality.

I'll agree that git's use of "master" is not as egregious as "master/slave" in database terminology, but it's still not great.

There are two prevailing uses of the term "master". One refers to the quality of being exceptionally good at a particular skill. By and large, I don't think most people have a problem uses of "master" where that's the intended meaning. But "master" in the sense of "leader" or "controlling" isn't great, even if (in the case of git's "master" naming) there isn't a corresponding "slave" role.

> if you try to point this out and discuss it at a company like Github (or mine) where the group making these decisions is convinced of their correctness you risk ostracization and career suicide

I agree that this is bad. These sorts of responses have a chilling effect on reasonable conversations and discussion. But in some ways I do understand why this happens; people who are directly affected by terminology like this are getting really tired of having the same conversations over and over about something that evokes significant emotional pain every time it's brought up. Again, it's not great, but I think it's understandable. And it's frankly hard to understand why using a word like "master" in technical terminology is somehow so important that it's even worth getting into repetitive discussion after discussion about it, especially when doing so causes some people pain. That's where the concerns about empathy and compassion come into play, because the people who constantly fight against this change do not seem to be even trying to look at this from someone else's point of view. (And I say this as someone who initially was resistant to these changes, but have since realized that I was wrong to do so.)

Red_Leaves_Flyypatorjk5 years ago
>Are there work places where this is unavoidable?

Many:

Education. Healthcare. Corrections, law, and law enforcement. Social work. Public utilities and subsidized housing. Etc.

satya715 years ago
How about bringing the same rules to the wider FB? I just want to look at baby pictures and connect with friends. I don't want to be part of a machinery that spreads misinformation and conspiracy theories.
drchopchopsatya715 years ago
I can't see how that is even possible at this point. You'd have to remove groups, pages, public profiles, and sharing, which would wreck the advertising and revenue ecosystem. Or, come up with magical AI which could detect politics/memes/disinformation and remove it instantly after it's been posted.
quicklimedrchopchop5 years ago
I’m sure they could develop a classifier that would catch most (~90%) of political content, and make it opt-in - if people want to see it they can, but it could be hidden by default. This would be my preferred approach, so that I can use it for connecting with people but avoid listening to everyone’s political outrage.
londons_explorequicklime5 years ago
You could build this as a browser extension... Call it "de-politics", and have it scan the HTML of popular sites (facebook, twitter, etc) and simply collapse/hide all content matching some filter.

I bet a simple keyword filter for names of politicians could catch 90%.

I wonder if people would pay for it?

davisrlondons_explore5 years ago
Ah, of course, censoring political discussion is the answer!
gfodordavisr5 years ago
Censoring is when an authority removes content without your consent. Installing something that lets you decide what content to see (or not see) on a site is not censorship.
davisrgfodor5 years ago
It is, however, sealing one's self in a chamber of like-minded opinion.
bmarquezdavisr5 years ago
That implies Facebook posts are an accurate assessment of opinion, and not whatever the algorithms promoted to increase engagement.
lozaninglondons_explore5 years ago
If you take the time to set it up you can get close using https://www.fbpurity.com/ chrome plugin.
gfodorquicklime5 years ago
You could def make this not suck by just making it so facebook automatically 'tagged' content and users can filter out certain tags. Since it'd be public, users could decide for themselves if the tags are reasonable for their filtering needs. But they'll never do it, because ad conversion rates and engagement would likely drop significantly.
ardy42quicklime5 years ago
> I’m sure they could develop a classifier that would catch most (~90%) of political content, and make it opt-in - if people want to see it they can, but it could be hidden by default. This would be my preferred approach, so that I can use it for connecting with people but avoid listening to everyone’s political outrage.

Eh, I don't really like that idea. For one, it only really addresses the problem of being exposed to content you find unenjoyable.

Honestly, sometimes I do wonder if consumer-level broadcast technology is the psychic equivalent of doing something like letting everyone fly planes without any training. It might be better to adopt communication technologies with a little more friction.

wnissenquicklime5 years ago
I spent the entire month of September 2016 flagging every single political post, whether it was a news article, friend's status, shared post, whatever, as "See less like this". It was completely ineffective. At the end of the month I was getting seeing just as much politics as before. So I'm not sure they can, or maybe want to. I was giving them all the input they needed to make a good classifier, and it was a lot of work. Maybe classifiers have improved enough in the previous four years.
saghmdrchopchop5 years ago
> You'd have to remove groups, pages, public profiles, and sharing

Or they could just not show posts to groups you're not in and from pages and public profiles you don't follow! Allowing something to interject into your newsfeed should be opt-in, but right now it isn't even opt-out, except for not logging on at all. It would also be cool if there were a way to opt-out from seeing shared posts selectively for people on your friends list, e.g. I want to see things that Overly Political Relative posts themselves, but not things that they share from other places.

That being said, I deactivated my Facebook account a couple years ago, so I'm no longer a user whose opinion they should theoretically care about anymore.

lozaningsaghm5 years ago
I don't really use FB all that much these days, because the people I care to keep up with have largely moved on from it. But when i do log in, I get mostly the experience you want with the https://www.fbpurity.com/ chrome plugin I've spent the time to heavily customize.

My timeline shows as strictly chronological, and only text and photos posted by my immediate friends. No groups, ads, publisher's bullshit, promoted things, no trending, no nothing. Just photos and plain text.

saghmlozaning5 years ago
Yep, back when I still used Facebook, I used that extension. I think that installing a third-party extension is probably a lot more than the average person knows to do, though. I was mostly making a point that if Facebook finds discussion over politics and the like too divisive for their internal company chat, maybe they should consider what they can do for the rest of us to keep things similarly sane.
kelnosdrchopchop5 years ago
I don't think you need to do that. I think just making the timeline a reverse-chronological firehose, and not filtering any posts out or making any posts more prominent, would do wonders. That's how Facebook used to present the feed.

Giving people tools to make sub-lists of certain friends/groups/etc. in order to organize their experience better (on their terms, not at FB's whim) would be great, too.

rsynnottsatya715 years ago
I'd give that about 10 minutes before someone started claiming that their baby was immune to covid, and then you're right back where you started.
thrwn_frthr_awyrsynnott5 years ago
But isn't that only happening because of FB allowing disinformation to spread so easily?
dmpk2kthrwn_frthr_awy5 years ago
How would you achieve that?
dntrkvthrwn_frthr_awy5 years ago
Curious how you think it's possible for FB to prevent the spread of disinformation? Everyone likes to pretend like Facebook has the ability to just stop disinformation, when in reality, even defining "disinformation" is basically impossible. Sure, you can bring up examples of blatant lies, but most of the effective disinformation is a lot more subtle and depends on what side of the issues you are on.
nindalfdntrkv5 years ago
I have seen this question asked many times on such threads but have never seen a workable solution. Usually the response is something like "I'd shut the whole website down" or "I'd employ millions of moderators" or "I'd allow people to only post once a month" or "I would remove sharing links, only baby pictures allowed". Nothing practical. If you respond to any of these with examples of positive speech that would be harmed, there is no response. For example, if you curb political speech, people wouldn't be able to organise political protests. People wouldn't even agree on what should be considered 'political'. For example, is organising a BLM event political? Should it be allowed under the proposed rules?

I'm happy to be proven wrong though. Maybe this is the thread where people will make practical suggestions.

munificentnindalf5 years ago
What is "impractical" about "shut the whole website down"?

When it was discovered that tetraethyl lead was widespread in the environment and caused neurological damage, it was banned. Yes, that materially harmed several chemical companies whose livelihood was based on producing tetraethyl lead.

So what?

If your business model harms people, I don't care if stopping harming people eliminates your business. People matter. Businesses do not.

Are we supposed to just go, "Yeah, we know Facebook is harmful to millions, but won't someone think of the poor shareholders?" Then shrug and accept it?

dntrkvmunificent5 years ago
Banning a harmful substance, and banning a platform for communication are not comparable.

There is no shortage of sites that will take Facebook's place.

What is the legislation you propose to prevent Facebook, or the millions of other existing or soon-to-be existing apps, from doing harm to people?

munificentdntrkv5 years ago
> Banning a harmful substance, and banning a platform for communication are not comparable.

OK, consider gambling. That is simply a kind of software that enables people to engage in behavior that turns out to be harmful for a large number of them. And, because of that fact, it is heavily regulated.

> What is the legislation you propose to prevent Facebook, or the millions of other existing or soon-to-be existing apps, from doing harm to people?

I don't know if we know what sort of regulations would help yet. But I do know that if we assume a priori that corporations cannot be forced to change their behavior because it might hurt the poor corporation, then we will never figure out the answer.

nindalfmunificent5 years ago
Well that's all I'm asking for here. A practical proposal.

Removing tetraethyl lead was certainly doable. Removing every car from the road was not. One was a targeted change that improved the industry, while the latter was so impractical that they never considered it.

Here's a thought - you assume a priori that shutting down social media would be a net win. How did you come to that conclusion? Did you spare a thought for the people who's social lives revolve around spending time with friends online? You'd advocate for taking away these people's social networks because you're certain you know what's best for them?

munificentnindalf5 years ago
> you assume a priori that shutting down social media would be a net win.

I didn't actually say that. I think many social media sites are net positives, like this one here. I think Facebook specifically is a net negative.

> How did you come to that conclusion?

Performing experiments on users' emotional state without their consent: https://www.theatlantic.com/technology/archive/2014/06/everything-we-know-about-facebooks-secret-mood-manipulation-experiment/373648/

Cambridge Analytica: https://en.wikipedia.org/wiki/Facebook%E2%80%93Cambridge_Analytica_data_scandal

Facebook makes users feel worse about themselves: https://www.bbc.com/news/technology-23709009

You get the idea. None of this is new. Some communities are more toxic than others. Some businesses are less ethical than others. I believe Facebook is an unethical business led by an unethical man making a product that is more harmful than good for most people.

> Did you spare a thought for the people who's social lives revolve around spending time with friends online? You'd advocate for taking away these people's social networks because you're certain you know what's best for them?

I did not advocate that.

claudeganonnindalf5 years ago
I don’t think you have to shut down Facebook. Just shut down targeted advertising, algorithmic filtering of content, and a lot of their surveillance practices (e.g. shadow profiles).
disgruntledphd2claudeganon5 years ago
Should Google's targeted advertising also be prohibited?

Should Google's algorithmic filtering of content also be prohibited?

Should Doubleclick/Google's surveillance practices (which almost certainly include some kind of "shadow" profile) be prohibited?

Also, people have been talking about shadow profiles for at least a decade now, and yet no disgruntled FB employee has revealed all. Why do you think that is?

kelnosdisgruntledphd25 years ago
Yes, Google's similar practices should also be prohibited. You ask as if it's obvious that someone would say no and show themselves to be a hypocrite, which is weird.
disgruntledphd2kelnos5 years ago
OK cool, I was interested in whether or not you were being consistent.

I don't find it weird at all, I've noticed that often people get really upset about FB or GOOG doing something, while ignoring the other, so hence my questions.

dntrkvkelnos5 years ago
How do you define "algorithmic" filtering?
claudeganondisgruntledphd25 years ago
Yes of course? Because Facebook employees are highly paid, I imagine. The reason why that whistleblower got so much coverage is that usually all it takes to keep some quiet is to pay them off.
disgruntledphd2claudeganon5 years ago
Like, there have been soooo many FB whistleblowers/leakers in the past four years, and yet shadow profiles (which would seem to be important to leak) have never had any backup here.

I don't think that this is a coincidence, and I 100% disagree with the notion that this is because FB employees are well paid.

I honestly think that it's because shadow profiles essentially don't exist in any meaningful form (there's probably some logs for non-FB users, but I don't think that they are aggregatable to a specific individual without an account, mostly because that would be super low value and really hard).

kelnosnindalf5 years ago
> Did you spare a thought for the people who's social lives revolve around spending time with friends online?

I don't agree that banning social networks would be productive or even possible, but this argument doesn't make sense. People had social lives before social networking. People had friends online before social networking. Social networking is not required for these things.

dntrkvmunificent5 years ago
I am not concerned about hurting the poor corporation. What I am concerned about is having actual laws in place, not enforcement based on outrage. If Facebook is doing thing X, we decide we shouldn't allow thing X, we make thing X illegal. If FB continues to do thing X, we apply the law to them. We don't just dish out punishments for vague "You're hurting America" crimes. You need to define the thing you want banned, and it can't be "Facebook."
freehunternindalf5 years ago
Why is it our responsibility to figure out how Facebook can fix their algorithm? They employ thousands of the smartest engineers in the world and pay them absolutely ridiculous amounts of money. If they wanted to fix it, they would. As just random people on the Internet, it’s not our job to figure out how to fix it. It’s our job to complain about it enough to make them fix it themselves.
searchableguynindalf5 years ago
I want people to pay. It will

- Reduce majority of bots by making them unsustainable.

- Provide direct money to improve moderation and make platforms liable.

- Make journalism cater to individuals rather than ad networks.

- Remove toxicity because trolls won't pay after getting banned regularly. No need for other fingerprinting methods.

- Reduce the number of users and silo them automatically.

Free business models are anti-competitive and result in worse service for the users by making platforms accountable to advertisers (other companies) than consumers.

Force facebook to introduce minimum payment based on purchasing power. Outlaw free/freemium models in software or limit them to a time period (3-6 months).

This won't apply to non-profit services. And open source will be fine since it will only apply to services or for-profit companies.

thrwn_frthr_awydntrkv5 years ago
Facebook's algorithms exploit the brain's attractiveness to decisiveness. They've know this and have chosen not to take action, except for small amounts so Mark can tell congress they are improving things. Source https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixed-solutions-11590507499

So besides straight up changing the algorithms to promote non-decisive content, these are a couple things I think could help:

- Limit the spread of information in general in favor of content created by the people you follow

- Un-personalize advertising

dntrkvthrwn_frthr_awy5 years ago
Facebook doesn't profit from divisiveness, they profit from engagement. The fact that divisive posts encourage more engagement tells me more about people in general, rather than Facebook's business model.

> Limit the spread of information in general in favor of content created by the people you follow

I don't think that's what people want from their social networks nowadays. FB, Twitter, YouTube, TikTok, Snapchat, etc all do not work this way anymore. Suggesting that Facebook revert their app to what it was 10 years ago is not a serious suggestion because there are many other apps that will fill that void. If it's not FB, another app will take its place and give people the outrage they're looking for.

> Un-personalize advertising

Advertising plays a very small part in this. Most of what you would call "disinformation" is spread through reposts, which are not affected by advertising.

Sure, there might be some hostile actors out their spending money on pushing propaganda to the masses. But from my experience, people actively seek this nonsense out, the algorithms just make it easier for them to find it.

In my eyes, the real problem is that most people aren't equipped with the right tools to identify bullshit. Simple things like an inability to gauge scale. e.g. "9,000,000 gallons of oil has been spilled from pipelines in the last 10 years" Is that a lot? I have no idea, but what I can do is compare that against other forms of oil transportation. Most people won't do that work though, they will go straight to outrage.

thrwn_frthr_awydntrkv5 years ago
> I don't think that's what people want from their social networks nowadays. FB, Twitter, YouTube, TikTok, Snapchat, etc all do not work this way anymore.

They don't work this way because it makes shareholder's the most money, not because it is the best experience for the user.

> Advertising plays a very small part in this. Most of what you would call "disinformation" is spread through reposts, which are not affected by advertising.

A completely false ad about a candidate of a different political party is much less likely to be called out or reported because it is only shown to a highly targeted group of people. This lack of accountability creates disinformation. These ads could not be ran as a billboard advertisement or in a non-personalized ad space.

All of the counter arguments always come down to this: Facebook would make less money. And, yes, of course that is going to be the case because if any of these changes would make them more money they would have implemented them themselves. It requires a public corporation to accept that they are making the world a worse place, and to choose to make less money to stop doing that.

s1artibartfastthrwn_frthr_awy5 years ago
>All of the counter arguments always come down to this: Facebook would make less money. And, yes, of course that is going to be the case because if any of these changes would make them more money they would have implemented them themselves. It requires a public corporation to accept that they are making the world a worse place, and to choose to make less money to stop doing that.

And it would also require them to make a product that people desire less, and risk losing to a competitor that gave people what they want. People want to cluster in silos, chase novelty, and spout off with 100% confidence about topics they know nothing about.

zapitadntrkv5 years ago
> Facebook doesn't profit from divisiveness, they profit from engagement. The fact that divisive posts encourage more engagement tells me more about people in general, rather than Facebook's business model.

"Crack dealers don't profit from drug addiction, they profit from the pleasurable effects of consuming crack. The fact that very addictive drugs are pleasurable to consume tells me more about people in general, rather than crack dealer's business model."

leetcrewzapita5 years ago
there are a lot of pro-legalization folks on this forum that would likely be inclined to agree.
zapitaleetcrew5 years ago
I’m fine with legalizing crack as long as producers and distributors are heavily regulated and held accountable for their impact on public health, just like the alcohol and tobacco industry.

Social media conglomerates manipulate how billions of people perceive the world around them, with disastrous effects. They should be held accountable for that.

Absolving them of all guilt, and blaming all the nefarious effects of social media on the consumers, accomplishes nothing.

dntrkvzapita5 years ago
Yes, and I don't see crack dealers as the problem in your example. The bigger problem is how society views drugs, addiction, and the criminalization of both. We've all seen how well that's worked for us. Thinking we could apply similar bans on speech we don't agree with is just as stupid.
zapitadntrkv5 years ago
Both things can be true. Some dangerous drugs are excessively criminalized and that causes enormous problems. Others are legal but heavily regulated, like alcohol and tobacco. Some are legal and insufficiently regulated, which causes enormous problems too: see for example the ongoing opioid crisis in the US.

Nothing good comes from denying the dangers of an addictive drug, or leaving its distributors free to misbehave without consequence. That is the current situation with social media companies and their enormous influence on our minds.

I don’t think we should ban social media. But not holding multi-billion dollar social media conglomerates accountable at all is lunacy.

elliekellydntrkv5 years ago
Doesn't HN have some sort of "flame war" prevention feature where people are automatically prevented from commenting/replying in rapid succession? Rate-limiting posts/sharing and comments on Facebook seems like a good place to start. Maybe it isn't necessary (and probably not possible) that we stop disinformation entirely so much as we slow it down.
CaptainZappdntrkv5 years ago
> Curious how you think it's possible for FB to prevent the spread of disinformation?

For example by doing something when they are warned for years by multiple entities that FB is used as a tool to support genocide, like in Myanmar.

It seems that only when such things blow up publicly and the stench of bad publicity gets too bad they send out Zuckerbot announcing his usual platitudes to then get back to business as usual. And that's far from the only example were their product was used for oppression by authoritarian regimes.

This company could do a hell of a lot more to counter this. But they just don't give a shit, unless publicity gets too bad.

edit : word change

beamatronicrsynnott5 years ago
Don’t be friends with those people
dylan604beamatronic5 years ago
To me, this is a valid response. If you can remember back when we as humans used to gather together in public places, we had lots of options on who we talked to at that gathering. If someone always talked about something you just didn't care about, you could walk away and talk to other people. People that you found yourself regularly talking to about things that everyone found pleasant were called friends. People you talked to occasionally were called acquaintances, and people you preferred not to talk to were called many things, but friend was not one of them. If some of your friends like the people you did not, they were called a friend of a friend but you would not refer to them as your friend.

In Facebook, there are only "friends". So, if you don't like what they are always carrying on about, don't have them as a friend. Just like in real life.

paganelrsynnott5 years ago
> claiming that their baby was immune to covid

Some people are indeed immune to covid, babies too, most probably. I've personally heard of numerous cases of persons not getting the virus at all while their spouse was in intensive therapy or worse.

driverdanpaganel5 years ago
Not getting it is not the same thing as immune.
paganeldriverdan5 years ago
What is it, then?

Later edit: To add to my comment, what do you call sleeping in the same bed, eating from the same plate and having direct physical contact with a person who gets the virus and ends up in IC or dead while the other person tests negative for the virus?

Let's not forget that ever since February we've all known that this virus is particularly easy to transmit/get, so you cannot say "that person got really lucky, that's why he/she hasn't got it".

idealspaganel5 years ago
You have no scientific evidence to support your claims. Take your L and stop spreading misinformation.
acrobatsunfishideals5 years ago
If you're going to ask him to share sources, please show some of your own with the level of rigor you would like to see to the contrary. I'm tired of this trend of people shouting Sauce! Sauce! At each other. Collaboration is key.
driverdanpaganel5 years ago
Immune means you won't get sick. Not getting sick just means you didn't catch it.
seattle_springpaganel5 years ago
My wife had the cold once and I didn't get it. Must mean I'm immune to the common cold.
paganelseattle_spring5 years ago
Yes, you were probably immune to that particular cold strain. Or you weren't in close contact with your wife during that timeframe, but that wasn't the case for the persons I've written about.
rsynnottpaganel5 years ago
While it’s possible that some people might be immune, your anecdata is certainly irrelevant to determining that.
thu2111paganel5 years ago
It's hilarious that in a thread about misinformation the idea that children are immune is being treated as not true. Death rates for COVID for people under 20 are ~zero. Babies are in fact immune, perhaps due to relatively lower levels of ACE2 expression compared to adults.

HN readers seem to have totally lost it w.r.t. COVID and misinformation. It's practically guaranteed that in any thread about misinformation/FB/Twitter/etc someone will state something about COVID that's true and then describe it as misinformation, or state something about it that's false and then decry the conspiracy theorists who don't believe it.

kelnosthu21115 years ago
First off, you're moving the goalposts a bit here. No one claimed that the death rate for people under 20 was anywhere near that of older age groups.

Your assertion that "babies are in fact immune" is demonstrably false: https://data.cdc.gov/NCHS/Provisional-COVID-19-Death-Counts-by-Sex-Age-and-S/9bhg-hcku shows 20 deaths for children under 1 year old. Presumably many more than that were infected but survived (unfortunately covid.cdc.gov is timing out for me right now and a quick search didn't give me infection rates for that age group).

Yes, the number is small compared to cases in older people. But "babies in fact are immune" is in fact the same kind of misinformation you're railing against.

thu2111kelnos5 years ago
Remember you're dealing with a very noisy, FP-prone test, and there have been millions of tests by now. At those levels there are bound to be some babies that tested positive and then died, so you can't infer from it that COVID actually killed them.

To put it in perspective, according to the UK govt's own analysis, it's very likely that all currently reported positive infections are false positives!

kyleeesatya715 years ago
Simple, just remove humans from the platform
subsubzerosatya715 years ago
Totally this, I get having ads injected into the home timeline(gotta keep the lights on!), but inundating people's feeds with 'publisher's stories' showing them news that is blatantly false/overly negative/polarizing and just not wanted for a vast majority of people.
beamatronicsatya715 years ago
I want an option to pay a yearly or monthly fee, that lets them still make money, but also protect my privacy.
taftsterbeamatronic5 years ago
Ha ha ha, they make too much money on you to allow that. Protecting your privacy can't happen at the platform end, because of the nature of the network itself. And there would be too few people that would pay for this feature, it would limit their ability to grow. Which is ultimately what it's about; the stock price.
Consultant32452satya715 years ago
This is a copy/pasta of a previous post of mine, but I feel it's very fitting here.

Why would they give up control of the world by doing something silly like that? Think about how much political influence Twitter has based solely on which tweets they show the President and corporate press. Consider how much untraced in-kind donations these companies can make by tweaking which news stories you see. The crazy thing about it is these things can be tweaked by humans, but it's largely controlled by AI now, which no one person will completely understand what's happening in any of these systems. We're in the early stages of AI controlling the global political future and it will tend to create whatever kind of future generates the most clicks. It's kind of like the game Universal Paperclips, except with clicks/rage/ads.

brian_cloutierConsultant324525 years ago
> Think about how much political influence Twitter has based solely on which tweets they show the President and corporate press. Consider how much untraced in-kind donations these companies can make by tweaking which news stories you see.

I hope you take this as kindly as I intend it, but what you're proposing is a conspiracy theory. This is a relatively nice attribute for a theory to have, because it gives you a nice heuristic for deciding whether the theory is true!

The likelihood of a conspiracy being true decreases as the number of people with knowledge of the theory and an incentive to report on it increases.

To take an extreme example, if the moon landing was faked, tens of thousands of people have somehow held on to that secret. Tens of thousands of people who could gain overnight notoriety by telling their story, and hundreds would have the proof required to gain even more popularity. The fact that nobody has ever broken ranks is a strong sign that the moon landing was not faked.

"Twitter and Facebook are secretly tweaking which news stories Trump and the rest of us are seeing" isn't a conspiracy on nearly the same scale as a faked moon landing. It requires some pretty incredible things to be true though.

- Maybe every employee knows, and none of them have decided to say anything, despite the large incentives to reveal the secret and win their moment in the limelight.

- Maybe not every employee knows, just enough employees know to implement it and hide that implementation from the others. Maybe every employee on the Algorithmic News Feed team knows. I don't know how Twitter and Facebook are structured, the team probably isn't called Algorithmic News Feed, but as one of the more important systems both Facebook and Twitter must dedicate at least a hundred engineers. So, 200 people were quietly chosen for their ideological purity and ability to keep a secret from their peers. These 200 people write code in secret. Somehow they commit lies to the monorepo and apply private patches to the code before deploys. The SREs must also be in on it, because those private patches will still show up in traces and their bugs will show up as errors. All of this happens inside Facebook, a company notorious for employees who speak up and expect transparency. It also happens inside Twitter, a company with such lax controls that until just recently thousands of people could use the internal admin tool to take over any account.

I don't know, I guess it's possible? Maybe you have a better idea for how it could be happening, but it just doesn't seem very likely at all.

Consultant32452brian_cloutier5 years ago
I'm not imagining an intentional conspiracy by anyone. Everyone need only respond to incentives. The AI responds to the incentives it was programmed to respond to such as engagement. The workers are responding to incentives such as profit. They tell you they censor some people because they fear it will radicalize you, will harm profits, or other similarly non-nefarious incentives. No conspiracy or under handed behavior is required.
iateanapplebrian_cloutier5 years ago
> I don't know, I guess it's possible? Maybe you have a better idea for how it could be happening, but it just doesn't seem very likely at all.

I’ve seen this kind of thought pattern a few times and frankly the way you are thinking doesn’t match reality.

I work on a 1000+ person enterprise software project.

Less than 5% of those 1000+ understand our customers requirements and use cases in any real depth. This is despite trying for years to incentivise developers to have a broader understanding of our business.

Within that core 5% most decisions are driven by the 3-5 people who care about the particular area.

So for a 1000 person+ org you would need to corrupt 3-4 people to drive a hidden agenda.

This is for a project not trying to be secretive in any way.

To relate it back to Twitter you would probably need the right 3-4 people to push hard for content moderators to be hired in San Francisco instead of Bangalore in order to push hard left views.

Consultant32452iateanapple5 years ago
You don't even need to discuss your "evil plans" with anyone. Hell, it doesn't even need to be a plan. You just only hire people who already agree with you. You don't even have to do this consciously, it's the default human behavior.
iateanappleConsultant324525 years ago
> You just only hire people who already agree with you. You don't even have to do this consciously, it's the default human behavior.

Exactly - our product uses angular because two of our core engineers loved angular, helped people who were having trouble with angular, and hired people who also liked angular.

Not because angular was the best tech choice. We didn’t even do a proper evaluation.

And this is for a hundred million+/year project......

throwaway0a5ebrian_cloutier5 years ago
You're making the assumption that the people running these companies are trying to take over the world with AI or whatever. They're not. They're just trying to make the most money and do what's best for the company. The AI, the political influence, etc. are all just side effects of that. There's no conspiracy because nobody is conspiring. Everyone is just doing their jobs the best they can.
xapataConsultant324525 years ago
I believe the word is "paste" and you can link instead.
B-Conxapata5 years ago
"pasta" is a meme replacement of "paste" is there copy/paste context. It's relatively popular, albeit uncommon on HN.
monadgonadxapata5 years ago
Someone's addressed the "paste" part so I'll address the rest: the vast majority of people are not going to want to click on a link to a short post in the middle of an HN discussion. The content belongs here, not behind a link, and mentioning that it's a copy of what the user has said elsewhere is a courtesy to us.
1vuio0pswjnm7satya715 years ago
And if someone even mentions the idea of creating exactly this service -- baby pictures and friends' contact information can be exported from Facebook and imported into such a service -- a torrent of HN commenters would decry "fail" before they even tried it. Tech bloggers and "journalists" would also inject doubt into the minds of their readers. To get to where you want to go, you would need to ignore the critics. To accomplish what you describe, there is no necessity that every user logs in to the same network. Each network would only need to contain family and friends. Connecting one network to another would be optional. You are not asking to be in a graph with the entire world, to be connected to total strangers. Yet that is precisely what FB is constantly trying to achieve. Every person's information collected in a single database controlled by a single entity. One "social" network for everyone. Hence you are connected to people and companies you do not know, who are not your friends or family. Your behaviour can be studied. The ads, marketing and misinformation can flow freely.

(FB == Fish Bowl)

Kattywumpussatya715 years ago
> I don't want to be part of a machinery that spreads misinformation and conspiracy theories.

You're swearing off the internet entirely?

TalanesKattywumpus5 years ago
Internet? I can get all of that from the dude sitting in a wheelchair outside my coffee shop. (Pre-Covid, at least. Hope he's doing okay.)
sevilosatya715 years ago
I wonder what happened... the early days of FB you're still somewhat close to your friend circle and therefore most of the content were personal updates. At then at some point Facebook started suggested links, posts that you have not explicitly followed, news articles that may or may not have been verified. Now every time I check the comments under such posts, it's people arguing with each other, and then people share and spread whatever they see from these suggested posts looking to confirm their existing beliefs even more strongly. And then people for some reason started believing they "own" the right to write whatever they desire on their wall, and it's a platform for spreading their political opinions.

Just less than 10 years ago, it would've been considered very rude to push your religious or political opinions on to others, especially when it's a professional setting it would've been considered highly unprofessional. But nowadays that line doesn't seem to exist anymore.

not2bsevilo5 years ago
Worse: Facebook decides that the "most relevant" comments on such posts are the most inflammatory comments, because their algorithms select for engagement, so they want to show me something that will anger you, so you write an inflammatory reply.
Udiknot2b5 years ago
But do they decide? Can they actually read the comment and decide it's inflammatory? Or do they just expose comments that people like to engage with (filtering out only the most obvious insults)?
cmaUdik5 years ago
To not expose highly the one they predict to meet their metrics from the get-go would be a big loss to give up, given the small scale of many threads.
TeMPOraLsatya715 years ago
Agreed, except maybe with an option to filter out the baby pictures. I never want to see those. Becoming a parent made me want to see them even less (my child is the best and the prettiest and that's all I need). :).
qpposatya715 years ago
I distinctly remember during the 2012 election, my friends began posting political materials extensively. Which makes sense because of the Obama campaign's unprecedented spend on social media.

Pre-2012 Facebook was awesome. Now the feed is almost exclusively bullshit from people I don't know.

throwaway423342satya715 years ago
It would hurt engagement.
Kirosatya715 years ago
People use Facebook for different reasons. I use it for Groups and to follow artists but definitely not to see baby pictures.
jessaustinsatya715 years ago
I don't want to be part of a machinery...

No one has to use Facebook.

wmf5 years ago
teddyhwmf5 years ago
From a cursory reading it does not really sound similar; it sounds like Google is picking a side and doubling down on it. The description of the Facebook policy, on the other hand, suggests that Facebook are trying to suppress the drama, not picking a side in it.
hirundo5 years ago
If we were to discuss politics at work at any length we would be at immediate risk of losing valuable people. We all pretty much know where we stand on politics, and it is not together. And many of us feel very strongly about our irreconcilable positions. But by carefully not talking about them (or engaging when someone less in tune starts) we get along just fine. That's not official policy but it is a good one.
benjohnsonhirundo5 years ago
Our small company has a "No drama" policy. We have an astounding diverse team and we've learned to appreciate each other.
ikirisbenjohnson5 years ago
For a lot of people, appreciating all people and letting them live their lives is "politics"
tolbishbenjohnson5 years ago
Does your company do anything remotely controversial such as moderating how people are informed, or providing tech for military organizations?
ry_cotolbish5 years ago
It is a political opinion itself to think those actions are controversial. At scale, anything is political. The only solution is to keep politics out of business, and manifest our political opinions through government. Business, like economics or biology, is dismal. The most efficient and productive continue on.

Government's role is to make the ideologically agnostic machine of business align with our values. In the kind of competitive economy we have, it can only be this way. If we try to apply politics from within a business, we risk introducing instabilities and ineffeciencies, making the business less competitive–an existential threat to the values we incorporated into the business.

dragonwriterry_co5 years ago
> It is a political opinion itself to think those actions are controversial.

No, the existence of controversy over an issue is a question of empirical fact, not political opinion.

The ascription of significance to the existence of controversy may be a political opinion (and is certainly a value-based opinion), but not the question of whether controversy exists.

ballenfdragonwriter5 years ago
If it's an empirical fact, how much objection from how many (and which) people is enough to cross the line into controversial? You can always find at least one upset person about any significant decision of any company, thus it's inherently political when you decide which group of people or how big a group you have to have to merit the "controversial" badge.

Those complaining of being deplatformed would probably agree strongly with your definition, however, so I will admit the definition of this word is itself controversial. Or maybe I shouldn't, because the prior sentence feels very political to me.

dragonwriterballenf5 years ago
> If it's an empirical fact, how much objection from how many (and which) people is enough to cross the line into controversial?

Any. Controversial is a continuous-valued, not binary, attribute.

How controversial is enough to justify a particular reaction? That's a political judgement, and in practice has as much to do with where you stand on the controversy as how much controversy there is.

pjc50benjohnson5 years ago
Does it also have a sexual harrasment policy, or is the response to one employee reporting that another groped them going to be "you're fired"? As is traditional?
watwutbenjohnson5 years ago
> we've learned to appreciate each other.

How do you know if people cant express anger at someone? It is not mock question. I recently found out that colleagues who pretended to have good relationships (because we dont talk negatively about others as cultural thing) had long term resentments against each other. And those resentments were influencing work under surface in negative way - until it blew up into dysfunction which is how I realized.

benjohnsonwatwut5 years ago
We have an issue system - instead of "Bob always leaves his mess to clean up" if you have a problem your supposed to put it in generic terms like "kitchen is sometimes left a mess" and then put a counter measure: "kitchen cleaning trainig and checklist".

If the countermeasure is reasonable we implement it.

We've found this keeps people from festering. Heck.. one employee thought he was being underpaid. He was.

He put it in the issue tracking system and we now have a public skill system and renumeration scale.

shakkharbenjohnson5 years ago
Not saying that this can't work in practice, but it sounds a bit passive-aggresive to me. I'd imagine if someone has some feedback for me, they'd talk to me directly instead of leaving a ticket in an issue tracker. You already mentioned that your company is small and this system has worked for you so far, but I doubt this will work at Facebook scale.
peterlkhirundo5 years ago
This is exactly the argument for privacy. Privacy is about agreed lines because we know that there are some places where we just won't get along. The "if you have nothing to hide" argument assumes that you want to see the things that I'm hiding. In this case, people are hiding their political affiliation (or at least their explicitly expressed opinions) because we know that if it was shared, all of our lives would be harder.
roenxipeterlk5 years ago
Privacy is slightly different - it has a temporal aspect. Eg, religion on a census, for obvious reasons looking back at last century.

The consensus against retrospective punishment is a lot weaker than people might expect, and who knows what new social crimes the future will bring.

PunchTornado5 years ago
People shouldn't be forced to join in political debates in the workplace. If you're not interested, you should be able to avoid climate change, racism, hate speech debates.
pklauslerPunchTornado5 years ago
I completely agree with your first sentence, but would not characterize any of your examples in your second as being inherently political as opposed to having been needlessly politicized.
fivre5 years ago
I've made attempts to reach out to some of my old friends there on the T&S team in light of some evidence really blatant Russian agitprop thriving and finding an audience there. Between this and the Zhang memo, however, it looks quite doubtful that I'd be able to do much more than reconnect and share a rather depressing lunch as they explain that their hands are tied because of executive will.

The IRA and/or its successors or friends appear to have taken the same approach as Russian security services have with the rash of targeted murders in Europe, with a "this totally isn't our doing, but anyone slightly educated on the subject will recognize our hand, because we want them to be aware that it's us and we don't actually mind people knowing" wink wink nudge nudge threadbare veneer of disclaiming responsibility.

Normally, I wouldn't really care: the 2016 stuff everyone made a fuss about on social media was largely ineffective and at best served as a smokescreen to distract from their very successful actions outside social media--Buff Bernie is a lasting meme treasure and nothing more. This go 'round, however, they've apparently learned from their mistakes, and I'm seeing.evidence that personal friends _are_ receiving and and are influenced by their messaging.

I thankfully haven't really had to watch any family or friends succumb to the Fox News media poison, and thought my social circles largely insulated from that sort of problem, but I was apparently quite wrong--right about _what_ wouldn't influence people, but blind to the idea that other actors would follow the same model and create content that _would_ suck in their target audience.

https://twitter.com/evelyndouek is a good source of reporting about Facebook and other social media cos' continued lackluster attempts to stand up potemkin independent review bodies, if you want more info on the space and can stomach more disheartening news.

cltbyfivre5 years ago
How terrifying that sneaky Russians are able to effect arbitrary social change by just throwing $100k of FB ads at the problem! I thought I was a moderately intelligent and well-informed person, but after reading your comment, I now understand that no amount of reason or self-awareness can protect me from miniscule amounts of ad spend by foreign spies.
fivrecltby5 years ago
The point is that they're well-educated themselves, persistent, and capable of learning from their mistakes. Earlier attempts were childishly bad. The contemporary ones are better-crafted, and may not be reaching you--I don't know you and can't speak to that--but they are reaching people I know and care about.
luckylionfivre5 years ago
Can you rule out that those people they are supposedly reaching don't simply have different values than you and you're currently seeing circumstances that make them alter their stance because it hits a fault line between your values and theirs?
fivreluckylion5 years ago
Not entirely, no--arguably we may have similar values, in a sense, but (and this is where the guesswork comes in) may have arrived at them via different paths. I'll posit that the path matters, a lot.

The people in question are current students at my alma mater, where I studied, among other things, Russian language and the former Soviet Union. Some in the current class are likely studying the same, but most aren't, and even those that are, well, they're just starting to study it.

Again--no certainties there, but while I received a fairly decent US high school education, coverage of the cultural and political history of the former Soviet Union is limited by necessity--there's just not enough time to slot that in among everything else US high schoolers are expected to learn.

My gut feeling is that if I'm seeing them share this sort of content, that it's reaching them organically, not because they're finding it after a long time studying the whole of the space over a decade of hobby interest--that's where I'm coming from. The end viewpoints and values may have similarities, but they will be colored by many other factors, and those factors matter.

If that intuition is right, while we may share views in some sense, their view is quite possibly being shaped by actors whose intent is to shape it in a particular direction, who recognize that there are avenues to do so (the amplification/radicalization potential of internet content rabbit holes is well-documented at this point), and who aren't really interested in building a nuanced perspective grounded in mutual understanding of both FSU and American history.

Intuitively, based on their past actions, those actors want the opposite: to (skillfully, mind you) leverage their own nuanced understanding to craft a shallow, targeted narrative that's believable enough, with the primary goal of supporting their own agenda and political goals, not with the goal of building a strong basis of mutual understanding across borders. Is trying to reason about those aims hard, to the point of being nearly impossible to get right? Yes! Entirely! But I don't think the response warranted is "well, it's hard, we should all give up and just see what happens". We must try to instead do the best we can, both in our words and actions in a given moment and with an expectation that we won't be entirely on the mark always, but that we can and should try to watch for our mistakes and catch them as early as we can--that is how we improve and help one another.

So, to sum up, can I rule that out definitively? No. Can I make what I think is a reasonable assessment of what's going based on the information available to me and my own background of knowledge, and recognition of what's changed in the world since I made a similar journey? Hopefully, albeit worryingly, yes. I therefore think it's important to not abdicate any notion of responsibility or to call it a day and agree to disagree on a lot of the nuance--doing so tacitly grants one sort of nuance authority, and the intent behind it may not be entirely benign--historically, it hasn't, and an about face seems unlikely at this time.

jessaustinfivre5 years ago
...they've apparently learned from their mistakes...

So the dastardly Russkies didn't intend that Trump be elected? Someone tell Rachel Maddow! This changes everything!

fivrejessaustin5 years ago
They didn't, oddly enough! They were as surprised at the outcome as most everyone else was, and had more intended to put the expected Clinton presidency off on a bad foot. https://www.theatlantic.com/magazine/archive/2020/06/putin-american-democracy/610570/ is fairly on the mark about their goals then and now.

American coverage on their efforts was by and large terrible, at least from major outlets. Focused analyst coverage in the space has been a lot more nuanced, but nobody's reading that without an existing personal or professional interest.

The other half of that analyst coverage is that they rapidly became quite tired of Maddow and friends hammering on a very simple narrative that missed the point, but was very effective at achieving its actual goal, keeping consumers of major media on the left-of-center end of the American political spectrum engaged in their content and bringing in continued advertiser money. That tiredness is relegated to water cooler discussion on Twitter, however, so it's not going to shape major outlet coverage much.

jessaustinfivre5 years ago
Thanks for the link; that seems like a nice summary. I appreciated the warnings about "loose talk" "despite a lack of evidence to justify such", but this lampshade is the size of a tent and swallows the whole article. Jack Cable described real things that could be verified and don't contradict facts we already know. That was good stuff, but everything else seems exactly like loose talk without justifying evidence. Basing the argument for "Russians hacked Hillary's campaign" on the Podesta emails, for one thing, is problematic. Although we're warned "the Russians have grown adept at tailoring bespoke messages that could ensnare even the most vigilant target. Emails arrive from a phony address that looks as if it belongs to a friend or colleague, but has one letter omitted.", in reality the phish that got Podesta was totally generic. [0] There are probably a million people around the world who could have executed this phish. I think I could have done it, if I'd had the inclination.

That's about the extent of the claims that can actually be checked by the reader. Of the rest, I certainly agree with the warnings about poor security for voting machines and other election infrastructure, but that's been a commonplace on HN for a decade, and the most salient if by no means the most egregious example this cycle, the Iowa Primary, is totally dismissed. Also in other parts of the article we're assured without any sort of proof that no one hacked a voting machine in 2016. Can we be so sure? The narrative walks a narrow path. The Russians did bad things but not catastrophically terrible things (i.e. they prepared to discredit the election on social media but didn't change the results). Voting machines should be more secure but let's not even mention requirements for open code and hardware audits (about which I've been writing my legislators for many years). Federal efforts on election security since Trump took office have been paltry but everything before that was great. Did Goldilocks write this? Was she the confidential source who provided most of the information without attribution?

I'm glad that normal neoliberal Democrats will finally distance themselves from the Maddow noise, but I would have preferred actual progress by this date rather than just "yeah sorry we went loopy for 3.5 years". I'd also like some indication that the next president, whether he takes office in January or four years later, will do anything at all to make voting more secure and more accessible to citizens. As it is, I just expect more attacks on the First Amendment. News media firms won't complain; as you observe they're banking fat stacks with Trump to kick around. The concern that keeps me up at night is that they're cooking up a new Russia effigy with which to torment the public now that Covid-19 seems likely to remove Trump himself from public office.

[0] https://www.cbsnews.com/news/the-phishing-email-that-hacked-the-account-of-john-podesta/

jgacook5 years ago
Generally I try and shy away from being too alarmist, but I am so disillusioned with the kind of tech worker HN's userbase seems to represent. I think it's a feckless attitude to think that working in one of the best-paid, global, most influential professions in the world right now means that your only obligation is to clock in on time, code whatever you're told to code, take no ownership of the effect your work may have on the general public and collect your fat paycheck at the end of the month.

Why does it sound good to anyone that Facebook employees should be prevented from discussing the ethical implications of the product they sell their labor to create? Facebook complete lack of accountability - internal or governmental - has to date:

- incited a genocide [https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html]

- provided a bias for right wing content in a American election year (and fired the employee who blew the whistle on it) [https://www.independent.co.uk/life-style/gadgets-and-tech/news/facebook-fire-employee-conservative-right-wing-breitbart-charlie-kirk-dimaond-and-silk-a9659301.html]

- exacerbated a global pandemic, indirectly causing 1000s of deaths, by not policing Covid misinformation [https://www.theguardian.com/technology/2020/aug/19/facebook-funnelling-readers-towards-covid-misinformation-study]

- is arguably a contributor to the global rise in authoritarianism [https://www.theguardian.com/commentisfree/2020/feb/24/facebook-authoritarian-platform-mark-zuckerberg-michael-bennet]

and that's really just the tip of the iceberg. If you buy into the notion that Mark Zuckerberg is a nice man in a hoodie trying to run a business that his employees are tearing down with some radical agenda then I'm sorry, but how naive are you? Facebook has a track record of ignoring the consequences of what happens on their platform in order to continue profiting. It's not a mistake, it's the point.

We should be cheering on tech workers challenging the ethics of the work they produce, not talking about how inconvenient it is for Facebook workers to start realizing how questionable the product they're building really is.

fivrejgacook5 years ago
We should indeed. I unfortunately don't know enough at Facebook well enough to have those conversations in person often, so internet will have to suffice.

It's unfortunately very much in the interest of Facebook's leadership team to discourage it, however, as a clock in clock out, see and hear no evil labor culture is good for the leaders' personal wealth, so ethics be damned, number go up.

SpicyLemonZestjgacook5 years ago
I think it's reasonable for someone working on, say, scaling the photo storage service to say that their work is apolitical and these debates aren't relevant to them. The performance characteristics of Facebook photos aren't going to incite a genocide or contribute to the global rise of authoritarianism.
claudeganonSpicyLemonZest5 years ago
I don’t want to go all Godwin’s law, but the “I was just scaling capacity for processing census punch cards” argument doesn’t pan out very well, historically.
SpicyLemonZestclaudeganon5 years ago
I don't think this comparison works at all except through Godwin's law. Nobody argues that, say, Walmart store clerks bear personal moral responsibility for their company's decisions.
jgacookSpicyLemonZest5 years ago
Yes, and this is why nobody is going after, for example, Facebook HQ's janitorial staff for the moral responsibility of Facebook's actions. Their income remains static in spite of Facebook's quarterly profit so it would be unfair to accuse them of trading their ethics for an income.

There is a fundamental difference when you're talking about a stock-owning, educated, in-demand software engineer, even if they are "just" working on scaling Facebook's image service. They have the institutional power at the company that they could leverage to change the product's outcomes, if they so desired.

SpicyLemonZestjgacook5 years ago
There's a difference, but it still strikes me as unreasonable to say that all institutional power must be leveraged towards political ends. I do business with a lot of companies whose owners don't agree with my politics, and I'd be unhappy to see them dedicate more of their institutional power towards fighting for things I don't want.
jgacookSpicyLemonZest5 years ago
Yes, but it's not a political end it's an ethical end. Facebook is being leveraged by political actors to cause harm in an unethical way - wanting to prevent this is not a political stance unless you believe that being apolitical is adopting some middle ground between America's Republican and Democrat parties, in which case considering ethics at all is a non-starter since both parties have shied away from imposing any kind of hard regulation on Facebook.

Institutional power doesn't have to be leveraged towards political ends, but if you profit directly from an institution choosing unethical behavior in pursuit of profits then you are also behaving unethically. It's completely reasonable to apply that standard to the best-paid of Facebook's employees, just as it is completely reasonable for those employees to petition against committing more unethical behavior.

claudeganonSpicyLemonZest5 years ago
Last time I checked, Walmart doesn’t sell conspiracy theories, stoke political violence, or demonstrably false, targeted advertising. You can muddy things with as many analogies as you like, but it’s quite obvious that Facebook’s engineering staff is closer to my example than yours.
SpicyLemonZestclaudeganon5 years ago
Sorry, I don't mean this to be dismissive, but I don't think I can productively engage with the idea that Facebook is more closely analogous to the Nazi Party than to Walmart. I just wouldn't know where to begin.
claudeganonSpicyLemonZest5 years ago
My comparison was to IBM, not the Nazi party.

Facebook themselves already admitted that they were used to further a genocide, so your dismissal is somewhat beside the point.

https://www.nytimes.com/2018/11/06/technology/myanmar-facebook.html

chillacyclaudeganon5 years ago
Seemed to work out pretty well for all the scientists in Operation Paperclip. So historically, maybe it does pan out.
wmfjgacook5 years ago
I'm convinced that discussing ethics or politics inside Facebook or Twitter will have literally zero effect. Employees should either quit or get back to work.
jgacookwmf5 years ago
Why are you convinced of that? Unionized protests frequently accomplish institutional change - why do you think Facebook or Twitter would be exempt? If anything a unionized tech force striking would have more bargaining power than other groups since they are educated, specialized, and difficult/expensive for either company to replace en masse.
wmfjgacook5 years ago
I agree that unions could be effective, but I also doubt that Facebook/Twitter employees could ever unionize. And for anything less than a full strike, leadership will just ride it out.
itg5 years ago
Looks like tech companies are finding out there's a good reason so many older companies discouraged talk of politics, religion, etc.
iron00135 years ago
How is this different from silencing employee objections to unethical corporate practices? It’s not merely “talking politics at work” to point out that, for example, your company’s practices are helping a political party steal an election. That’s an ethical concern, not a political one.
paganeliron00135 years ago
> your company’s practices are helping a political party steal an election

Nobody is stealing anything, as the rules are set right now influencing public opinion through media channels is not seen as "stealing". If the powers that be were to physically alter the votes and the voting process that would be another discussion, but almost everything presented in the media is fair game.

pjc505 years ago
If Facebook doesn't want to talk politics, it should stop making political donations to PACs.
dejavuagainpjc505 years ago
That's right. There is a goal to separate the employer and employee to absolve the proletariat of the moral connection between their actions day-to-day and the bourgeois who all share in the protections of the corporate veil.

But this cannot be done, despite all attempts to quiet the cognitive dissonance. Every employee of an evil company is evil.

Every political message lobbied for by the employer is the employee's political statement. Any claims to the contrary reek of hypocrisy.

neonate5 years ago
drewcoo5 years ago
How has it worked at tabloid newspapers for longer than I care to remember? Same problem. This is not about tech.
fgrtr3terwy5 years ago
Facebook wants their employees to stop talking about politics, even though Facebook by its nature takes stances on deeply political topics. How exactly do you avoid political discussion when you're asking what constitutes hate speech, whether a US president should be allowed to violate Facebook's content guidelines, or to what extent governments can spread misinformation in other countries?

If you work at Facebook, your work directly or indirectly supports Facebook's political decisions. Facebook just doesn't want you to talk about it. Because Mark and the executives make the decisions, and you're just supposed to follow orders. This is how it works at many other companies. But for a long time, Facebook was able to recruit people to work their by promising that they could 'change the world' and 'make a difference.'

Side note: One of Facebook's board members apparently enjoys the company of white supremacists. https://news.ycombinator.com/item?id=24444704 Will Facebook employees be allowed to talk about that? If you work at Facebook, how do you feel about that?

SpicyLemonZestfgrtr3terwy5 years ago
"Curb" here doesn't seem to mean "ban". What Facebook wants is for employees to avoid the totalizing view of politics you're describing, where political discussion needs to occur in all places at all times and declaring a "no politics zone" constitutes taking a side.
elliekellySpicyLemonZest5 years ago
And that's an entirely reasonable stance for management to take at most companies. But not facebook. They've deliberately tailored the platform to be a monetized political outrage machine. Its like ESPN announcing they plan to "curb" sports talk in the office.
SpicyLemonZestelliekelly5 years ago
If Yankees and Red Sox fans regularly got into shouting matches on the ESPN software development mailing lists, I expect they'd be told to knock it off, and I wouldn't see that as a contradiction of the idea that sports is important.
DaiPlusPlusSpicyLemonZest5 years ago
Even so, in the grand scheme of things sports is utterly inconsequential, politics isn’t.

If your team won the Super Bowl or your nation took home a lot of gold at the Olympics, how does that affect you materially?

SpicyLemonZestDaiPlusPlus5 years ago
The reason to avoid political discussions is that they're often as toxic as the worst sports arguments, not that the underlying topics don't matter. No matter how important the underlying issue is, having my coworkers call each other nasty names won't resolve it.
gabereiserSpicyLemonZest5 years ago
this, I’ve worked at places where we could talk politics and have disagreements and it’s totally fine. It’s when those discussions become toxic it’s an issue and management would rather avoid those hard conversations all together. Having said that, I have also worked places where it’s been banned. I also remember certain mailing lists just as toxic for non-political (maybe political in the engineering world) reasons. I think it all comes down to how you foster discussions about discourse.
DaiPlusPlusgabereiser5 years ago
A problem with suppressing unrelated discussion (like politics and public policy) for all the right reasons such as those mentioned above, is that the suppression alone can be validly interpreted as the bosses/mods/admins reinforcing the status-quo if the workers/employees/users already feel oppressed or marginalised.

Advancing progressive politics necessarily requires upsetting the status-quo, whereas if someone is a social-regressive then simply adhering to the status-quo suits their agenda just fine.

...so then some places want to avoid that image so they do allow discussion on controversial topics with the constraint that people debate things civilly - so far so good - except those same places want to void an image of partisanship - and in the spirit of freedom-of-speech they'll allow discussion on any topic (again, provided it's done civilly).

...which leads to that place or community falling victim of the paradox of intolerance (as, by their own rules, those communities must allow for the advocacy of genocide and unspeakable crimes provided they're advanced by an individual who conducts themselves with politeness, while their debate opponent might be a gay, black disabled Ethiopian Jew who is rightfully concerned for their own life and future who may utter a swear-word a bit too loudly and suffer censure for doing so).

As far as I can tell, the most workable solution to that is for the bounds of the Overton Window to be explicitly declared by the bossses/mods/admins - and in doing-so instantly open themselves up to accusations of partisanship, especially if extremists take advantage of people acting in good-faith.

I believe in most places the current Overton Window permits discussion and advancement of communist utopian ideals but not far-right ethnonationalism - assuming those two are somehow equivalent - and if Facebook - or any other place - had a similar declared Overton Window policy then it can be said they're biased towards the left, which is great fodder for the pundits on America's most popular right-wing TV news channel.

SpicyLemonZestDaiPlusPlus5 years ago
You're not entirely wrong, but I think the endpoints you're picking obscure the problem. Few people mourn the loss of ethnonationalist discussion, but Facebook's Overton window is going to exclude a lot of mainstream right-wing and centrist thought that goes beyond identity politics. (For example, I wouldn't advise Facebook employees to tell their coworkers about opinions like "welfare programs are bad" or "all responsible people should own a firearm".) That's not just "fodder for the pundits" - it's a real problem if huge swaths of the political spectrum are unwelcome as employees of a major communications platform.
lopmotrDaiPlusPlus5 years ago
If politics is consequential, then one of the worst ways to handle it is have people digging themselves into entrenched extreme positions just to "win" the ideological war against their dehumanized imaginary enemy. Better to shut them up so at least they have a chance to engage their brain in peace without the endless emotional need to fight for whatever unreasonable nonsense somebody goaded them into getting angry about.
elliekellySpicyLemonZest5 years ago
You're right. Then again ESPN doesn't bend over backwards to constantly bombard viewers with content that will sow division and hate between franchises so I'd imagine it's less likely to be a recurring issue.
heartbeatselliekelly5 years ago
Don't get high off your own supply.
TalanesSpicyLemonZest5 years ago
They can be a "no politics zone" when they agree to "no politics money."
xamuelfgrtr3terwy5 years ago
>Facebook by its nature takes stances on deeply political topics.

Why is that "by its nature"? I don't think that's "by its nature" at all. Phone companies and ISPs facilitate communication between people, but that doesn't necessitate they take political stances on what communication to allow. Why should Facebook? If certain language should be restricted, then laws should be written restricting said language and Facebook should comply with those laws. Nothing about Facebook's nature forces them to go beyond that and act as de facto language legislators.

joshuamortonxamuel5 years ago
Perhaps this demonstrates why facebook is fundamentally different from an isp.

Facebook doesn't "facilitate" communication in the same way a computer "facilitates" communication. It facilitates communication in the same way that a forum or book club or group of people facilitates communication. Groups require some moderation to remain popular. Facebook is driven to moderate by the market.

t-writescodexamuel5 years ago
> Phone companies and ISPs facilitate communication between people, but that doesn't necessitate they take political stances on what communication to allow

Taking a stance to not control what communication is allowed is a very political stance. It just so happens that, I believe in those cases, it's also a legally mandated stance; but, if it weren't a legally mandated stance, it would absolutely be a political stance, whatever they ended up saying.

Where a private company decides to limit free speech (or not limit free speech) is, 100%, a political stance when the laws have not been written that make that decision for them.

Even if we maintain a law around protecting companies that just host other people's content vs curating and publishing content, it could be seen as a political decision whether a given website and company choose to be on the publisher vs public content stance.

I'm forgetting the word for publisher vs ... whatever it is where they take no responsibility for what people post on the site; but I hope my point is clear.

novokt-writescode5 years ago
In these debates, people seem to pretend that if facebook or others don't stay politically neutral as possible premptively, that pissed off politicians will not do it for them, badly, in hundreds of countries. It's why social media companies have the kid gloves with politically powerful people.

A segment of social media company staff also don't like that reality and want their social to censor the political parties / discussions they don't like and thus they toe the line and give unsatisfying non-answers at all hands and to the media.

ponkert-writescode5 years ago
This is why the cloud flare CEO has lobbied for regulation that takes these decisions off his hands.
mehrdadnt-writescode5 years ago
> Taking a stance to not control what communication is allowed is a very political stance.

Then in your model of the world, how would one not take a political stance?

If everyone takes a political stance in your model by definition no matter what their intents or actions are then it's a rather useless definition.

elimehrdadn5 years ago
Indeed! We all take political stances every day and it is certainly not possible to run an apolitical social network. The word is probably not very useful. But being able to reflect on “sensitive” issues is extremely useful and, I would argue, necessary.
throwawayseaeli5 years ago
And yet any random person on the street can easily flag something as being political or not political. I don’t accept that everything is automatically political. That is just an unrealistic argument that doesn’t match the reality of how the word is used.
NoodleIncidentthrowawaysea5 years ago
You're correct. Almost everyone can efficiently categorize things they disagree with as "politics", while categorizing things they agree with as "common sense".
throwawayseaNoodleIncident5 years ago
Again this is hyperbole and is not how people think. The politicization of everything and denying that some ideas are apolitical are tied. The latter is a false justification for the former.
shardmehrdadn5 years ago
It's like Rush says in Free Will, "If you choose not to decide, You still have made a choice".
mehrdadnshard5 years ago
Taking that logic at face value, it would mean "if you choose not to take a political stance, then you still have made a choice". Okay, so you've chosen not to make a decision. Yet I don't see how having made the choice implies you've still made a decision ("you have still taken a political stance"). If anything it seems like you just argued against the point?
randallsquaredmehrdadn5 years ago
Choosing not to restrict what political things people can say (or to whom) is definitely a political choice. In many contexts today in 2020, choosing to allow others to speak about their opinions without restriction or opposition is seen as, not only a political choice, but an active attack.
shardmehrdadn5 years ago
If you choose not to decide on a political stance, you have chosen to accept the current political climate as it is.
luckylionshard5 years ago
What is the current political climate? And is it static, will it not change?

Is providing food in supermarkets without political background checks "a political choice"? If so, then everything, including picking your nose with your left or right hand, is a political choice and the term "political choice" becomes utterly meaningless.

shardluckylion5 years ago
The current political climate is the policies enacted by politicians and the populace's reactions to it. It is not static, and it changes with the politicans in power, the laws in effect, and the political mood of the populace.
adamseamehrdadn5 years ago
This is an extreme example. But. Imagine someone knocks on your door and says “please help, the secret police are after me.”

You don’t help, but you don’t report them to the secret police either.

That is a choice.

waterhouseshard5 years ago
It's a lot like, "If you're not with me, then you're my enemy."
CocaKoalamehrdadn5 years ago
> how would one _not_ take a political stance?

A non-political stance would be one which has zero side effects on anybody other than yourself. As soon as the actions you make and the decisions you take have an effect on somebody who isn't you, it becomes political.

Actually _taking_ a non-political stance is an exercise left to the reader.

stale2002t-writescode5 years ago
> Taking a stance to not control what communication is allowed is a very political stance. It just so happens that, I believe in those cases, it's also a legally mandated stance; but, if it weren't a legally mandated stance, it would absolutely be a political stance, whatever they ended up saying.

Not really, no. That is not usually what people mean, when they say "take a political stance".

We can extend this to other examples. Do you think that a grocery store should ban people from their stores, if the individual is wearing a pro Trump, or pro Biden Tshirt?

I think it would be pretty silly to condemn a grocery store, for refusing to ban people from their stores, if they were wearing a "Vote for Biden/Trump" shirt.

Most people would find it absolutely and completely ridiculous to ban people from stores for doing that.

adamseastale20025 years ago
[EDIT: actually what’s probably quite relevant is all the stories about high schools sending young women home because they don’t approve of their outfits even if said outfit doesn’t violate a dress code.]

Imagine someone walks into a grocery store naked.

Or wearing a t-shirt with a explicit image of a man and a woman having sex. Or two men having sex.

Or a t-shirt which says / shows something extremely inflammatory yet not illegal.

I could imagine various stores making various decisions in all of these cases, all of which would be the folks working in that store expressing their beliefs!

Humans are inherently social and thus inherently political (politics in the sense of politics as the negotiation and management of a community).

Or imagine someone walks into a grocery store and doesn’t wear a mask! Lol :):/:(.

stale2002adamsea5 years ago
> Or wearing a t-shirt

Or wearing a "vote for Trump/Biden" t shirt.

In that situation basically everyone would agree that it would be ridiculous to say that the grocery store should ban people for wearing that t-shirt.

I think that most people would not call refusing to ban someone for wearing this t-shirt as a "political" decision.

Most people would agree that it is not political to refuse to ban someone for wearing a "vote for Trump/Biden" shirt.

If you do call that political, to refuse to ban someone for that, basically everyone would disagree with you.

ghaffstale20025 years ago
It's hard to parse all the double negatives.

But, in general, businesses businesses should generally have a fairly wide latitude as to what the t-shirt slogans they allow customers to wear. (Though I think we can imagine various slogans a business might deem objectionable.)

At the same time, businesses can reasonably have a fairly narrow latitude as to what employees should wear, even barring an official dress code, with respect to even advocating for a specific candidate--and that may even get into matters of company campaigning.

stale2002ghaff5 years ago
> to what the t-shirt slogans they allow customers to wear.

But the point is that basically nobody would call it a "political" decision if a store allowed customers to wear "vote for Biden/Trump" shirts.

Nobody would call that political, if a store allowed customers to wear those shirts in their store.

adamseastale20025 years ago
Actually I do call it a political decision, but one that is not so controversial in many places in the US - the political decision to value freedom of speech!

In China, for example, while I don’t know for sure I would bet you could not wear a t shirt with the face of say, a former communist party member who had opposed current clique in power (Ie the closest thing to an “opposition” politician that China has).

Or a democracy activist t shirt.

Point is, in the US, though we are lucky in that we often don’t have to think about it, our politicial principles of free speech allow for a lot of behavior.

Stores are demonstrating their political belief in freedom of speech if a store manager doesn’t kick up a fuss when someone walks in wearing a t shirt for a politician the manager dislikes.

Of course there are probably also laws or the manager is savvy enough to know they could get the store sued, but, you get what I’m saying I think / hope :).

You don’t notice it until it’s not there.

And thus, what often appears to be not making a choice, really is making a choice, albeit the default choice :).

ghaffstale20025 years ago
Yes, but a lot of companies would absolutely do so for employees--rather than customers--certainly including grocery stores even if they didn't otherwise have dress codes (which they likely do).
stale2002ghaff5 years ago
> Yes

So then you agree with the vast majority of people that it is not a "political" decision to refuse to ban someone for wearing a "vote for Biden/Trump" shirt?

Cool. That is my point. Basically everyone would not call it political to refuse to ban someone for that.

triceratopsstale20025 years ago
> Do you think that a grocery store should ban people from their stores, if the individual is wearing a pro Trump, or pro Biden Tshirt?

A lot of bars and clubs have dress codes. Many ban wearing clothes that could be perceived as "gang" colors. It would be a terrible business decision for a grocery store to do this, but I don't think it's wrong.

IncRndt-writescode5 years ago
Taking a stance to not control what communication is allowed is a very political stance.

That's simply not true. One can verify the truth or falsehood of your statement by applying the knife of logic. Draw your statement to its logical conclusion in order to determine if it results in absurdity.

Let's do that.

A person who has had their brain surgically removed will (quite probably) never mention politics or attempt to control other's communications about politics. According to your statement, that person's actions are political. Sorry, but that's absurd.

dragonwriterIncRnd5 years ago
Your conflation of passive inaction with an active choice to refrain from certain action is absurd.
IncRnddragonwriter5 years ago
1. People who live in the Congo may consciously choose not to become involved in Candian politics. It is absurd to think that is political.

2. You wrote: Your conflation of passive inaction with an active choice to refrain from certain action is absurd.

That is only meaningful as a circular definitio0n. Choosing not to act differs from an inability to act. Why? How would you define the difference between the action of not acting and the action of not acting, because of not making a choice to act?

dragonwriterIncRnd5 years ago
> How would you define the difference between the action of not acting and the action of not acting, because of not making a choice to act?

I wouldn't, because not making a choice to act in a particular way is very different from making a choice to avoid as policy acting in that way, or, as was at issue upthread, “Taking a stance to not” act in a particular way. Taking a stance is (for this subject matter, at least) political. Inaction on its own is not.

triceratopst-writescode5 years ago
I see this a misconception a lot online so I have to point out something important. The "publisher" vs "platform" is only about liability.

Publishers are liable for everything posted on their websites. Platforms are not - as long as they make good faith efforts to take down or prevent posting of illegal content.

Both are allowed to engage in moderation, curation or "censorship". Engaging in such does not make a website a publisher.

Talanesxamuel5 years ago
I cannot say something into one end of the telephone, and depending on the content of my message, not have it delivered on the other end. That's the difference.
hartatorTalanes5 years ago
Some govs cut off your phone if you say certain keywords.
heartbeatshartator5 years ago
Which?
Talaneshartator5 years ago
If the state is using your telecom to actively monitor you, would you call that telecom nonpolitical?
triceratopsTalanes5 years ago
If the phone company finds out you're running an illegal/scammy robocalling operation from their phone line, they absolutely will cut you off.
kelnostriceratops5 years ago
Cutting you off entirely is not the same thing as chopping up what you're saying and selectively passing through parts of it and not others.
triceratopskelnos5 years ago
It's not a 1:1 comparison. The point is the phone company can and does take steps to prevent misuse of their network.

Would you prefer it if social networks went straight to bans for rule infractions?

alwayseasyxamuel5 years ago
They do take a stance on what can be amplified through ads or not. Hence its political nature.
r00fusxamuel5 years ago
Neither an ISP nor phone company has a feed algorithm or spam prevention* as a central part of its operations.

These are key to the usability of any social network. And they are inherently biased. Any such organization also has to take money, so ads are also key to their operations, and they have taken political stands on ads too.

The attempted comparison to utility companies is not compelling.

* arguably they do now by limiting "scam calls"

nraynaudxamuel5 years ago
facebook decides how many feeds each post appear on. it's a not a broadcast system, they dampen the spread with an active decision.
okintheoryxamuel5 years ago
At various points in time, Facebook has made it possible to target political ads toward based on your known interest in pseudoscience or antisemitism [1]. Giving access to this degree of targeting (even if the categories are algorithmically generated) is inherently political.

[1] https://www.getrevue.co/profile/themarkup/issues/probing-facebook-s-misinformation-machine-241739

mikenewxamuel5 years ago
Because Facebook, by it's nature, is designed to filter content. It turned everyone and everything into a content generator, even down to the level of "Jimmy liked a post by Burger King!". It gives everyone access to this unfathomable ocean of content, and then filters it down to something a person can consume.

Some of filtering is based on what the user wants to see, some of it is based on some notion of how "good" a piece of content is (scored by likes and engagement numbers), some of it is from advertisers paying to have their content make it through the filter, and some of it is Facebook deciding what should be seen and what shouldn't (mostly driven by their desire to keep you on the platform). Every single thing you see on Facebook has made it through a huge filter that ultimately decides if it's something you should see or not. And the inevitable outcome of building a gigantic what-information-do-you-get-to-see machine is that there are many, many parties trying to influence the machine.

Phone lines don't have that problem.

Thorentismikenew5 years ago
This is a fair point, but I think the political aspect is an "emergent property" if you will, rather than an inherent one.

If Facebook limits the filtering to engagement, then it isn't the fault of Facebook that political content is engaging. That's just human nature. Disasters, outrage, politics, polarizing topics - these are all popular topics both online and off-line, and apread quickly as town gossip well before Facebook.

It is only when Facebook steps in and says that particular topics need to be exceptions to the filtering rules that apply to everything else, that they make themselves into a political actor.

For instance, let's say that the news feed showed you content based purely on number of likes. If political posts get lots of likes that isn't Facebook's problem. If the same ranking rules apply to all posts (# of likes) then they would remain neutral. As soon as Facebook says "content from x person will have their ranking artificially changed to reduce/increase engagement with it" thereby making an exception to the rule that applies to everything else, they have now become a political actor.

tkzed49Thorentis5 years ago
Whether or not a property of a system is emergent does not seem particularly relevant to discussing its effects. It would be nice if we could just look at the way a platform like Facebook works, or see that it does not intend to be a "political actor", but the reality is that its effects are destructive. It has a strong effect on politics. The effects of Facebook are not outside of its responsibility simply because it isn't programmed to be political.
mikenewThorentis5 years ago
I agree with what you're saying; pointing a finger at one particular thing and saying "this gets suppressed" is a bad solution. In fact it probably opens the door to a whole new category of problems.

But... I think what we're seeing with political content is just a symptom of the real problem.

> Disasters, outrage, politics, polarizing topics - these are all popular topics both online and off-line, and spread quickly as town gossip well before Facebook.

This is true. But when information spreads through people's conversations with each other there's limits to how fast it spreads. There's also a lot of room for dialogue and different perspectives. If I have some silly conspiracy theory that I want to spread around, it's going to be pretty hard to convince the people around me that 5G is going to activate microchips that were injected into my bloodstream. They will likely point out that basic laws of physics don't really allow for that. But if I know how to game a social media algorithm[0] to connect me with millions of people that are susceptible to that kind of thinking, I could convince a shockingly huge number of them to believe it[1]. Especially if the social media platform isolates those people from opposing opinions and connects them with people that think similarly.

I think social media is like removing the control rods from a reactor. Those basic human flaws are now being amplified and capitalized on at a scale we can barely even grasp. And it really doesn't matter if Facebook, Twitter, etc. are "at fault" or not. It's a fundamental problem with this services and the problems will continue to get worse.

0. https://www.npr.org/2020/07/10/889037310/anatomy-of-a-covid-19-conspiracy-theory 1. https://www.cnn.com/2020/04/13/us/coronavirus-made-in-lab-poll-trnd/index.html

adamseamikenew5 years ago
Agreed. It’s all pretty new stuff in a lot of ways and we are just starting to figure out what it actually is.
paulgbThorentis5 years ago
> let's say that the news feed showed you content based purely on number of likes

Does any site actually do this successfully? It seems to me that even sites that lean heavily towards algorithmic curation (including HN) still have an element of human veto.

whateveracctThorentis5 years ago
> This is a fair point, but I think the political aspect is an "emergent property" if you will, rather than an inherent one.

This is like saying discrimination that gets baked into an ML model isn't the creators' fault imo.

Thorentiswhateveracct5 years ago
No it isn't. The creator chooses which data to feed in, and which not to. Garbage in, garbage out. If Facebook positions itself as the arbitor of data and actively stops people from posting, then it must take responsibility for what is on the platform. If it simply allows people to sort the data that anybody can post, then any properties of that sorting (whether by likes, or date posted) are emergent.
jschwartziThorentis5 years ago
How could it not be the fault of Facebook when Facebook designed the algorithms that are creating all of the divisiveness on Facebook?

If I build a bridge intending it to stay up and it happens to fall down 6 months later, I'm responsible for it. Facebook created an algorithm that divides people politically and that surfaces content that is provably fictional. So they should be held responsible for it regardless of their intent. They don't get to invoke "common carrier" status when they're writing software that makes decisions about what you do or don't see. What makes a telephone a "common carrier" is the fact that the telephone doesn't decide who you call.

It doesn't matter whether it's software or a human. What matters is that decisions are being made by Facebook about what you do or don't see.

Whether or not it is intentional is immaterial to the effect. The law doesn't care about your intent. I wouldn't intentionally dump toxic waste into a river but I'm liable for dumping whether I intended to or not. Mark Zuckerberg can't just throw up his hands and go "oops it's software I can't help it" when it's his company that made all of the decisions about how the software works.

Thorentisjschwartzi5 years ago
Truth is not the point of social media. Facebook isn't an encyclopaedia. Humans already gravitate towards groups that validate their opinions. It makes sense for Facebook to show people content they want to see. It is incredibly Orwellian to say "Facebook should only show people content which corrects their wrong views". Facebook isn't a social conditioning tool. I find the alternative to "misinformation" much scarier. Misinformation and being mistaken are human flaws we will always have, and therefore any social groups will have them by default. Using technology as a tool to condition people out of their views against their will is scary.

The information is out there. There are reliable news sources. There are reliable databases and encyclopaedias and journalism. If people choose not to read them then that's on them.

adamseaThorentis5 years ago
IMHO we as a society and species are still figuring out mass communication.

Propaganda, misinformation and deception have always been human issue - mass media magnified them as it does everything else.

And I think we all have the right to critique social media, just like we can critique the news, books, movies, etc.

We don’t have to agree but the discussion is a valid one to have!

We get to help shape our society and world, after all :).

IncRndThorentis5 years ago
Facebook has a fact-checking program. That program has third-party fact-checkers. Facebook has been documented as pressuring those third-party fact-checkers to change their rulings.

  https://www.facebook.com/business/help/2593586717571940?id=673052479947730
  https://www.fastcompany.com/90538655/facebook-is-quietly-pressuring-its-independent-fact-checkers-to-change-their-rulings
They can't impute that they checked facts, remove postsings they believe are incorrect, and then quietly put pressure on the fact-checkers to have a different "opinion" as to what is "factual".
freehunterThorentis5 years ago
>It makes sense for Facebook to show people content they want to see.

The problem is, Facebook doesn’t show people content that they want to see. They show people content that they will engage with. That’s a very important distinction.

HN algorithm/moderators actually explicitly do the opposite: if a thread gets too many comments too quickly, it’s ranked downward. The assumption is too many comments too quickly indicates a flamewar and the HN moderators want to keep a civil discussion. The approach Facebook takes is to “foster active discussion” which on the Internet typically means a flamewar. Noting generates engagement like controversial political views. So that’s what Facebook’s algorithm/moderators show to their users.

Facebook absolutely is a social conditioning tool, it’s designed from the ground up to show people content that stirs their emotions enough to click “like” or the mad face icon or even leave a comment and wait around until someone replies back.

themacguffinmanfreehunter5 years ago
That's no distinction at all, what people engage with is just one effective way of measuring what people want to see. HN simply optimizes for something else, that's no less of a social conditioning tool than optimizing for engagement, just in a different direction. You could say that it's designed from the ground up to show people content that stirs their curiosity enough to comment cautiously, or to hide content that stirs their emotions enough to engage strongly.
Thorentisfreehunter5 years ago
My point is that this what happens in real life. People will continue to engage in stuff that they want to engage in. Facebook doesn't force people to engage in anything.

I think it is far worse to attempt to condition people by showing them things *they wouldn't otherwise engage with". What is scarier, showing somebody something they want to engage in based on their past behaviour, or showing somebody something they wouldn't have otherwise seen if you hadn't gone out of your way to shove it down their throat?

People seem to want Facebook to make people more placid. Oh you have extreme views? Here's, let's condition that out of you by only showing you more moderate stuff. Oh you think x is bad? Let's not show you anything to do with x so that you'll hopefully forget about it and not engage with that part of your brain any more.

Like I've already said, this alternative is far more Orwellian and far more of a tool for social control, than simply optimising for engagement.

kelnosThorentis5 years ago
> I think it is far worse to attempt to condition people by showing them things they wouldn't otherwise engage with". What is scarier, showing somebody something they want to engage in based on their past behaviour, or showing somebody something they wouldn't have otherwise seen if you hadn't gone out of your way to shove it down their throat?*

I don't think that makes sense, and I don't think that's what anyone's advocating for.

If you friend someone, or follow a page, or whatever, you are explicitly saying "I want to hear what this person/group has to say". They aren't saying "I want FB to carefully curate what this person/group says in order to increase my engagement of FB". FB shouldn't promote, hide, or reorder anything coming from someone who I've explicitly chosen to follow. It should just show me all of it, and let me decide what I do and don't want to see.

TheSpiceIsLifejschwartzi5 years ago
> The law doesn't care about your intent.

This isn't correct. The law in most modern democracies, as far as I'm aware, is very concerned with intent.

This why we generally define murder and manslaughter as distinct.

Murder is the unlawful killing of another human without justification or valid excuse, especially the unlawful killing of another human with malice aforethought.

https://en.wikipedia.org/wiki/Murder

Manslaughter is a common law legal term for homicide considered by law as less culpable than murder

https://en.wikipedia.org/wiki/Manslaughter

Murder vs manslaughter is the extreme example, though you'll find courts are broadly quite concerned with intent.

adamseaThorentis5 years ago
1. This sounds like a somewhat naive take on filtering algorithms.

2. Does it matter if it’s Facebooks “fault” or not? The issue is their power.

Imho ideally they would acknowledge and accept responsibility for their power and in the US at least there would also be some laws regulating them in this regard.

clairityThorentis5 years ago
it's not emergent, it's inherent. facebook is political because it was designed from the start to control information. that's coercive, designed literally to change people's behavior, whether covertly or overtly, which is the essence of power and the political. facebook employees cannot absolve themselves of responsibility by saying it's the machine that did it, or that other people were involved and complicit.

being political is not an incidental facet of facebook, it's a core intention.

themacguffinmanmikenew5 years ago
Phones lines do have that problem, just with a weaker effect on a smaller scale. Phone call spam/fraud is notoriously rampant, to the point where many carriers & developers create centralized systems to detect and filter calls down to something a person can tolerate. These systems are scored by spam reports and other metrics that I'm not privy to, some of it is T-Mobile or Google deciding what calls should be auto-blocked and what shouldn't (mostly driven by their desire to keep people using telephony, otherwise they wouldn't bother).

And the inevitable outcome of building a what-calls-go-through machine is that there are many parties trying to influence the machine. Eg. faking caller ID, evading blocks with throwaway numbers, spamming no-response calls to figure out which numbers are valid to target, faking a robot voice to pretend to be a real person.

Practically every modern platform uses centralized systems to filter the noisy world down to something fit for purpose, and sometimes this intersects with political issues. That's no reason to expect a platform like Facebook to become even more political in their stance than the existing level of politicization that is almost impossible to avoid.

wonnagexamuel5 years ago
You pick up the phone and call someone. You don't pick up the phone and have the phone company present you a list of people you might be interested in calling, ranked by how much they bid to be on that list.
csaxamuel5 years ago
> Nothing about Facebook's nature forces them to go beyond that and act as de facto language legislators

The ethics of being a/the major institution of mass communication in large parts of the world may not force FB to act as language legislators, but these ethics certainly should compel them to do so.

Relevant points:

- If FB’s status as a mass comms source is threatened, then the company itself is threatened. This threat can be due to a lack in trust in the platform and/or legislation that effectively legislates them out of existence (see below re free speech). This existential issue should compel them to factor language legislation into their corporate policies.

- Stockholders certainly care about FB’s status as a mass comms source even if no one else does.

- Stakeholders obviously care about this, too.

- Relying on governments to regulate mass communications is a Pandora’s box for FB since FB is an international platform.

- In the US, in order to facilitate and encourage free speech, mass comms laws are not particularly restrictive, but they are built on an underlying assumption about social-based regulation that generally hold up but seem to be completely broken with platforms like FB. If FB doesn’t address this issue, then the laws that end up addressing this issue may end up legislating FB out of existence.

To close, whether playing the language legislator is part of FB’s nature, an emergent property, or something else, there are very real reasons that FB has policies on regulating language. Whether they do this well or not is a completely different issue, but putting the onus on government legislators to address the problem with formal laws seems, at best, overly dismissive.

themacguffinmancsa5 years ago
Relying on governments to regulate mass communications is strictly better for Facebook than the alternative, where governments disagree with Facebook's chosen policies and punish them instead. This is why most companies prefer to comply with local law, only enforcing universal policy where it doesn't potentially conflict with local law.

Companies don't just have no reason to regulate language, they also have no serious authority to do so. The onus has always been and can only be on government legislators to address these issues in the most fundamental sense.

I'd like to see Facebook try to take on the Democrat/Republican eternal conflict in the US or the CCP in China on adopting a universal policy that they don't agree with, armed with powerful arguments like "the power of ethics compels me!" or "it's a Pandora's box because we're international!" or "stockholders care about our status!". Going above and beyond their most basic policy obligations has been a great way to attract the ire of political authorities who are now agitated over whether Facebook's policy is intentionally empowering or weakening their political enemies.

vkouxamuel5 years ago
> Why is that "by its nature"?

Because Facebook explicitly chooses how to construct the timeline it presents to its users, and becomes some of that timeline contains political content.

If Facebook were a dumb first-in-first-out aggregator, it wouldn't be political.

But it's not.

ashtonkemxamuel5 years ago
By choosing to have a system by which algorithms would recommend social content, they’ve committed to a course that is going to eventually collide with politics. The only way to avoid it is to get out of the recommendation game, which would be tantamount to corporate suicide at this point.
eli_gottliebxamuel5 years ago
>Phone companies and ISPs facilitate communication between people, but that doesn't necessitate they take political stances on what communication to allow.

They were regulated as public utilities and common carriers. Facebook is not.

lackerfgrtr3terwy5 years ago
How exactly do you avoid political discussion when you're asking what constitutes hate speech

Just like any corporate decision, you have a small number of people who are relevant to making the decision, and they discuss among themselves. It isn't productive to have 10,000 people who are all angry if Facebook doesn't make the decision their way, and they each spend an hour complaining about it on Facebook, while claiming that they're doing work because they're discussing a corporate decision.

cyrux004lacker5 years ago
Make that official them. Remove the BS "culture at FB" about being open and inclusive and your opinion matters stuff both internally and externally
cltbyfgrtr3terwy5 years ago
> How exactly do you avoid political discussion when you're asking what constitutes hate speech, whether a US president should be allowed to violate Facebook's content guidelines, or to what extent governments can spread misinformation in other countries?

These issues are not part of your job description. You were hired to write Javascript, not to set corporate strategy. Sit in your seat, content yourself with the $500K/yr you're being paid by your betters, and refrain from sharing with everyone else your facile moralism.

dwaltripcltby5 years ago
No one is forced to work for Facebook.
skjcltby5 years ago
Betters, haha
jessaustinskj5 years ago
Some commentators have not worked in the sector.
hartatorfgrtr3terwy5 years ago
> Facebook by its nature takes stances on deeply political topics

I use Facebook to say in touch with old friends. If they want to share last Tucker's video, not sure why FB should block them if I don't block them.

DaiPlusPlushartator5 years ago
Imagine if it wasn’t a Fox News talking head, but an incendiary video built on falsehoods being spread to incite violence - or what if it’s 2011 again and ISIS beheading videos are being shared and supposing the evidence showed they were causing ISIS recruitment to increase?
hartatorDaiPlusPlus5 years ago
I would still have a proper judge made a decision than Facebook. I can easily block people that share BS on Facebook but I don't want Facebook to make that decision for me. That overrides my own choice to follow them in the first place.
elihartator5 years ago
What is a “proper judge”? You mean you want the courts to moderate your social platform?
stale2002eli5 years ago
> You mean you want the courts to moderate your social platform?

In situations regarding literal terrorist propaganda, and active calls for violence (Which were the examples given), which are already illegal?

Yes. The courts are the proper place for determining how literal terrorism/imminent threats of violence should be handled.

I don't think this is controversial, to say that people who make imminent calls to violence, as has been already defined as being illegal by the court system, should be handled by the law.

Most everything else should be not blocked by the platform, though.

DaiPlusPluseli5 years ago
> You mean you want the courts to moderate your social platform?

Honestly? Yes. At least that way there's impartiality, accountability, an appeals process, and enforcement.

If Facebook self-moderates you get all of the same downsides of "big government"[1] moderation and none of the benefits listed above.

[1] I assume your argument is predicated in terms of "big government" as though an unregulated and for-profit company having a near-monopoly on key parts of modern society is somehow superior to any kind of state involvement in reigning-in excesses that the free market fails to address.

not2bhartator5 years ago
Let's be more specific: you're in Myanmar (aka Burma), and there are Facebook groups calling for, and even organizing, massacres of the Rohingya minority. We should wait for a judge to shut that down? When the relevant judge is part of the people organizing the massacres? And no, this isn't hypothetical, this happened, and it happened on Facebook.
hartatornot2b5 years ago
Is not about stopping the post of fb anymore but arresting these people. Yes a judge should make that decision.
monocasahartator5 years ago
The state of Burma allowed the massacres though. You can't trust a judge to shut it down.
rahulsbisenmonocasa5 years ago
Well so you want FB to step outside law and filter content. I don't see a way they can be right here.
DaiPlusPlusmonocasa5 years ago
After that episode Facebook should have been subject to an immediate inquiry and then US courts should have ordered FB to not operate in Burma/Myanmar until further notice.
heartbeatsDaiPlusPlus5 years ago
That should be OK. If my friend wants to send it to me, why shouldn't he be allowed to?
DaiPlusPlusheartbeats5 years ago
On an individual basis - one-to-one communication, sure.

But on a publishing platform where a posted article can have millions of viewers with no connection to the author who would miss out on important context... that won't end well.

There's a difference between Facebook facilitating private communication between individuals and small groups with inherently limited information-spread (e.g. phone calls, emails, IM) and Facebook operating a publishing platform that allows for mass communication. The problems we're seeing today stem from that very same mass-communication publishing platform being used as a state-level propaganda tool to sway public opinion (e.g. Russia discouraging Dem-leaning voters in 2016) at one end, to Facebook knowingly allowing and facilitating extremist groups to operate on their platform and coordinate real-life terroristic assaults at the other end.

bleepblorphartator5 years ago
I don't think it's a realistic concern that FB would block content from mass-market media for political reasons. They might block it for copyright, but there's no benefit in deplatforming material that's already been seen by tens of millions of cable viewers.

My problem with Facebook is that it acts as radicalization pipeline by channeling lies that are too crazy for mass market media into the minds of people who are susceptible to believe them.

For a recent example in America, Facebook was used to spread propaganda to Republican rural residents in Oregon by telling them that Democrats were coming to light wildfires in their towns. Oregon rural people responded by setting up their own vigilante checkpoints.

It goes without saying that the claim that Democrats were coming to set rural Oregon on fire was a lie, but that lie spread like wildfire through Facebook, and it was a lie that could have very easily resulted in fatal violence.

To make matters worse, many of the too-crazy-for-mass-media ideas spread on Facebook are the product of astroturf propaganda and are not organic. The problem is not people sharing videos of Tucker Carlson, or even individuals spewing racist diatribes; the problem is well-funded right wing propaganda groups using Facebook to distribute material that encourages people to hate--or even kill--their fellow citizens.

While I agree that judges should make the ultimate decision on what is acceptable speech, the judicial system just isn't fast enough to respond to the speed of Facebook-propagated propaganda. Society needs a better solution; we can't wait for a perfect one.

Facebook is pouring gasoline on the fires of social division. This is not just unethical but is extremely dangerous to social stability and it ought to be stopped.

pesentifgrtr3terwy5 years ago
Facebook employee here. You can definitely continue talking about any company business or decisions, even if they have political relevance, as you did before. This change only applies to discussions completely unrelated to work.
gilrainpesenti5 years ago
Sure. You don't boil the frog all at once.
jgacookfgrtr3terwy5 years ago
Well put!

My parents are very much of the "no politics at work" generation and I really question why that cultural strain has carried itself into 2020 since it only serves company board members/executives and categorizes rank and file employees as automaton code monkeys who should "shut up and type".

Armchair thought: in this odd period of history where, ostensibly, capitalism "won" as the political system of choice and "the end of history" was declared we have entered an alarming stage of hyper-capitalism mixed with growing discontent/civil unrest. More than ever there seems to be a breathless determination by upper-middle class professionals to not rock the boat in any way in the hopes that these mega-corporations will continue to prop up the stock market, pay out outrageous salaries, and keep the gravy train running. It's a kind of cognitive dissonance where we can see how much damage the big players in tech are wreaking on global society - there's ample evidence - but to recognize and face it would sully the deeply held ideal that tech is some kind of great, benevolent force in our society (more cynically: confronting it would also mean confronting that fact that we as tech workers have ethical responsibilities to society at large that we have at best ignored, at worst defied).

Practically, it's not. Yes, you can catch up on how your cousin's new baby is doing, but you can't disentangle that from the extremist propaganda, disinformation, and real harm that these platforms incur by leveraging human psychology against us. Taking the view that ethics and work are separate silos is hopelessly naive. Almost every profession requires constant awareness and ethics in order to be a benevolent force: doctors, lawyers, builders, scuba gear manufacturers, car designers all have a responsibility to their end user and I can't see how tech is any different. I doubt people would react the same way if this were GM instead of Facebook and their employees were up in arms after learning the car they had been designing and building had a track record of blowing up and killing people.

sidllsjgacook5 years ago
Counterargument: a workplace is, at best, a poor environment in which to advocate for one's political causes. Most people engaged in labor for a company do so because they are not financially independent--that is, they have to in order to simply live. The larger the organization the more likely the employee base will span incompatible political views. Also, very vocal proponents (regardless of what political view they're advocating) will tend to dominate and effectively marginalized the more soft-spoken: a poor outcome in other contexts; why not this one?

On a more personal side: I honestly cannot stand when most people discuss politics in the Slack at work. The vast majority of comments are snarky, are unsupported (by data) opinions, or are caustically dismissive of opposing views. It's bad enough when people holding political views I disagree with engage in that behavior, but it's much worse when people I do otherwise agree with do. And it happens in just about equal measure, as far as I've experienced.

Work is already stressful enough without adding to it with political fights.

jgacooksidlls5 years ago
1. I agree with you in as far as politicking that has nothing to do with your workplace can be a distraction, but as it pertains to Facebook the politicking is not abstract, but relates directly to Facebook's actions. It might be unacceptable for an employee to use company resources to boost a political candidate: this is not the case here. Facebook is curbing internal criticism of company policy.

2. I think it's disingenuous to imply that Facebook workers - and bear in mind we're not talking about the janitorial staff here, but tech workers who command salaries at and above $100K p.a. - must work at Facebook lest they be destitute. The greatest advantage of being a tech worker is the range of high salary positions available to you. That aside, I return to my previous point about this not being an abstract, culture wars style debate, but specific critique of company actions. It's not politics, but internal politics. Every company has internal debates about the strategic and ethical direction of the company - why not this one?

3. I understand that politics can be exhausting, especially in the highly polarized environment we live in, but I don't think that's sufficient reason to forbid internal critique of any company. Moreover I think the stakes are higher than we are comfortable with - Facebook has already ADMITTED that they provoked the Burmese genocide 2 years ago [https://www.nytimes.com/2018/11/06/technology/myanmar-facebook.html].

To flip the question around: what makes YOU think that YOUR personal right to feeling relaxed at work is more important than an employee's right to ensure that they do not work on a product that can lead to mass murder? Moreover, is it really a political stance to demand that you are not complicit in unethical activity?

sidllsjgacook5 years ago
Nobody said anything about “personal rights”. Also, yes, it is a political stance. And one in this context your own comment indicates you have a solution for: don’t work there (at FB). A lot of people don’t have the luxury to pick and choose employers for their political activities: arguably they’re not a position to agitate on the inside, either.
anonymousabfgrtr3terwy5 years ago
>How exactly do you avoid political discussion when you're asking what constitutes hate speech, whether a US president should be allowed to violate Facebook's content guidelines, or to what extent governments can spread misinformation in other countries?

Probably gotta just do what Mark thinks is right and, should that not be clear, guess what he would think is right. And suffer the consequences yourself should you guess wrong.

tanilamafgrtr3terwy5 years ago
> its nature takes stances on deeply political topics

Most of its employees job have nothing about politics. They imagined such relevancy themselves to create a greater significance/satisfaction from the daily mundane job.

ThomPetefgrtr3terwy5 years ago
i dont think they want them to stop, they want thrm to focus on the job not take advantage of their well payed and flexible jobs to be activist.
m0zg5 years ago
Good for them. I haven't worked at FB, but I can only assume they're similar to Google in this regard, maybe worse, since their workforce tends to be younger on average. Things were already getting pretty unbearable when I left Google years ago, and (according to people I know who still work there) have taken a turn for _much_ worse in 2016. When recruiters email, I politely decline, without specifying why, but this is largely why. I actually liked working there when it came to _work_, but the environment was extremely politicized and oppressive. No differences of opinion were tolerated at all. You'd immediately be ratted out to HR for a mere suggestion that someone is too aggressive/uncivil in enforcing the dogma on internal Google+.
mensetmanusman5 years ago
Interesting to see young companies fall into line. There is a reason it is against norms to talk about these things in most companies, because it causes undo conflict usually far outside of the context of what is being worked on.
ponker5 years ago
Very glad to work somewhere where people just don’t talk politics or anything serious at work. I have kids and I don’t want to risk my job over saying the wrong thing. I don’t need you to be my partner in discovering myself, let’s just discuss our work, the weather, and the local sportsball results.
throwitawayfb5 years ago
I've just been given an offer from Facebook and I have a few days to decide to take the job or not. The ethical implications of what I'm doing are intense. On one hand, a near 400k total comp package is very nice, but on the other hand I don't want to make the world worse off. I think if I could make that kind of money working from home for another company it'd be an easier decision. Unfortunately, I have to play the hand I'm dealt.
trhwaythrowitawayfb5 years ago
Don't sell yourself that cheap, ask for 600k. Once you get it, your doubts will disappear just like by magic.
bendoernbergthrowitawayfb5 years ago
How much less would you make working at another, less evil company?
throwitawayfbbendoernberg5 years ago
Unknown. No other company has recruited me and made me an offer.
jgacookthrowitawayfb5 years ago
If you're being offered a $400K comp package I can say with a lot of certainty that you have not been dealt a hand and are, in fact, a highly skilled worker with a great many options for employment, so sincere congratulations on your success!

It's therefore hard to see how taking this offer would not be choosing to sell your ethics for money and success, given that you could likely land a well paid job anywhere.

chmaynardthrowitawayfb5 years ago
Consult a lawyer for advice on the legal implications of being directed to work on projects that either violate the law or your own ethical principles. The lawyer can help you draft an employment contract that protects you from retaliation if you object. If Facebook refuses to sign the contract, walk away.
emtelchmaynard5 years ago
Someone should make a museum of bizarre advice found on HN.

No normal company is going to sign _any_ contract provided by a prospective full-time employee (except perhaps if you are a sought after celebrity being hired at a VP level or above), so it would just be a waste of time and money for someone to take your advice.

Even if the hiring manager personally wanted to, there is no process for doing this. They don't have lawyers standing by to review such contracts. It would probably be hard to even find out who would have the authority to sign such a contract.

Further, retaliating against whistle-blowers is already illegal, as is ordering employees to break laws, so I don't know what additional protection you imagine you would get from such a contract.

chmaynardemtel5 years ago
> No normal company is going to sign _any_ contract provided by a prospective full-time employee

Agreed. To take my advice, you would need to be hired as a contractor/consultant. Normal companies do this all the time.

babythrowitawayfb5 years ago
I’m obviously biased since I work there, but I had the exact same concerns before starting two years ago. In reality things are much more different that what HN makes it sound like. There’s all sorts of people, and not everyone agree, and much like the current climate in the US people are getting more and more polarized. I think it’s pretty awesome that everyone in the company is free to express themselves and debate and openly challenge management during Q&As and other events, but I also recognize that at some point the debates can turn toxic and I can see why we would want to avoid that. Once you realize how things work from the inside, you realize that the majority of people do want to make the world a better place, and that it’s easy to pick on things that didn’t work quite well and forget all of the positive sides that social networks have brought to the world. You can tell me that I’m drinking the kool aid but IMO internally things are really not at all like HN likes to portray it everyday.
ciarannolanbaby5 years ago
I don't think the question is really about how the internal politics of Facebook work. The question is whether you should devote years of your limited working time and your talent to make the world a worse place to live in (which I think FB almost certainly does).
babyciarannolan5 years ago
And I’ll say that you’re entitled to your opinion. If you do think that, then indeed I would not recommend to join. If you are on the verge, like I was, then my advice is to join because you will be surprised to see that it’s not what you thought it was.
ciarannolanbaby5 years ago
It may not be what you think it was internally, but it's negative externalities are there for the world to see.
babyciarannolan5 years ago
What about the upsides?
ciarannolanbaby5 years ago
Also there for the world to see. I just think on balance Facebook is a force for social destruction and individual harm on a scale never even imagined.

I looked at your blog; you seem like a talented, driven person. Why not apply those gifts to something meaningful? Why spend your limited working years building tools for this horrible company?

babyciarannolan5 years ago
Well from my comments it should be obvious that we don’t agree :)
kelnosbaby5 years ago
The problem is that we can't take your agreement or disagreement at face value. The OP that prompted this entire thread admitted, front and center, that their $400k total comp number was very seductive. People do and believe all sorts of things when their fat salary incentivizes them to do so, even if they don't consciously realize it.

What's that Upton Sinclair quote? Ah, yes: "It is difficult to get a man to understand something, when his salary depends on his not understanding it."

I recognize that it might not seem fair to take this position, but understand that it's an easy one to take when I look at FB's negative effects not just on the world, but on the lives of actual people I know. It seems unlikely to me that a disinterested party could truly weigh FB's positives and negatives and think the balance is positive. But you are far from unbiased, and I hope you can at least realize that.

babykelnos5 years ago
Oh yeah for sure, I am indeed biased and I recognize that. I still think that most people here fail to see how the app is used throughout the world in positive ways.
kelnosbaby5 years ago
I don't think that's the case. Most replies I see absolutely acknowledge that FB has positive effects, just that the negative affects outweigh those positives (which is my position as well). And frankly I just don't think a current employee (who is relatively happy and in good standing) can be objective about this at all.
cheezebaby5 years ago
Heavily outwieghed by the downsides.

Facebook has done a lot of good, but IMO there is no question that it's done more harm.

And Zuckerberg is a crazy person. There are a lot of people I wouldn't want to ultimately report to, but Zuck is right next to Larry at this point.

babycheeze5 years ago
> there is no question that it's done more harm

Not a great way to argument against someone who disagrees with that point

Reedxbaby5 years ago
> Once you realize how things work from the inside, you realize that the majority of people do want to make the world a better place

It's hard to square that with the algorithmic feed, likes, etc, which are making the world worse every single day in favor of engagement metrics. We've known for many years how destructive these are.

Facebook and Twitter could literally make the world a better place simply by disabling those kind of features. Just remove them. It doesn't get easier than that to substantially improve the world, yet it's not being done.

luckylionReedx5 years ago
> Facebook and Twitter could literally make the world a better place simply by disabling those kind of features. Just remove them.

I'm not sure about that. I agree that the world might be better, but I'm not sure they could just disable them. The next smaller competitor who won't will have more user engagement and grow. If something is a very effective advantage, I believe you can only remove it by coordinated action and enforce it on a global scale.

Modern weapons are terribly efficient at killing people. But if you're the only country that's removing them from your arsenal, you depend on the mercy of your neighbors.

tpxlluckylion5 years ago
Ah yes, the "if we don't do it, someone else will" defense, that has been shown to have merit time and time again.

If you don't want to make the world a worse place, you don't do it. Hiding behind such logic means you're really just virtue signalling.

jakearluckylion5 years ago
This doesn’t need to be a secret flipped switch. It could be a very public announcement, which lots of supporting data and arguments. Nobody is going to build a successful competitor to FB based off of “we’re doing the same thing that Facebook just very publicly stopped doing because they took a stand against its society-destroying implications”.
luckylionjakear5 years ago
Maybe, but I have doubts. Nobody likes predatory lending, but it's still a blossoming industry. I believe that works for small things, but if the advantage is large enough, somebody will step up and do it.

And it's not like people don't like it. They "want" to be engaged, to feel anger and surprise etc, those systems work because they're catering to peoples' instincts and desires.

throwitawayfbbaby5 years ago
Thanks for your perspective, it helps.
thurnthrowitawayfb5 years ago
I suggest you try and talk to some people who actually get positive value out of Facebook for an alternate perspective. The HN audience (mostly young, nerdy white men) have always been one of the worst demographics for social networking products, and they might give you the impression that it's strictly bad for society because they don't personally derive any benefit from it. But if you step outside of that group, you will find there are actually a lot of people whose lives are enriched by social products.

Incidentally, this is why Google+ failed -- it was a social network marketed to the kind of people that hate social networking :)

PaulStateznythurn5 years ago
> The HN audience (mostly young, nerdy white men) have always been one of the worst demographics for social networking products...

Wow, that's a very racist/sexist statement and you don't even leave a hint about why you think it's true. Worse, it reads like you expect it to be obvious. What about a person's gender or skin makes them "a bad demographic for social networking"?

Edit: Also presumptuous of you about the HN crowd. Where would you even get those statistics? HN doesn't collect that data.

kelnosbaby5 years ago
It's great that y'all discuss this sort of thing internally, but at the end of the day, FB and similar platforms have increased polarization, addicted users, eroded privacy, and allowed state actors to influence elections. And that's the incredibly short list.

Whatever internal discussions you're having, they're not working. I'd posit that they can't work, because FB's entire business model is predicated on user-hostile, polarizing behavior, whether anyone internally will admit it or not.

It frankly does not matter one bit what things are like internally when externally we can see the harm FB has caused, and there is zero evidence that harm is going to stop.

ciarannolanthrowitawayfb5 years ago
Do what you know is right. What you can live with and be proud of.

It sounds like, from reading your comment a couple times, you know what is right but are tempted to ignore that and take the cash.

jballerthrowitawayfb5 years ago
I work at FB. I wouldn’t want to be working alongside people who think I’m a mercenary abdicating my moral precepts. And I wouldn’t expect that of my colleagues. So my guess is that you could get away with it easily, but you‘d be doing a disservice to yourself and your team.
throwitawayfbjballer5 years ago
No, I certainly wouldn't think of my coworkers as mercenaries for hire, as that's a very simplistic worldview that I don't subscribe to. My net concern is: Will me working at FB be a net good, neutral, or bad for society? I'm fine with neutral, I just don't want it to be a net bad.
evgenthrowitawayfb5 years ago
I think it is somewhat clear at this point that it is net bad. While I cannot comment on the claims made by some FB employees rising to defend their own moral choices I can say that I left FB when the consequences of my work on personal privacy could no longer be ignored. Luckily for me (I guess...) this was before it became clear that there was also the long-term destructive effect that FB has on democracy and civil society to deal with, but I think you should know that working there is always going to involve compromising your morals and ethics. You and your co-workers will be surrounded by a lot of motivational posters and routinely be told how your work is serving some nebulous positive benefit, but it will be a lie.

I have worked for a lot of start-ups, including several that grew to become significant giants in their segment of the internet. Facebook has been the only one I worked at where there was a group of employees whose job was to create posters to hang up in all of the offices telling everyone else how important and worthwhile it was to work at FB -- looking back I think this level of internal propaganda should have been a warning sign.

3131sjballer5 years ago
> abdicating my moral precepts

Sounds like there's nothing left to abdicate.

tanilamathrowitawayfb5 years ago
Not intense at all.

As a screw in the Facebook machine, your significance is trivial. This is true regardless of your intention.

Get over the ethnical drama I would say. Big tech is about as ethnical as banks. In another word, the companies don't care, and they are probably not.

skinkestekthrowitawayfb5 years ago
I'm actually in a different position:

I used to loathe Facebook and like Google. These days both seems about the same. Facebooks policy to leave people alone deeply resonates with me even though I still dislike them intensely for what they did to WhatsApp.

And for what it is worth, Facebook unlike Google hasn't insulted me for a decade with the ads they show.

kelnosthrowitawayfb5 years ago
> Unfortunately, I have to play the hand I'm dealt.

Usually you see someone say something like this when they're presented with truly awful options. Seeing it used to refer to a $400k comp package is a bit jarring.

And if you've made it through FB's hiring process and they've given you an attractive offer, I find it hard to believe you don't have other options that don't involve a big ethical quandary, or wouldn't if you interviewed around more.

fblifeadvice645throwitawayfb5 years ago
As someone who also feels they’re evil (left 5+ years ago), I will grant that it’s not all bad like many here like to think. I imagine many small businesses and nonprofits derive a lot of value from their community building and targeted advertising, which is (maybe) a good thing.

That being said, it’s a good idea to understand why the pay is so high (and it’s not because they’re nice people who only want the best for their employees):

You will be expected to leave moral qualms at the door. This an unwritten rule at many companies, but Facebook had to write it. That says something.

You will be expected to work for it. Hard. The people I know at Facebook easily put in 1.5-2x the hours I do at a FAANG-ish (late nights and weekends seem to be the norm), but get paid roughly 1.5-2x what I do. If that’s a tradeoff you’re willing to make, go for it. I however am making more money than I know what to do with, and thus value all the time I’m not working (hobbies, travel, side projects, etc) way more than the money I’d make from working during that time.

At the end of the day you aren’t going to singlehandedly destroy the fabric of society all that much in your first year, so you’re fine making the above sacrifices for a year or two for some quick cash then fucking off to pursue some real interests, go for it. But I sincerely warn you against sacrificing too much of your life (youth especially) and morals for money —- it really isn’t as valuable as it’s cracked up to be.

gabereiser5 years ago
FB by its nature is political, as it supports its ad network. To say to employees you can’t be is really them admitting they aren’t equipped to deal with this crisis they themselves created.
eli5 years ago
I wonder what the people who justify working for Facebook because they’re “changing it from the inside” think about this.
forgotmysneli5 years ago
are there really people that naive at fb?
x3n0ph3n3forgotmysn5 years ago
_Absolutely_ there are.
secondcoming5 years ago
I'm not surprised. US political 'discussions' are ruining the internet as it is, having to endure it in the workplace must be unbearable. Reddit is fucked from it, Twitter should be avoided by everyone and it's here on HN too. It's also on Slashdot even though it's dead, the San Fran office of The Register seems intent on pulling that site down too.

The sooner this fucking election is over, the better. No more having to read about Marxism, Trump, Racists, Snowflakes and Trannies.

I downloaded nVidia Broadcast a while ago, it's really quite good.

themacguffinmansecondcoming5 years ago
I'd be surprised if it stopped after this election. It's shaping up to be one of the most divisive elections ever. Whichever the outcome, at least one of the two large factions will have a lot to say, which leads to the other faction saying a lot about what they have to say.
BTCOG5 years ago
All this rioting and yet nobody burned down the Facebook offices. A shame, could have brought about the radical change they expected.
jballerBTCOG5 years ago
By “they” do you mean Facebook? Or the rioters?

…or were you intentionally toeing the line between sarcasm and anarchy?

BTCOGjballer5 years ago
I specifically meant the ones burning buildings hoping it would bring about change. Burned the wrong buildings.
kerng5 years ago
This seems like a wrong move. The idea seems to disassociate employees from the problems of Facebook, and the problem the platform creates...

When you work at Facebook you should know what's going on and what the company is doing and causing and trying to help fix it.

It sounds like leadership is asking employees to put the head in the sand - shouldn't a leader propose the opposite? What happened to move fast and break things?

htnsao5 years ago
Probably better to #deleteFacebook and give everyone access to their own Mastodon instance for ~$5/yr.
robjanhtnsao5 years ago
It costs a lot more than that to host a Mastodon instance. The system requirements for all of the dependencies are pretty heavy
htnsaorobjan5 years ago
Of course, currently. $5/yr could just be the targeted subsidized price.

#deleteFacebook

gorgoiler5 years ago
Yesterday, I introduced some of my pupils — in this case, a group of ten rowdy pre-teens — to the idea of decorum and vulgarity. There’s could well be a whiff of truth to a comparison between my class of children and this story about FB employees.

It feels very old fashioned, but are we not getting a little burned out by a world where people openly nail gun their identity politics to the mast?

When I were a lad (way back in the nineties) I was taught it was rude to talk about politics, religion, or money. This applied to anywhere one was in polite company, not just at home, and definitely not at work.

Apofis5 years ago
They drive their own content moderators insane. This is just corporate protectionism.
btbuildem5 years ago
The cognitive dissonance is nauseating -- on one side, a flaming sphincter of discord, pandering to the lowest common denominator. But they want the other side, the side that picks the diet and tunes the dilation to be disengaged, apolitical and obedient? I really am not sure you can have both, Zucky.
fareesh5 years ago
If the climate is such that the employees are so passionate about politics, is it at all possible that zero employees have their thumb on the scale in terms of using their position to nudge towards their desired election result?

That seems like a bigger issue. If I am an activist and I poison the enormous dataset that's being fed to a ML model, is anyone even going to notice?

disgruntledphd2fareesh5 years ago
They would notice the size of the ETL job necessary to do this (and tbh, I don't think anyone understands the individual level outputs of any large ML model well enough to accomplish this).
thu2111disgruntledphd25 years ago
Evidence suggests the opposite: they would write a self-congratulatory blog post about it.

For example the work Google does on "de-biasing AI" is all about taking ML models and warping its understanding of the world to reflect ideological priorities.

https://arxiv.org/abs/1607.06520

disgruntledphd2thu21115 years ago
That paper (and all other works in this space) are about population level inferences from the model.

My point is that the individual level outputs (which you'd need to accomplish what the OP was talking about) are essentially impossible to tune so precisely, given our current understandings of the models.

fareeshdisgruntledphd25 years ago
I'm reading a leaked internal Facebook document published on theverge.com where it's suggested that they build a "troll classifier" based on the use of words like "reeee", "normie", "IRL", "Shadilay", "lulz".

The have also suggested a "meme cache" - one of the memes shown is a Folgers coffee cup which says "Best part of waking up, Hillary lost to Trump".

Based on this classifier and hits to the meme cache, "trolls" would experience things like auto-logout, limited bandwidth.

Under "when to trigger this" they also suggest the period "Leading upto elections".

So on the one hand this document seems well-intentioned because there's some bad behavior in these groups like raiding, doxxing, racism, etc.

Rather than focusing on behaviour like doxxing and raids, the approach suggested seems to be directed at a specific group. Why? In the entire universe is it only this group that engages in this kind of behaviour?

It also does a broad classification that lumps anyone sharing the same memes, or vocabulary with punitive action.

Also they associate the election with this, which seems especially puzzling.

Ruthalasfareesh5 years ago
Would you provide a link to that Verge article? I cannot find it.
unabst5 years ago
Zuckerburg touts free speech over Russia interfering with our elections or correcting the president or controlling the viral spread of disinformation, yet moves to control speech within the company due to inconveniences.

“If you don't stick to your values when they're being tested, they're not values: they're hobbies.”

― Jon Stewart

(Many said something similar, but I just love Jon Stewart)

jyrkesh5 years ago
> Ctrl+F -> "quit.", "quit "

I see so much debate about what's right to do within FB, "how will people change the structure from the inside with this rule?", etc.

QUIT. Just quit. Seriously. Make it public why you quit. Quit en masse. FB is not a good company. Your talents are useful in many other places.

Yes, I'm privileged in saying this. No, I wouldn't feel comfortable quitting my job right now.

But if you believe enough that FB is an evil company--as many of us have known for 10+ years now--you should not work there.

If they are doing bad things, and they are not open to people fixing said bad things, stop helping them do bad things.

greencabsjyrkesh5 years ago
They will, in fact, just hire somebody else. American companies have access to an unlimited supply of labor. Better to stick around and try to unionize. It may not happen in your lifetime, but the other fact is the United States is a corporate hellworld. Workers have no say in anything. Until that changes, the corporations won’t.
ggmgreencabs5 years ago
I think argument this has considerable merit.
dangusgreencabs5 years ago
I don’t know that this level of defeatism is justified. Quitting does help.

Why does quitting help?

Consider IBM. Their revenue is about 20% of its peak. It used to be seen as a monopoly power. Now we barely think about them.

I think that IBM has made itself an unattractive place for employees where it used to be seen as an extremely prestigious place to work. And I think poor quality employees, along with average to mediocre management has squandered an incredible, dominant company over the past 20 years.

Facebook will decline if all the most desirable employees just quit. It’s basically just math - those who know the most, interview best, and have the most accomplishments will be able to leave the fastest. Facebook will be left with the D team and they’ll get taken to the cleaners by competitors (as they already are with TikTok - and what’s the average age of the most engaged users of Facebook again?)

Anyway, the point is, you quitting hits a company in precisely the right place - their wallet. Employee turnover is tracked and costs companies money. Higher turnover does bring about changes.

dejavuagaindangus5 years ago
It absolutely does! There's a lot evidence to support this claim. The smartest, most ethical people leave first. Eventually replacement employees will cost more and more to attract. The company will implode in a toxic talent vacuum. Shareholders will flee as margins decline. The company will have to take increasing user hostile actions to survive. Users will flee. Advertisers will flee.

And FB ceases to exist. A loud message will echo through silicon valley fit years to come.

The only ethical decision for the engineers to make is to quit. Thus all employees there are mathematically unethical. They are writing the code that executes the immoral decisions.

thu2111dangus5 years ago
Quitting doesn't help. For it to mean anything, there has to be a plausible alternative plan that the quitters are supporting.

What is the plausible internal plan of the agitators and activists inside FB and other firms? They have none, beyond systematically ban more and more users who violate ever more bizarre and ad-hoc purity rules. That's not a plan.

Moreover, quitting over this stuff isn't a one way street.

Facebook is not an evil company. I wouldn't work there today but that's exactly because of their vicious internal partisan politics that make these firms so unfriendly to anyone who isn't strongly on the left. For anyone who thinks corporate diversity programmes are sexist against men, Brexit is a commendable move towards localism, that sometimes Trump actually might have a point, etc, Facebook is just not attractive to those people today.

It sounds like Zuck may be getting a grip on his workforce and professionalising it. If so, for every activist quitter they'll suddenly find they're more appealing to 10 more normal employees who just don't want their workplace to be a political battleground. Moreover people get more conservative as they age, and they also get more experienced. So they may suddenly discover they have access to more experienced senior engineers who were previously, uh, content with their current job.

dejavuagainthu21115 years ago
Facebook exactly is an evil company. No question about it. To its very core. Anyone who works there is paid money to propagate an evil impact on the world and thus, all employees who work there are evil.
daveFNbuckgreencabs5 years ago
If they have an unlimited supply of labor and we're living in a corporate hellworld, why do they pay so well and include so many perks? They sure seem to be acting like a company that's competing for a limited talent pool.
vldaveFNbuck5 years ago
Because it makes business sense to hire best of the best. Football clubs have unlimited supply of players, yet they pay ridiculous amounts of money to their players.
dejavuagainvl5 years ago
You're assuming the best of the best decide where to work based on compensation. Also consider the number of pro players is miniscule compared to the number of engineers at fb. It's a false analogy because of the law of diminishing returns and population dynamics.
daveFNbuckvl5 years ago
If it makes sense to hire the best of the best, then leaving it should actually hurt Facebook when the employees leave.
greencabsdaveFNbuck5 years ago
In short: to keep doing whatever they want without you opening your mouth.

To elaborate: They don’t all pay well. Not even close. And none of them pay workers in other countries well, and they can outsource endlessly with no repercussions.

But they want to keep their home in the US (the world military empire) because it lets them do whatever they want. And they don’t pay migrants nearly the same because migrants are bound by their visas. But too much of that and the Americans will catch on.

Unionizing in this case is extremely unlikely. There is too much for the employees to lose. Instead, Facebook and the US Government will continue doing whatever they want.

daveFNbuckgreencabs5 years ago
It sounds like you're saying that Facebook has a lot to lose when employees in the US get upset enough to leave.
greencabsdaveFNbuck5 years ago
Yeah fair haha. But so does the US government if Facebook leaves. I do not actually think the current state of things is sustainable but I don’t think anyone knows how it will resolve.
kelnosgreencabs5 years ago
> They will, in fact, just hire somebody else.

That's fine, let them. "If I don't do it, someone else will" is a poor justification for anything.

praptakjyrkesh5 years ago
Quitting may make you feel better but it doesn't fix social problems. There's just too many people who can't afford to take a lower paying ethical job.
dejavuagainpraptak5 years ago
Saying that doesn't help. It's not true, it's an opinion. Is not a fact, but you're presenting it as a fact. You're saying the future will be bad because there's an endless supply of bad people willing to do bad things.

It's a horribly dystopian opinion that disparages the moral action and inhibits good people from doing a good thing: taking a stand against unethical behavior.

praptakdejavuagain5 years ago
I did not say you should not quit, just that it does not fix anything.

What fixes things is organized political action.

chephpraptak5 years ago
I for one would be okay if I was not subjected to social problem solutions that facebook employees think are best.

I did not vote for them and I would rather have the people I did vote for (and that I can stop voting for) solving those problems.

That is just me though. I'm sure many other people would much rather have their problems solved by Facebook employees than elected representatives, I mean I for one also think that most people elect literally the worst people in the world and would rather have them ruled over by unelected clerks at facebook.

bzb5jyrkesh5 years ago
“Quit for your right to make the workplace a living hell for those who don’t share your views on politics”
29athrowaway5 years ago
There will be always people that do not care and just want to get paid. FB won't have many problems finding people to do the job without moral objections.
jondubois5 years ago
I think they should be allowed to talk about politics. Verbal conflict is always good. The reason why political conflicts are not resolvable these days is because there is a strong element of financial self-interest which is preventing honest and rational discourse.

On one side, some people have an interest in not accepting that their financial success is arbitrary and illegitimate. On the opposite side, some people feel that they have been locked out of an arbitrary wealth transfer and so they have a strong interest in not accepting that they're incompetent losers and that they deserve to be at the bottom of the food chain because they didn't time the market right (a highly speculative and irrational market too!). Or maybe they didn't pass the Facebook whiteboard test job interview questions several years back (which is also an arbitrary hiring process by many accounts)... So basically they missed out on a huge opportunity because of some fickle arbitrary reason.

I don't think blocking discourse is going to improve things. History has shown time and time again that preventing free speech will stop people from finding compromises. The only solution to the worsening problems will be violence.

If the elites keep suppressing speech, the result will be worse than WW2 and the elites will not stand a chance because it will be fought on their own turf... The elites won't even know who their enemy is. Their own friends and family members could be against them. They won't even realize it until it's too late.

The right thing to do is to find political solutions. I personally think that UBI (Universal Basic Income) would solve most problems. It wouldn't fix the wealth gap immediately, but it would fix the mechanism which is suspected of causing arbitrary (centralizing) wealth transfer and that would at least level the playing field.

UBI is a really good compromise. If the elites are so confident in their superior abilities, surely they have nothing to lose by leveling the playing field right?

BTW, I currently earn 100% passive income so I'm actually saying this as someone who is on the winning side... I've come so close to complete failure - I leaped over the crevasse in the nick of time; the system's fickleness and arbitrariness are crystal clear to me. I'm currently standing on the winning side of a very deep precipice and I can see legions of talented people running straight into it.