3 hours ago by eganist

Relationship Advice lead-ish mod here.

I've posted this elsewhere on hackernews (https://news.ycombinator.com/item?id=23259595), but I'll repost it here since it's relevant:

---

We're (r/relationship_advice) rarely transparent with removal reasons. Our templatized removal reasons generally look something like this:

> u/[user], please message the mods:

> 1. to find out why this post was removed, and

> 2. prior to posting any updates.

> Thanks.

or

> User was banned for this [submission/comment].

The reason is because we have a high population of people who:

1. post too much information and expose themselves to doxxing risks, and

2. post fantasies that never happened.

So in order to protect the people who inadvertently post too much information, we tend to remove these posts using the same generic removal template. However, if people know that the post was pulled for one of these two reasons, the submitter may still end up on the receiving end of harassment as a result, meaning we have to fuzz the possibility of the removal being for one of these two reasons by much more broadly withholding removal reasons.

This is specific to r/relationship_advice. Other subreddits have other procedures.

3 hours ago by CM30

As I'd have expected, it seems providing an explanation does indeed reduce the likelihood the user will break the rules in future, and make them more likely to contribute in future.

Surprised there wasn't much difference between a human provided reason and a bot provided one though (and in fact, the latter performed slightly better). Wonder what the reason behind this could be?

> s.Our results show that controlling for other factors,explanations provided by automated tools or bots are associated with higher odds of moderatedusers posting in the future

Either way, I always make sure to provide an explanation why a piece of content was removed on any site I manage, with the sole exception of obvious bot submissions (since the latter are literally incapable of understanding feedback).

2 hours ago by matthewheath

The emotional rejection experienced by the moderated user might be less when they associate that experience with an automated tool/bot rejecting their posting versus a human.

It's possible that a moderated user might feel less dejected when they know it's not a human personally reviewing their content and rejecting it.

Certainly I myself would feel less inclined to try posting again if a human moderator rejected my content (because I feel that I have little chance of changing their mind, and I'd feel judged) whereas I'd be inclined to try again if a bot rejected my content because it's impersonal. This is, of course, entirely irrational since automated tools can't really be persuaded in the same way a human can.

an hour ago by TulliusCicero

Sometimes, you don't want them to contribute anymore.

A large number of the people who get banned on the subs I moderate are just belligerent assholes that we're better off without. Someone posting blatantly racist or misogynistic comments isn't someone we want around.

39 minutes ago by naasking

> A large number of the people who get banned on the subs I moderate are just belligerent assholes that we're better off without.

Nobody is "just" anything.

32 minutes ago by TulliusCicero

Iā€™m talking about their message board contributions, obviously.

That someone might have a normal life outside of screaming at women and minorities online doesnā€™t have a ton of relevance on the forum where theyā€™re doing the screaming.

2 hours ago by js8

I wish HN would require an explanation (or a counter-argument) for downvote. Often I get downvoted and have no idea why.

If the content is obviously a spam, it should be flagged and removed, not just downvoted.

32 minutes ago by GuB-42

Slashdot has that, somehow.

Up/downvotes has categories. For upvotes there are interesting, insightful, informative and funny. For downvotes there are troll, flamebait, redundant, offtopic. There are also the general categories overrated and underrated. You can also flag posts for things like spam.

It is not a full review but it is better than nothing. Also, users can give out weights per category. For example if you don't want to see jokes, you can lower the value of "funny".

3 hours ago by tumetab1

Few years back Jeff Atwood on coding horror (which I can't find now) had a "guide" that partially match this data. The poster, and everyone else, must know see the original post and the reason it was moderated so the rule is clearer for everyone.

I think in Reddit, like other communities, the problem still is that most users still don't see moderated content because only upvoted/popular content is seen.

Maybe that could be future analysis comparing communities where moderated content is regularly visible by the overall community vs small set of the community vs only the poster.

3 hours ago by benjaminjosephw

I wonder whether there's a membership size threshold where, after crossing that tipping point, the community implicitly looses some degree of transparency whatever the moderation rules are. At some point the increasing impact each decisions have will skew the potential motivations of members and moderators. Intent become very difficult to judge meaning the degree of transparency decreases.

I've seen so many comments from people about the changing nature of some online communities as they grow. Perhaps capping memberships could curb the increasing politicisation of user behaviour and the weight of any individual action would be limited as a result.

17 minutes ago by elliekelly

I wonder whether itā€™s even limited to online communities. Iā€™ve had similar experiences with a small office and in a small student organization that grew rather large.

3 hours ago by HPsquared

Too much transparency can result in people "gaming the system" if the full workings are known.

37 minutes ago by naasking

If the rules are solid, 'gaming the system' means they're rule-abiding members. If there's a problem with that, then improve the rules and don't blame the users. Reducing transparency because you're relying on tricks is a great way for users to begin hating your mods.

14 minutes ago by elliekelly

This is the same argument I hear against open-source software but in practice it seems transparency helps identify and solve problems more quickly.

Daily Digest

Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.