Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Communications Network Networking The Internet Science Technology

Study Finds That Banning Trolls Works, To Some Degree (vice.com) 341

An anonymous reader quotes a report from Motherboard: On October 5, 2015, facing mounting criticism about the hate groups proliferating on Reddit, the site banned a slew of offensive subreddits, including r/Coontown and r/fatpeoplehate, which targeted Black people and those with weight issues. But did banning these online groups from Reddit diminish hateful behavior overall, or did the hate just spread to other places? A new study from the Georgia Institute of Technology, Emory University, and University of Michigan examines just that, and uses data collected from 100 million Reddit posts that were created before and after the aforementioned subreddits were dissolved. Published in the journal ACM Transactions on Computer-Human Interaction, the researchers conclude that the 2015 ban worked. More accounts than expected discontinued their use on the site, and accounts that stayed after the ban drastically reduced their hate speech. However, studies like this raise questions about the systemic issues facing the internet at large, and how our culture should deal with online hate speech. First, the researchers automatically extracted words from the banned subreddits to create a dataset that included hate speech and community-specific lingo. The researchers looked at the accounts of users who were active on those subreddits and compared their posting activity from before and after those offensive subreddits were banned. The team was able to monitor upticks or drops in the hate speech across Reddit and if that speech had "migrated" to other subreddits as a result.
This discussion has been archived. No new comments can be posted.

Study Finds That Banning Trolls Works, To Some Degree

Comments Filter:
  • Remind me... (Score:2, Insightful)

    by sycodon ( 149926 )

    ...who gets to define who the trolls are and what constitutes Trolling?

    Is it like Pornography?

    • Re:Remind me... (Score:5, Insightful)

      by Anonymous Coward on Wednesday September 13, 2017 @04:41PM (#55191407)

      Coontown was banned because of the speech it contained, not because of what our users did. Reddit's CEOs Steve Huffman and Ellen Pao both admitted this.

      The Board of Directors pushed for the banning, spez complied.

      Reddit is a left-wing propaganda mill, they hire employees specifically to promote social justice (this has been admitted too!), and they also banned my subreddit /r/alternativeright simply because they didn't want to give /r/altright 's userbase to me. My sub didn't have any doxing info on it.

      • Re: (Score:3, Insightful)

        by cayenne8 ( 626475 )
        Maybe we all just need to go back to USENET...where you can do an say most anything.

        Perhaps a public push for more people to host USENET servers for free or something.

        If the speech is short of actually inciting someone to bodily harm, then people should be allowed to post and express it, even if it is distasteful.

        • Re:Remind me... (Score:5, Interesting)

          by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Wednesday September 13, 2017 @05:15PM (#55191661) Homepage Journal

          A lot of us have moved to Mastodon [joinmastodon.org], which is like Twitter but federated like email. You can host your own Mastodon instance (server) and set your own local policies. Then your users can talk to users on my instance, just like Outlook users can email people at Gmail.

          But! I can set my own policies, too. If your users are causing problems for mine, I can completely disconnect from you and end the problem from my side. This is an excellent situation. Instances that are too tolerant of trolls find themselves disconnected from the network. Instances that are too thin-skinned and that server connections too quickly end up the same. Either way, their more mainstream users are likely to flee to more moderately administered instances, so there's a nice feedback loop that optimizes for common decency above other extremes.

        • Perhaps a public push for more people to host USENET servers for free or something.

          If you do so, expect a saturated connection as the pirates flock.

        • by AmiMoJo ( 196126 )

          The problem with Usenet is that there is no effective way to block people. All you have is a useless plonk file, which is trivially defeated by changing username/email.

          I guess if you like reading at -1 all the time that sounds great, but most people want some kind of filter.

          • Can I interest you in a cup of public-reputation-based proactive filtering? Wouldn't you rather spend your time with nice people, perhaps with a tilt in favor of people who have even better reputations than your own?

            • by AmiMoJo ( 196126 )

              Maybe... The problem with reputation based systems is that they are wide open to trolling as well. Slashdot almost works, but periodically people going against the groupthink or getting mod-bombed have their karma destroyed.

              Effective filtering seems to be the best option.

              • by shanen ( 462549 )

                Sounds like a bit of a request for more information and a suggestion about the direction of the information you seek?

                I think that by making the reputation-source-data available (via links on the analysis page), you can prevent the trolls from gaming the system. You would be able to apply various algorithms to detect trolls and even networks of sock puppets, basically by using the 6 degrees of Kevin Bacon approach. Legitimate people would eventually link to legitimate people you actually know, while sock pup

                • by AmiMoJo ( 196126 )

                  Stack Exchange uses a reputation system like the one you describe. They try to detect unwanted behaviour, but it's still extremely hostile to new users and vulnerable to dog-piling.

                  The one thing they do have right is that down-votes carry a cost for the voter, but it's too small. At the moment it's only -1, and people are happy to take that hit to push their political agendas or harass users. Plus it's easy to create new accounts with a +100 rep bonus, giving them plenty of ammunition to down-vote people th

                  • by shanen ( 462549 )

                    I think that much of your concern would be addressed with a "maturity filter", even though it is a relatively trivial aspect of the public reputation. A new identity is young, and I'm willing to wait a month or two for it to mature and develop a ripe reputation--at the expense of people who are more tolerant of newbies than I usually want to be.

                    Actually, I would prefer a mixed mode if it were possible. I'd be willing to see top-level comments from newbies, at least most of the time, but I don't want any per

        • How does usenet deal with spam? tbh spam is a more serious problem than trolls......
        • by shanen ( 462549 ) on Wednesday September 13, 2017 @07:36PM (#55192413) Homepage Journal

          One of the things that destroyed usenet was rampaging trolls. The kill-list was a weak response that ultimately availed naught. That is why I advocate for a more proactive reputation-based-filtering solution. You might choose to stuff your eyes and ears with tripe, but I would prefer not to.

          There is a great deal of confusion about "freedom" and "free speech". Your freedom to speak freely should not block my freedom to ignore idiots. Not that I'm calling you an idiot. Yet. However, if I had to make a prediction based on your short comment...

        • Maybe we all just need to go back to USENET

          - and we will have Kibo back! (https://en.wikipedia.org/wiki/James_Parry)

    • Re:Remind me... (Score:5, Insightful)

      by WrongMonkey ( 1027334 ) on Wednesday September 13, 2017 @04:45PM (#55191433)
      Who ever runs the website gets to decide the trolls. Don't like their definition? Start your own site.

      You have a right to free speech, but nobody owes you a soapbox.

      • Who ever runs the website gets to decide the trolls. Don't like their definition? Start your own site.

        Damn right we'll start our own site! With blackjack, and hookers! In fact, forget the site!

      • by ron_ivi ( 607351 )

        Who ever runs the website gets to decide the trolls

        Remember all the reddit trolls like the GNAA who always attempted to get upvoated first posts?

        /. censored those, and no-one seemed to mind.

      • There has to be compromise. I agree with you so long as everyone has the ability to start their own site and not worry about being shut down by a few select over powered companies. (assuming only legal content)

        Right now, you forfeit your ability to have a site if you say the wrong thing that a few companies may not like as stormfags showed us. Nobody owes you a phone line or access to a road didn't work why would it work now?

    • Re:Remind me... (Score:5, Insightful)

      by Anonymous Coward on Wednesday September 13, 2017 @04:47PM (#55191449)

      Its more like common decency.

      You know how your parents eventually taught you not to shit all over the house? It is essentially the same thing. My cousin works in a day care and has the unfortunate job of doing this kind of training when the (wealthy, in theory well educated) parents fail to do so.

      I suspect this will be much the same but much older children will have to be educated.

      • You know how your parents eventually taught you not to shit all over the house?

        But I still shit all over the house...

    • The company running the site does. As long as it's not the government, that's fine.

    • it's their site.
    • A company operating a site without government funding should be the one to define who the trolls are on their site. Simple as that.

      The First Amendment is immensely important and must be defended, even when doing so means defending abhorrent people, but we need to get over this false sense of entitlement that suggests organizations have no right to interfere with, discourage, or otherwise supervise the use of the platforms they've built. As the creators of those sites, that's their prerogative. The Constitut

    • I'm going to pretend that was a sincere question instead of another bit of first-post drivel.

      Each person should be free to define what to regard as a waste of time. Given that freedom, I would certain define trolls as worthless wasters of my precious time. Sometimes a troll can be thought-provoking, but it's only accidental, and I'd much prefer to spend my limited time with nice people, which leads to my suggestion:

      Let the trolls flush themselves. Simply by being rude trolls, their negative reputations shou

  • I've had posts marked down as troll because people disagreed with the points raised.

    Trolling is making inflammatory statements for the sake of getting people to respond. Disagreeing with a position does not make it a troll post.
  • by mentil ( 1748130 ) on Wednesday September 13, 2017 @04:57PM (#55191533)

    Seems the trolls came to Slashdot after the ban.

    • Seems the trolls came to Slashdot after the ban.

      Good sir, they prefer to be called Libertarians! ;)

    • That's why we have comment threshold settings.
  • which targeted Black people

    People with the surname Black?

  • by liquid_schwartz ( 530085 ) on Wednesday September 13, 2017 @05:33PM (#55191775)

    First, the researchers automatically extracted words from the banned subreddits to create a dataset that included hate speech and community-specific lingo. The researchers looked at the accounts of users who were active on those subreddits and compared their posting activity from before and after those offensive subreddits were banned. The team was able to monitor upticks or drops in the hate speech across Reddit and if that speech had "migrated" to other subreddits as a result.

    How do they know if the person using the "N" word is black, in which case it's considered OK, or non-black, in which case it's an obvious crime against all humanity? Or calling someone a fag is OK for Milo but wrong for normal people? Granted certain sub-forums are likely largely one demographic but word based still seems flawed.

    • by AmiMoJo ( 196126 )

      That's why they didn't ban words, they simply banned subreddits dedicated to things like fat-hate and racism. It's hard to imagine any context in which "coontown" isn't simply overt racism.

      • by swb ( 14022 )

        But in a sub does overt anything mean much? You aren't likely to stumble across niche subs like that by accident. Hell, I still find useful normal subs I didn't know existed because finding subs by topic isn't all that easy on reddit, especially if the title and description don't contain the right key words.

        • by AmiMoJo ( 196126 )

          /r/fatpeoplehate has 150,000 subscribers. The study is really interesting - they didn't have to ban any specific users, just those subreddits, and the main fat haters moved to Voat and the rest of the community stopped being such asshats.

          This is a well understood principal in sociology. A small number of people behaving like asshats gives others "permission" to do the same. It normalizes it.

          • by swb ( 14022 )

            I think you overstate "normalizing" when describing a subreddit subscription as "normalizing" as well as confusing cause and effect of fat hating. I'd wager the majority of those subscribers didn't visit the sub regularly and none of them had positive or neutral views of fat people -- they didn't become fat haters because they saw the sub.

            If Reddit was comprised of some small number of subreddits, I could see where this would be a problem but the quantity of subs and their isolation makes this seem like mu

            • by AmiMoJo ( 196126 )

              If there is no causal link between the existence a fat-hating subreddit and poor behaviour, why was banning it so successful? The study says that users who were part of that subreddit and used a great deal of abusive/foul language actually changed their behaviour afterwards, being nicer on other parts of the site.

  • A few months ago, the Denver Post switched their comment section to something called "Civil Comments"

    Our article comments have been a cesspool of trolls and spam for years. Enter Civil Comments. Civil Comments is intended to bring back the civil in online discourse [denverpost.com]

    The idea is every time you post a comment you are required to rate several other comments as either "Civil" or "Not Civil", but if you are "wrong" too many times you might get banned. That is, if you rate a comment as "Civil" when enough other

  • Study finds limiting free speech successfully stifles the spread of ideas... to some degree.
  • ... results is good behaviour.

    This should not be surprising, the majority of people will follow the community leaders, and the social standards they espouse.

  • Here is another good point from a 2005 munich university study - hate speech wasn't a topic back then, only flaming:

    allowing flaming on a respectable website will
    1. drag down this websites standards in all aspects
    2. make flaming more widely accepted especially on this website but also outside
    3. drives out old customers objecting flaming
    4. brings in new customers prefering flaming

    All of this is pretty obvious but you will be surprised how little the editorial staff is aware of this.

    I have seen exactly this o

  • Fat People Hate was a terrible place, without a doubt, but they stayed in their little box and didn't attempt to invade other subreddits. They moved to Voat, which has also become an awful place, but they maintained their standards of behavior -- being dicks to people who don't really deserve it (and a few who do), but not trying to stir up shit in other groups. Since what they did was deliberately steered away from abuse of the network, I felt they had every right to continue. Of course, advertisers call t

  • Actually, this may be a stretch, but I believe that the 2015 reddit crackdown on "hate subs" not only didn't have the intended effect, it contributed a LOT to Donald Trump's election. /r/the_donald arose when all of the people who had their communities destroyed had nowhere else to go, and attracted a lot of other people who would never post to /r/coontown or even read it, but were outraged that reddit decided they didn't get to exist any more. I don't agree with any of those subs, but I'm far more disgust

You know you've landed gear-up when it takes full power to taxi.

Working...