Twitter, like most social media websites, is no stranger to controversy. The freedom of the internet allows for a certain degree of anonymity, opening the door for people to say what they want without much consequence. In the case of Twitter, this results in a lot of hateful, and often racially charged, language.

Just how racist can Twitter get though? Well, according to a study by U.K. think tank Demos, the website averages over 10,000 racist slurs. After analyzing over 126,975 English language tweets over a nine day period, 1 in 15,000 were found to have some racial slur.

This news is not so damaging for Twitter's reputation, however. Not only is 1 in 15,000 a fairly rare occurrence, but the study concluded that more than half of the racial slurs were not intended to be offensive, but rather used "to express in-group solidarity or non-derogatory description."

For example, by far the most commonly used racial slur was "white boy," which appeared in 49 percent of the racial tweets. Other words included, "whitey," "pikey," "paki," "nigga," "wigga," "spic" and "squinty." Very few of these tweets were made in a deliberately offensive matter.

The authors of the study did stress, however, that hateful language is sometimes difficult to discern merely on the basis of racist terminology. Just as potentially derogatory words could be used in a non-offensive manner, a person can express racist and hateful intent without actually using racist slurs.

That said, the study may not be all that effective in actually measuring racism on the net. As the authors admitted, context is key, and the study did little to account for it. Maybe next time Demos should set its sights on a different site. The comments section on Youtube seems to a popular outlet for racist language.

The study can be found here.