A research team at the University of Colorado Boulder has developed a new technique for detecting cyberbullying on social media platforms.


Spotting and addressing cyberbullying proves quite important, especially considering a recent study which found that cyberbullied victims under the age of 25 were more than twice as likely to attempt suicide and self-harm.


"The response of the social media networks to fake news has recently started to uptick, even though it took grave consequences to reach that point," study co-author Richard Han, an associate professor in the Department of Computer Science, told Science Daily.


"The response needs to be just as strong for cyberbullying."



In this new technique, several different computing tools scan massive quantities of social media data to identify cyberbullying. Parents and network administrators are then alerted that cyberbullying has occurred.


The approach is extremely efficient, using five times less computing resources than current tools.


The researchers also released a free Android app called BullyAlert that lets parents know when their children are victims of abuse on Instagram.


"As parent, I know that a lot of times we are not in full knowledge of what our children are doing on their social networks," study co-author Shivakant Mishra observed.


"An app like this that informs us when something problematic is happening is invaluable."



The team calculated that their tool could monitor cyberbullying on Instagram and the now-defunct Vine (which was used because they made their data publicly available) with 70 percent accuracy.


In research they presented at a conference in April, they also stated that their technique looks at the data in real-time, and could send warnings about cyberbullying within two hours of the abuse occurring. This speed is unmatched by available software.


Their system makes a quick scan of comments when a user uploads a new post, and posts with questionable comments are then prioritised for further checks. Posts with comments that seem benign are pushed to the bottom of the queue.


"Our goal is to focus on the most vulnerable sessions. We still continue to monitor all of the sessions, but we monitor more frequently those sessions that we think are more problematic," Han explained.