Malaysian police are investigating the case of a teenager believed to have jumped to her death after asking her social media followers to vote on whether she should kill herself.
The 16-year-old girl, who was not named, had run a poll on photo-sharing app Instagram with the question โReally Important, Help Me Choose D/Lโ, hours before jumping off the roof of a building in Sarawak, on Malaysiaโs east, on Monday, district police chief Aidil Bolhassan told Reuters.
The โD/Lโ meant โDeath/Lifeโ, and the poll had showed 69% of the girlโs followers chose โDโ, he said.
โWe are conducting a post-mortem to determine whether there were other factors in her death,โ he said, adding that the girl had a history of depression.
Instagram reviewed the teenagerโs account and found that the online poll, which ran over a 24-hour period, ended with 88% percent votes for โLโ, the companyโs Malaysia spokeswoman, Serena Siew, told Reuters.
Aidil, however, said that the pollโs numbers may have changed after news of the girlโs death spread.
The case had sparked concern among Malaysian lawmakers who called for a wider probe.
Ramkarpal Singh, a lawyer and member of parliament, said that those who voted for the teenager to die could be guilty of abetting suicide.
โWould the girl still be alive today if the majority of netizens on her Instagram account discouraged her from taking her own life?โ he said in a statement.
โWould she have heeded the advice of netizens to seek professional help had they done so?โ
Youth and Sports Minister Syed Saddiq Syed Abdul Rahman also called for a probe, saying that rising suicide rates and mental health issues among young people needed to be taken seriously.
Under Malaysian law, anyone convicted of abetting the suicide of a minor could face the death penalty or up to 20 yearsโ jail and a fine.
Instagram extended its sympathies to the teenagerโs family, and said the company had a responsibility to make its users feel safe and supported.
โAs part of our own efforts, we urge everyone to use our reporting tools and to contact emergency services if they see any behaviour that puts peopleโs safety at risk,โ Ching Yee Wong, Instagramโs head of communications in the Asia-Pacific, said in a statement.
In February, Instagram banned graphic images and content related to self-harm from its platform, citing a need to keep vulnerable users safe.
The changes came following pressure from the parents of a British teenager, who believed that viewing Instagram accounts related to self-harm and depression contributed to their daughterโs suicide in 2017.
REUTERS