Content warning: Suicide, self harm
It all came out just before Christmas last year. Stacey’s* 11-year-old son, Noah*, had not been himself for a few weeks and had been asking rather unusual, specific questions: what does ‘gay’ mean? What is suicide? Why do people commit suicide? He told her about a girl at his school who cut herself.
“One day he just laid down on my bed and started crying and eventually he told me that he was really worried about something and he couldn’t really tell me,” says Stacey.
Unsure of what to make of it all, Stacey initially put it down to anxiety about starting a new school in the coming year. However, a few days later she took an opportunity to check Noah’s phone.
“That’s when I found out pretty much everything and I was, and still am, shocked. It probably sounds really negligent, but I didn’t really know how to actually use Instagram, I don’t use it myself.”
Noah was being cyber-bullied. Stacey decided she needed to have a conversation with him, even though she didn’t know exactly what to say.
“I didn’t want to freak out because that would freak him out, so I got him to come up and I just said, ‘Honey, I think we need to have a talk about your phone’,” she says.
“That’s how I started it; I was just really gentle. He was really upset and I said, ‘I think you need to show me your arm’. He didn’t want to, but eventually he did. I only tackled it a tiny bit at a time because it was very overwhelming for him, and for me. Eventually, after about a week, I think, I took him to the GP.”
After deleting his social media accounts, Stacey let Noah have his phone to listen to music while on a family holiday, during which he accidentally took it swimming with him.
“He only had it for 26 days. My first thought was that [the loss of the phone] was a blessing in disguise because it has been nothing but trouble. He doesn’t have a phone for very organic reasons but he’s not punished, so that solved a lot.”
Although it was not an easy thing to do, Stacey also contacted the parents of the other child involved in the cyber-bullying.
“It was really awkward, but my viewpoint was that if this was my child I’d want to know that this was going down,” she says.
“I just tried to be really kind and say, ‘Look, there’s not really any easy way to tell you this and I’m sure it’s going to be really distressing, but I think you need to know what’s been going on. They were very apologetic and the child wrote a letter of apology.”
One of the issues around social media and children is that parents often blame each other, says Stacey.
“We need to stop blaming each other; we have to all get on the same side about this. It’s really hard and they’re only children and we have to all look out for them.
You think your child won’t do it; I thought my child wouldn’t do anything like that either, but they do, so we have to drop the judgement.”
Stacey also wrote a letter to the principal of Noah’s school and contacted police about some of the messages as they included allegations of abuse.
“I felt that [the principal] should know there was a culture, that this was a thing going on. She wasn’t really that interested; she wrote back, but she was just like, ‘Where do I begin, confidentiality …’ and she just said, ‘All the best at a new school’,” she says.
“I did show the police the messages and they did take screenshots of some of the stuff … they did say that they would follow up, so I felt like that was all I could do.
“You just hope that the children will get what they need.”
Five months on, Noah is doing well at his new school. The cuts on his arms are healing and he no longer has a phone or any active social media accounts. Stacey has created an alert for herself in case he reactivates his social media accounts.
“I guess I’ve tried to shelter him, let him get on with starting at the new school and make that his focus. It’s given him distance from those children who are involved; it’s sort of a fresh start, I suppose. That’s been really good. For children, time can move really quickly so for him two months ago probably seems like six months ago.”
Stacey believes social media organisations should be held accountable for allowing children access to their platforms and unsuitable content. While policies state that images of self-harm and suicide will be banned, the sites are still rife with such content, she says.
“These children are following self-harm accounts, advice accounts, suicide accounts and how-to-suicide accounts,” says Stacey.
“It shouldn’t be that easy, they can just put in a false age or whatever. I feel that [social media organisations] must be aware of this, in fact, I know that they are. I just feel it’s very cynical and they don’t care, why would they care when they’re really popular?”
Teenage influencers’ advice
At a joint launch with Instagram of ‘A Parent’s Guide to Instagram’ earlier this year, NetSafe CEO Martin Cocker said parents want practical advice.
“They want to know how to enable their children to use the technology but, at the same time, keep them safe. The idea that parents aren’t concerned about children’s safety or that they are blasé, that’s crazy; we all know that every parent worries about their child and they worry about those things.”
At the same event, a panel of young people discussed their thoughts on social media. Each member of the panel had multiple social media accounts and outlined a variety of uses for these, including spam, backup, jokes, interest-specific, personal, and personal/deep thoughts. Most members of the youth panel agreed that it was normal to differentiate between one’s personal and public personas online.
The group also discussed how they handled unwanted negative communication on the platform.
Seventeen-year-old Dominic Taylor is New Zealand’s most-followed teenager on Instagram.
“For my comment controls,” he says, “I’ve thought of every swear word possible and put in a filter. I’m aware that a lot of my audience is very young so I’ve got to be very careful to make sure that they’re not being exposed to how toxic some of the fandoms can be. I’ve probably blocked upwards of 300 people because of death threats in comments.
“Some of these people are adults and parents, which is quite sad, because I’ve seen some of their accounts and they’ve got pictures holding their babies and I’m like, ‘You’re commenting this on a teenager’s account’. It’s really scary because
I know eight-year-olds with social media, which is ridiculous, and they’re being exposed to a billion people.”
Taylor’s parents and grandparents joined social media to keep up with his posts, which he says is a good conversation starter.
“My dad will bring up anything I post online so when I post something that’s slightly dodgy or anything he’ll bring it up or he’ll text me and be like, ‘take that down’. It’s been a great way to connect with my parents, so I think it’s important that parents do join social media. They both have private pages and they both have like 10 followers, but they’re making the effort and that’s what’s important.”
Twenty-one-year-old panel member Maddie Grant is a member of Sticks ‘n Stones, a youth-led bullying prevention organisation. In most cases, she tries not to take online negativity too seriously.
“If it’s really nasty, you just block it. I’m never afraid to use the block button, I don’t mind at all. I’ve probably blocked over a hundred people,” she says.
“If there’s anything to take away it’s that lack of understanding leads to fear in parents; this fear leads to high emotions, high emotions leads to overprotection and helicoptering parenting, which ultimately makes a teen less likely to come forward if they do need help because they don’t want an ‘I told you so’ reaction.”
The panel agreed that while further controls or settings would not eliminate cyber bullying or negative content online, it was important for searches relating to suicide and self-harm to be flagged.
Seventeen-year-old Emma Kwan says parents should not ban young people who have viewed such content from social media, as this is “the worst thing you could do”.
“You’re punishing them for being vocal about their troubles and ultimately that means they are so much less likely to come forward if they are struggling again in that way. I think what would be good is phrases like ‘I want to die’; if that phrase triggers some kind of warning to Instagram or other organisations who can then check that out, maybe message ‘Are you okay?’.”
Instagram Head of Public Policy Mia Garlick says the platform provides a service that allows people to get closer to others and the things that they love.
“Obviously we want to make sure that people are doing that in a positive and constructive way and also that parents are feeling very empowered about the process as well.”
Instagram has tens of thousands of people working around the world and is increasingly using automation to find and remove inappropriate content, she says.
“I’ve had anecdotal feedback from different school groups that we work with that they’ve reported something on Instagram and it’s been gone within 41 minutes.
“There are ways in which you can filter out particular words or emojis, so if there’s a particular issue that’s sensitive for you at this point in time, you can add that in so it can get filtered out. Again, it’s just about creating that conversation that works for you.”
The platform also has a team working on wellbeing research. Garlick urges anyone who sees any inappropriate or distressing content to report this to the platform.
“We have teams that are staffed 24/7 to try to address those reports,” she says.
“We are getting better at being able to detect that kind of content and sending people support messages, but if you ever do see something like that and you can’t get there, if you let Instagram know we will send them a message saying ‘Hey, someone’s worried about you, here are some tips and here are some ways to get help right now’.
“If it looks incredibly serious … we will actually reach out to New Zealand law enforcement to get them to go to that person’s house and we have done that on several occasions.”
Last year Kowhai Intermediate School encouraged its community to create a social-media-free environment, both in and out of the school.
Principal Louise Broad says the ICT agreement and guidelines sent to parents at the beginning of the year included an explanation of the rationale and a checkbox where parents could signal their agreement to support the initiative by not allowing their child access to social media.
“That was based on our experience in the school of being previously the one-stop-shop for trying to sort out what had been happening in the bedrooms of Mt Albert at 10 o’clock at night when children were on social media, which really isn’t our core business.”
This was part of a wider initiative around digital technology within the school, which also has an ‘off-at-the-gate’ policy, meaning students are not permitted to use their devices on school property.
“Phones are put away in a valuables box and locked away in an office. Our children play at lunchtime; they’re out there socially interacting, they’re kicking balls around and the playground is an amazing place to be. We don’t have pockets of children sitting in corners playing games on phones.”
The underlying aim is to help children relate to others in a positive way, resolve issues face-to-face and to be effective communicators with each other, says Broad.
“I had a student who really sticks in my mind … he had said the most appalling things on social media to another child and he sat at my desk and said, ‘But I would never had said those things in real life, Mrs Broad’.
“That just really reinforced in my mind that there’s a disconnect between the reality of life and being on a social media site, so you are a different person, you can say and do a whole lot of different things,” she says.
“I don’t think that social media is a place where you learn pro-social behaviours.”
The role of schools
However, Broad says schools do have a role in teaching students (and parents) about online safety, and her school has a digital technology curriculum in place to address this.
“At the end of it all, it is part of educating children. It’s the world they live in and it’s real,” she says.
“It’s fronting up with these discussions in class before they launch into social 1media rather than doing the trial and error learning on the job when you’re using it.”
Parents have responded positively to the initiative, she says. While the school initially expected some kickback from parents, they were pleased to find most were willing to see the line of responsibility moved.
“It gave parents a bit of a wake-up call as to what was going on. I think the idea that your child is quiet and away in a bedroom, it’s easy to think everyone’s happy. Knowing exactly what they were doing, and what the impact of that was, got the parents onside. They’re really empowered, I believe; lots of comments have been that it’s really good and a positive move.”
ICT Director and Deputy Principal Tom Mackintosh says the initiative is not a ban.
“It was an invitation to create a social media-free environment or community outside the school,” he says.
“The tide had been rising with social media and there was no one even saying, ‘At least consider keeping your kids off social media’.”
Within two weeks of the agreement being sent out, 84 percent of parents agreed they would not allow their child access to social platforms.
“There are plenty of things in our society that are age restricted that you can’t do… we don’t allow children to start drinking alcohol at 13 because they might be drinking when they reach the legal age,” he says.
“The main thing is making sure they have those core values of empathy and respect for each other and I think those are the important things. That’s our core business, is growing good people and actually until they get to probably even older than 13, they don’t really have the understanding and maturity to be able to deal with the intricacies of social media.”
Student feedback at the school has indicated they are experiencing less anxiety at school and home about their online interactions.
“We also saw a reduction in the number of social media incidents that were occurring between our children. As educators, it opened the pathway for the discussion to move the line of responsibility back towards parents and whānau,” he says.
“We absolutely bear and are willing to take on the responsibility of education. I guess what we can’t take on is being accountable for our students’ behaviour on social media outside of school and that seemed to be where things were heading.
“Even if there is an issue with social media, we still ask our parents to let us know… it can be hard to repair those relationships but that’s our job, to be alongside them to help them repair those relationships face-to-face so that they can feel happy at school and remove that as a barrier to their learning and progress.”
While the approach has worked for their school, Mackintosh acknowledges that each school and community is different.
“There are other schools that embrace social media and again it’s up to each school to think carefully about the issues that they’re involved in, what the school is able and willing to do and how they see their role fitting in, but certainly I’d encourage any school to at least consider it and think, ‘This is one way of going, how might this work in our school?’
“They might turn around and say, ‘Look it’s not for us,’ but I think the exercise of going through and seeing what might be the benefits and the pitfalls for any school would be really valuable.”
Younger teenagers most at risk
According to research by NetSafe, Kiwi teenagers are twice as likely as adults to be negatively affected by harmful communications online.
Younger teenagers, aged 14 and 15, were at greater risk of harm and were also more likely to say that an unwanted online communication had made it difficult for them to take part in usual daily activities, such as going to school or studying, participating online as usual or eating or sleeping properly.
Girls were at a higher risk of being harmed by online communications than boys and teenagers with disabilities were more negatively impacted by harmful experiences online than those without impairments.
Teenagers’ experiences of harm online also differed by ethnicity, with Māori and Pacific teenagers more likely to have received unwanted digital communications in the past 12 months.
*Names have been changed to protect privacy
WHERE TO GET HELP:
If you are worried about your or someone else’s mental health, the best place to get help is your GP or local mental health provider. However, if you or someone else is in danger or endangering others, call 111.
If you need to talk to someone, the following free helplines operate 24/7:
DEPRESSION HELPLINE: 0800 111 757
LIFELINE: 0800 543 354
NEED TO TALK? Call or text 1737
SAMARITANS: 0800 726 666
YOUTHLINE: 0800 376 633 or text 234
There are lots of places to get support. For others, click here.