Canada

‘They are monsters:’ Experts warn of online network grooming Canadian children

Published: 

Experts warn of a dangerous online group targeting teens. Andrew Johnson on what parents need to know to keep kids safe.

VANCOUVER — Experts are warning that a violent online extremist network targeting vulnerable Canadian children and teenagers is using grooming tactics to coerce victims into self-harm, sexual exploitation, and violence. However, the motive isn’t necessarily money.

“They are monsters,” said Darren Laur, the founder of The White Hatter, a digital literacy education company. “And their goal here is to inflict as much emotional, psychological and physical harm on their targets as possible.”

The network, known as 764, has been designated a terrorist entity in Canada. Specialists who monitor online extremism say the group preys on young people seeking friendship, validation or a sense of belonging, gradually manipulating them into dangerous and degrading acts.

“We are starting to see these idiosyncratic ideological themes creeping into these exploits,” said Valarie Findlay, an expert in threat analysis. “We’re starting to see a little bit of white supremacy. We’re starting to see gender bias, racial bias, a number of different ideological or worldview themes.”

CKTB News-Hacker-1.2614108 Experts are warning that a violent online extremist network targeting vulnerable Canadian children and teenagers is using grooming tactics to coerce victims into self-harm, sexual exploitation, and violence. (iStock)

The grooming often begins on widely used platforms such as Roblox or TikTok before conversations are moved to encrypted apps like Discord or Telegram, where abuse can escalate away from parental oversight.

Victims can be coerced into self-harm, sending sexual images, hurting, or killing animals and other violent acts. “The depth of cruelty that we’re seeing we really haven’t seen anything like that since probably like the 1970s, with some of the apocalyptic groups, occult groups, Satanic panic, that type of thing,” said Findlay.

Parents urged to watch for warning signs

She says there are no easy answers for parents who have a lot to deal with in the digital age. “It’s unfortunately really left to the parents to be really dialed in with the patterns of their child’s behaviors and what’s changing and not changing,” Findlay said. “If you can have really great open communication with your kid, I think that’s the best defence.”

The federal government is looking at new restrictions aimed at protecting children online, including whether to restrict youth access to social media. One grassroots group of parents says that can’t happen soon enough.

“It’s clear that it’s unsafe,” said Robin Sherk, who advocates on behalf of Unplugged Canada. “It’s clear that parents can’t do this alone. You don’t know every platform that’s out there, every device your kid can touch.”

Jason Sokolowski Jason Sokolowski discusses the loss of his daughter. (CTV News)

One Vancouver family’s devastating loss

For some families, the consequences of that exploitation have been devastating.

Jason Sokolowski, a Vancouver father, says his teenage daughter was groomed by someone identifying himself as “Culprit,” whom he later learned was connected to the 764 network.

After his daughter’s first suicide attempt, Sokolowski says he found evidence of online coercion on her laptop while she was in hospital. “There was a bucket list of self-harm… perform these acts, and you get status within the group. You become a celebrity within the group,” he said.

She ended up taking her own life. Sokolowski considers her death a homicide. He agrees families need stronger safeguards to prevent children from being targeted.

“We don’t let kids drive cars, use guns, drink alcohol,” he said. “Social media is turning out to be just as dangerous to children as any of those three.”

Sokolowski says he plans to be in Ottawa on Monday April 27, along with other parents and children from across the country to push the Liberal government to better protect Canadian kids.