How-tos

Online Safety Laws Lagging by a Decade


Technology

In association with

The laws that are supposed to be keeping Kiwis safe online are lagging by a decade. Internal Affairs’ director of digital safety talks about what needs to change.

When the Christchurch terrorist posted a livestream of his horrific attacks at two mosques in March 2019, those in charge of keeping the online space safe were caught off-guard.

There were no emergency protocols, or specific regulations, designed to deal with this type of digital terror.

Despite this, the Department of Internal Affairs responded quickly and efficiently to remove the content, and then keep it off the internet.


How can government monitoring and controls of dangerous content keep pace with a fast-moving online world? Click here to comment.


Department of Internal Affairs director of digital safety Jared Mullen said the “surprisingly good” response to the online element of the March 15 attacks came down to the “Kiwi can-do attitude”.

The response relied on the good judgment of people working in this space, and a personal and professional commitment to making things happen.

But with that, came the realisation that New Zealand was lacking in the preparation and tools which could have made life easier at the time.

In DIA’s briefing to incoming minister Jan Tinetti, the department said its operational focus in recent years had largely been to help prevent child sexual exploitation and abuse online. 

“However, the uploading and sharing of content from the Christchurch terrorist attack on 15 March 2019 presented a unique regulatory challenge for us, our partner agencies, internet service providers and social media platform providers,” the briefing said

Over the past 18 months – in response to this emerging threat – the Department has led an extended work programme to counter online harm, which has had a strong focus on building the capacity and expertise needed to detect and prevent violent extremist content.

Internal Affairs has been included in the work on the high-profile Christchurch Call, as well as working with other government departments, civil society, and private businesses (including the tech giants) to create a set of emergency protocols, which Mullen said were now world-leading.

Those new protocols allowed New Zealand to protect Kiwis against the online element of copycat attacks like the Poway Synagogue shooting.

“A lot’s happened, and now we’re much better prepared and organised, and we have some of those operational tools in place, and relationships,” Mullen said.

“The big challenge for New Zealand and others is how to ensure their citizens have the same protections and rights online as they do in the physical world.”
– Jared Mullen

While New Zealand has come a long way since March 15, the laws that are supposed to protect Kiwis online are not fit-for-purpose and the pace of change is lagging.

The fundamental legislation governing this area was at least 10 years out of date, Mullen said.

“If you think of the step-change that’s happened in that time with broadband, social media and peer-to-peer networks, it just hasn’t kept pace…

“The big challenge for New Zealand and others is how to ensure their citizens have the same protections and rights online as they do in the physical world.”

Now Mullen is calling on the Government to ramp up its efforts to make sure the laws protecting New Zealand from online extremist content remain relevant.

“I’d like to see some kind of rolling effort on Government’s part to keep up with the backlog,” he said.

“Government passes two bills a year to keep the tax system up-to-date; it’d be really nice to see them commit to at least the same level of change, and ongoing change, in the online space, which actually moves far faster.”

Last year the Labour-led Government took a step in the right direction with the inclusion of video-on-demand streaming services, such as Netflix, Disney+, Apple TV and Neon, in the regulatory regime.

The changes mean age ratings and warnings would apply to content hosted on streaming services, as they did to other movies and video.

At the time the laws were changed, about a third of New Zealanders were regularly using commercial video on-demand and streaming platforms to access entertainment, but there was no regulatory framework to govern this type of media content.

To fill this gap, the Government introduced the Films, Videos, and Publications Classification (Commercial Video on-Demand) Amendment Act, and updated the Films, Videos, and Publications Classification Regulations 1994.

While there had always been flexibility within this legislation to make these types of changes, to account for innovation and new media, this marked the first change to the law in 10 years.

“The internet brings many benefits to society but can also be used as a weapon to spread harmful and illegal content and that is what this legislation targets.”
– Jared Mullen

The Government has also introduced a bill that would allow it to issue takedown notices and create internet filters, with a specific focus on combatting terrorism and violent extremism.

Newsroom first reported the Government was moving ahead with the suite of reforms a year ago, but the progression of the legislation was delayed. This meant the proposed legislation wasn’t introduced until May 2020, and it is still awaiting its first reading.

The proposed bill also saw civil society and the tech industry butt heads over the provision that would allow the government to filter out what the Censor deemed to be objectionable extremist content.

“The internet brings many benefits to society but can also be used as a weapon to spread harmful and illegal content and that is what this legislation targets,” former internal affairs minister Tracey Martin said when announcing the proposed legislation.

“Our laws need to reflect the digital age and the Government has worked with industry partners to create this bill, which will ensure law enforcement and industry partners can rapidly prevent and fight harm from illegal online content,” she said.

“This bill is part of a wider government programme to address violent extremism. This is about protecting New Zealanders from harmful content they can be exposed to on their everyday social media feeds.”

Martin also acknowledged the impromptu nature of the digital response to Christchurch. 

Like Mullen, she said the efforts to remove videos of the attack, and the terrorist’s manifesto, were effective, the experience highlighted the “inefficiencies and ambiguities” in the country’s censorship system for responding to objectionable online content.

In the spirit of future-proofing, when Internal Affairs recently put its internet filter contract out to tender – the filter used to help prevent people accessing child sexual exploitation and abuse online – it required that the filter could also filter out violent extremist content.

While laws allowing this type of filtering to take place were not yet passed, the Department wanted to make sure it had the capability if, or when, the filtering system was introduced.

Mullen said these recent efforts to change the law were encouraging, but legislative change continued to be one of the biggest challenges in the effort to counter violent extremism online.

Even with everything, and everyone, working efficiently, the process typically takes between 12 and 24 months.

Tougher issues, or a stickier political environment, stretched that timeline further – as was the case with this most recent piece of legislation.

“It means our law in New Zealand, and with our partners overseas, is always lagging the innovation that we see online,” Mullen said.

The Government says it’s committed to the continuation of a free and open internet, while also ensuring New Zealanders’ existing rights, including the right to be protected and feel safe, were upheld online. Photo: Lynn Grieveson (file photo)

But the former deputy chief censor had some tips and tricks for drafting good law in an internet age. It needed a flexible approach, he said.

The best law clearly set out its objectives and principles, so everybody could understand the purpose. It needed to embed the key rights people had in relation to that law, such as appeal rights or rights of review. But it also needed an appropriate amount of flexibility so the law could endure. 

That could be achieved by allowing some of the operational detail to be set out in regulation, which was still subject to scrutiny but was much easier – and faster – to change than a primary act.

With discussion around online protections and censorship inevitably comes pushback from those concerned about their rights and freedoms being curtailed. This has been a constant source of tension in the wake of the March 15 terror attack.

But there was no suggestion of new levels of censorship, through internet regulations, Mullen said.

The Government was committed to the continuation of a free and open internet, while also ensuring New Zealanders’ existing rights, including the right to be protected and feel safe, were upheld online.

For example, it was illegal to print and distribute a magazine that promoted terrorist activity. So, it should be no surprise that it was also illegal to do that online.

It wasn’t a question of additional censorship or new requirements, it was about creating consistent levels of protection across the whole of Kiwis’ lives.

“The reality is that the law, as it applies online, has not kept pace, and it doesn’t actually provide New Zealanders with the same rights and protections that they have in the physical world,” Mullen said.

* This article is the second in a two-part series about DIA’s work in detecting, preventing and countering online extremism. You can read the first article about the department’s online content monitoring here.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.