So the Canadian government will soon be introducing legislation aimed at regulating platforms with respect to harmful speech, spearheaded by Heritage Minister Steven Guilbeault. I'm concerned b/c we haven't seen a lot of serious thought from the Gov on this. A thread:

Guilbeault has taken this on as part of a large digital regulatory agenda that also includes things like requiring Canadian content on streaming services, and potentially forcing intermediaries to pay for news (like Australia). https://t.co/oVHCYC8I4J
We don't know what form the legislation will take, but there's cause for concern that it's partially going to look a lot like the German NetzDG law, and require hate speech and other illegal speech to be taken down quickly. It will also include a new regulator to do...something
The concern comes from the PM's mandate letter to the Minister, which asked him to "create new regulations for social media platforms, starting with a requirement that all platforms remove illegal content, including hate speech, within 24 hours or face significant penalties."
I haven't seen anything to suggest Guilbeault has moved away from this. His office was quoted recently as saying "our approach will require online platforms to monitor and eliminate illegal content that appears on their platforms." https://t.co/DF4VRlqVhn
Notably, a few weeks ago, a report was released by the Canadian Commission on Democratic Expression of the Public Policy Forum, funded by the Canadian Gov. The Report specifically called out NetzDG as a bad idea. No idea if that'll be listened to https://t.co/QZSTVWWTwA
I'm not convinced that something like NetzDG could survive Charter 2(b) scrutiny, given how limited the scope of hate speech laws are in Canada and the danger of incentivizing over-removal. See what happened in France: https://t.co/1wOw2RuiQU
With respect to the regulator, we know that's coming but not what it will actually do. The Canadian Commission on Democratic Expression called for a Duty to Act Responsibly enforced by a regulator. The Report fails to provide any detail about what this duty would be, however.
Possible that the new regulations could move in this direction, requiring the regulator to establish some sort of Code of Conduct (following the UK Online Harms White Paper approach). The Report also called for e-tribunals to be set up to adjudicate moderation decisions
I've seen this e-court recommendation before in @HeidiTworek's work, but lots of questions. What are the bases of the decisions? Platform policies? Code of conduct? Law? IHRL? How much harmonization do we want? Apply to volunteer moderation? What enforcement actions are in-scope?
In the report, @JameelJaffer's "Concurring Statement" (though he does not exactly concur) is right on. Recommending a Duty to Act Responsibly without content gets us nowhere, and it's not clear that public e-tribunals are better than regulated company processes
The Report did have some good parts, including a lot of smart proposals around transparency and requirements for impact assessments. If the legislation goes in this direction, I'll be quite happy. I am generally in favour of mandated transparency and due process requirements.
But it's not clear what impact that Report will have on the proposed regulations, if any. My guess is we'll see a NetzDG style fast takedown requirement paired with a regulator mandated to figure out its mandate. I hope to be pleasantly surprised.

More from Government

You May Also Like