In a 5-4 decision at the end of October, the Supreme Court of Canada struck down the one-year mandatory minimum sentence for possessing and accessing child pornography. The case, Quebec (Attorney General) v. Senneville, involved two men accused of possessing child pornography. The accused argued that a mandatory minimum penalty of one year was “cruel and unusual treatment or punishment.”
The Court did not find that the accused men would face cruel and unusual treatment or punishment for their particular crimes if the mandatory minimum applied to them. The offenders were found to be in possession of hundreds of horrific pornographic files.
Rather, the majority relied on a hypothetical scenario in which an 18-year-old receives and retains a “sext” depicting a 17-year-old. Technically, the 18-year-old could be found guilty of a child pornography offence. The majority of the Court found that, based on this hypothetical example, a one-year sentence could be cruel and unusual punishment.
Mandatory Minimum Sentences
Canada’s courts have struck down dozens of mandatory minimums as unconstitutional. Parliament has also repealed some in the past decade, including 14 just a few years ago. In some cases, this is a reasonable approach, as prison is costly and often serves as a training ground for future criminal activity and associations. Mandatory minimums ensure that certain crimes are always punished with prison time. As such, they should be reserved for violent crimes that pose an unmistakable danger to the community, allowing judges to use restorative sanctions such as restitution and community service for non-violent offenders.
Mandatory minimum sentences indicate the severity of a particular crime, in Parliament’s view. When the prohibition on child pornography (now referred to as child sexual abuse and exploitation material) was added to the Criminal Code, there was no attached mandatory minimum sentence. However, in the years following, Parliament created more severe sentences. In 2015, Parliament increased the minimum penalty to one year so that anyone convicted under this section of the Code would have to spend at least one year in prison for possessing child pornography.
The Dissent in Senneville
The dissenting Supreme Court justices expressed the importance of mandatory minimum sentences for child pornography. “Child pornography has unquestionably become a scourge both nationally and internationally. It destroys countless innocent lives. Each pornographic photograph, video or audio recording that involves a child is an act of exploitation that will leave the child with deep and lasting scars… Through the imposition of more severe sentences, the justice system expresses society’s deep and rightful indignation.”
In response to the Supreme Court’s decision, prominent politicians such as Ontario Premier Doug Ford, Manitoba Premier Wab Kinew, Alberta Premier Danielle Smith, and Conservative leader Pierre Poilievre have condemned the Court’s decision and called on the federal government to use the notwithstanding clause to maintain mandatory minimums.
Bill S-240
Not long after the Supreme Court released its ruling, Senator Leo Housakos introduced Bill S-240 to do just that – maintain mandatory minimum sentences for possessing and accessing child sexual abuse and exploitation material. The bill would invoke section 33 of the Charter, which allows laws that the courts have struck down to continue to stand. The notwithstanding clause would be in effect for five years, at which point it would lapse or need to be renewed by a vote in Parliament.
Conclusion
The government bears the sword to punish wrongdoing and crime in Canada must be appropriately punished. At the very least, courts should not strike down laws based on hypothetical scenarios. If an actual case arises where the court believes that applying a mandatory minimum would be grossly disproportionate, the court could find a constitutional remedy that fits such a case.
Parliament should respond to this ruling both to publicly condemn and to severely punish the possession and use of child pornography by passing Bill S-240.
Status: Passed 1st Reading in the Senate
Description: Would use the notwithstanding clause to declare that mandatory minimum sentences for possessing or accessing child pornography continue to operate notwithstanding the Supreme Court’s ruling that these penalties are unconstitutional.
Analysis: Mandatory minimum sentences indicate the severity of a crime and are generally reserved for the most egregious crimes. Bill S-240 would maintain minimum sentences for possessing or accessing child pornography and would continue to highlight the severity of such crimes. The scourge of child pornography justifies a mandatory minimum sentence for offenders.
Action Items:
Kids in Canada can easily access internet pornography – often without even meaning to.
Bill S-209, currently being considered in Canada’s Senate, would mandate meaningful age-verification mechanisms for access to pornographic sites. Other jurisdictions have already enacted such laws, which are proving effective.
The first US state to pass an age-verification law was Louisiana, followed by nearly two dozen others. Among those was Texas, which just won a legal battle at the US Supreme Court to uphold its law. The Court, in Free Speech Coalition v. Paxton, found that Texas’ age-verification law does not violate free speech.
Legal Challenge
The Free Speech Coalition challenged Texas’ law, arguing that it violates free speech. Generally, pornography is considered protected speech provided it is not obscene, which is a high bar. The first court to review the Texas law struck is down, reasoning that Texas should have pursued a “less restrictive means” to prevent children from viewing porn, such as encouraging parents to install content filters. Texas appealed, and the higher court upheld the law as a valid means of pursuing the state’s interest in protecting children.
In a 6-3 decision, the US Supreme Court also ruled in Texas’ favour, noting that the purpose of the law (to prevent children from viewing porn) is pressing, and that it does not burden the speech of adults more than necessary.
Although US courts are cautious about violating free speech, there are types of speech that are not protected by the constitution, including obscenity, defamation, and fraud. But because a lot of pornography would not be considered obscenity, it would generally be considered protected speech. However, pornography that is not considered obscene per se may also be obscene for minors, the Court reasoned. The Court also noted that minors can now access both benign and extreme content with an ease that would have been unimaginable when previous court decisions addressed the question of protecting minors from porn. In 2024, 95% of American teens had access to a smartphone, and 93% used smartphones several times per day. Justice Thomas emphasized the sheer quantity and ease of access to internet porn, noting that in 2019, Pornhub published 1.36 million hours (150 years) of content.
Ultimately, the Supreme Court clearly stated that adults have no constitutional right to avoid age-verification, even if adults have a constitutional right to view pornography. Texas’ law is not a ban on pornographic content, but simply a requirement to ensure all viewers are over 18. Age-verification has been used for many years for in-person age-restricted activities, including registering a handgun, voting, getting married, or purchasing a pornographic magazine. These requirements have never been disputed. Age-verification is also already used for online gambling, alcohol and tobacco sales, and renting a car. It is also used by some porn websites, by choice. The specific methods of age-verification required by the Texas law are government-issued ID and transactional data, and both were deemed valid methods by the Court.
Privacy concerns are also commonly raised by opponents of age-verification. They worry about the stigma of pornography use and the potential for data breaches and loss of privacy. The Court addressed these concerns only briefly, but noted that porn companies and third-party age-verification companies have every incentive to protect privacy. After all, they are trying to promote their companies, and users can avoid their sites if they are not protecting privacy. Additionally, the Court noted that social stigma is no reason to exempt porn companies from valid regulation.
Three Takeaways
The United States is possibly the most ardent defender of free speech in the world. Yet their highest court also recognizes that laws protecting children are critical even if they hinder adults’ freedom to some degree. If Canada passes Bill S-209 or other age-verification requirements, our law will likely be challenged as well, for similar reasons. The US Supreme Court’s Paxton ruling explains why such constitutional challenges should fail.
We must protect children from porn and hold companies accountable for who they allow on their websites. As for privacy concerns, adults can choose whether to upload whatever information is necessary to verify their age, and decide which sites to trust. Canadians already use identity-verification methods online for shopping, banking, tax filing, and much more. While laws restricting access to pornography could go much further, age-verification is a good place to start. It aims to protect children from easily accessing pornography from anywhere and holds pornography companies accountable if they fail to use meaningful age-verification methods.
Age-verification may make viewing pornography slightly more difficult for adults. Let’s hope that many adults will choose to forego porn rather than upload age verification, and that overall porn consumption consequently drops. Tellingly, the porn industry hates age verification laws. In response to age-verification laws, Pornhub, one of the largest porn companies in the world, chose to stop operating in certain jurisdictions with such laws rather than implement proper age-verification.
Conclusion
The U.S. Supreme Court’s decision in Paxton is a key development in the global effort to protect kids from pornography. Online porn is not only much more extreme, but also much easier to obtain and view secretly than it once was. Canadian society must abandon the notion that easy access to porn is a right and instead understand the profound damage it does to our lives and our culture. We are thankful to see good court decisions on this anywhere in the world and we pray that Canada will also pass and successfully defend age-verification legislation.
As the Canadian Senate considers Bill S-209, take a few minutes to send an Easymail asking them to pass the legislation as quickly as possible.
Pornography continues to be a public health crisis in Canada, especially for minors. As we’ve documented in our policy report, Canadians consume pornography at some of the highest rates in the world, rewiring the brain of the viewer and exploiting the sexuality of human beings made in God’s image. Although several pieces of legislation were introduced in the last parliament to regulate pornography, none of them passed.
That’s why we’re thankful that, on June 19, 2025, MP Michelle Rempel Garner introduced Bill C-216, the Promotion of Safety in the Digital Age Act, which seeks to protect young people online and to combat deepfake pornography. The bill is very similar to MP Rempel Garner’s bill in the previous Parliament, which never proceeded to the debate stage prior to the election.
Protecting Minors
The Promotion of Safety in the Digital Age Act would require social media operators to protect minors online. This would ensure that their personal data is not used in a manner that compromises their privacy and well-being. Operators would be required to prevent or mitigate the effects of online harassment, sexual violence, sexual images of a minor, the promotion of services or products that are illegal for minors, promotion of self-harm, encouraging addiction-like behaviours, and predatory marketing practices. This duty would impact algorithms, advertisements, and personal data.
The operator would also be required to provide parents of a minor using the platform with safety settings on its platform. These settings would relate to privacy, data deletion, time limits, and preventing purchases and financial transactions. The default setting would be the highest level of protection, but parents could opt-out of certain safety settings. If the user is under 16, the operator must get express consent from a parent for the child to use the platform.
Bill C-216 would prohibit operators from using personal data to facilitate advertising products that are unlawful for minors and would allow parents to opt out of personalized recommendations based on algorithms. Parents and children would have access to a reporting mechanism to report online harms or risks to minors.
The operator would be required to have an independent review conducted every two years regarding the platform’s effects on minors. These findings must be made publicly available. Every year, an operator must prepare a report including how many minors visit their site, what systems are in place to protect minors, how many reports of harm it received from users, and how those issues have been addressed. Operators would receive a financial penalty if they contravene the Act.
Users who claim to have suffered serious harm (or their parents) would be permitted to bring legal action against the operator, something that is not possible now. Serious harm includes physical or psychological harm, substantial damage to reputation or relationships, and substantial economic loss.
Reporting Child Pornography
The Promotion of Safety in the Digital Age Act would also amend Canada’s child pornography legislation to ensure that such content is reported quickly and efficiently. The amendments in Bill C-216 would strengthen those requirements by clarifying the types of Internet services covered and clarifying regulations around the notification process and time periods.
False Intimate Images
The third part of the Promotion of Safety in the Digital Age Act would clarify that the Criminal Code prohibits the sharing of deepfake pornography. These images are edited or altered to falsely depict a person in an intimate way.
The bill would also increase the maximum criminal penalty for the sharing of non-consensual intimate images to 10 or 14 years in prison in certain circumstances. It would also update the offence of criminal harassment to address the ease and anonymity of harassment online, creating a separate criminal offence for online harassment.
We’ve written in the past on the importance – and limitations – of governments combatting deepfake pornography. We’ve also written about British Columbia and Manitoba combatting the non-consensual sharing of intimate images. While some provinces have been combatting deepfake porn, the Criminal Code has not been updated to clearly prohibit it. Such an update is critical since artificial intelligence has developed to a point where people can be misrepresented in horrible ways.
Evaluating the legislation
Overall, Bill C-216 would place more responsibility on social media companies to protect children online, while also giving parents the tools to manage their child’s online use. While the bill addresses online harm for children, it does so without a new government bureaucracy and without other problematic hate speech elements that have previously been included with child protection legislation.
The Promotion of Safety in the Digital Age Act would also give social media users a right to take legal action against social media operators. In addition to potential fines, the threat of legal action incentivizes social media companies to implement better protections for young people online. Rather than requiring the government to police the internet, this legislation requires social media companies to better regulate themselves.
The bill’s approach to parental oversight of children’s internet use is important. Ensuring that social media companies give parents tools to apply safety settings, privacy settings, and limit their child’s social media use recognizes parents’ primary responsibility and authority to raise their children, and that includes how they manage a child’s internet use. The bill seeks to give parents the tools to adjust safety and privacy settings and choose which settings are most appropriate for their child.
Follow the progress
Private members’ bills undergo a long process before they become law and often do not pass. But we are encouraged to see another effort to combat pornography, to better protect young people on the internet, and to recognize the responsibility of parents in managing their children’s online use. We should encourage Members of Parliament to support good policies that combat pornography and are thankful to see another good piece of legislation proposed. Bill C-216 will likely be debated in the House of Commons this fall. Send an EasyMail to your MP asking them to support the bill. For future updates and information about this and other bills that we are following, visit our Legislation to Watch page.
Across much of the western world, governments are beginning to recognize the harmful effects of pornography, particularly on young people.
In the United States, 24 states have passed age-verification legislation, requiring viewers of pornography to verify that they are over the age of 18. Most of these states have also called pornography a public health crisis because of the range of negative impacts it has on society. A bill in the U.S. Senate is seeking to implement age-verification requirements across the country.
The United Kingdom and France have also passed age-verification requirements that are coming into effect this year. Because of France’s age-verification law, Pornhub, one of the largest pornography websites in the world, blocked access to its website in France rather than comply with age-verification. Pornhub similarly blocked access to its site in 17 U.S. states in response to their age-verification laws. In a society where pornography has been easily accessible to anyone at any age, such laws are reducing ease of access to pornography.
Before the last federal election, Senator Julie Miville-Dechene introduced Bill S-210, which would have implemented age-verification standards in Canada. You can read more about ARPA’s support for Bill S-210 here. Bill S-210 passed the Senate and only had 3rd reading left in the House of Commons, but the bill died on the order paper when the 2025 federal election was called. At 2nd reading in the House of Commons, the bill had received unanimous support from Conservative, Bloc Quebecois, NDP, and Green MPs, as well as over a dozen Liberals.
Now that Parliament is sitting again, Senator Miville-Dechene reintroduced a slightly revised bill, Bill S-209. The Bill has already passed 2nd reading in the Senate and will be studied in Committee.
Bill S-209
Bill S-209 is an important initiative to protect children from pornography. The bill’s purpose is “to protect public health and public safety and, in particular, to
(a) protect the mental health of young persons by restricting their access to pornographic material;
(b) protect Canadians — in particular, young persons and women — from the harmful effects of the exposure of young persons to pornographic material, including demeaning material and material depicting sexual violence; and
(c) deter organizations that make pornographic material available on the Internet for commercial purposes from allowing young persons to access that material.”
Bill S-209 would require companies to verify that a person is over the age of 18 before they can view pornography online. Currently, in Canada, children can easily access pornography online without verifying their age. Bill S-209 would require effective age-verification, which refers to using physical or digital identity documents to verify age, or age-estimation technology, such as facial scans or other biometric or behavioural scans. Specific, effective methods would be determined by a government regulatory body following the bill’s passage.
If an organization fails to verify age and allows a young person to access pornography, it may be subject to fines of $250,000 for a first offence, or $500,000 for subsequent offences. If a site fails to comply with the regulations, the government agency in charge of enforcing the law can apply to a court to have the site blocked in Canada.
Moving Forward
The previous version of this legislation came close to becoming law and it’s great to see another attempt early in this new Parliament. Despite support on both sides of the political spectrum, the bill will face criticism from those who fear government regulation of the internet or worry about privacy concerns. But this law is an important step in combatting pornography in Canada.
Today, online pornography is not only much more extreme, but also much easier to obtain and view secretly than it once was. The intent of age-verification laws is to protect children from accessing pornography. But if it has the added impact of making it more difficult for adults to access pornography and thus reducing overall consumption and addiction, or if it results in sites like Pornhub blocking access in Canada, it will be an added benefit.
Of course, parents need to understand the impact of pornography and how to protect their children from it as well. But parents cannot hold pornography companies accountable with the force of the law like governments can. While more can be done to combat pornography, Bill S-209 is an important place to start.
As this bill moves through the Senate, send an EasyMail to your senators, asking them to better protect children from pornography and to support Bill S-209.
Status: At Committee in the Senate.
Description: Makes it illegal to make pornography accessible to young persons online and requires pornography companies to verify the age of potential viewers of pornography.
Analysis: This bill strengthens the prohibition of minors’ access to pornography by creating fines of up to $250,000 for the first offence and $500,000 for subsequent offences. Pornography companies cannot claim that they did not know the age of the pornography consumer (e.g. plead ignorance) as a way to get out of the fine but must verify the age of any potential pornography consumer. For more about this bill, read Age Verification Bill Reintroduced in the Senate.
Action Items:
Words matter. What we choose to call something both shapes and is shaped by what we understand it to be. For example, should we call it euthanasia, assisted suicide, or ‘medical assistance in dying’? Should we call it reproductive care, abortion, or murder? Should we use the term prostitution or sex work? Each has different connotations and normative implications.
The importance of word choice is no less true when talking about other issues in Canada’s Criminal Code. That’s why Member of Parliament Mel Arnold introduced a private member’s bill to change the term ‘child pornography.’ Bill C-291 was introduced in the House of Commons in June 2022 and received Royal Assent on October 10, 2024. The bill amends the Criminal Code to change the term ‘child pornography’ to ‘child sexual abuse and exploitation material.’ Although a seemingly minor change, the bill was unanimously supported in both the House of Commons and the Senate.
Increasingly, pornography is considered acceptable in Canadian society. As multiple MPs and Senators pointed out in debates on the bill, the term ‘pornography’ indicates that what is being created or viewed is legal and consensual. Child pornography is neither legal nor consensual. In debate at 2nd reading, MP Mel Arnold stated, “What the Criminal Code currently calls ’child pornography’ is more severe than mere pornography because it involves children and cannot be consensual. It is exploitive and abusive, and the Criminal Code should clearly reflect these realities.” Similarly, in a recent court case out of British Columbia, Judge Gregory Koturbash noted that “[t]he phrase ‘child pornography’ dilutes the true meaning of what these images and videos represent … These are not actors. It is not consensual. These are images and videos of child sexual abuse.”
Other countries have made a similar change, which highlights that sexual abuse and exploitation take place every time child pornography is created, whether or not it is created by electronic or mechanical means. One Bloq Quebecois MP noted in debate, “Masquerading as a libertarian utopia, child pornography is actually a system in which humans exploit other humans. We need to tackle it head on.”
While a terminology change does not substantively change or strengthen the law, we are thankful to see MPs and Senators clearly understand and describe child pornography as exploitation and abuse. Child sexual abuse and exploitation material are rightly illegal in Canada, and the terminology surrounding it should be clear and the law actively enforced.
You can read more about ARPA’s policy recommendations in our respectfully submitted policy report on pornography.
What has been your worst nightmare? Maybe it was someone or something chasing you. Maybe it was being attacked. Maybe it was a loved one passing away.
Although they aren’t real, these nightmares can be terrifying in the short term. Perhaps they prevent you from going back to sleep or leave you anxious during your morning shower or your commute to work. But thankfully, these nightmares fade away quickly because they weren’t real. They didn’t actually happen.
That’s how nightmares – false imaginations of reality – work.
Unfortunately, we’ve entered an age where nightmares can come true.
Deepfake Pornography
We’ve now entered an era that some are already calling the AI (artificial intelligence) era. This technology, which can be used to create – conversations, music, images, videos, and who knows what will be next – has immense promise to increase productivity and improve lives. But some are using it to create nightmares.
Imagine that someone takes a picture of your face and uses AI technology to create pornographic content. Nude photos. Sexual activity. That pornography might not actually depict your body, but your face is still on it. It may not be you, but it sure looks like you.
And now imagine that the creator publishes these intimate images on a pornography site like PornHub, posts them on a social media site like X (Twitter), or shares those images with your friends (or your enemies). The people that you pass on the street, in the classroom, or even in church look at you a little differently. You wonder why, until some embarrassed friend (or even worse, a gloating enemy) shares the photo or video with you.
You immediately feel engulfed in shame. You want to hide into a hole and never show your face in public again. Adam and Eve were ashamed of their nakedness after the fall into sin, but they were only naked in the sight of others. You’ve been exposed to who knows how many people. Tens? Hundreds? Thousands? Millions?
This might be a woman’s worst nightmare. (If you’re a man, think of your mother, your wife or your daughter being at the center of this story). This recently happened to Taylor Swift, where deepfake nude images of her were shared on X (Twitter) and viewed by millions of people before they were taken down. And yet the technology to create and disseminate such images exists now and is becoming cheaper and easier to use.
Governments Step Up
That’s why governments are stepping to combat this deepfake pornography. Last year we covered how British Columbia and Manitoba passed legislation to combat the non-consensual sharing of intimate images, but technology and the nature of pornography continues to change rapidly. In its recent Online Harms Bill, an otherwise terrible piece of legislation when it comes to free speech that contains some positive steps to combat pornography, the federal government intentionally included deepfake materials in its anti-pornography provisions. This is a positive step at the federal level to combat such vile material.
The new Manitoba has also introduced legislation to amend its previous Intimate Image Protection Act to prohibit the sharing of deepfake pornography. In introducing this legislation, the Manitoba Minister of Justice and Attorney General stated, “This bill will expand the definition of intimate image to capture images created or altered by electronic means, and update the title of the act. By expanding the definition, victims who have had computer-generated or altered intimate images distributed without their consent will have access to the civil remedies provided for under the act. These amendments will also act as a deterrent for would-be distributors of electronically altered or created intimate images. While other jurisdictions such as British Columbia, New Brunswick and Saskatchewan have brought forward similar legislation, these amendments to The Intimate Image Protection Act will make Manitoba a leader in protecting and supporting victims.”
This is a laudable step that every province should adopt if they don’t already have legislation on the books to prohibit deepfake pornography. British Columbia, Saskatchewan, New Brunswick, and PEI already have legislation that not only forbids the sharing of non-consensual intimate images but also includes deepfake images as well. Provincial legislation in Alberta, Nova Scotia, and Newfoundland and Labrador has yet to grapple with the possibility of deepfake pornography. Ontario entirely lacks any provincial legislation dealing with the dissemination of non-consensual intimate images.
As the technology that is used to create deepfake materials becomes more widely available (more “democratized”), governments, law enforcement, and women in general will increasingly be playing whack-a-mole, trying to shut down amateur creators in their basements and professional players in the pornography industry who produce deepfake porn. In such an environment, governments will need every tool at their disposal to protect both women and men from the evils of deepfake pornography.
But Government Isn’t Enough
But at the end of the day, deepfake pornography it isn’t the fault of the technology, technology companies, or of governments. Many people are quick to jump up and insist that AI image- and video-generators are evil in and of themselves and should be banned by governments or its development halted by tech companies. But the fault doesn’t lie in the technology nor on its creators nor on its users. It is a problem that stems from evil human hearts.
As technology continues to push the frontiers of what is possible and release human beings from dependence and oversight of others, it will continue to reveal the depravity of the human heart. Six hundred years ago, the human heart was just as corrupt as it was today, but pornography was relatively rare, if only because any sexually explicit content had to be drawn, written, or sculpted by hand. Subsequent communication technologies – the printing press, the telegraph, the still shot camera, the video camera, editing software, and then the internet – made it possible to create pornography more easily and disseminate pornography more widely. They also made the consumption of pornography more private. Once upon a time, people had to purchase a pornographic magazine from a store and suffer the judgement of the store clerk and anyone who saw them enter the building. Now anyone can access pornography anonymously online.
These technologies and these developments did not make human beings any more evil (or any more virtuous). They simply made it easier for people to satisfy the desires of their sinful heart and avoid the authority structures (e.g. parents, teachers, governments) that God has instituted, at least in part, to restrain evil.
Beyond government (and educational and parental) restriction, our society needs cultural transformation to combat pornography. Specific to this point, we need to promote a biblical perspective on human sexuality, the blessedness that it is within the bounds of marriage, and the salaciousness it is outside of this covenant relationship. But beyond that, we need to share the gospel so that not only is the sinfulness of human hearts restrained, but that these sinful hearts may be redeemed and transformed.
For some further ponderings around the impact of AI in the world around us, check out the AI-themed March-April 2024 edition of Reformed Perspective.
ARPA Canada has a new policy report on pornography that seeks to answer the question: What can (and should) governments do about pornography?
The pornography industry is one of the most prominent drivers of sex trafficking. Pornography is also often used to groom women and children into further sexual exploitation. Porn sites often include videos and images of violent pornography and pornographic content featuring minors. Pornography is often uploaded without the consent of those depicted, and it is nearly impossible to have such content permanently removed from the internet.
Pornography is inherently dehumanizing, treating people (especially women and girls) as objects and seeing their worth in their ability to satisfy sexual desires. Pornography objectifies and degrades human beings made in the image of God. It is also forbidden by the seventh commandment, which encompasses looking at a woman lustfully (Matthew 5:28).
Pornography is so pervasive that the website Pornhub alone receives roughly 4 million unique user visits per day in Canada, or about 10% of the entire Canadian population. Pornography use is highly addictive, often destroying relationships and changing users’ attitudes and beliefs about sex. It is a scourge on our society and should be combatted.
Restricting Pornography
The Canadian Criminal Code does not explicitly mention pornography except in the context of child pornography. However, it does prohibit obscenity, defined as “any publication a dominant characteristic of which is the undue exploitation of sex, or of sex and any one or more of the following subjects, namely, crime, horror, cruelty and violence.” According to the law, much of the pornography that circulates online today would be considered obscene. But the law is poorly enforced.
There have been attempts in recent years to better restrict pornography in Canada. Bill C-270, introduced in the House of Commons in 2022, seeks to prohibit the creation of pornographic material for a commercial purpose without verifying the age and consent of those shown. It also seeks to create an offence for failing to remove videos or images for which consent has been withdrawn.
In 2021, Senator Julie Miville-Dechêne introduced Bill S-210, which seeks to prevent sexually explicit material from being made available to young people on the internet. It would do so by requiring pornography companies or internet service providers to reliably verify the age of potential users.
Other jurisdictions, including the United Kingdom, Australia, and France, have been implementing age-verification processes due to the high numbers of children who are being exposed to pornography. Canada should follow their lead.
Updated Policy Report
ARPA Canada recently published a revised and updated Respectfully Submitted policy report on pornography. In this report, we consider practical ways that the government can (and should) limit pornography in Canada. This report was first published in 2017 and has now been updated to reflect political and legal developments in Canada and other countries.
Canadian MPs and Senators are considering how they should restrict pornography, at least in certain areas. We encourage you to read through the report and connect with your representatives to encourage them to read it as well. Please contact us at [email protected] if you have any feedback, suggestions for improvement, or other questions on the report.
Status: No longer active due to prorogation of Parliament.
Description: Regulates a wide variety of online harms including child pornography, non-consensual pornography, hate speech, promoting genocide, and promoting anti-Semitism. This includes once again adding hate speech as a violation of human rights under the Canadian Human Rights Act. This offence (Section 13) was removed from the Human Rights Act by the previous Conservative government.
Analysis: The provisions that propose to restrict access and dissemination of pornography are positive policies that ARPA has called upon in our policy report on pornography. However, addition of hate speech provisions in Canada’s Human Rights Act will likely lead to less freedom for Christians to proclaim the truth of God’s Word in the public square.
Articles: “One Step Forward, Two Steps Back in Online Harms Bill”
Action Items: