Course:LIBR548F/2010WT1/Censorship Content-control Software

From UBC Wiki

Introduction

The development and expansion of social-media has granted authors new ways to publish and make accessible their works. Like their physical counter-parts, these works can be censored. In the electronic realm, these works can be inadvertently censored by content-control software.

Content-control software is software that is designed to restrict a user’s access to web content. This kind of software is commonly used to prevent users from viewing content which political, moral, and religious authorities find objectionable. Content-control software can restrict access to websites based on keywords, domain address, and proxy host.

American and Canadian Policies and Initiatives

In an effort to protect minors from accessing obscene and pornographic material on the internet, American and Canadian politicians have developed policies and initiatives to promote the use of content-control software.

The United States

The Communications Decency Act of 1996,[1] an amendment to Title V of the Telecommunications Act of 1996,[2] was the first notable attempt by the U.S. government to regulate pornographic material on the internet. It imposed criminal sanctions on anyone who accessed “patently offensive”[3] material on any computer that could be viewed by a minor. Critics protested that the bill violated their First Amendment right to free speech because it extended to most online communications. As a result, the bill was overturned by the U.S. Supreme Court on the grounds that it extended to non-commercial language use; it did not allow parents to determine for themselves what was suitable for their children; and it failed to define the term “patently offensive,” which had no prior legal use.[4]

The Child Online Protection Act of 1998[5] marked a second attempt to regulate online pornographic content. It was designed to restrict access by minors to materials that were deemed harmful to them. The bill was ruled to be unconstitutional by the U.S. federal courts, and was prevented from taking effect.[6]

The Children’s Internet Protection Act of 2000[7] represented a change in strategy by requiring all publicly funded schools and libraries to purchase and use internet filters as a condition for the receipt of federal discounts on telecommunication and internet services. Unlike the previous bills, the U.S. Supreme Court found the bill to be constitutional because it regulated federal assistance rather than online speech.[8]

Canada

In comparison, the Canadian government has acted conservatively in matters of obscenity legislation by revising existing obscenity provisions to extend to electronic forms. In order to support the concerns of parents and teachers, the government launched in 2001 the Canadian Strategy to Promote Safe, Wise and Responsible Internet Use.[9] This later evolved into the National Strategy for the Protection of Children from Sexual Exploitation on the Internet.[10]

Selection or Censorship?

Government officials, advocacy groups, and private individuals have called for the use of content-control software on public computers to block sites, regulate use, and monitor online activity in an effort to protect minors from obscene material. They justify its use by likening it to traditional selection process by which librarians determine what materials to include or exclude from their collections.[11] Librarians do not keep pornographic materials on their shelves, and so should strive to keep it off of public-access computer terminals.

Critics, however, have countered that such filters restrict access, and thus censor, more than just obscene or pornographic websites.[12] Because these programs filter based on instances of keywords, they indiscriminately block access to sites containing artworks, literary classics, and historical or political information, social network sites, blogs, videos, games, email, and other Web 2.0 applications.[13]

While it can be debated that current works published on social media platforms are "books," there is a growing trend of authors, bloggers, and Twitterers who are publishing their electronic works in traditional book formats. Examples of this trend include Christian Lander’s Stuff White People Like: A Definitive Guide to the Unique Taste of Millions, which originated as a Wordpress blog, and Kevin D. Hendrick’s Adoption by Addition: Kids, Causes & 140 Characters, which was compiled from Tweets. In their original form, these works are at risk of being censored by indiscriminant content filtering. Likewise, websites such as ReadPrint, which electronically publishes classics such as Uncle Tom's Cabin and The Canterbury Tales for educational purposes, are at risk of being censored.

Further Reading

Deibert, Ronald, John Palfrey, Ratal Rohozinski and Jonathan Zittrain, eds. Access Denied: The Practice and Policy of Global Internet Filtering. Cambridge, MA: The MIT Press, 2008.

Written from a variety of viewpoints, Access Denied examines the social, political, cultural, and legal contexts of internet filtering as it is applied in countries around the world.


Spinello, Richard A. CyberEthics: Morality and Law in Cyberspace. Sudbury, MA: Jones and Bartlett Publishers, 2003.

CyberEthics provides readers with a discussion of the moral problems and social costs that have resulted from the expanded use of communications and information networks.


Spinello, Richard A. and Herman T. Tavani, eds. Readings in CyberEthics. Sudbury, MA: Jones and Bartlett Publishers, 2004.

Readings in CyberEthics examines different ethical perspectives as they relate to online speech, property, privacy, and security.

Notes

  1. “Communications Decency Act,” Electronic Privacy Information Center, http://www.epic.org/free_speech/cda/cda/html (accessed September 21, 2010).
  2. Telecommunications Act of 1996, Pub. LA. No. 104-104, 110 Stat. 56 (1996).
  3. “Communications Decency Act,” Electronic Privacy Information Center.
  4. Reno v. American Civil Liberties Union, 521 U.S. 844 (1997).
  5. “Child Online Protection Act,” Electronic Privacy Information Center, http://epic.org/free_speech/censorship/copa.html (accessed September 22, 2010).
  6. “ACLU v. Mukasey,” Electronic Privacy Information Center, http://epic.org/free_speech/copa/ (accessed September 22, 2010).
  7. Children’s Internet Protection Act, Pub. L. No. 106-554 (2000).
  8. United States v. American Library Association, Inc., 201 F. Supp. 2d 401 (2002).
  9. Lynne Casavant and James R. Robertson, “The Evolution of Pornography Law in Canada,” Publications List: Library of Parliament – Parliament Information and Research Service, http://www2.parl.gc.ca/Content/LOP/ResearchPublications/843-e.htm (accessed September 23, 2010).
  10. Canadian Centre for Child Protection, Inc., Cybertip.ca, http://www.cybertip.ca/app/en/ (accessed September 23, 2010).
  11. Hampton “Skip” Auld, “Do Internet Filters Infringe Upon Access to Material in Public Libraries?”, Public Libraries 44, no. 4 (2005), 197; Skip Auld, “Filtering Materials on the Internet Does Not Contradict the Value of Open Access to Material,” Public Libraries 44, no. 4 (Jul/Aug 2005), 196; David Burt, “In Defense of Filtering,” American Libraries 28, no. 7 (1997), 46-48; Will Manley, “Good Fences Make Good Libraries,” Booklist 98 (2001), 446; Michael Schuyler, “Filters: It’s Not About Porn, Stupid!,” Computers In Libraries 17 no. 9 (1997), 31-33.
  12. Michael Schuyler, “Filters Revisited,” Computers In Libraries 21 no. 6 (2001), 46-49; Carol Stanley and Jerry Stovall, "The Blocked Blog (or Websence and the Technical Colleges' Fight for Academic Freedom," Georgia Library Quarterly 45, no. 1 (Spring 2008), 5-8; Art Wolinsky, "Mandating the Wrong Filters," Teacher Librarian 29, no. 1 (October 2001), 26-27.
  13. Marjorie Heins, Christina Cho, and Ariel Feldman, Internet Filters: A Public Policy Report, 2006, http://www.fepproject.org/policyreports/filters2.pdf (accessed September 22, 2010).