Marekt-driven Regulatory System

There are less restrictive means than regulations established and enforced by government such as rating system and filtering (or screening) to fix the problem. A market-driven regulatory system combining an Internet rating with filtering software may provide the best method to avoid governmentís interference and protect children from harmful materials.

The foundation for a voluntary rating lies in technology entitled Platform for Internet Content Selection (PICS). It is a set of common protocols that enables blocking software to associate a rating label with Internet content. PICS labels are based on existing rating systems such as Recreational Software Advisory Council (RSAC). PICS allows websites to be rated by content providers or independent third parties.

PICS allows for a proliferation of ratings and screening systems which reflect diverse viewpoints and flexible selection criteria. In allowing selective content control, PICS addresses constitutional concerns by empowering users to control content selectively while avoiding or at least limiting legislative censorship. It can thus empower Internet users to tailor Internet content control to their own requirements and help achieve an essential goal: quelling political pressure for legislative censorship (McGuire, 1999).

By allowing parents to screen out material that they deem inappropriate for their children to see, legislative effort to censor the Internet could be rendered moot. In addition to various ratings systems, software companies can offer various packages with different levels of content screening to consumers. Thus, a competitive market in screening software could be created, lessening concerns that a universal screening system will impose subjective value judgments on users (McGuire, 1999).

There are typically two ways of filtering. Users can either subscribe to a service offered by their Internet Service providers (ISPs), or purchase a product that installs on the network server (Holzberg, 1999).

Many web pages that contain sexually explicit or violent material are preceded by a warning message that appears on the userís computer screen. Online service providers also have monitored chat rooms to assure that content is appropriate for children. Further, many ISPs, acting out of liability concerns as well as the desire to avoid unilateral regulation, have rated content by developing blacklists of sites that they determine to be obscene or otherwise offensive (McGuire, 1999).

Blacklists and monitoring by ISPs, however, are cumbersome technologies that suffer from serious concerns about overregulation and the impositions of someone elseís morality on an unsuspecting viewer. Further, it is impossible for an individual ISP to review all the content offered on the web. While some providers have resorted to word searching and screening on the basis of objectionable phrases, such a regime is not functional in practice. Moreover, blacklists may not be effective because they only ban the address, not the content itself, and the Internet technology allow another content provider or user to circumvent this restriction by simply pacing the content on another address (McGuire, 1999).

This software allows parents or online users to control content by customizing the software filter used when they access the Internet. Generally, such software can block access to the World Wide Web, newsgroups, and other online services, and can prohibit access on particular days of the week or particular times of the day (McGuire, 1999).


Site created by Bong Kun Song, hand50@ufl.edu
Last updated: 28 November, 2001
© copyright 2001 Bong Kun Song