home *** CD-ROM | disk | FTP | other *** search
-
- =========================================================================
- ________________ _______________ _______________
- /_______________/\ /_______________\ /\______________\
- \\\\\\\\\\\\\\\\\/ ||||||||||||||||| / ////////////////
- \\\\\________/\ |||||________\ / /////______\
- \\\\\\\\\\\\\/____ |||||||||||||| / /////////////
- \\\\\___________/\ ||||| / ////
- \\\\\\\\\\\\\\\\/ ||||| \//// e c t o r
-
- =========================================================================
- EFFector Vol. 10, No. 03 Feb. 28, 1997 editor@eff.org
- A Publication of the Electronic Frontier Foundation ISSN 1062-9424
-
- IN THIS ISSUE:
-
- EFF Online Filtration/Ratings/Labelling Public Interest Principles
- Quote of the Day
- What YOU Can Do
- Administrivia
-
- * See http://www.eff.org/hot.html for more information
- on current EFF activities and online activism alerts! *
-
- ----------------------------------------------------------------------
-
-
- Subject: EFF Online Filtration/Ratings/Labelling Public Interest Principles
- ---------------------------------------------------------------------------
-
- ELECTRONIC FRONTIER FOUNDATION
-
- PUBLIC INTEREST PRINCIPLES
- FOR ONLINE FILTRATION, RATINGS AND LABELLING SYSTEMS
-
-
- Public Discussion Draft Version 1.0b Feb. 28, 1997
-
- Please submit comments or questions to mech@eff.org, with "FILTER DRAFT"
- in the subject line, by March 31, 1997 if possible. This draft should not
- be redistributed beyond March 31, 1997. The latest version can be found at
- http://www.eff.org/pub/Net_info/Tools/Ratings_filters/eff_filter.principles
-
- This document is a DRAFT, and should not be quoted or paraphrased as a
- final statement of position, policy or opinion.
-
- If your organization wishes to endorse this document please send a message
- to that effect to mech@eff.org or fax: +1 415 436 9333.
-
-
- INTRODUCTION
- ____________
-
- As the Internet and other computer networking technologies increasingly
- become intertwined in the daily lives of a large number of people,
- concerns are frequently raised about locating relevant online material
- in a sea of data, preventing the exposure of minors to sexually explicit
- expression, ensuring that paid online work time is spent productively,
- and avoiding racist, sexist or otherwise offensive electronic messages.
-
- A market in competing and complementary filtration solutions has arisen to
- address these concerns, empowering the individual to manage the "firehose"
- of information available in cyberspace - as well as manage employee
- online time on the job, or children's access to controversial information.
- These tools range from email sorting utilities, through specially filtered
- sites for children (that provide links to only pre-reviewed material), to
- applications and services that track employee Web browsing. Soon, search
- engines and "intelligent" agents may also incorporate aspects of
- filtration or content labelling.
-
- Even as these new technologies empower users, parents, and employers,
- they pose unique conundrums, involving participant privacy, freedom of
- expression, and intellectual property among other issues. Many questions
- are raised: "Who's watching and recording what?" "What happens to my
- personal information when I send it to a filtering site?" "Who decides
- whether a site is to be blocked by this filtering software I use?"
-
- Although many benefits accrue to individual control over Internet content
- at the receiving end, the technolgies that make this possible also pose
- several risks for users, on all sides.
-
- The principle areas of concern are:
-
- * protection of end-user privacy;
-
- * ability of parents to understand, and to select in detail, what is
- filtered;
-
- * protection of intellectual property rights;
-
- * maintenance of the integrity of information;
-
- * viability of positive as well as negative filtration tools;
-
- * prevention of a system of self-censorship;
-
- * ability of content providers to challenge inappropriate blockage or
- inaccurate ratings/labels.
-
- These concerns may be addressed by applying core online principles of
- trust, and more specific guidelines for filtrations/ratings/labelling
- policies.
-
-
- Core Online Trust Principles
- ____________________________
-
- EFF has developed a set of core principles for the implementation and
- operation of rights-affecting networking technologies, necessary to
- establish a base level of consumer and organizational trust in privacy,
- security, and free flow of information online:
-
- * Informed Consent Is Necessary
-
- Consumers have the right to be informed about the privacy, security,
- intellectual property and intellectual freedom consequences of an online
- transaction or activity, BEFORE entering into one.
-
- * There Is No Privacy Without Security
-
- System security is inexorably linked with privacy - and protection of
- intellectual property rights - in an online interaction.
-
- * Standards Vary According to Context
-
- No single narrow standard or policy, regarding free speech, privacy,
- or security, is adequate for all situations, or for all participants.
-
-
- Guidelines for Implementation of Internet Filters and Ratings/Labelling
- _______________________________________________________________________
-
- 1) Users' Information Privacy - Disclosure and Op-Out from Personal
- Information Use and Re-Use
-
- * The filtration provider must inform the user of what personally
- identifiable information on the user is being kept and of the use of
- this information (including use by the filtration provider, and/or by
- any intermediary such as educational institution or employer), whether
- or not the information will be made available and in what form to other
- parties, to whom, and for what purpose.
-
- * Users must have the right to opt out of any outside third party use of
- personally identifiable information, and to restrict use and
- redistribution of that information by such outside parties.
-
- * Intermediaries must have the right to opt the intermediary and the user
- out of outside party use, and out of marketing use by the service
- provider.
-
-
- 2) Children's Information Privacy - Protection of Identity and
- Confidentiality of Minor Status
-
- * The product or service should never reveal that the browsing/posting
- user is a minor, nor reveal any personally identifiable information
- publicly or to outside parties, without the intermediary's knowledge and
- consent. A child's browsing or other preferences or habits should not
- be made available to outside parties in a personally identifiable
- manner at all
-
- * Private information such as address or phone number should
- not be released to outside parties without the written and informed
- consent of the parent.
-
- Notes: Already, the US Congress and Federal Trade Commission,
- and other governmental bodies around the world are examining
- possible regulatory measures to prevent marketing with personal
- information about children, and to restrict the collection and
- redistribution of such information. As with the "L-18" user
- identification system proposed by the Dept. of Justice in the
- Communications Decency Act trial - a proposal rejected by the
- court - the "broadcasting" of an online child's age or even their
- status as a minor may make it easier for abusive individuals to
- target children.
-
-
- 3) Availability of Default Content & Filtration Criteria and Operational
- Details
-
- * An explanation of the filtering or rating criteria, and the values or
- principles underling them, must be accessible easily and without fee to
- customers and content providers, in enough detail to make meaningful
- choices.
-
- * Customers must be informed especially as to whether the filtration
- may block political/social discussion, news reportage, literature,
- art, or scientific/reference works, as well as presumed targets (e.g.
- explicit images, private "chat" sessions, email, or advertising.)
- It must be clear whether blocking is based on topic, keywords, and/or
- other distinctions, and how broadly it may reach.
-
- * Customers must also be informed of the limitations of the
- software/service - what it does NOT filter, what it cannot prevent - and
- generally how the filtering works (respective of trade secret &
- proprietary information, of course.)
-
-
- 4) Notice of Active Filtration and Tracking
-
- * User tracking, such as "click-stream" information or "audit trails",
- should be an option (if offered at all), not a default.
-
- * If any tracking is enabled and information on the user's browsing or
- other Net use (including anything from a list of sites to full-text log,
- on either the customer's own system, or held by the filtration service
- provider) is available for review by a parent, an intermediary or an
- outside party the user should be notified during use or sign-on that
- their usage is being monitored and may be reviewed, and by whom. This
- notice should come before any connection attempt or other online
- activity is logged or processed, and may be shown more frequently to
- give better notice.
-
- * If the service or software does not provide such notice to the user,
- then it also must not provide an on-site or off-site audit trail or other
- form of log available to a parent or employer (nor to outside parties
- without a court order.) Audit trail or other tracking information must
- never be available to the public without explicit written permission of
- the user.
-
- * If a site, session or document is blocked, some kind of notice should
- appear explaining why, regardless of whether or not the session is being
- logged/tracked.
-
- Notes: As filters become more common both in the home and the workplace,
- several concerns arise about "secret monitoring". Users of any age
- deserve the same notification of loss of privacy online as they do when
- their phone conversations are recorded. Children's and teens' physical
- safety, even their lives, may be at stake in some cases.
-
- Examples: Proper notice might consist of pop-up screens that tell the
- user at the beginning of a session that their Net browsing is being
- recorded, and that their parents or employers will have access to a list
- of what sites or newsgroups the user has been reading. This reminder
- might reappear every half-hour or so. On the other hand, a simple email
- filter that sorts incoming messages into content-relevant mailboxes,
- discarding any emails with profanities in the process, might give no
- notice (other than logging what it had done). Level and detail of notice
- is dependent upon potential negative privacy impact on the user.
-
-
- 5) Customer Choice and Control
-
- * The customer should be able to configure what is being filtered, such
- as by a user-friendly means of adjusting defaults for filtration/ratings
- categories, by selectively adding or deleting specific new sites or
- keywords, by turning on or off topics to filter for, or by swapping
- entire sets of filtration criteria, as examples.
-
- * Customers should not be placed in the position of purchasing someone
- else's morality or preferences for lack of ability to customize or make
- meaningful choices. Instead, they need tools that help them filter
- out material they do not find appropriate.
-
- Notes: Systems based on the Platform for Internet Content Selection
- (PICS) are already compliant with this principle, as PICS allows for
- multiple ratings systems from which the user may select, provided that
- more than one label bureau is available.
-
-
- 6) Appeal Process, Public Access, and Integrity of Personal Information
-
- * Creators, moderators and/or owners of sites or other resources rated,
- filtered or otherwise negatively impacted should have a means of appeal
- within the organization doing the filtration, labelling or rating, to
- review the appropriateness of the decision to block/filter that site,
- to review the accuracy or breadth of an human-assigned label or
- rating, and/or to review the actions of an automated filter or other
- function that blocks that site or document. The filtering/rating party
- should treat such concerns seriously and help to resolve conflicts when
- possible.
-
- * Additionally, providers have a responsibility to verify information.
- Others must have a right to correct any wrong information about them,
- and to have suggested corrections of general fact considered seriously.
- One of the most serious problems inherent in the computerization of
- records and other information is the wild propagation of errors once
- they are introduced. Providers should have a well-thought-out published
- policy for dealing with such errors rapidly and fairly, with benefit of
- the doubt adhering to the person about whom the information may be
- mistaken.
-
- * The full results of such reviews of claims of errors or of mislabelling
- or improper filtration should be made available to the filtered-out
- party and to the public after the complaint is handled, and not covered
- by non-disclosure or other restriction from consumer examination.
- When possible, parties should seek arbitration, rather than recourse to
- legal machinery.
-
-
- 7) Intellectual Property and Integrity of Content
-
- * Filtering, labelling and rating should not modify source material.
- Filtered material should simply be blocked, or otherwise dealt with per
- customer preference, intact, with any ratings or labels appearing in
- frames, menu bars, headers, pop-up windows, or distinct and clearly
- attributed lead-ins to the presented content.
-
- Notes: "Four-letter words" should not be replaced by "****", and
- proxy-like watchdog servers should not insert rating icons into the
- HTML code or other content of rated materials. Such practices abuse the
- material owner's copyright (in particular, the right to control the
- production of derivative works), and opens the filtration provider to
- liabilty. Such alterations may also lead to incorrect reportage or
- citation, false attributions of quoted material, misinterpretation, and
- other problems.
-
-
- 8) Open Expression Without Self-Censorship
-
- * Content control systems must not place a heavy burden on content
- authors. In particular, a self-rating/labelling system must be
- sufficiently simple to implement and use that it does not interfere
- with content production, or result in self-censorship to avoid the toil
- of labelling content. Under no circumstances should any such system be
- imposed by governments, or by private-sector parties such as
- Internet service providers, under government pressure.
-
- * Self-labelling schemes logically apply only to comparatively static
- documents such as web pages, not to content of a conversational nature,
- such as live "chat" or postings to newsgroups and other forums of a
- fluid nature. In such cases, the forum as a whole, not each post or
- momentary expression in it, could be rated, labelled or filtered.
-
- * Filtration and labeling schemes must be designed carefully, with an
- eye to avoiding monopolization that can lead to chilling of free
- expression or barriers to access for all but the influential or those
- willing to comply with a particular labelling scheme.
-
-
- 9) Positive as Well as Negative Filtration
-
- * When feasible, content control services should make efforts to not only
- block material offensive to their customers, but also provide active
- pointers to material these users will appreciate.
-
- Notes: Though concerns about inappropriate material have sped up the
- development of filtration and labelling technology, the initial seed, and
- logical culmination, of such efforts is the search for a solution a much
- longer standing problem: the difficulty of finding relevant information
- in a staggeringly complex and vast flux of data. Working on this larger
- problem simultaneously moves the Internet community away from hype and
- fearmongering, helps the evolution of the Internet into a user-friendly
- knowledge tool for everyone, and does something active and constructive
- for everyone, as well something passive for those for whom the
- availability of inappropriate content remains a focus.
-
-
- 10) Contextual, Factual, Cultural Sensitivity
-
- * Content control systems must consider among the rating/labelling/blocking
- criteria, whenver possible, the context in which the material is found,
- and whether it is presented as fact or fiction, textual or graphical,
- advocacy or reportage, etc.
-
- * Content control systems must take into account whenever possible the
- literary, artistic, journalistic, educational or other value of the
- material to be labelled, rated or blocked.
-
- * Local standards should be taken into account, as mores and preferences
- vary from culture to culture. A system implementing the values of a
- particular subset of one culture may be rationally inapplicable on a
- global scale, or even on a local scale elsewhere.
-
- Notes: "Hell" in the context of a religious discussion is not very similar
- to the more offensive use of such a term as an expletive. Similarly, if
- the word "gay" or images of violent conflict appear in a news report, this
- should probably not be filtered out by a system that blocks access to
- "alternative lifestyle" or "violent" material, unless the customer
- specifically requests that such material also be blocked or the
- filtering/rating system is intended to be and is disclosed as very
- restrictive. Most users, including parents, draw a sharp distinction
- between material that advocates or visually displays behavior they find
- distasteful, and journalism or political discussion about topics in
- general. There is a severe danger of misuse of parental empowerment
- technology for entirely opposite ends, facilitated by censorship of
- political, journalistic and other material under the rhetoric of "safety".
- Already several public libraries are having filtration softare imposed
- upon them by local goverment with political agendas to restrict access to
- information. The constitutionality of these actions is highly
- questionable.
-
- 11) Individual and Academic Self-Determination
-
- * Government and semi-governmental entities must refrain from imposing a
- requirement for self-ratings, assigning private-sector sites particular
- labels, or mandating the use of filtration software. Any attempt to do
- so is sheer censorship, consisting of forced silence, coerced speech,
- denial of access, or restraint of publication.
-
- * In particular, censorship of online access in libraries and other public
- places must be avoided, and filtration must not be the default for
- public Internet terminals any more than hiding of "mature" books may be
- a default in public libraries. Public libraries must not reduce
- adult patrons to reading online only what has passed filtration as
- appropriate for children.
-
- * Student's freedom of speech and press, and the rights of libraries and
- library patrons, as forumlated in statute, case law, constitutions and
- UN treaty, apply fully in the context of online media, not simply paper
- and vocal speech.
-
- * The decision to use filtration of online material in the classroom or
- children's reading room - and what to filter - must rest with the
- teacher or librarian, with no more control by administrators than that
- excercized over what paper handouts teachers may use in class or what
- books may be checked out by children.
-
- * That libraries can in some cases legally excerise content-based
- discretion in what materials they make available does not in and of
- itself constitute a reason to do so with online material, nor does it
- imply a legislative or executive governmental prerogative to make
- or influence those decisions. Likewise, that libraries may protect
- valuable or fragile paper works by allowing their use only on special
- request, does not indicate that the reverse, "protecting" library
- patrons from materials that may be offensive to some, is appropriate.
- Libraries must not block online material, then require adults to ask for
- a key, password or special permission to access it.
-
- * The decisions of a teacher or librarian in this area, as in others,
- should be based on their own criteria, with input from the community
- where appropriate, and not controlled by the political priorities of
- administrators, or of executive or legislative government. The role of
- teachers and librarians is to provide access to information, knowledge
- and critical thinking, not to act as online content police.
-
- * Similarly, network service providers must not require users to rate or
- label their own material or submit to the editorial control of others.
- Government must not coerce or pressure service providers into providing
- content control technology, or require users to participate in
- content control systems.
-
- * Removing from distribution users' materials or otherwise taking action
- against users based on disagreement with how they self-rate is
- indefensible, and logically incompatible with the notion of a
- self-rating system.
-
-
- 12) Prevention of Centralization and National Filters
-
- * Content filtration defaults must not be built into publicly available
- hardware or operating systems, since market dominance by a particular
- manufacturer, or adoption by governments, could virtually destroy
- free flow of information on the global Internet.
-
- * Filtration service providers must take care not to put into place or
- enable the creation of centralized storehouses of personally
- identifiable user preferences or other transactional and private
- information.
-
- * No filtration, ratings, or other content control system should be
- designed specifically for government usage to censor a populace. It is
- insufficient justification that a government may have laws against
- material that is legal in other parts of the world and accessible
- online. Companies providing such technology to the public must not
- design it to be intentionally easy to abuse in censoring the public,
- and should consciously design their products or services to be difficult
- to scale to such misuses.
-
- Notes: As of recent revisions, PICS does NOT appear to be compliant
- with this principle.
-
-
- 13) Consensus and Standards
-
- * Designers and providers of content control technology are encouraged
- to participate in the formulation of open platform, public standards.
-
- * In the case of proprietary solutions, care must be taken not to
- undermine public standards, even in the name of extending them.
-
- Notes: Open, participatory standardization efforts will increase
- justified public trust in the technology and the online environment by
- helping prevent monopolization, the institution of censorship-prone flawed
- systems, intellectual property disputes that hold up market progress, and
- many other problems.
-
-
- 14) Balance of Rights
-
- * Providers must be mindful of the rights of the customer and user,
- particularly privacy rights, but also of the content owner's copyright
- and freedom of speech and press.
-
- * Intermediaries must take into account the user's right to read and to
- communicate, and to not have personal information revealed or used
- without permission.
-
- * The user needs to be aware of intermediaries' institutional or employer
- rights, as well as the rights of other users, of content owners (e.g.,
- copyright), and of the provider (e.g., to collect aggregate,
- NON-identifiable statistical info, without consent.)
-
- * Content owners must respect the fair use rights of users, and the
- rights of users and customers to refuse to receive content they do not
- want, as well as a labellers' or raters' rights to honestly review,
- comment on, describe, or block for their subscribers the material they
- encounter online.
-
- [end]
-
- ------------------------------
-
-
- Subject: Quote of the Day
- -------------------------
-
- "Falsehoods not only disagree with truths, but usually quarrel among
- themselves."
- - Daniel Webster (1782-1852)
-
- Find yourself wondering if your privacy and freedom of speech are safe
- when bills to censor the Internet are swimming about in a sea of of
- surveillance legislation and anti-terrorism hysteria? Worried that in
- the rush to make us secure from ourselves that our government
- representatives may deprive us of our essential civil liberties?
- Concerned that legislative efforts nominally to "protect children" will
- actually censor all communications down to only content suitable for
- the playground? Alarmed by commercial and religious organizations abusing
- the judicial and legislative processes to stifle satire, dissent and
- criticism?
-
- Join EFF!
- http://www.eff.org/join (or send any message to info@eff.org).
-
- Even if you don't live in the U.S., the anti-Internet hysteria will soon
- be visiting a legislative body near you. If it hasn't already.
-
- ------------------------------
-
-
- Subject: What YOU Can Do
- ------------------------
-
- * Keep and eye on your local legislature/parliament
- All kinds of wacky censorious legislation is turning up at the US state
- and non-US national levels. Don't let it sneak by you - or by the
- online activism community. Without locals on the look out, it's very
- difficult for the Net civil liberties community to keep track of what's
- happening locally as well as globally.
-
-
- * Inform your corporate government affairs person or staff counsel
- if you have one. Keep them up to speed on developments you learn of,
- and let your company's management know if you spot an issue that warrants
- your company's involvement.
-
-
- * Find out who your congresspersons are
-
- Writing letters to, faxing, and phoning your representatives in Congress
- is one very important strategy of activism, and an essential way of
- making sure YOUR voice is heard on vital issues.
-
- If you are having difficulty determining who your US legislators are,
- try contacting your local League of Women Voters, who maintain a great
- deal of legislator information, or consult the free ZIPPER service
- that matches Zip Codes to Congressional districts with about 85%
- accuracy at:
- http://www.stardot.com/~lukeseem/zip.html
-
- Computer Currents Interactive has provided Congress contact info, sorted
- by who voted for and against the Communications Decency Act:
- http://www.currents.net/congress.html (NB: Some of these folks have,
- fortunately, been voted out of office.)
-
-
- * Join EFF!
-
- You *know* privacy, freedom of speech and ability to make your voice heard
- in government are important. You have probably participated in our online
- campaigns and forums. Have you become a member of EFF yet? The best way to
- protect your online rights is to be fully informed and to make your
- opinions heard. EFF members are informed and are making a difference. Join
- EFF today!
-
- For EFF membership info, send queries to membership@eff.org, or send any
- message to info@eff.org for basic EFF info, and a membership form.
-
- ------------------------------
-
-
- Administrivia
- =============
-
- EFFector is published by:
-
- The Electronic Frontier Foundation
- 1550 Bryant St., Suite 725
- San Francisco CA 94103 USA
- +1 415 436 9333 (voice)
- +1 415 436 9993 (fax)
- Membership & donations: membership@eff.org
- Legal services: ssteele@eff.org
- General EFF, legal, policy or online resources queries: ask@eff.org
-
- Editor: Stanton McCandlish, Program Director/Webmaster (mech@eff.org)
-
- This newsletter is printed on 100% recycled electrons.
-
- Reproduction of this publication in electronic media is encouraged. Signed
- articles do not necessarily represent the views of EFF. To reproduce
- signed articles individually, please contact the authors for their express
- permission. Press releases and EFF announcements may be reproduced individ-
- ually at will.
-
- To subscribe to EFFector via email, send message body of "subscribe
- effector-online" (without the "quotes") to listserv@eff.org, which will add
- you to a subscription list for EFFector.
-
- Back issues are available at:
- ftp.eff.org, /pub/EFF/Newsletters/EFFector/
- gopher.eff.org, 1/EFF/Newsletters/EFFector
- http://www.eff.org/pub/EFF/Newsletters/EFFector/
-
- To get the latest issue, send any message to effector-reflector@eff.org (or
- er@eff.org), and it will be mailed to you automagically. You can also get
- the file "current" from the EFFector directory at the above sites at any
- time for a copy of the current issue. HTML editions available at:
- http://www.eff.org/pub/EFF/Newsletters/EFFector/HTML/
- at EFFweb.
-
- ------------------------------
-
-
-
-
-
- End of EFFector Online v10 #03 Digest
- *************************************
-
- $$
-