January 24, 2001
Magalie Roman Salas
Office of the Secretary
Federal Communications Commission
445 12th Street, S.W.
Washington D.S. 20554
Dear Sir --
I am responding to the request for public comments made by the FCC with
regard to docket 96-45, a proposal related to the implementation of the
"Children's Internet Protection Act (CHIP [sic])," included as part of
the Consolidated Appropriations Act, 2001, which, among other things,
requires that public libraries and schools which wish to receive
discounted "e-rate" Internet access must incorporate so-called
"filtering" software in order to restrict the type of content which may
be retrieved using library resources.
Below, I will provide my comments on how the use of filtering software
itself is highly anti-democratic, unethical, and unconscionable in our
supposedly "enlightened" society. However, since the appropriateness
of content filtering itself is not at issue here, I will provide my
recommendations for the implementation of this misguided policy.
Problems and limitations of content filtering:
- First, while content-filtering itself may be appropriate in
a very limited subset of situations, federally mandated
content-filtering for Internet access within libraries and
schools constitutes an indefensible over-reaching of local
authority and local definitions of what may constitute, for
example "obscenity," the definition of which varies
substantially from one community to another. This attempt
of federal legislators to assume control over content
selection, which local librarians and school boards have
been able to accomplish for years on their own with a high
degree of success (as measured by public satisfaction with
the content (books, newspapers, etc.) available in public
libraries) is highly inappropriate and misguided.
- Second, the law is exceedingly vague. For example, it requires that
libraries and schools wishing to obtain discounted Internet access
block material which is "harmful to minors," without clearly
defining what this means, other than to say, for example, material
which "taken as a whole lacks serious literary, artistic, or
scientific value to minors" should be blocked. The point which
is completely lost here is the significance of context
in determining what may or may not be harmful to minors, and this
is one of the primary areas in which computer-based content
filtering is inadequate, as a piece of computer software cannot
adequately distinguish between appropriate and inappropriate
contexts.
- Third, filtering software typically works in one of the following
three ways, each of which has significant flaws:
- the software may block users from accessing any information whose
address (URL, domain name, IP address or block, or filename, for
example) does not specifically appear on a "white list" of
approved material;
- the software may allow access to all material whose address
(URL, domain name, IP address or block, or filename,
for example) is not specifically banned by a "black list"; or,
finally,
- the software may deny or allow access based upon the
presence, number, and/or proximity of certain "key" words
to be found by searching the text
The problems with the first method should be rather
self-evident: large quantities of "legitimate" material will
become unavailable to library patrons or school students. This
may be the result of several reasons: the content may simply
not have been indexed by the author of the filtering software
and, hence, may never have been provided with the opportunity to
receive an "approved" rating and be added to the "white
list"; the content at a given URL, IP address or block, or
domain name may have changed or been removed since the list was
created, thus invalidating any such list; finally, new content
may have been added or used to replace existing content, which
might change the status of a given URL, domain name, etc. from
"approved" to "unapproved" or vice versa. The amount of
content on the Internet is constantly increasing and changing
at a rapid rate, and this method is clearly unable to cope with
this rate of change, even with regular updates.
The second method also presents similar problems: much
so-called (by the software developer) "inappropriate" content
may still be available to users simply because it did not exist
at the time when the "blocked-material" list was created or
because the content at certain locations has changed from
"appropriate" to "inappropriate" since the list was
created. Similarly, "appropriate" content may be blocked
because it was erroneously added to the "blocked-material" list
or because it replaced content at a particular location which
was previously considered to be "inappropriate" by the
software developers.
The third method is by far the worst filtering method and will
result in many "false-positive" and "false-negative" decisions.
By filtering on keywords without exercising judgment over
context (as only a human librarian or teacher can do reliably),
the software is unable to provide anything remotely effective
at distinguishing between what should and should not be filtered.
For example, software which filters out the string "XXX"
would likely prevent legitimate access to material which discusses
the sporting event "Super Bowl XXX." Similarly, filters based
on the word "breast" would fail to distinguish between material
discussing "breast cancer" and "breast pictures".
All of these methods fail to reliably take into account the
context in which the material which may be allowed or denied is
sought. Further, the lists used by each method (keywords,
allowed-content, and denied-content) are often not made
available to the public by the software designers, which
provides a significant opportunity for software developers to
incorporate a "hidden agenda" of certain political or moral
values without public knowledge, and also makes it difficult or
impossible for librarians to determine whether a given piece of
software is capable of performing the appropriate function, and
extending my argument that none of the filtering software
packages which is currently available or is likely to become
available in the foreseeable future is capable of guaranteeing
that all "inappropriate" material will be blocked or
that no "appropriate material" will be accidentally
blocked, as none of this software can or does take context
into account.
- Next, it must be noted that much filtering software which is
currently in use can be easily circumvented by users. This is
a widely-known fact which the developers of filtering software
are unlikely to publicize. Thus, whatever effectiveness such
filtering software might have had is significantly diminished
if its purpose can be easily defeated.
- Also, as I have mentioned above, it is important to realize that
local and not national standards have historically
been used to distinguish between, for example, what is "obscen"e
and what is not. This applies in many aspects of government; for
example, some communities may permit the existence of strip clubs,
while others may ban them on the grounds of obscenity. Similarly,
some libraries may carry certain magazines or books which others
decided would be considered "obscene" by the local community.
Since it is unlikely that there would be as many variations on
filtering software implementation as there are in community
standards for obscenity, there are likely to always be problems
with implementing filtering software to match a given community's
definition thereof. These problems are made particularly acute
because, as noted above, most developers of filtering software do not
make public their lists of what their software actually filters
and why (which further allows these developers to impose a
hidden agenda of what their own definitions of "inappropriate"
or "obscene" might be with near-complete impunity as a result
of obscurity).
In short, the filtering requirement is inappropriate, as it subverts
the local control of libraries and education, oversteps the bounds
of federal power, and, most importantly, requires the use of a
technology which is not appropriate in all areas and which is known
to be seriously flawed.
Further, the "monitoring" requirement for minors' use of the Internet
in schools is also a cause for concern. I will not address this in detail
here for reasons of brevity, but, suffice to say, that this is rarely
necessary in school environments and that it may well end up draining
significant technological and personnel resources away from areas
which are in much more dire need of same.
Comments on implementation:
Here, I come to the real purpose of this letter. Given that the FCC
is charged with the enforcement of this law and that filtering software
simply must be implemented in public schools and libraries
despite its many flaws, then the implementors (the schools and libraries
themselves) should be required to meet the following
requirements prior to being granted the reduced-rate access:
- The following information about the filtering software in use
should be required to be made part of every school's and library's
FCC filings
and should also be prominently displayed in the library itself:
product name, version number, platform, and, where applicable,
the manufacturer's name and contact information, as well as the
date and version number of the filtering lists. If multiple
versions or pieces of software are used, they must be listed
separately.
- The entire contents of the filtering lists themselves and other
mechanisms used by the filtering software should be
made part of the FCC filing and should be made available to the
public as with the above information. Disclosure of this information
should be mandatory and users of software for which this information
is not available to the public should not be allowed to qualify for
the reduced-rate program. Further, the reasons for blocking each
item on the blocking list must be made public.
- Filtering lists must be able to be modified by each school or
library in order to include or exclude material as deemed to
be appropriate by the librarians or teachers or other appropriate
authorities. Any modifications to these lists should be made
public and part of the FCC filing, as with the above information.
- The users of filtering software must be able to prove that
the filtering software and lists will be updated on a regular
basis and that any updates and changes will be reported to the
FCC and made public locally as well.
- Finally, the libraries and schools must provide assurance that
qualified staff members will be available to disable the filtering
software for non-minors for the "legitimate research purpose"s
for which such disablement is permitted by the provisions of the
CHIP law.
In short, I am strongly against a federal mandate for the use of filtering
software in schools and libraries. However, I do believe that there
are good and poor ways to implement its use, as I have presented
above. Please consider the issues raised here thoroughly when
determining the best possible method for implementation of this law.
If there are any questions or comments, I may be reached at the above
address as well as by telephone at 757-565-0894. Thank you for your
consideration of this important issue.
Sincerely,
Scott Norwood
Return to home page
Last Updated: Wednesday, 24-Jan-2001 17:31:53 EST