Home' Independence : Independence Vol 31 No 1 May 2006 Contents 10 Independence Volume 31 No. 1
The practice called “keyboard stuffing” is
where pages and pages of the same word
are added invisibly to a website to bias
the search engine. While search engines
may be programmed to ignore these key
words, the hackers now embed these
words invisibly in syntactically correct
but meaningless sentences.3
Internet users are well aware of the
scourge of “pop-up scripts”, whereby
websites or commercial advertisements
appear on their screens without warning.
These devices are also the electronic
hook by which a criminal predator can
remain in touch with a computer. The
more you try to delete them, the more
they are designed to swarm around your
computer, festering unseen behind the
The risk of the Internet is not just what
content young net surfers may find: the
content is, in fact, the bait. Young users
become prey for those who know how
to spy upon them because the Internet
is inherently an electronic conduit into
Many young Internet users are savvy
enough to by-pass current filtering and
blocking software and the practice
of tunnelling into supposedly secure
domains is now routine.
Such acts of technology manipulation
are predominately legal. Yet our children
are rendered highly vulnerable to hidden,
subtle but immensely powerful devices.
We would achieve much for the online
safety of our children, if we were able to
stop those who know how to get to them
through their computers.
The risks of liability
The question is raised as to whether
technology manipulations which require
legislation in order to protect our young
are also likely to increase the risk of legal
liability for carers and teachers.
Duty of care in a school is the expectation
that the carers and supervisors of other
people’s children will act in the manner
of a reasonable and responsible parent
in matters to do with welfare and safety.
The age of the students, their abilities
and the context in which they are
operating must be taken into account.
Reasonable duty of care on an excursion
to the beach, for example, is different for
12-year-olds and 18-year-olds or for the
12-year-old who is a champion swimmer
and the 18-year-old who can’t swim.
When we allow students to enter the
world of cyberspace, their age, their
mental and emotional maturity, their
resilience and any information we have
about their intellectual and emotional
intelligence will impinge upon our duty-
of-care obligations and liabilities.
Claims of negligence will become a
possibility for schools and teachers within
the following three areas:
• Insufficient safeguards against
hacking, especially in schools with
the “gatekeeping” capacity of a
campus-specific intranet, the content
of which can be entirely controlled and
monitored by the school, as well as
all issues of access by users and data
transfer to and from the Internet.
• Harassment and cyber bullying,
which appears to be on the increase,
augmented by the curse of the
networked mobile phone. I see no
reason why a failure to prevent this is
any less actionable than a failure to
prevent it in the playground.
• Emotional and psychological health.
Trauma from online sexual solicitation,
the linkages between depressive
symptomatology and addictive Internet
surfing, particularly in young males,
must have duty-of-care implications
for schools which lack proper
supervision protocols to ensure that
campus technology is being utilised
The legislative position
Australian law, through its divided State
and Commonwealth provisions, largely
reflects my distinction between Internet
technology and content. Commonwealth
law applies to Internet host and service
providers, with take-down notices
regulated by the Australian Broadcasting
Authority. State law applies to content
and users, although some states can
prosecute providers. While Australian
Internet law is seen by some to be among
the most restrictive among countries of
similar political systems and cultures,
there is also a belief that this is not
making the Internet safer for children.
My view is that this is because predators
can manipulate Internet technology with
criminal intention simply because those
manipulations are not themselves illegal.
In the United States, in a case involving the
American Civil Liberties Union and other
free-speech advocates, the Supreme Court
seems to be suggesting that there may not
be a need for laws compelling website
operators to adjust their practices or
modify their offensive content if adequate
technological methods, such as blocking
and filtering, exist to protect minors.
I believe that is a dangerous position for
the law to take. Apart from once again
putting more faith in the absolutism of
technology than in human reflection and
moral discrimination, there are three
issues to consider here.
Firstly, to be secure, blocking and filtering
programs will increasingly need to utilise
artificial intelligence so that the program
can recognise the user’s motive for
wishing to access a site. In the continued
absence of this, and with the ever-present
threat of hackers, the future of gatekeeper-
type protection software seems limited.
Secondly, Internet-security experts
are already losing confidence in such
software. A report by the Kaiser Family
Foundation in the United States claims
that filtering systems failed 90 percent of
the time, frequently blocked legitimate
information, and that the most restrictive
filters were the very ones most likely to be
overcome by pornographic content.
Finally, relying upon blocking content
as a legal constraint effectively requires
a legal judgement as to the potentially
offensive nature of that content in order
for the block itself to be legal. In this
country there is no agreed definition as to
what constitutes objectionable content.
Links Archive Independence Vol 31 No 2 Nov 2006 Navigation Previous Page Next Page