1 - Cozen O`Connor

Transcription

1 - Cozen O`Connor
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I,
COZEN
O’CONNOR
ATTORNEYS
2005 International Insurance Seminar
BLOWING UP YOUR COMPANY
AND CASE BY ELECTRONIC RECORD AND DOCUMENT/V~LPRACTICE
WEDNESDAY, JUNE
22, 2005
MARRIOrI FINANCIAL CENTER
85 WEST STREET
NEW YORK, NEW YORK
© Copyright 2005 by Cozen O’Connor. All Rights Reserved.
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
COZEN
O’CONNOR
2005 International Insurance Seminar
TABLE OF CONTENTS
Speaker Profiles
II.
The Fundamentals of Electronic Discovery - PowerPoint Presentation
written & presented by Thomas M. Jones, Esq.
III.
Frequency of Identity Theft - PowerPoint Presentation
written by Robert W. Hammesfahr, Esq. and Keith E. Horton, Esq.
presented by Robert W. Hammesfahr, Esq.
Choice Point Class Actions and More - PowerPoint Presentation
written by Robert W. Hammesfahr, Esq. and Keith E. Horton, Esq.
presented by Vincent P. Pozzuto, Esq.
HIPAA Enforcement and Liability - PowerPoint Presentation
written & presented by Katherine M. Layman, Esq.
VI.
Errors and Omissions Insurance - PowerPoint Presentation
written & presented by Manny Cho of Carpenter Moore
VII.
Coverage for Risk: Survey of Key Contract Language - PowerPoint Presentation
written by Robert W. Hammesfahr, Esq. and Keith E. Horton, Esq.
presented by Robert W. Hammesfahr, Esq.
Claims for Breach of Contract Versus Professional Errors and Omissions
PowerPoint Presentation
written & presented by Margaret A. Reetz, Esq.
IX.
Hacking and Downstream Liability
written by Brian J. Walsh, Esq. and Erika L. Winkler
a.) exhibit-Computer Viruses and Civil Liability: A Conceptual Framework
b.) exhibit-Downstream Liability for Attack Relay and Amplification
c,) exhibit-Can Hacking Victims Be Held Legally Liable?
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
COZEN
O’CONNOR
ATTORNEYS
SPEAKER PROFILES
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Las Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*Affiliated with the Law Offices of J. Goldberg & D, Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide
legal advice. Accordingly, these materials should not be relied upon without seeking specific legal advice on matters discussed
herein.
Copyright © 2005 Cozen O’Connor. ALL RIGHTS RESERVED.
I
I
I
I
I
!
I
!
I
I
I
I
I
I
I
I
I
I
I
COZEN
O’CONNORo
ATTORNEYS
William P. Shelley
Member
Vice Chair, National Insurance Litigation Department
Philadelphia Office
(215) 665-4142
[email protected]
AREAS OF EXPERIENCE
- Alternative Dispute
Resolution
- Appellate Pracnce
- Commercial General Liability
- Excess & Surplus Lines
- Toxic & Other Mass Torts
EDUCATION
- J.D. Rutgers University
School of Law, 1979
B.A. Rutgers University, 1976
BAR ADMISSIONS
- Pennsylvania
- New Jersey
- New York
COURT ADMISSIONS
- United States Supreme Cour~
- New Jersey Supreme Court
- New York Supreme Court
United States District Court
for the District of New Jersey,
the Eastern District of
Pennsylvania, the Southern,
Eastern, Northern and Western
Districts of New York
- United States Court of
Appeals for the 3ra. 11 t~ and
District of Columbia Circuits
MEMBERSHIPS
- American Bar Association
- New Jersey State Bar
Association
- Burlington County Bar
Association
Defense Research Institute
William Patrick Shelley serves as vice chair of the firm’s National
Insurance Litigation Department. His practice primarily focuses on complex
insurance coverage issues including general and professional liability and
on the coordination of the defense of mass tort claims.
Currently, Bill serves as counsel for insurers in many major asbestos
coverage cases as well as associated bankruptcy proceedings pending
around the country. He also serves as national counsel for Chubb Group on
direct action asbestos suits filed around the country. Bill also recently acted
as coordinating counsel for a major insurer on lead paint coverage claims.
Bill has authored two major articles on insurance coverage for toxic tort
claims that are frequently cited by courts titled, "Toxic Torts and the
Absolute Pollution Exclusion Revisited" Tort Trial & Insurance Practice
Law Journal, Vol. 30, No. 1 (Fall 2003), "Application of the Absolute
Pollution Exclusion to Toxic Tort Claims: Will Courts Choose Policy
Construction or Deconstruction," Tort & Insurance Law Journal, Vol. 33
No. 3 Spring 1998. He is also the author of"Fundamentals of Insurance
Coverage Allocation," Mealey’s Litigation Report: Insurance, Vol 14, #9,
January 5, 2000. Most recently, Bill co-authored "Unraveling The Gordian
Knot Of Asymptomatic Asbestos Claimants: Statutory, Precedential And
Policy Reasons Why Unimpaired Asbestos Claimants Cannot Recover In
Bankruptcy," 3-10 Mealey’s Ash. Bankr. Rep., 22 (2004). Bill appeared in
the August 2003 edition of Metropolitan Corporate Counsel in the article
titled "Cozen O’Connor: Using All The Tools To Meet Clients’ Needs".
Bill’s seminar presentations include: Mealey’s Wall Street Forum: Asbestos
Conference (February, 2005); American Conference Institute: E-Commerce
Coverage Claims (June, 2001); American Conference Institute - Asbestos
Litigation: Co-Chair (October, 2001); Mealey’s Insurance Coverage
Advanced Allocation Seminar (February, 1998); and Mealey’s Insurance
Coverage 101 : Co-Chair (November, 2001 ).
Bill earned his bachelor of arts degree, with highest honors, at Rutgers
College in 1976 and his law degree at Rutgers School of Law in 1979. He
was admitted to practice in New Jersey in 1979, in Pennsylvania in 1984,
and in New York in 1989. He was selected a 2005 "Pennsylvania Super Lawyer"
by his peers, appearing in Philadelphia Magazine and Pennsylvania Super
Lawyers.
I
I
I
I
I
I
I
I
I
I
I
!
I
Craig Rhinehart
Professional Biography
Di[ector for Complia~sce Markets and P~oducts
Craig Rhinehart directs FileNet’s Compliance Markets & Products and is a
veteran in the enterprise content management (ECM) industry with over 20 years
experience with records management, content management, imaging and media
asset solutions as vendor, integrator, consultant and end-user.
Craig joined FileNet in 2003 to develop the vision and strategy for a new suite of
products to address the records management and legal compliance challenges
facing companies today. The two first products, FileNet Records Manager and
Email Manager are evolutionary products that reduces risk and enable proof of
compliance while simultaneously generating business value and a strong ROI for
its’ customers.
Prior to joining FileNet, he was involved in IBM’s acquisition of records
management software company Tarian Software where he was Vice President of
Worldwide Marketing.
Craig has led industry research efforts to define, develop and prove ROI models
for both content and records management and is a requested speaker on a
variety of electronic records management topics.
Considered an expert in electronic records management systems and the US
Department of Defense 5015.2-STD certification program he currently serves as
an advisor/board member on the ARMA Electronic Records Initiative.
Over the years, he has helped CNN, NFL, Exxon Mobil, Disney, ABC News,
Verizon, ESPN, MCl, The Weather Channel, US Army, Honeywell, US Air Force
and others realize the benefits of records management and other ECM solutions.
I
I
I
I
I
!
I
I
I
I
i
I
I
i
!
I
I
I
I
COZEN
O’CONNOR°
ATTORNEYS
Thomas M. Jones
Member
Co-Chair, Insurance Coverage Department
Seattle Office
(206) 224-1242
[email protected]
AREAS OF EXPERIENCE
- Advertising Liability
Agent/Broker Liability
Appellate Practice
- Arson & Fraud
- Bad Faith Litigation
Business Torts
- Class Actions, Multi-District
Litigation and Other
Consolidated Claims
- Commercial General Liability
- Construction Liability
- Crisis Managemem
- Directors’ & Officers’
Liability
Employment, Labor &
Employee Benefits
- Environmental Law
Excess & Surplus Lines
- Fidelity & Surety
- Financial Risk Transfer
- Medical Device & Drug
Litigation
- Personal Lines
- Products Liability
- Property Insurance
- Punitive Damages
Reinsurance
- Security & Premises Liability
- Technology & E-Commerce
- Toxic & Other Mass Torts
EDUCATION
- J.D. Oklahoma City
University School of Law,
1976
- B.A. Central State University,
1974
MEMBERSHIPS
Seattle-King BarAssociation
- Washington State Bar
Association
- Oklahoma Bar Association
- American Bar Association
Defense Research Institute
- Washington Defense Trial
Lawyers Association
Thomas M. Jones, who joined Cozen O’Connor in January 1986, is a Member of the
Firm and serwes as the Co-Chair of the Insurance Coverage Practice Department. Mr.
Jones’ practice spans many areas of law, including, Advertising Liability, Agent/Broker
Liability, Appellate Practice, Arson & Fraud, Bad Faith Litigation, Business Torts,
Class Actions, Multi-District Litigation and Other Consolidated Claims, Commercial
General Liability, Construction Liability, Crisis Management, Directors’ & Officers’
Liability, Labor & Employment, Environmental Law, Excess & Surplus Lines, Fidelity
& Surety, Medical Device & Drug Litigation, Personal Lines, Products Liability,
Property Insurance, Punitive Damages, Reinsurance, Security & Premises Liability,
Technology & E-Commerce, and Toxic & Other Mass Torts.
Mr. Jones is a member of the Defense Research Institute, the Washington Defense Trial
Lawyers Association, the Washington State, Seattle-King County, American and
Oklahoma Bar Associations. He has acted as lead trial insurer counsel in some of the
highest profile insurance coverage cases in the country. Mr. Jones was also selected by
his peers as a "Super Lawyer" in Washington from 2000 - 2005.
Mr. Jones has also authored several published articles including "Insurance Issues for
the Insurer," (supplement) Washington Real Property Deskbook, Ch. 135, Washington
State Bar Association, 3d Edition, 2001; "An Introduction to Insurance Allocation
Issues in Multiple Trigger Cases," The Villanova Environmental Law Journal, Vol, 10,
Issue 1, 1999; "Intellectual Property Coverage," Insurance Coverage: An Analysis of
the Critical Issues, Continuing Legal Education Committee of the Washington State Bar
Association, 1999; "Claims for Advertising Injury Coverage: A Primer," Journal of
Insurance Coverage, Vol. 1, No. 4, Autumn 1998; "Washington State’s Insurance
Regulation for Environmental Claims: An Overview of Key Provisions and Legal
Issues," Environmental Claims Journal, Vol. 9, No. 3, Spring 1997; and "Reinsurance
Issues Arising from the Settlement of Complex Claims," Insurance Litigation Reporter,
Vol. 17, #12, 590, 1995.
Mr. Jones received his Bachelor of Arts degree from Central State University in 1974
and earned his law degree at Oklahoma City University School of Law in 1976. Mr.
Jones was admitted to practice in Oklahoma in 1977 and in Washington in 1983, all
U.S. District Courts in Washington and Oklahoma, and the 9t~ and 10th Circuit Courts of
Appeal.
i
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
COZEN
O’CONNOR.
ATTORNEYS
Robert W. Hammesfahr
Member
Chair, International Insurance Practice Group
Chicago Office
(312) 382-3101
rhammes fahr@cozen .corn
AREAS OF EXPERIENCE
Advertising Liability/Personal
Injury
Bad Faith Litigation
Business Torts
Class Actions, Multi-District
Litigation & Other
Consolidated Claims
Commercial General Liability
Directors’ & Officers’
Liability
Excess & Surplus Lines
Professional Liability
Punitive Damages
Reinsurance
Technology & E-Commerce
Toxic & Other Mass Torts
EDUCATION
J.D. Northwestern University
School of Law, 1978
B.A. Colgate University, 1975
MEMBERSHIPS
Defense Research Institute
Chicago Bar Association
American Bar Association
PLUS
PUBLICATIONS
Co-author, The Law of
Reinsurance Claims
Co-author, @Risk -lntemet
and E-Commerce Insurance
and Reinsurance Issues
Robert Hammesfahr is a Member of the Firm and Chair of Cozen O’Connor’s
International Insurance Practice Group. He has more than 20 years of experience in
litigating and counseling a broad spectrum of clients involved in excess liability,
coverage, and reinsurance cases. He has represented insurers and reinsurers in
connection with containing major litigation threats and defended against coverage
claims arising from mass tort, pollution and latent injury, technology, employment
practices and professional indemnity claims. This work has included analysis of a
wide variety of insurance policies and reinsurance contracts, advice on reservation
of rights and defenses and defense of claims, numerous negotiations of complex
liability claims for excess insurers, multi-party coverage disputes for direct
insurers and disputes involving reinsurance and retrocessions.
Mr. Hammesfahr’s litigation and arbitration experience includes:
¯
Coordination of health hazard coverage litigation for a large number of
insurers
¯ Analysis of reinsurance issues in connection with the formation of a multibillion financial reinsurer
¯ Work on disputes involving allocation issues in connection with numerous
cedents and contracts
o Monitoring of technology and IP exposures for technology errors and
omissions insurers
¯ Appearance in over 50 major coverage and bad-faith cases and over 100
reinsurance disputes
Mr. Hammesfahr’s counseling work includes:
Development of employment practices liability insurance policies
Development of policy wordings for high tech policyholders
¯
Advice on insurance risk securitization and alternative risk transfer
Auditing and reserving advice for excess insurers and reinsurers
Mr. Hammesfahr is the author of leading treatise on punitive damages, reinsurance
and technology insurance and reinsurance issues as well as over 30 articles. Prior
to joining Cozen O’Connor, Mr. Hammesfahr was Chairman of Blatt,
HammesFahr and Eaton and a partner at Peterson & Ross. He was voted an Illinois
"Super Lawyer" by bar association peers, as reported by Chicago Magazine in
2005.
I
I
I
I
I
I
I
I
I
I
I
I
!
I
i
I
I
I
COZEN
O’CONNOR°
ATTORNEYS
Traci M. Ribeiro
Member
Philadelphia Office
(215) 665-6976
[email protected]
AREAS OF EXPERIENCE
Advertising Liability and
Personal Injury
Bad Faith Litigation
E-Commerce, Interact and
Cyber-Peril Insurance Law
Insurance Corporate and
Regulatory
EDUCATION
J.D., American University Washington College of Law,
1995
- B.A., Hofstra University, 1992
MEMBERSHIPS
-
Philadelphia Bar Association
Pcmnsylvania Bar Association
American BarAssociation
Phi Beta Kappa
TIPS Fidelity & Surety Law
Committee
PUBLICATIONS
- "Insurance Laws of Eastern
Europe," American Bar
Association, 1994
Traci M. Ribeiro joined Cozen O’Connor in January, 2001 and practices
with the insurance litigation group. She focuses her practice on insurance
coverage, insurance corporate and regulatory, and e-commerce issues. Traci
represents insurers in complex insurance coverage litigation in state and
federal courts in both the first and third party liability context. Recently,
Traci has been involved in complex mediations involving the defense of
mass tort claims. She also provides counsel to insurers with respect to state
and federal regulatory issues, and is a frequent lecturer on e-commerce
liabilities under property and liability insurance policies.
Traci joined Cozen O’Connor from Wolf, Block, Schorr and Solis-Cohen
LLP where she practiced in the complex liability, surety and fidelity
practice in the firm’s litigation department.
From 1997-1998 Traci served as an attorney for American International
Group, Inc. (AIG). In her post, she provided legal counsel to AIG member
companies including National Union, American Home, New Hampshire,
and American International Underwriters on all regulatory aspects of their
professional liability divisions as well as several other product lines.
Prior to joining AIG, Traci served as the intemational policy analyst for the
National Association of Insurance Commissioners, where she provided
legal counsel to the International Association of Insurance Supervisors and
advised the United States Trade Representative Office (USTR) and state
insurance regulators with respect to international trade agreements’ affect on
state insurance laws.
Traci received her bachelor of arts degree from Hofstra University in 1992.
In !995, she received her law degree from American University’s
Washington College of Law. She is a member of Phi Beta Kappa and the
Philadelphia, Pennsylvania and American Bar Associations. Traci is
admitted to practice in New York and Pennsylvania.
I
I
!
COZEN
O’CONNOR.
ATTORNEYS
Vincent P. Pozzuto
New York Office
(212) 908-1284
[email protected]
I
I
I
I
I
AREAS OF EXPERIENCE
Casualty & Property Defense
Construction Claims
Professional Liability
EDUCATION
J.D. Brooklyn Law School.
1995
B.A. Fordham University,
1992
Vincent P. Pozzuto is an associate in the New York office. His practice primarily
involves defending casualty defense eases including premise liability claims, toxic
torts, construction accidents, products liability claims, ground surface accidents,
and professional negligence claims. He also has significant experience in
defending brokers and financial institutions in NASD arbitrations.
Mr. Pozzuto received his bachelor of arts degree from Fordham University in 1992
and earned his law degree at Brooklyn Law School in 1995. He is admitted to
practice in New Jersey and New York.
Prior to joimng the Firm in May 2000, Mr. Pozzuto was an associate with the firm
Costello, Shea & Gaffney, LLP.
I
I
!
I
I
!
I
!
I
I
I
I
!
I
i
I
I
COZEN
’
O CONNO Ro
ATTORNEYS
Katherine M. Layman
Member
Philadelphia Office
1215) 665-2746
[email protected]
AREAS OF EXPERIENCE
Health Law
EDUCATION
- J.D. Temple University School
of Law, cure laude, 1993
B.A. University of Michigan,
with distinction, 1971
MEMBERSHIPS
Pennsylvania Bar Association,
Health Law Section
American Health Lawyers
Association, Health Law
Section
National Health Lawyers
Association
Pennsylvania Society for
Healthcare Attorneys
Pennsylvania Health Care
Association’s Lawyers in
Long Term Care Specialty
Council
Member, Board of Directors,
American Red Cross Blood
Services, Penn-Jersey Region
Katherine M. Layman, a Member of the Firm, practices in the Health Law
Department where she handles a variety of litigation, regulatory, and transactional
matters. She has wide-ranging experience in staff privileges issues and litigation,
survey and compliance issues for long term care providers, licensure issues, fraud
and abuse, clinical laboratory and pharmacy issues, HIPAA and privacy, and
regulatory compliance in the operation of Medicare, Medicaid, and other third
party reimbursement programs. She has co-authored a number of articles
including "Fraud and Abuse Initiatives and Medicaid and Medicare Compliance in
the Long Term Care Sector," "Guidance for Handling Surveyors and Government
Investigators," and "Nursing Homes Under Attack." She has spoken widely on
H~AA issues and has written several columns for Medical Economics concerning
"H]PAA: Frequently Asked Questions". Ms. Layman has also made several
presentations to the Pennsylvania Health Care Association on survey-related
issues.
Ms. Layman earned her Bachelor of Arts degree, with distinction, from the
University of Michigan in 1971. She eamed her law degree, cure laude, from
Temple University School of Law in 1993, where she served on the Temple Law
Review. Upon graduation from law school, she served as a law clerk to the
Honorable James M. Kelly, of the United States District Court for the Eastern
District of Pennsylvania. She is admitted to practice in Pennsylvania and to the
Supreme Court of Pennsylvania and the United States District Court for the
Eastern District of Pennsylvania.
Ms. Layman is a member of the Health Law Sections of the Pennsylvania Bar
Association and the American Bar Association, as well as the National Health
Lawyers Association. Ms. Layman serves on the Pennsylvania Health Care
Association’s Lawyers in Long Term Care Specialty Council and she served on the
Act 142 Advisory Committee to the Pennsylvania Department of Public Welfare
Bureau of Hearings and Appeals. She is a member of the Board of Directors of the
American Red Cross Blood Services, Penn-Jersey Region.
I
i
I
I
I
!
!
I
i
I
I
I
I
I
I
!
i
I
I
Manny Cho, Senior Broker, E&O Division Manager
As the manager of the E&O Group, Manny has works with industry leading technology
and financial services companies in the acquisition of Professional Liability, Media
Liability, Intellectual Property and Network Security / Loss of Income products.
Before joining Carpenter Moore, Manny worked as the Regional Technology Manager
for AIG (American International Group). Manny was instrumental in establishing the
technology insurance practice for AIG in California and the Pacific Northwest. Manny
was actively involved in the development, sales and marketing of AIG’s Professional
Liability and Network Security (netAdvantage) products.
The majority of Manny’s career was spent with the Chubb Group of Insurance
Companies where he held various positions. Prior to his departure, Marmy was the
manager of their Technology Practice in Pleasanton, California.
Manny holds a B.S. in Finance from the University of Illinois.
I
I
I
COZEN
O’CONNOR°
ATTORNEYS
Margaret A.
Reetz
Associate
Chicago Office
(312) 382-3171
[email protected]
AREAS OF EXPERIENCE
Insurance Coverage
Reinsurance Coverage
Insurance Defense
Premises, Product, Medical
Malpractice, Errors &
Omission Liability
Mass Tort Litigation
EDUCATION
J.D. DePaul University, 1986
B.A. University of Illinois,
Urbana-Champaign, 1983
MEMBERSHIPS
American Bar Association
Illinois State Bar Association
Chicago Bar Association
The State Bar of California
The New York State Bar
Margaret Reetz joined Cozen O’Connor’s Chicago office in August 2001.
She has over 15 years of experience advising clients in direct defense and
insurance related matters. Peggy focuses her practice on insurance,
reinsurance and e-commerce matters.
Peggy’s experience includes litigating and counseling a broad spectrum of
clients in direct defense, excess liability, coverage, and reinsurance cases.
She has represented insurers and reinsurers in connection with containing
major litigation threats and defended against coverage claims arising from
pollution, product liability, business interruption and technology claims.
This includes extensive analysis of insurance policies and reinsurance
contracts, negotiations of complex liability claims for excess carriers and
reinsurers, multi-party coverage disputes for direct insurers and disputes
involving reinsurance and retrocessions. Peggy is a co-author of
Reinsurance Claims (Reactions Publications, 2004).
Immediately prior to joining Cozen O’Connor, Peggy was a Manager in the
Claims Advisory Group of Ernst & Young. In that capacity, she advised
self-insured, insurer and reinsurance entities as to their business practices
and claim-related issues.
From 1987 to 1999, Peggy practiced in both California and New York with
the law firm ofMendes & Mount, where she was a Non-Equity Partner.
!
!
i
I
I
!
i
I
I
i
I
I
I
I
!
I
i
I
COZEN
O’CONNOR
ATTORNEYS
THE FUNDAMENTALS OF ELECTRONIC DISCOVERY
POWERPOINT PRESENTATION
written & presented/~y
Thomas M. Jones, Esq.
[email protected]
COZEN O’CONNOR
1201 Third Avenue
Washington Mutual Tower, Suite 5200
Seattle, WA 98101
(206) 340-1000 or (800) 423-1950
www.cozen.com
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Las Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Toronto
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*Affiliated with the Law Offices oF J. Goldberg & D. Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide legal advice.
Accordingly, these materials should not be relied upon without seeking specific legal advice on matters discussed herein.
Copyright © 2005 Cozen O’Connor. AU. RIGHTS RESERVED.
I
I
I
I
!
I
I
1
I
I
I
1
I
1
I
I
I
Fundamentals
of ~-~
lectronic
Discovery
~1
/~~he
Thomas M. Jones, Esq. ~
Cozen O’Connor ~
J
Sea_ttle~
Sample Interrogatories
Identify all email systems in use, including but not
limited to the following:
a) Ust all email software and versions presently used by you and
the dates Of use;
b) Identify all hardware that has been used or is currently in use as
a server for the emait system induing ~ name;
¢) Identify the speot~c type of hardware that was used as terminals
nto the email system (including home PCs, laptops, desk’tops,
cell phones, personal digital assistants (~PDAS"), et~.) and its
current location;
d) S~te how many users there have been on each e.mail system
(delineate between pa~ and current users);
Sample Interrogatories (cont.)
e) State whether the email is encrypted in any way and list
passwords for all users;
r’} Identify all users known to you who have generated email
related to the subject matter of this litigation;
g) Identify all email known tO you (including creation date,
recipient(s) and sender) that relate to, reference or are
relevant to t~e subject matter of this litigati(~n.
I
I
I
I
I
I
I
I
I
1
I
1
1
I
1
1
1
1
Sample Interrogatories (cont.)
2. Identi~ and describe each computer that has been, or ~s
currently, in use by you or your employees (including
desktop computers, PDAs, portable, laptop and notebook
computers, cell phones, etc.), including but not limited to
the following:
a) Computer type, brand and model number;
b) Computers that have been re-formatted, had t~e operating Sysr.em
reinstalled or have been overwritten, and identify the date of each
eve.n~;
c) The current location of each computer ident~ed in your response to
this interrogatory;
d) The brand and version of alt software., including operating system,
private and custom-developed applications, commercial applications
and shareware for eao~ coml~ter identified;
Sample Interrogatories (cont.)
e) The communications and connectivity for each computer,
including but not limited to terminal-to-mainframe
emulation, data download and/or upload capability to
mainframe, and computer-to-computer connections via
network, modem and/or direct connection;
f) All computers that have been used to store, receive or
generate data related to the subject matter of this
litigation.
Sample Interrogatories (cont.)
3. As to each computer network, identify the following:
a) Brand and version number of the network operating
system currently or previously in use (include dates of all
upgrades);
b) Quantity and configuration of all network servers and
workstations;
c) Person(s) (past and present including dates) responsible for
the ongoing operations, maintenance, expansion, archiviog
and upkeep of the network;
d) Brand name and version number of all applications and
other sot%rare residing on each network in use,
including but not limited to electronic mail and
applications.
I
I
I
i
I
I
I
I
I
I
I
I
I
I
1
I
I
Sample Interrogatories (cont.)
Describe in detail all inter-connectivity bebveen the computer
system at [opposing party] in [office location] and the compoter
system at [opposing party #2] in [office location #2] including a
description of the following:
a) All possible ways in which electronic data is shared between
locations;
b) The method of transmission;
c) The type(s~ of data transferred;
d) The names of all individuals possessing the capability for
such transfer, including a list of names of authorized outside
users of [opposing party’s] electronic mail system;
e) The individual r~ponsibie for supervising inter-connectivity.
Sample Interrogatories (cont.)
5. As to data backups performed on all computer Systems
currently or previously in use, identify the following:
a) A~I procedureS and pev~ces used to back up the software and the
data including but not limited to name(s) of backup software used,
the frequency of the backup process, and type of tape backup
driveS, including narr~ and version numl3er, ~/pe of media (i.e.,
DLT, 4mm, 8ram, AIT). S~ate the capacity Ibytes) and total
amount of reformation (gigabytes) stored on each tape;
b) Descdbe the tape or backup rotation and explain how backup data
is maintained and stata whether the backups are full or incremental
(attach a copy of all rotation schedules)
Sample Interrogatories (cont.)
c) State whether backup storage media is kept off-site or onsite. Include the location of each backup, and a description
of the process for archiving and retrieving on-site media;
d) The individual(s) who conducts the backup and the
individual who supervises this process;
e) Provide a detailed list of all backup sets, regardless of the
magnetic media on which they reside, showing current
location, custodian, date of backup, a description of
backup content, and a full inventory of all archives.
I
I
I
I
I
I
I
1
1
!
1
1
I
I
I
I
I
Sample Interrogatories (cont.)
6.Identify all extra-routine backups applicable for
any sewers identified in response to these
interrogatories, such as quarterly archival
backup, yearly backup, etc., and identify the
current location of any such backups.
Sample Interrogatories (cont.)
For any server, workstation, laptop or home PC that
has been "wiped clean’; defragmented, or
reformatted such that you claim that the information
on the hard drive is permanently destroyed, identify
the following:
a) The date on which each drive was wiped, reformatted
or defragmented;
b) The method or program used (e.g., WipeDisk, WipeFile,
BurnI~, Data Eraser, etc.);
~ Sample Interrogatories (cont.)
8, Identify and attach any and all versions of document/data retention
policies used by you and identify documents or classes of documents
that were subject to scheduled destruction. Attach copies of document
desthJct~on inventodes/Iogs/schadules containing pocuments relevant
to this action. ALtach a copy of any disaster recovery p~an Also ~ate:
a) Tile dato, if any, of the suspension of this policy i/~ tonto or any
aspect of said POI~’ in response to this I~gat~n;
b) A description by topic, creation date, user or bytes of any and all
data that has been detetad or in any way destroyed after the
commencement of this litigation. State whether the deletion or
destruction of any data pursuant to said data retention policy
occurred through automation or by user acoon;
c) Whether any company-wide mstructJon regarding the suspension of
said data reter~don/destruction policy OCCUrred after or relemd t~
the commencement of this litigation and, if so, identJf~, the
individual responsible for enforcing said suspension.
4
I
I
I
I
I
I
I
1
1
1
1
1
I
I
I
I
I
Sample Interrogatories (cont.)
9.Identify any users who had backup systems in
their PCs and describe the nature of the backup.
Sample Interrogatories (cont.)
:~0. Identify the person(s) responsible for
maintaining any schedule of redeployment or
circulation of existing equipment and describe
the system or process for redeployment,
Sample Interrogatories (cont.)
11. Identify any data that has been deleted,
physically destroyed, discarded, damaged
(physically or logically), or overwritten,
whether pursuant to a document retention
policy or otherwise, since the commencement
of this litigation. Specifically, identify those
documents that relate to or reference the
subject matter of the above-referenced
litigation.
I
I
I
I
i
I
I
1
1
!
I
1
I
I
I
I
I
Sample Interrogatories (cont.)
12. Identify any user who has downloaded any files
in excess of ten (10) megabytes on any
computer identified above since the
commencement of this litigation.
Sample Interrogatories (cont.)
13. Identify and describe all backup tapes in your
possession including:
a) Types and number of tapes in your possession (such
as DLT, AIT, Mammoth, 4mm, 8ram)
b) Capacity (bytes) and total amount of information
(gigabytes) stored on each tape;
c) All tapes that have been re-initialized or
overwritten since commencement of this
litigation, and state the date of said occurrence.
Planning Electronic Discovery
Every sound litigation plan should include a
strategy for responding to discovery requests.
The strategy can be broken roughly into five
categories:
¯ Data Preservation
¯ , Data Collection
¯ Data Review
¯ Data Protection
¯ Data Production
6
I
I
Proposed Amendments
to the F.R.C.P.
¯ On August 10, 2004, the Standing Committee
on Rules of Practice and Procedure approved
for publication and public comment several
proposed amendments to the Federal Civil
Rules that specifically address electronic
discovery. A copy of the proposed
amendments, and the Committee Notes, can
be found at
http://www.uscourts.govlrules/comment2005
/CVAug04.pdf
Proposed Amendments
to the F.R.C.P.
(cont.)
¯ The public had until February 15, 2005 to
comment to the Secretary to the Standing
Committee regarding the proposed
amendments, by submitting comments in
writing, or by testifying at one of three public
meetings which were held at various dates
prior to the February 15 deadline. The
earliest the proposed rules may go into effect
is December 1, 2006.
Federal Court
E-Discovery Guidelines
¯ At least two federal district courts have adopted electronic
discovery guidelines or standards to be observed by litigants
appearing ~n their courts:
U.S. District Court in the District of Delaware, Default Standards
for Discovery of Electronic Documents ("E-Discovery"), available
at h~;//www.d~.uscourt,o_ov/Announce/HotPaoes21.htm
U.S. District Court for the District of Kansas, Electronic
Discovery Guidelines, available at
http://www.ksd, uscou rts.gov/attorney/elecb’onicdiscoveryg uideli
nes.pdf
See htto://www.ediscovervlaw.com/cal-resoun-ms-htm for links to
the Delaware and Kansas guidelines.
I
I
I
I
I
I
I
I
I
I
I
I
1
I
1
1
1
Many Organizations are
Unprepared for
Electronic Discovery
¯ Although awareness of electronic discovery issues is becoming
more widespread, many organizations are nonetheless illpre..~a, red for the possibility of el .ec~. n.ic inform.ation being used
in i~tigation. A 2000 survey conoucteo at the American Bar
Association - Section of Utigation 2000 Annual M~eting showed,
82 percent of respondents reported that their clients do not
have an established protocol for handling electronic discovery
re~ueStSo
¯ 60 percent of respondents said that in 30-60 percent of their
cases involving electronic discovery their clients were not
aware that e ectronic information cou d later become evidence.
Many Organizations are
Unprepared for
Electronic Discovery (cont.)
¯ Most recently, a 2003 survey of records
management professionals elicited a number
of similarly troubling revelations:
¯ 47 percent of the organizations represented
did not include electronic records in their
retention schedules.
¯ 58 percent of the respondents’ organizations
do not have any formal email retention policy.
¯ In its mvisod August 3, 2004 report to the Standing Committee,
the Advisory Committee stressed the importance of addressing
issues retat~l to electronic discovery now:
Case law is emengi .rig, but it is not consistent and discovery
disputes are rare~y me subject of appellate review.
The uncertainties and problems lawyers, litigants, and judges
face in handling eL-~-~ronic discovery under the present federaJ
discovery rules are reflected in the growing demand for
additional rules in this area. At least four 0nited States district
courts have adopted local rules to address electronic discovery,
and many more are under consideration. Two states have, and
more are considering, court rules specifically addressing these
I
I
The pmposeo amendments cover five related areas, some aspects
of which am described in more detail below:
¯ a) eady atLention to issues reJatJ~g to electronic discovery, Including
the form of production, preservation of electronically stored
informaUon, and problems of reviewing eJectronically stored
information for privilege;
b) discovery of eb~:tronically stored information that: is not reasonabiy
ac(~ssible;
¯ cl the assett~n of privilege after production;
¯ d) the applicadon of P.~tes 33 and 34 to eted~nically stored
information; and
¯ e) a limit on sonctJons under Rule 37 for the loss of electronically
stored informat~n as a result of the routine operation of computer
systems.
I
I
I
I
I
1
DATA PROTECTION
Discovery of Electronic
Documents
In examining the current treatment of electronic discovery ~n
the courts, it is necessary to consider four issues:
¯ The extent to which existing discovery rules apply to
electronic discovery.
¯ The extent to which the courts are willing to protect
parties from burdensome or expensive electronic
discovery.
¯ The extent to which the courts are willing to shift the
cost of electronic discovery from the responding party
to the requesting party.
¯ The extent to which, in resolving the three prior
~ssues, courts treat electronic discovery differently
from traditional discovery.
Discovery (cont.)
¯ It is axiomatic that electronically stored information is
discoverable under Rule 34 of the Federal Rules of Civil
Procedure if it otherwise meets the relevancy standard
prescribed by the ru/es.
¯ Rules 26(b) and 34 of the Federal Rules of Civil Procedure
instruct that computer-stored information is discoverable
under the same rules that pertain to tangible, written
rnatedals.
¯ Although Rule 26(c) allows a court to issue protective orders
against oppressive or harassing discovery, and Rule 26(b)(2)
directs the court to prevent or control unduly burdensome
discovery, neither provision provides the court with
substantial guidance as to the meaning of those phrases.
9
I
I
SCOPE OF DISCOVERY
¯
For purposes of ~etermining the appropriate scope of
discovery beyond the initial disclosure requirements, the
relevant inquiry is whether the request for electronic
discovery is ~reasonably calculated to lead to the discovery of
admissible evidence". Rule 26(b)(].).
¯ An electronic discovery request must typically specify the
various electronic sources the requesting party seeks to
examine in discovery.
¯ One influential opinion in this area has suggested that "a test
run" or "sampling" procedure can be a helpful solution. See
¯/¢Peek v. Ashrcroft, 202 FRD 31, 34 (D.D.C. 200i).
!
I
I
I
I
i
I
!
I
!
I
I
Judicial Protection Against
Burdensome Electronic Discovery
¯ Generally, courts have hek:l that inconvenience and expense are not
valid reasons for the denial of electronic discovery.
¯ Courts have applied this reasoning where the responding party must
bear the additional expense of translating electronic data in a useable
form. Typically, courts have relied on a =reasonableness" standard.
¯ In several instances, however, courts have ~ against
burdensome electronic discovery. ~ Playboy EntP_fpri~, Inc., v.
We//e~, 60 F.S~pp.2d ].050 (S.D. Cal. ].999)
¯ In the P/ayl.~oyEnterpri~case, a distr~ct court held that in permitting
discovery Of electronically stored data, the producing party must be
"protected againsl: undue burden and expense and/or invasion of
privileged matter." Tne cour~ appointed a neutra| computer expert to
serve as an officer of the court and creai0e a "mirror image" of
defendants hard ddve. Tne court allowed defense counsel to view the
recovered documents and to produce only those docurr~nts that were
respens~ve and re, rant.
Other Objections to
Production of Electronic Data
¯ Attorney-client privilege
¯ The overriding principle in considering the application of
the attorney-client privilege is whether or not the client
was seeking a legal opinion or legal services with respect
to the communication at issue. If so, those legal opinions
and legal services are what is protected by the privilege.
¯ In determining the confidentiality of communication, it is
the intent of the client that controls.
¯ The communication will usually be deemed confidential
where the client has a reasonable expectation of privacy
and confidentiality.
¯ However, both intentional and inadvertent disclosures
have been deemed to waive the privilege.
10
I
I
I
I
I
i
I
I
I
I
I
I
I
I
I
I
I
Work Product Doctrine
The work product doctrine has been codified in Federal Rule of
Civil Procedure 26(b)(3). It protects research, analysis, legal
theories, mental impressions, and notes and memoranda
prepared in "anticipation of litigation or for trial" from disclosure
to opposing counsel.
¯ The question of wf~ther a document is subject to the work
product privilege is: was that document prepared by or for
the party (by an attorney or otherwise in anticipation of
litigation?
¯ Like the attorney-client privilege, the work product protection
can be waived by disclosure to any party other than one with
a common interest in the subject matter, by disclosure to a
government agency, or through deposition testimony to the
extent that it is used to refresh a witness’s recollection. -
How the Privilege Applies to
Ele~ronic Documents
¯ Electronic information and documents are subject to the san~
protection as traditional documents including the attorney-client
privi|ege, and the work product doctrine. Insurers should be
mingful of these traditional protections when creating electronic
documents, and endeavor to create and protect appropriate
privilege accordingly.
¯ For example, e-mail and other communicatmn to and from counse~
should be ciearty marked as privileged.
¯ LitJcjation databases created by and with the input from counsel
should be clearly marked as protected work product.
= Documents relevant to litigation should be carefully identified and
organized. This will prevent a later need to perform a system wide
search to recover documents and information responsive to an
adverse party’s discovery requests.
¯ Clear labeling and organization lessens the chances of
inadvertent production.
Privilege (cont.)
Does the Internet offer the requisite level of objectively
reasonable expectations of privacy, such that
communications sent via e-mail over this network will
maintain their privilege? As long as attorneys and clients
have objectively reasonable expectations of privacy in
their communication, the communication should be
protected.
¯ In fact, the Electronic Communications Privacy Act of
1986 makes unauthorized interception of= e-mait
messages a federal crime.
]]
I
I
I
I
I
I
I
E-mail Solutions
The following provisions should be included in any e-mail use
in the company in consideration of privilege issues:
¯ The e-mail system is the property of the employer,
¯ E-mail correspondence is to be kept confidential by the
employee.
¯ E-mail message recipient lists must be thoroughly
reviewed by the composer for accuracy before being sent.
¯ Employees should archive important messages by subject
and delete groups when no longer needed,
¯ if a company’s e-mail has a two-tiered delete function,
employees must perform a final deletion of all previously
deleted messages on a regularly scheduled basis.
Messages that need to be saved should be archived,
DATA COLLECTION
An initial step in both the collection and
the preservation process is to determine
the scope and sources of electronic
documents being requested. One of the
first questions you should consider is the
types of documents which are responsive
to the request.
1
1
1
I
1
TYPES OF DATA
Data falls into roughF/three categories: primary, secondary, an(;
tertiary.
Primary data is comprksed of emaJl and other "active data" such as
wore processing files, spreadsheets, presentations, and databases.
Active data can be thought of as everything electronic that is currently
available for use - the documents a company keeps; readily available
and accessible orl hard drives, serve~, and so on.
Secondary data consists of less acoessibie data such as system backup
data and archival or legacy data. This data is generally kept for
historical reference, and iS often difficult or expensive to retrieve.
Because the cost of backup tape restoration, retrieval, and translation
may be very high, the burden and expense involved in the recovery
process may OUt weigh the probative value of the material to be
recovered.
12
I
I
I
i
TYPES OF DATA (cont.)
Tertiary date includes data that exists despite no active effort
to maintein or save it. The most common example of
tertiary date is remnants of files that were either never
saved, or were actively deleted. Deleting an electronic
document merely renames the file, and marks the file space
as being available for overriding if that particular space on
the hard drive is needed in the future.
Because recovery of tertiary date generally requires the
assistance of a forensics expert, a par~ who demands
deleted material will often be required to absorb some or all
of the cost. An obvious exception would be in instences
where the request is necessary due to willful violation of a
preservation order ....
I
I
I
I
I
I
I
I
i
I
I
LOCATING POTENTIALLY
RESPONSIVE DOCUMENTS
¯
After you have identified the types of documents
that may be relevant, you need to identify where
those documents are located. This is a two-fold
process.
¯
First is the matter of identifying the potential
custodians, or who has the documents.
Second is the question of the actual, physical
disposition of the documents - i.e., where,
physically, do the documents reside?
¯
LOCATING POTENTIALLY
RESPONSIVE DOCUMENTS (cont.)
¯ Once you have made initial determinations regarding the nature
and location of I)otentialty relevant documents, you should
consider whether you have a need for forensic or other expert
advice. This is often unnecessary because, in the majodty of
cases, the parties can find everything they need responsive to
the m:lUeSt in easily accessible locations.
mm One standard exception is in cases where there is crucial archaic
or deleted date that needs to be recovered.
¯ Another situation that may warrant reteining an ex~)ert is where
, you intend to object to discovery requests on the basis of undue
burden or costs.
13
I
I
I
I
l
I
LOCATING POTENTIALLY
RESPONSIVE DOCUMENTS (cont.)
The next step in document collection is to formulete a collection protocol.
There are a few key points to keep in mind.
One is to incorporate custodian interviews as port of t~ collection
process, if possible. Custodian interviews can be used to ensure all
sources and custodians Of relevant documents have been identified.
~ can also make th~ co##ect~on more eff’~z~ent, by focusing your
coHeci~on offo~.
Another aspect of e~abtishing a collection protocol is to inOude a
descdpl~on of the guidelines procedures followed in collecting the
aocuments. On a technical level, it is important to make sure that ~
par~, ~tc~ng the colh~’t~lg understends the need to rr~intain the integrib/
of the electronic data
i
!
I
!
I
1
I
I
t
i
I
LOCATING POTENTIALLY
RESPONSIVE DOCUMENTS (cont.)
¯ Electronic documents are easily alterable. To avoid
potential claims of evidence spoliation, be aware of the
many ways that electronic documents may be altered.
Turning on a computer system; using automatic update
fields; recycling backup tapes; system maintenance
activities; saving new data; or instal~ing new software may
all inadvertently cause documents to be altered,
¯ The format in which you will collect the documents, as
we~ as the step-by-step procedures used in the collection,
should be included in your collection guidelines.
DATA REVIEW
Production of Electronic Documents eviewing and Protecting Client Documents
¯ In a typical case, the most time-consuming and expensive
aspect of electronic discovery is the review of documents.
Review is crucial for two main purposes: to identify the specific
documents responsive to the opponent’s request for production,
and to protect a dient’s documents.
As part of eslz~lJshing your plan of attack for the review, you
will want to consider the volume and complexity of the review.
Where discovery threatens to he prohibitively expensive, you
will want to try to control discovery costs by limiting review
where possible. Them are several steps you can ~ake to limit
review: filing objections to overbroad requests; negotiating
scope with opposing counsel (e.g., identifying limited number of
custodians, agreeing on search terms to narrow universe of
documents to be reviewed, etc.); or moving for a protective
order if necessary .......
i
~b
14
l
1
The Manner in Which You
Conduct the Review Will Also
Affect Costs
Whenever possible, take advantage of technology to make your review
more efficient, and to locate responsive docume~s more quiCl~y.
MOre advanced technologies include the use of duplicate suppression
and document mapping. Duplicate suppression tachnoiogles can De
used to identify and suppress up front the duplicative documents
existing within the documents you’ve selected. This technology is
especially useful when applied to e~naii, because of the highty
repetitive nature and t~/pically wide distribution of e-maiL
Document mapping is an example Of cutting edge technolegy that
allows an attorney t~ streamline review by organizing documents
graphically. By grouping similar documents together graphically,
document mapping technologies facilitate a quick and ef~ient
review of the documents.
1
1
!
1
!
DATA PRODUCTION
Production
The final aspect of your strategy for responding to
electronic discovery reauests is production.
¯ In addition to reaching an agreement regarding the
schedule and deadlines for production, is the issue of the
format in which the documents will be produced.
¯ Format type:
¯ hard~opy; or
¯ electronic f~rmat
If production in electronic format is considered, you will need
to decide between various electronic formats, such as native
format, or electronically imaged formats (e.g., PDF or TIFF
images).
¯ On-site inspections (protocols)
Format
I
i
1
1
¯ Another advantage of electronic format over hardcopy is
that loading documents into a litigation support database
typically requires that the documents be in electronic
format. By agreeing to production in electronic format,
parties can avoid the considerable expense of scanning
and imaging required to convert hard copy back to
electronic form.
15
I
I
!
I
!
!
I
I
i
!
i
I
I
!
I
i
I
I
I
Cost-Shifting
Of potentially enormous importance to control the costs of
discovery is the issue of cost allocation. Will a pa~y
required to respond to a discovery request be forced to
bear the full cost of preparing this response, and, if not,
on what basis can cost be allocated?
Rules 26(c} and 26(b)(2) empower a court to shift costs
where it deems it necessary. Two cases represent the
seminal
decisions
to date,
Rowe Entertainment v. William
I~orr~, and
Zubulake
v. UB~
U.S. District Court Judge Shira A. Scheindlin of the
Southern District of New York released a set of guidelines
for splitting e-discovery costs. Then, on July 2~l, 2003,
she issued a ruling applying those guidelines to the case
at issue.
Cost-Shifting (cont.) - Zubulake I
In Zubulake v. UBS Warburg, LLC, et aL, (’Zubu.~ke I), Judge
Scheindlin rejected the idea that the requesting party should
always pay for the restoration.
Instead, she enumerated seven elements to consider when
allocating such costs:
¯ the extent to which the request is speGifically tailored to discover,
relevant information;
¯ the availabilit~ of such inforrnation fmrn other sources;
¯ the total cost of procluc’don, compared to the amount in
controversy;
¯ the total cost of produc~on, compared to the r~ources available to
each party;
¯ the relative ability of each par~y to control costs, an~ its incentives
to do so;
¯ the impartance of issues at stake in the litigal~on; an~
¯ the relative be~fits to the parties of obtaining the information.
Zubulake I (cont.)
¯ Judge Scheindlin emphasized these elements provided
guidance only. She held that ’~vhen evaluating costshifting, the central question must be, does the request
impose an ’undue burden or expense’ on the responding
party?"
¯ Judge Scheindlin then ordered UBS to restore the
information on five of the 94 back up tapes, and to
present the court with a more accurate cost estimate.
]6
I
!
!
i
I
!
Zubulake II
¯ UBS came back with a figure of $273,649.39 -$165,954.67 for restoring and searching the tapes, and
another $107,694.72 for attorneys and paralegals to
review the documents. In her July decision, Judge
Scheindlin addressed these two items separately.
¯ First, she ruled that Zubulake should pay for 1/4 of the
restoration costs.
¯ The attorney review time, however, was solely UBS’s
responsibility. Judge Scheindlin held that: "the
responding party should always bear the costs of
reviewing and producing electronic data once it has been
converted to an accessible form."
Requesting Electronic Production
from the Opposition
¯ Just as every sound litigation plan must include a strategy
for responding to discovery requests, it also requires
similar attention be given to how you will obtain the
documents and information you need from the opposing
part,/.
¯ Much of the information need regarding the custodians
and locations of documents should be disclosed by the
other party during Rule 26(a)(1) disclosures.
¯ You can also address electronic discovery issues during
the Rule 26(t) meet and confer.
¯ Also, you can conduct a Rule 30(b)(6) deposition to
uncover important information about electronic evidence
controlled by your opponent.
DATA PRESERVATION
Preservation and Recovery of Relevant
Evidence
¯ .~ ~r~’t ~’tep in making discovery requests is ~o ensure that your
opponent is on notice of the need to preserve all potentially
relevant documents. One very concrete action you can take is
to send a preservation of evidence letter to your adversary.
¯ There are several e~ernents to include in your preservatJon
letter:
¯ Identify the individuals, by name or by position within the
organization, and
¯ Who may possess relevant e~ectronic evidence.
¯ Describe the types of evidence to be preserved, both in terms of
subject matter and of possible locations of evidence.
¯ Finally, ask that the evidence be located immediately and
¯ Preservation of evidence letters can reduce the risk that your
opposing party will destroy relevant documents.
- ....
,
!
I
I
I
i
I
I
I
I
I
Preservation and Recovery of
Evidence (cont.)
Where willful destruction of evidence is a real risk in your
case, be prepared to take action beyond issuing the
preservation of evidence letter.
If you have cause to think that the opposing party is apt
to alter or destroy relevant electronic evidence, it may be
prudent to obtain an order to preserve evidence, and an
order permitting the seizure of computers and storage
media.
If you can show the risk of destruction is particularly high,
i.e., showing faces demonstrating that the adverse has the
opportunity to conceal or destroy evidence, and
demonstrating that the party is likely to take the
opportunity for deceptive conduct, exparte relief
may be possible.
Preservation and Recovery of
Evidence (cont.)
¯ Despite the precautions you take, you may discover that
relevant evidence has been altered or deleted, either
innocently or maliciously.
¯ Where the altered or deleted files are likely to contain
information that is both relevant and probative, you may
want to consult with a computer forensics expert to
recover the missing evidence.
¯ Even if the court allows the recovery, the expense will
often be borne by the requesting party.
¯ Where the destruction is malicious, the offending party is
more likely to be held responsible for the expense of
recovery data using forensics.
Two-Way Street Rule
¯ A final word of caution regarding making discovery
requests is to emphasize the "Two-Way Street" rule.
¯ Think of it as the golden rule: What you do unto others,
will likely be done to you. Be careful in making requests
or demands that you would not want to have made upon
you, or with which it would be difficult or impossible for
you to comply.
18
I
!
I
I
I
I
The Duty to Preserve
Electronic Information
A person or entity has a duty to
preserve electronic information it knows
or reasonably should know is or will be
discoverable in pending or reasonably
foreseeable litigation,
The growing trend is for courts to
attach the obligation to preserve
documents earlier than the filing of a
complaint.
I
!
i
The Duty to Preserve is Broad
The duty to preserve evidence is as broad as the
concomitant duty to produce evidence.
While a party need not preserve every document
in its possession, it must preserve documents
and electronic information it knows or reasonably
should know are relevant, likely to lead to the
discovery of relevant evidence, or are reasonably
likely to be or have been requested during
discovery.
The prudent course of action is to preserve all
data and information as are feasible once the
duty to preserve attaches.
Affirmative Obligation to Take
Effective Steps to Prevent
Destruction of Evidence
The duty to preserve information imposes an
obligation on senior management and their counsel
to take effective affirmative steps to preserve the
information and prohibit unauthorized destruction of
the documents, or information.
The dub/to preserve electronic data requires more
than just preventing the intentional deletion of
documents and data. It also may require the party
to preserve backup tapes containing relevant
information, and the duty may require a party to
preserve residual deleted data.
19
I
I
I
I
I
Sarbanes-Oxley Act
The passage of the Sarbanes-Oxley Act in
July 2002 creates duties for publicly traded
corporations to protect and retain certain
electronic information.
Sanctions
¯ Courts
have the authority to sanction the
improper destruction or spoliation of electronic
documents and evidence. Spoliation is the
intentional destruction, mutilation, significant
alteration, or concealment of discoverable
information, where the failure to preserve
property for another’s use as evidence, in
pending or reasonably foreseeable litigation.
¯ To be actionable, spoliation also must prejudice
or otherwise damage the right of a party to bdng
an action.
Sanctions (cont.)
A federal court’s power to sanction spoliation
stems from Rule 37 of the Federal Rules of Civil
Procedure, and the court’s inherent powers.
Courts have available a relatively wide variety of
sanctions for spoliation, i.e., breach of the duty
to preserve. This relief ranges from a
presumption that the destroyed evidence would
help the case of the opposing party, to the
exclusion of other evidence, to dismissal of the
action (or default judgment)in more egregious
cases. Monetary sanctions may also be available
against the spoliator as well.
2O
I
I
i
I
i
I
Sanctions (cont.)
¯ In imposing sanctions, a court will generally
engage ~n the following analysis. Courts will first
assess (1) the fault or culpability of the spoliator,
and (2) the prejudice to the opposing party.
After evaluating fault and prejudice, courts apply
a proportionality analysis to determine the
appropriate sanction that will be the least harsh
effective sanction. Obviously, the greater the
culpability and prejudice, the more harsh the
sanction.
Culpability
There is a division among and between the
circuit courts regarding the level of culpability
necessary for the imposition of sanctions for
spoliation. Some courts hold that this question
has been resolved by the U.S. Supreme Court,
but the courts have not reached a consistent
application of this standard. Other circuits
appear to require a showing of bad faith to
justify the imposition of sanctions for spoliation
under the courts’ inherent powers, but the courts
differ on the meaning of "bad faith" and whether
it is required. Courts within the same drcuit
even differ on this auestion.
Prejudice
¯ A court will not sanction spoliation unless there is
some prejudice to the opposing party arising
from the spoliation. Thus, where copies or
cumulative evidence is destroyed, sanctions are
not warranted. On the other hand, where the
evidence is central to the case, dismissal or its
equivalent may be appropriate.
2]
I
!
Proportionality
Courts generally must apply the least harsh
sanction to properly address spoliation. A court
must weigh the culpability of the party
responsible for the spoliation and consider the
prejudice to the opposing party. The court must
then determine what sanctions are appropriate
and what is the least harsh appropriate sanction.
In other words, the punishment must fit the
crime, and it should be the least severe effective
punishment.
Proportionality (cont.)
¯ The sanctions imposed should serve one or more remedial
purposes: punishment, accurate fact finding (remedying
evidentiary imbalance caused by the spoliation), and
compensation. Thus if the conduct is egregious, the
punishment will be severe. ]f the conduct is less
egregious but creates an evidentiary imbalance, the court
will use presumptions and the exclusion of other evidence
to cure the imbalance.
¯ Monetary sanctions are also available to punish. On the
other hand, monetary sanctions are sometimes imposed
to cover the cost of rectifying the situation caused by the
destruction of evidence, rather than as a punishment.
Tort of Spoliation
Finally, there is the possibilib/that an aggrieved party
could bring an action for the tort of spoliation. The
tort is not widely recognized, however, as most
courts that have considered the issue have refused to
recogmze the tort. Nonetheless, the few jurisdictions
that have recognized the tort of spoliation permit a
party to recover damages against a third party that
has failed to preserve evidence.
The majority view, however, is that the tort of
spoliation suffers from too many infirmities to be a
viable cause of action.
22
Creating Valid, Effective
Data Retention Policies
1. Systematically develop data retention
policies.
2. Address all data files - electronic records.
3. Address all media, including microfilm and
machine-readable computer records.
4. Obtain written acknowledgement & approval
from all personnel who will be subject to or
affected by proposed data retention policies,
procedures & destruction schedules.
Creating Valid, Effective
Data Retention Policies
t
5. Systematically destroy data according to established data
destruction procedures & schedules.
6. Strictly control, carefully manage & regularly audit general
! compliance with data retention policies.
7. S uspend the scheduled destruction of all potentially
relevant data whenever litigation, government investigation
or audit is pending or imminent.
8. Naintain documentation regarding the creation &
implementation of the data retention policies.
Requirements Imposed by Effective
Data Retention Policies
l_~
¯ Data is maintained according to applicable statutes &
regulations, or otherwise preserved on_q_~y as tong as
necessary, as specified by data destruction schedules.
2. Data necessary to the general conduct of business are
systematically filed for ready accessibility, as required
3. Data permanentiy maintained by legal or business
requirement are catalogued & preserved on electronic
media affording economical storage & easy access.
I
23
I
I
I
!
I
Requirements Imposed by Effective
Data Retention Policies
4. Regarding data potentially relevant to pending or
~ litigation or investigation, a mechanism
trigger by policy immediately susoends compliance with
data retention procedures & destruction schedules,
enabling the prompt identification, isolation &
preservation of such data.
S. All other data is destroyed.
6. However, any uncertainty regarding compliance with
data retention policies must be resolved by retention.
I
I
I
I
I
The Critical Element of All Electronic Data
Retention Policies
An administrative mechanism must be established to
assure the IMMEDIATE suspension of scheduled data
destruction, when it is determined that specific data
may be relevant to a pending or foreseeable law suit
or government investigation.
Absent the existence & exercise of this mechanism,
data retention policies will not insulate against
judicial sanction for spoliation of evidence, based on
a routinely scheduled destruction of electronic data.
I
I
I
24
I
I
I
COZEN
O’CONNOR
ATTORNEYS
FREQUENCY OF IDENTITY THEFT
POWE RPOINT PRESENTATION
l
I
l
I
I
I
I
I
I
I
I
I
I
I
written/~,y
Robert W. Hammesfahr, Esq. and Keith E. Horton, Esq.
presented/~,v
Robert W. Hammesfahr, Esq.
rhammesfah [email protected]
COZEN O’CONNOR
Suite 1500, 222 South Riverside Plaza
Chicago, IL 60606
(312) 382-31 O0 or (877) 992-6036
www.cozen.com
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Las Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Toronto
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*Affiliated with the Law Offices of J. Goldberg & D. Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide legal advice.
Accordingly, these materials should not be relied upon without seeking specific legal advice on matters discussed herein.
Copyright © 2005 Cozen O’Connor. ALL RIGHTS RESERVED.
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Blowing Up Your Company and Case by Electronic
Record and Document Malpractice
Frequency of Identity Theft
June 22, 2005
Presenter:
Robert W. Hammesfahr, Chicago
COZEN
O’CONNOR.
ATTORNEYS
Frequency of Identity Theft
What is identity theft?
"identity theft occurs when someone uses your personal
information such as your name, Social Security number,
credit card number or other identifying information,
without your permission to commit fraud or other crimes."
COZEN
O’CONNO~
Frequency ofldentityTheft
¯ Identity theft is not limited to financial records
¯ Reported cases of identity theft include:
¯ Medical/dental records
¯ Employment Records
¯ Drivers License information
¯ Photographs
¯ Travel records
¯ Instant messaging/online usage
¯ "entertainment activities"
¯ ut|l~ty/teteph~e records
COZEN
I
I
Frequency of Identity Theft
¯ First Amendment Right to P~vacy
Four main types of pdvacy dghts have been recognized:
- Unreasonable intrusion upon the seclusion of another
- False light
- Disclosure of pdvate facts
- Appropriation of a person’s identity
I
I
Frequency of Identity Theft
¯ StandardofRecovery
In order to recover for an invasion of privacy offense, the
defendant’s conduct must be "highly offensive to a
reasonable person, "
I
I
I
I
I
I
I
I
I
Frequency ofldentityTheft
2
I
I
I
I
I
!
Frequency of Identity Theft
¯ What is private ?
¯ "There are virtually no online activities or services that
guarantee absolute privacy."1
¯ "Privacy is not something that I’m merely entitled to, it’s
an absolute prerequisite."2
COZEN
I
!
I
I
I
I
!
I
I
I
I
I
Frequency of Identity Theft
¯ Number of Reported Identity Theft Cases
- Between January and December 200z,, Consumer Sentinel. the
complaint database developed and maintained by the FTC.
received over 635,000 consumer fraud and identity theft
complaints. Consumers reported losses from fraud of more than
$547million.
- Credit card fraud (28%) was the most comman form of rel3orted
identify theft followed by phone or utilities fraud (19%}, bank fraud
(18%), and employment fraud (13%),
COZEN
O’CONNOR
ATTORNEYS
CHOICEPOINT CLASS ACTIONS AND MORE
POWERPOINT PRESENTATION
I
I
I
I
I
I
I
I
I
I
I
I
I
I
l
wriffen ~v
Robert W. Hammesfahr, Esq. and Keith E. Horton, Esq.
presented/~,v
Vincent P. Pozzuto, Esq.
[email protected]
COZEN O’CONNOR
16th Floor, 45 Broadway
New York, NY 10006
(212) 509-9400 or (800) 437-7040
www.cozen.com
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Los Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Toronto
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*Affiliated with the Law Offices of J. Goldberg & D. Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide legal advice.
Accordingly, these materials should not be relied upon without seeking specific legal advice on matters discussed herein.
Copyright © 2005 Cozen O’Connor. ALL RIGHTS RESERVED.
I
I
I
I
i
!
I
I
I
I
I
I
I
I
I
I
I
Blowing Up Your Company and Case by Electronic
Record and Document Malpractice
ChoicePoint Class Actions & More
June 22, 2005
Presenter:
Vincent Pozzuto, New York, NY
COZEN
O’CONNOR.
ATI[ORNEYS
ChoicePoint Class Actions & More
ChoicePoint "Illegal Data Access" Class Action Lawsuits
¯ Facts
¯ Provider of identification and credential vedfication services
¯ 10/04: Discovered identity thef~ of dat~ of up to 145.000 ir~dividuals
, Names. Addresses. SSNs. Credi~ Reports. etc.
¯ PJ05: Notified consumers (delay requeste~/:)y law enforcement)
¯ 10/04 to 02/05: Certain officers of the Company sold Choicepoint
common stock while in poseessioa of nonpublic information
¯ Government agencies and consumers investigate and sue
ChoicePoint Class Actions & More
ChoicePoint "Illegal Data Access" Class Action Lawsuits
¯ More than 20 class action lawsuits have been filed to date
¯ Several Class Action Lawsuits consolidated in April 2005,
including the lead case, Harrington v. ChoicePoint, CV05-1294
¯ Goldberg v. ChoicePoint, Inc. CV05-2016 (C.D. CaL)
¯ Harrington v. Cho/cePoint, CV05 1294 (C.D. Cal.}
¯ Salladayv, ChoicePoint, CV05-1683 (C.D. Cal.)
¯ Cloy v. CholcePoint, CV05--1993 (C.D. Cal.)
!
I
I
I
i
I
I
I
I
I
I
I
!
I
I
ChoicePoint Class Actions & More
Case
Calmes of A~tion
Hardmjton v. C~t
Fair C~edit Repo~ng Act (FCRA)
Remedy sought
ChoicePoint Class Actions & More
¯ More Pending/Potential Lawsuits
¯ California Dept. of Social Services
¯ Security breach exposes SSNs and contact info of 1.4 million
9roviders and clients (Oct. 2004)
¯ Bank of America
¯ Data on 1,2 million federal employees stolen (Disclosed Feb.
20O5)
¯ CitiFinencial (Citigroup)
¯ Lost in shipment a box of computer tapes with pnvate account
information for 3.9 million customem (June 20O5).
¯ BJ’sWholesaleClub
¯ $13 miJ. in customers’ claims for failure to encrypt data
transmissions. Settlement reached with FTC (June 2005)
Lexis-Nexis
¯ Intruders misappropriate passwords and renords on 32.000
people in the U.S. (Disclosed March 2005)
2
I
I
I
!
I
I
I
I
I
I
I
I
I
I
I
I
COZEN
O’CONNOR
ATTORNEYS
HIPAA ENFORCEMENT AND LIABILITY
POWERPOINT PRESENTATION
written & presented
Katherine M. Layman, Esq.
COZEN O’CONNOR
1900 Market Street
Philadelphia, PA 19103
(215) 665-2000 or (800) 523-2900
www.cozen.com
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Las Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Toronto
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*Affiliated with the Law Offices of J. Goldberg & D. Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide legal advice.
Accordingly, these materials should not be relied upon without seeking specific legal advice on malters discussed herein.
Copyright © 2005 Cozen O’Connor. ALL RIGHTS RESERVED.
I
I
I
i
I
I
I
I
I
I
!
I
I
I
I
COZEN
O’CONNOR
ATTO~NSYS
Insurance Coverage Seminar
HI PAA Enforcement
and Liability
June 22, 2005
Katherine M. Layman
klaymanL~,cozen.com
215-665-2746
COZEN
O’CONNOR
OCR & Enforcement
¯ HIPAA Privacy
- Enforcement is carded out by Office For Civil Rights (OCR)
11,280 coml~laints flied as of Apd12005
¯ 61% closed.
168 to DOJ for investigation
¯ HIPAA Security - Enforcement by CMS
OCR Enforcement
Example
¯ Small rural clinic
¯ Fired employee a whistleblower
¯ Clinic did not cooperate --> resulted in 2
day on-site inspection
I
!
!
I
!
Proposed Enforcement Rule
(April 18, 2005)
¯ Expands application to all administrative
simplification rules
¯ HHS Philosophy: Voluntary Compliance
¯ Should be final by September 2005
I
I
I
I
CO~N
Proposed Enforcement Rule
Business Associates
¯ A covered entity that complies with rules
regarding BA Agreements will not be held
liable for BA’s violation of the rules
Proposed Enforcement Rule
Affirmative Defenses
¯
¯
¯
Violation is a criminal offense -, DOJ
Lack of knowledge
CE must demonstrate it had measures in place to
identify and follow-up on violations
Violation is due to "reasonable cause" and not
willful neglect
2
I
I
l
I
I
COZEN
O’CONNOR
HIPAA Civil Penalties
¯ $100 per violation
¯ $25,000 maximum fine per year,
per person, for like violations
COZEN
HIPAA Criminal Penalties ~-N~
- $50,000 fine and/or
1 year ~n prison
(~
False Pretenses:
- $100,000 fine and/or
- 5 years in prison
Really Bad Intent (e,g., commercial gain. malicious harm):
- $250.000 fine and/or
10 years in prison
First HIPAA Conviction
U.S.v. Gibson (W.D. Wash. Aug. 2004)
Seattle Cancer Care Alliance phlebotomist obtained the SSN and other
identifying information of a patient
¯ Used the information to obtain fraudulent credit cards - ran up charges
of about $9,000
The patient conducted his own investigetJon when he received notices of
card issuance and collection agency calls for nonpayment of bills -police and credit card companies would not investigate.
3
COZEN
O’CONNOR
I
t
First HIPAA Conviction
(Cont’d)
¯ Calling the crime one of the "most deplorable I~e
witnessed in 15 years on the bench," U.S. District Judge
Ricardo S. Martinez sentenced the health-care worker to
16 months in pdson after a guilty plea.
¯ Charged for "personal gain"
¯ NOTE: First case prosecuted under HIPAAfocused on
an individual and not a CE. Will this happen again?
!
COZEN
O’CONNOR
Criminal Enforcement
June 1, 2005 DOJ Opinion
ii
¯ Penalties for criminal violation of HIPAA apply to
covered entities - not employees.
I
¯ "If CE is not an individual, general principles of
corporate criminal liability will determine the
entity’s liability and that of individuals within the
entit?/, including directors, officers and
employees."
!
i!
I
l
I
!
I
l
COZEN
O’CONNOI~
Criminal Enforcement
(Cont’d)
¯ Conduct of these individuals may be
prosecuted under principles of aiding and
abetting liability or conspiracy liability.
4
c~z~
Potential Causes of Action
i
i
Common law invasion of pdvacy
Computer invasion of privacy (e.g., Virginia)
Malpractice: breach of confidentiality
Breach of contract
FTC: Unfair or deceptive practices (Section 5(a))
¯ Eli Lilly disclosure of names of Prozac users
¯ Case settled - no fines
¯ Attorneys fees?
Wire Fraud
i
I
!
i
I
!
I
i
5
ERRORS AND OMISSIONS INSURANCE
POWERPOINT PRESENTATION
wriffen & presented
Manny Cho
Senior Broker
E & O Division Manager
Carpenter Moore
I
I
I
i
I
l
l
I
i
I
I
i
I
I
I
I
I
I
!
!
i
!
i
I
!
!
i
I
!
!
!
!
!
i
I
I
!
I
I
!
!
!
!
II
I
I
!
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
l
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
!
COZEN
O’CONNOR
ATTORNEYS
COVERAGE FOR RISK: SURVEY OF KEY CONTRACT LANGUAGE
POWERPOINT PRESENTATION
I
I
I
!
I
i
I
I
I
I
i
I
i
i
wrilten /~y
Robert W. Hammesfahr, Esq. and Keith E. Horton, Esq.
f~resenleaI h);
Robert W. Hammesfahr, Esq.
[email protected]
COZEN O’CONNOR
Suite 1500, 222 South Riverside Plaza
Chicago, IL 60606
(312) 382-3100 or (877) 992-6036
www.cozen.com
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Los Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Toronto
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*Affiliated with the Law Offices of J. Goldberg & D. Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide legal advice.
Accordingly, these materials should not be relied upon without seeking specific legal advice on matters discussed herein.
Copyright © 2005 Cozen O’Connor. ALL RIGHTS RESERVED.
I
I
I
I
I
i
I
I
I
Blowing Up Your Company and Case by
Electronic Record and Document Malpractice
Coverage for Risk: Survey of Key Contract Language
June 22, 2005
Presenter:
Robert W. Hammesfahr. Chicago
COZEN
O’CONNOR.
ATTORNEYS
Survey of Key Contract Language
The Scope of Coverage
¯ Professional Services
¯ Hardware
¯ Software
¯ Consulting
¯ Media
¯ Services are generally designated within the policy
¯ Actual or Alleged Failure to Perform Services
I
Survey of Key Contract Language
I
I
I
I
I
I
¯ Definitions
¯ Network Operations Secunty
¯ E&O
¯ Media
¯ Cyber-Extortion
I
I
I
I
I
I
I
I
I
I
Survey of Key Contract Language
¯ Network Operations Security
Network Operations Secudty means th(~e activities performed by the
Insured. or others on the lnsured’s behalf to ensure against
Unauthorized Access to and the Unauthorized Use of the tnsured’s
Computer System.
Survey of Key Contract Language
Technology and Internet Errors & Omissions Liability
Coverage
The Company will pay on behaff of the Insured all sums in excess of the
DeductiVe that the Insured shall become legally obligated to pay as
Damages and Claims Expenses because of a Claim first made against the
Insured and reported to the Company dudng the Policy Peded by reason of a
Wrongful Act c~’nmitted on or subsequent to the Retroactive Date specified
in Item X of the Declaredons and before the end of the Policy Period,
Survey of Key Contract Language
Media
¯ Media Services mear~s:
a) the gathering, collection or m~ing of Media Material fm inol~ion
in any Media Communi~tion; or
b) the publication, di~inat~ ~ rel~ ~ Med~ Material in any
Media ~mmuni~tton in the o~inaw cou~ of the ~ureds’
busing.
~ia C0mmuni~tion m~ns any c~mun~tion of Media Material
by way of m~ia, r~ardte~ of the Oatum or fo~ of ~h
~u~on and r~a~l~ of any media ~.
¯ Media Ma~r~l mes~ matedal of any f~ or nature whatever,
inctudi~ ~t not I~t~ to w~s, ~ta, ~p~er ~i~, imag~,
graphi~ and mus~.
2
!
I
i
I
I
I
I
Survey of Key Contract Language
¯ Cyber-Extortion
Extortion claim mear~s any claim in the form of a threat or connected
Survey of Key Contract Language
¯ Key Exclusions
¯ Prior Ac~s
¯ Breach of Contract
¯ Damages
¯ Proscribed Activities
¯ Intentional Acts
I
I
I
I
I
I
I
I
I
Survey of Key Contract Language
¯ Prior Acts
¯
¯’We shaft not be/iabk~ for any damages or claims expet~ses directly
indirectly adsthg out of or in any way altdbutabto to... any claim or
circumstance arising from any wrongful act prior to the retroactive
date of this Policy or where you knew Or could reasonably nave
foreseen such wrongful act may be the basis of a claim.’*
3
I
I
I
I
I
!
I
!
i
I
I
I
!
I
I
Survey of Key Contract Language
¯ Breach of Contract
"We shall not be liable for any damages or claims expenses directly or
indirectly arising out of or in any way attributable to;.., any liability
assumed under any contract or agreement iectuding any breach of
express warranty or guarantee, except and to the extent you would have
been liable in the absence of such contract or agreement."
(ACE London Safeonline Policy Wording for Safe Enterprise)
Liability assumed under contract, agreement, writterl or oral; excep~Jes
for liability would have had even in the absence of such
contrac+Jagreement.
(ACE Computer & Technology Products & Services Professional
Liability)
Survey of Key Contract Language
Damages
¯ Damages rneansa compensatorymonetaryjuclgement, award
Or settlement, other than;
(a) your future royalties or future profits, restitution, disgorgement of
orofits, or the costs of complying with orders granting injunctive
relie~
(bj return or offset of fees. charges, or commissions for gcoos or
services airsedy provided or contracted to be i~revkled;
(c) punitive or exemplary (unless insurable by law). treble or other
damages that are assessed in pat1 to punish the defendant or to
deter others:
(d) damages pursuant to federal, state or local statutory law other than
compensatory:
(e) any amounts owed under any express or implied contract; and
(t) any amounts for which you are not liable, or for which there is no
legal recourse against you.
Survey of Key Contract Language
¯ Proscribed Activities
¯ We shall not be liable for any damages or claims expenses directly or
indireotty adsiag out of or in any way attributable to... gamb~ng,
pomagraphy, or the sale or provision of prohibited, restricted or
regulated items including but not limited to alcoholic beverage, firearms,
tobacco, or drugs.
1
I
4
I
!
I
I
i
Survey of Key Contract Language
¯ IntentionalActs
¯
"~/V]e will not cover claims of loss.., alleging or arising out of a
dishonest, fraudulent, criminal or malicious act, error or omission, or any
intentional Or knowing violation of the law, or gaining of any profit or
advantage to which you are not legally entitled."
Conclusion
i
Do organizations buy insurance to help manage cybersecurity risks?
I
I
I
5
i
I
I
I
I
I
I
I
COZEN
O’CONNOR
ATTORNEYS
CLAIMS FOR BREACH OF CONTRACT VERSUS PROFESSIONAL ERRORS AND OMISSIONS
POWERPOINT PRESENTATION
wriffen & presenfed /~v
Margart A. Reetz, Esq.
[email protected]
COZEN O’CONNOR
Suite 1500, 222 South Riverside Plaza
Chicago, IL 60606
(312) 382-3100 or (877) 992-6036
w~vw.cozen.com
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Las Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Toronto
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*Affiliated with the Law Offices of J. Goldberg & D. Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide legal advice.
Accordingly, these materials shou(d not be relied upon without seeking specific legal advice on matters discussed herein.
i
Copyright © 2005 Cozen O’Connor. ALL RIGHTS RESERVED.
I
I
I
I
I
i
I
I
i
!
i
I
!
I
!
Claims for Broach of Contract Versus Professional
Errors and Omissions
Your Parents Never Heard of or Worried about
These Types of Claims
New -~ypes of Claims
Errors and omissions in design or use of technology, internet, website
Theft or misuse of data
New causes of action cyber piracy, cyber squatting, domain name theft, cyDer stac~=ng,
identity theft, etc.
Breach of Contract, Professional E&O, Unfair
Business Practices
¯ Emerging Laws/Trends in Litigation
The rise of the unfair business practice claims
Breach of contract claims
New statutes and regulations
New tort theories
New Claimants/Plaintiff Law Firms
I
I
I
I
I
I
I
Claims for Breach of Contracts Caused by Professional
Errors and Omissions
~ ~...~ -Case Examples
Claims for ¯teeth of Contracts Caused by Professional
Errors and Omissions
!
I
I
!
I
I
I
i
I
i
I
¯Case Examples
oB¢ Manufacturing Company:
Breach of Contract Versus Professioeal
Errors and Omissions
¯ Allegations include:
- Breach of Contract
- Breach of Warranty(ies)
- Fraud
- Negligence
- Negligent misrepresentations
- Negligent failure to perform
- Advice
- Unjust Enrichment
I
I
I
I
I
I
I
I
!
I
I
I
I
I
I
I
I
Licensing and Other Agreements
¯ Loss control starts with the controlling agreements:
¯ licensing agreement;
¯ indemnification in vendor agreen~nts;
¯ web site terms of use;
¯ privacy statements;
¯ terms of sale of goods, click ware and shrink ware,
Professional E&O
¯ New Policies
Third arty - CGL with Advertising Liability and
Personal Injury to CGL with no AL/PI
Specialty Tech e and o re mJsc e & o
media and content, copyright, trademark
and other IP, cyber extortion, privacy,
public relations, and compliance
First Party - property damage due to physical injury
plus perhaps valuable paper including
electronic storage
Cyber First Party - damage to data and electronic storage
Breach of Contract versus Professional Errors and
Omissions
¯
The Policy is a pay on behalf of wordino that pays as a result of any claim first
insurers in writing during the policy period
¯
0
A "wr0ngful act" is defined as alleged breaches of duty or neglect or omisston
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Breach of Contract]Professional Errors and Omissions
Breach of Contract versus Professional Errors and
Omissions
Breach of Contract versus Professional Errors and
Omissions
¯
Co~,erage issue(s):
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Breach of Contract versus Professional Errors and
Omissions
Case Law Emerging
Compare w~ CGL Policies:
American Family MutuaJ Insurance Co. v. Amedcan Girl Inc., f/PJa
Pleasant Company Inc., The Renschler Company Inc. v. West American
Insurance Co., et aL, 673 N.W.2d 65, 80 (Wis. 2004)
Ren~chler was hired by Amedcan GJd ti’,c., to work on the design and
construction of a warehouse. Renschler subcontraCted w~ a soils
engineer to analyze the soil conditions at the site. Pursuant to the
engineers recommendations that the soil was poor and should be
prepared, Renschier proceeded w~th "surcharging" or preparing the site
by placing heavy fill on it to compress the so~t. After the buiMing was
completed, significant sinking of the foundation occurred, causing serious
physical damage to the building.
The Wisc. Supreme Court
¯ The contractual liability exclusion does not exclude coverage for all
breach of contraCt liability, the majority held.
¯ It applies only where the insured has cor~tractuatly assumed the
liability of a third party as in indemnity or hold-hamliass agreements,
which are not present in th~s case, it held.
Privacy versus Seclusion Claims
Privacy/Seclusion - Statutes and Law
¯ Electronic Communications Privacy Act (ECPA)
¯ Fair Credit Reporting Act (FCRA)/Fair and Accurate
Credit Transactions Act (FACT)
- Permits consumers to obtain free credit report
- Lenders and credit agencies must implement
procedures to identify identity theft
¯ Telephone Consumer Protection Act (TCPA)
¯ Federal Trade Commission
¯ Common law invasion of privacy
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Privacy/Seclusion Claims
¯ Unsolicited email, phone calls and faxes
¯ Disclosure of personal information (finances~
medical records etc.)
Claims Seeking Damages for Privacy and
Seclusion Violations
¯ What does it mean to violate
someone’s dght of pdvacy?
¯ The dght of pdvacy is invaded
by:
¯ unreasonable intrusion
upon the seclusion of
another
¯ appropriation of another’s
name or likeness
¯ unreasonable publicity
given to another’s private
life
¯ pubtic~ty that unreasonably
places another in a false
light before the public
0
Privacy/Seclusion Claims
Private causes of action are allowed under the TCPA.
"A person who has rece~vad mere than one telephone call
within any 12-month pedod by or on behalf of the same entity
in violation of the regulations prescribed under this subsection
may, if otherwise permitted by the laws or rules of court of a
State bdng in an approphate court of that State--(A} an action
based on a violation of the regulations prescribed under this
subsection to enjoin such violation,(B) an action to recover for
actual monetary loss from such a violation, or to receive up to
$500 in damages for each such violation, whichever is greater,
or (C) both such a~ons."
Treble damages may be awarded for willful violations.
I
I
I
I
I
I
I
I
I
!
I
I
I
I
I
I
I
Privacy/Seclusion Claims
¯ Coverage cases unaer TCPA:
¯ Unsolicited fax advertisements are covered under advertising
injury provisions
¯ Cases include:
Hooters of Augusta v. American Global Insurance
Company and Zurich /nsurance Company
Court: a layman undemtands his dgl~t to be left alone to
inClude being left alone at wod~ by advertisers sending
within the meaning of AGIC’s policy and that Nichoison
(claimant) suffered an invasion of privacy;
- Universal Underwriters Ins. Co. v. Automotive Network, Inc.
- Park University Enterprises,/nc. v. American Casualty
Company of Reading, Pa.
Privacy/Seclusion Claims
insu~era~s a~ged ~CPA ~otations (whiCh deals with the ...... f
Recovery of Defense Fees - Reservation of
Rights Letters
¯
¯
¯
¯
Recent ltlinois Supreme Court Decision: GeneralAgents v. Midwest
Sporting.
Insurer rsserved rights but provided a defe’nse to tbe insured who was
sued by the City ot~Chisego and Cook County over inappropdste gun
sales.
Insurer sent at= ROR letter dated Dec. 3, 1998 but agreed to defend.
Insurer filed a DJ action on Oct. 28, 1999, seeking a declaration that it did
Rejecting the majo~t~ trend (Cal., Colo, Fta, La., Minn.) that allows for
parties."
I
I
I
I
I
I
I
I
I
!
I
I
I
I
I
I
i
I
I
COZEN
O’CONNOR
ATTORNEYS
HACKING AND DOWNSTREAM LIABILITY
wrifen
Brian J. Walsh, Esq.
COZEN O’CONNOR
16th Floor, 45 Broadway
New York NY, 10006
(212) 509-9400 or (800) 437-7040
www.cozen.com
Atlanta
Charlotte
Cherry Hill
Chicago
Dallas
Denver
Houston
Las Vegas*
London
Los Angeles
New York Downtown
New York Midtown
Newark
Philadelphia
San Diego
San Francisco
Seattle
Toronto
Trenton
Washington, DC
West Conshohocken
Wichita
Wilmington
*A~’liated with the Law Offices of J. Goldberg & D. Grossman.
These materials are intended to generally educate the participants on current legal issues. They are not intended to provide legal advice.
Accordingly, these materials should not be relied upon without seeking specific legal advice on matters discussed herein.
Copyright © 2005 Cozen O’Connor. ALL RIGHTS RESERVED.
!
I
I
I
I
I
I
I
!
I
I
I
I
I
I
I
I
I
I.
Introduction
Each year, computer hackers are getting more destructive: "Total damage [in 2004] was at least
$17.5 billion, a record -- and 30% higher than 2003, according to research firm Computer
Economics Inc." 1 Civil suits asserting liability of innocent middlemen are likely to increase as a
result of this rise in computer hacking. As hackers are often difficult to track down, judgmentproof, or both, the ultimate victims of computer hacking will increasingly look to downstream
liability as a way to hold "innocent" parties with deep pockets responsible for the resulting
damage.2
II. Scenarios for Trouble
A. Possible Scenarios
Hacking is a widespread and diverse phenomenon. All of the following scenarios can and have
occurred.
¯
A hacker disables a website operated by a large brokerage firm so that its customers
cannot trade for several hours. On that day, the stock market is volatile, and a class of
customers suffers financial losses.
¯
Hackers target a web-based banking service gaining access to usernames and passwords,
putting individual bank accounts at risk.
A hacker targets an employee who works from home accessing her employer’s internal
network over the Internet. The theft and subsequent dissemination of confidential client
or third-party information, such as trade secrets, leads to large-scale damage.
A hacker emails a virus to an employee of a company through an attachment. B’s
employee unintentionally forwards the infected email attachment to her friend who works
at a hospital. The virus infects the hospital’s internal server and dangerously alters patient
files, causing harm or death.
Hackers gain control of unsuspecting users’ computers and use those machines to flood a
targeted site or service with junk messages, overwhelming the site, thereby making it
inaccessible to legitimate customers.
B. Potential Defendants
¯
¯
The hacker, if you can find him.
Internet service providers (ISPs) that failed to properly secure their networks.
Brian Grow, Hacker Hunters, Business Week, May 30, 2005, at 74.
W. Reid Wittliff, Computer Hacking and Liability Issues: When Does Liability Attach?, at
http://www.gdhm.com/pdf/wrw-hack article.pdf.
I
I
i
I
i
I
!
I
i
I
I
I
I
Companies with computers used as "bounce sites" or as "zombies" to launch attacks.3
Companies that hired a known hacker and gave him or her access to high bandwidth and
a computer.
III. Who Pays?
A. The Innocent Middleman?
1. At present, there is no liability for an innocent middleman.
If the middleman had no knowledge of the attack, did not aid in the attack, and had adequate
security systems in place to prevent hackers from targeting the network, courts are not likely to
hold the middleman responsible. Our society looks to punish the bad actor, and the innocent
middleman does not fit that role. However, whether the middleman is, in fact, innocent, will be a
fact-specific inquiry.
2. If there is no physical damage, the economic loss doctrine may bar recovery.
Case law distinguishes physical harm, which includes property damage and bodily injury, from
economic harm. If, somehow, the middleman is not "innocent" and is, in fact, negligent in some
way, it still may not be liable for damages to a third party under the economic loss doctrine.
Under the doctrine, a tort claim will not succeed if economic damages are the only injury to the
plaintiff. Notwithstanding special exceptions, a plaintiff can generally recover in negligence
actions for actual physical damage to personal or real property and personal injury, but not for
purely financial damages. However, if the injured party suffers bodily injury or property damage,
such as corrupted or erased data, this rule may not apply depending on case law in the relevant
jurisdiction.
B. The Hacker?
If the bad actor is a known hacker, federal and state laws mandate criminal prosecution.
However, it is not easy to find the bad actor. Hackers can launch attacks from anywhere in the
world. Even if hackers are domestic and nearby, hackers may be able to avoid detection by
erasing any sign of their invasions. Furthermore, hackers are often judgment-proof: either they
do not have any assets, or they have them well hidden. Therefore, the victims are likely to seek
joint and several liability for the middleman who arguably exposed them to the risk.
C. The Injured Party?
At present, the hacking victim is paying for the damage caused by online hacking, allowing
companies and consumers to unfairly assume the losses.
3 A computer is used as a "bounce" site when it allows the hacker’s connection to "bounce" off
its server to another machine. A system is considered a "zombie" when a hacker programs it to
perform the illicit task without the knowledge of their system operators.
I
I
IV. Proposals
A. Theories of Liability for the "Innocent" Middleman
I
i
I
I
I
I
I
I
I
I
I
!
I
I
I
The damage resulting from computer hacking can consist of immediate financial loss, damage to
reputation, and consumer distrust. As lawsuits increase, plaintiffs may rely on various legal
theories in an attempt to recover against a middleman, especially if the true ’bad actor’ is not an
ideal defendant.
1. Negligence
The core of the downstream liability issue is negligence. Common Law negligence requires that
four elements be satisfied for a successful claim: duty, breach of duty, causation, damages. The
threshold question of whether a duty exists between the middleman and the injured third party
may be difficult to answer.
The well settled "no duty" doctrine, which holds that there is no duty to protect another from the
criminal acts of a third party, if strictly applied, would mean that if a company’s unsecured
computers were hacked and used as a mechanism to launch attacks against other systems, the
company would not be liable to third parties, even if it was without any security systems in
place. Such a rule, however, is not unconditional. Courts have recognized certain situations in
which a duty to protect a party from the criminal acts of another may arise:
(1) a property owner who maintains control over the property owes a duty to exercise
reasonable care to maintain premises in safe condition, including those precautions to
protect from foreseeable criminal acts of third parties; 4
(2) a person with a special relationship with a third party may owe a duty to control that
party’s conductS; and
(3) a person who has created a dangerous situation owes a duty to prevent harm to others
because of the situation that person created.6
Situations (2) and (3) are unlikely to be on point for cases of downstream "innocent middleman"
liability. Situation (1), however, regarding typical premises liability, is closely analogous to a
negligence claim arising from hacking. An invitee harmed on a property owner’s property is in a
similar situation to a company’s network attacked by hackers while hosted by an ISP. The ISP
has control over the online system and is responsible for protecting foreseeable harm. What is
foreseeable is, of course, fact-specific.
Assuming a recognizable duty, courts will look to see whether the middleman has breached the
duty. At present, there is no universally accepted standard of care to apply. General tort law
defines the duty as the actions taken by a reasonable and prudent person to prevent unreasonable
See Newell v. Swiss Reassurance Company, lnc., 580 N.Y.S.2d 361 (Sup. Ct. 1992);
TimberwalkApartments, Partners, Inc. v. Cain, 972 S.W.2d 749, 756 (Tex. 1998).
See Tarasoff v. Regents of University of California, 17 Cal.3d 425; 551 P.2d 334 (1976).
See Medina v. City and County of Denver, 960 F.2d 1493 (10th Cir. 1992).
I
I
!
I
I
I
I
I
i
I
I
I
I
!
I
I
risks of harm. Therefore, to ascertain breach, courts will look at several factors, such as how the
middleman company could have prevented the loss.
Should a Court apply the cost-benefit approach7, under which unreasonable risks are those that
the company could cost-effectively eliminate, it would weigh the cost of untaken measures
against the value of reducing all foreseeable risks, not just the risk that caused the damage at
issue.8 In the hacking context, if only unreasonably expensive technology could eradicate an
unknown virus, courts would probably consider the risk unavoidable as due care could not have
prevented it, and the defendant will escape liability.9
Once the plaintiff has established a breach of duty, he must then demonstrate causation in one of
two ways. Cause-in-fact requires the plaintiff to prove that "but for" the middleman’s failure to
take the reasonable precaution, the harm would not have occurred. For instance, if the security
administrator forgot to scan the system for viruses, but the scan would not recognize the virus
regardless, the untaken precaution would not have prevented the damage.I° Proximate cause
requires that the middleman’s negligence be sufficiently related to foreseeable damage.
However, if an intervening or superseding cause comes between that negligence and the injured
plaintiff, the amount of liability may be reduced or eliminated. For example, if a hacker gains
access to a network due to a blackout that disables the security system, the company did not
proximately cause the damage.
If the plaintiff satisfies the aforementioned elements, the economic loss rule, supra, may still bar
recovery.
2. Negligent hiring or supervision
Because this claim is only likely to arise if there is an identifiable hacker who is an employee of
a company with deep pockets, a lengthy discussion on this theory for recovery is beyond the
scope of this paper; however, it is worth mentioning.
As it is unlikely that the hacker’s employer authorized the hacking, a claim for respondeat
superior would probably fail. However, a plaintiff may be able to proceed under a negligent
supervision or negligent hiring theory. Employers have a duty to adequately hire, train and
supervise their staff members. If a plaintiff can demonstrate that the employer failed to take
reasonable steps to protect third parties from misconduct of employees, a Court may impose
liability on the company. Likewise, if a company knows or should have known that it hired an
employee with a propensity to hack computers and provides such employee full access to the
Internet, a court may find the company responsible for the results of the employee’s hacking.
7 Alternatively, the courts may look at industry custom or do a risk-utility test, which balances
the utility of the conduct against the likelihood and extent of harm.
8 See U.S.v. Carroll Towing Co., 159 F.2d 169 (2d Cir. 1947).
91°1d
Meiring
de Villiers, Computer tZiruses and Civil Liability, 40 Tort & Ins. L.J. 123 (2004).
"
11 "A claim based on negligent hiring and supervision requires a showing that defendants knew
of the employee’s propensity to [commit the alleged acts] or that defendants should have known
i
I
I
I
!
I
I
I
!
I
!
I
Additionally, an injured party may bring a negligent hiring claim if the middleman company
hired a network security administrator who did not properly secure the system. The employer
could be held liable in hiring an unqualified employee if the employee’s failure to institute
security mechanisms caused the system’s vulnerability.
3. Breach of contract
Common Law breach of contract might apply if parties have contracted to provide and receive
data storage or processing services. Often, an online business enters into a contract with the ISP
for Interact service. The breadth of the contract terms determines the extent and scope of any
action. However, such a claim is not likely to be successful in the case of security breaches
involving individuals or third parties. Because courts tend to adhere to a privity of contract
requirement, a victim of a hacker attack launched from a third party’s unsecured computer
system would have no claim against the third party without a contractual relationship. For this
reason, the breach of contract claims that have the most chance at success would be those by
consumers or businesses against companies who promised specific hacker protection in the
contracts, or those whose contracts identify steps taken should a hacker compromise the system.
4. Strict products liability does not apply
The doctrine of strict products liability renders manufacturers of defective products liable to any
person injured as a result of the defect, regardless ofprivity, foreseeability, or due care, if the
defect was a substantial factor in causing the injury.12 However, in a situation questioning the
liability of an innocent middleman, like an ISP, the middleman is not passing any tangible good
through the stream of commerce. The ISP is providing a service, rather, and the middleman has
not placed the tangible items, such as the computer, into the stream of commerce. The doctrine
of strict products liability does not apply to services.13
B. What does the future hold? Proposals and changes in law and policy.
1. Should the innocent middleman bear the costs?
i
I
I
I
I
i
As it stands now, the injured party foots the bill for the damage caused by hacking, arguably
creating an unfair burden on the victim. Therefore, many have suggested that because
"middleman" companies pay a relatively low cost to implement a security standard, compared to
of such propensity had they conducted an adequate hiring procedure." Honohan v. Martin’s
Food of South Burlington, 679 N.¥.S.2d 478, 479 (Sup. Ct. 1998) (quoting Ray v. County of
Delaware, N.Y.S.2d 808, 809 (Sup. Ct. 1997))
12 Vaniderstine v. Lane Pipe Corp., et al., 455 N.Y.S.2d 450 (1982).
13 See Id. (holding that the erection of a highway guardrail was a service, and therefore the
county could be not be liable under strict liability for injuries sustained when a passenger struck
a defective guardrail.); see also Simone v. L.[. Jewish Hillside Medical Center, 364 N.Y.S.2d 714
(1975) (holding that the concept of strict products liability is inapplicable to the furnishing of
services, such as blood transfusions.)
!
I
I
I
I
I
I
I
I
I
I
I
i
I
I
I
!
I
I
the potentially tremendous cost to society, the middleman should take on this burden. When
applied to ISP middlemen, the increased security would lower consumer costs for Internet use.
ISPs can employ security standards more cheaply than other parties can. The difficulty, however,
is the lack of a designated standard.
2. Should there be a universal standard of care?
Establishing a standard of care is problematic. Companies have differing security needs,
depending on size, field, value of data, and other factors.14 Additionally, uniformity may promote
hacking by establishing minimum security measures for hackers to understand and surpass. In
2001, several administrative agencies, including the FDIC and the Department of the Treasury,
created and issued security guidelines for financial institutions under the 1999 Gramm-LeachBliley Act, requiring systematic monitoring, employee training, and encryption of customer
information.15 Similarly, proposals under the Health Insurance Portability and Accountability
Act of 1996 called for parallel measures among health care providers and insurance companies. 16
Whether there should be a standard of care pertaining to computer security, if the occurrences of
computer hacking steadily rise, courts or legislatures will more than likely develop a universal
standard to apply to these cases.
V.
Conclusion
Presently, there is no downstream liability for innocent middleman for damage caused by
computer hacking. The difficulties in proving aprimafacie case of negligence are tough burdens
for plaintiffs to overcome. However, we can expect successful claims against middlemen to
increase in the next several years.
Whether through regulations and statutes or through the common law of different jurisdictions,
someone will find a way to get to these deep pockets. We now have the technology and
sophistication to detect causation, making it easier to really know who is liable. Further, as
damages rise, pressure from injured parties and legislatures is likely to result in some sort of
policy change. With a reported 30% increase in damages from 2003 and 2004, and an unknown
total cost for 2005, downstream liability from computer hacking is likely to be a landmark issue
on dockets in the near future. As the law develops, we have a job as attorneys and industry
professionals to regularly keep track of statutory proposals and ensure that our voices are heard
on this issue.
14 Mary M. Calkins, They Shoot Trojan Horses, Don "t They? An Economic Analysis of Antihacking Regulatory Models, 89 Geo. L. J. 171, 214-15 (2000).
~ See 15 U.S.C. § 6801 et. seq. (2001); 12 C.F.R. § 30; 12 C.F.R. § 208; 12 C.F.R. § 225.
16 See 63 Fed. Reg, 43,241, 43,241-77 (1998).
!
!
!
I
I
I
i
I
I
I
I
I
I
I
I
!
i
COMPUTER VIRUSES AND CFV"IL LIABILITY:
A CONCEPTUAL FRAMEWORK
Meiring de Villiers
This article analyzes a negligence cause of action for inadvertent
tran.,mission of a computer virm’. It provides an introduction
to the principles of operation and detection of vhwses and analyzes the elements of negligence liability in the context of virus
in.~bction. A final section diaz~tsses and analyzes litigation complicutions that are a direct result of the dynamic and unique
nature of virus and virus detection technology.
I. INTRODUCTION
The Internet and modern communications technology have stimulated unprecedented advances in electronic communication, commerce, and information access. These technologies also have dramatically increased the
vulnerability of computer networks to hazards, such as malevolent software
and rogue programs that are capable of spreading rapidly and causing widespread and substantial damage to electronic data and programs.~ The most
I. KEN DUNHAM, BI~ELOW’S VIRUS TROUBLESHOOTING POCRET REFERENCE xix-xxiii (2000)
("Current Threat of Viruses" and "Interpreting the Threat."); .Jeffrey O. Kephart et al., Blueprint .[br a Computer Im.mune Syst~n, IBM Thomas J. Watson Research Center Report, at 1
(originally presented at Virus Bulletin International Conference in San Francisco, California
(Oct. 1-3, 1997), available at http://www.research.ibm.corn/antivirus/SciPapers/Kephart/VB97
("There is legitimate concern that, within the next few years, the Internet will provide a fertile
medium for new breeds of computer viruses capable of spreading orders of magnitude faster
than today’s viruses ... [T]he explosive growth of the Internet and the rapid emergence of
applications that disregard the traditional boundaries between computers threaten to increase
the global spread rate of computer viruses by several orders of magnitude."); How Fasta Virus
Can Spread, in PmL~P F~TES ET AL., THE COMPUTER VIRUS CRISIS 21 (2d ed. 1992); Carey
Nachenberg, Future ImperJbct, VIRUS BULL. (Aug. 1997) ("With the ubiquitous nature of the
lnternet, new viruses can be made widely accessible within minutes."); BIzREPORT NEws,
Meiring de Villiers (mdv@uns’w.edu.au) is John Landerer Faculty Fellow at the University of New South l, Vales School of Law in Sydney, Australia.
123
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
124
Tort THai & Insurance Practice Law Journal, Fall 2004 (40:1)
notorious of these rogue programs is the so-called computer virus, a program capable of attaching itself to a host program, cloning itself, and
spreading the cloned copies to other host programs, analogously to a biological virus. In addition to replicating and spreading, many viruses are also
capable of harm, such as information theft and corruption of electronic
data. This article focuses on the computer virus and its legal impact.
A collateral effect of the proliferation of malevolent software is exposure
to legal liability, not only for the virus author and the intentional transmitter of a virus, but also for one who inadvertently transmits a virus. An
example of the latter would be someone who unwittingly forwards an infected e-mail attachment. A civil action against an inadvertent transmitter
would most likely be pursued under a negligence theory, the most widely
used theory of liability in the law of torts.2
Negligence is a breach of the duty not to impose an unreasonable risk
on society. It applies to any risk that can be characterized as unreasonable,
including the risks associated with malevolent software.~ A victim of a virus
attack may therefore bring legal action under a negligence theory against
anyone who failed to take reasonable care to eliminate or reduce the risk
of virus infection.
Potential defendants in a virus case include such individuals as commercial software providers who sell infected products; entities involved in software distribution, such as website operators and participants in shareware
arrangements; and individuals who transmit infected e-mail attachments.
The system operator in a workplace who becomes aware that an internal
network is infected with a virus may have a duty to external e-mail recipients to reduce or eliminate the risk of infection. This can be accomplished
by advising internal e-mail users, blocking all external e-mail traffic, or
including warnings with outgoing e-mail, until the system has been disinfected with reasonable certainty.4
Sept. 12, 2003 (reporting that five to fifteen new viruses are released on the Internet daily),
at http://www.bizreport.com/print.php?art_id = 4917. For those interested in pursuing the
scientific aspect further, IBM’s website at http://www.research.ibm.com/antivirus/SciPapers.
htm provides hyperlinks to numerous papers on viruses, including many cited in this article.
2. See, e.g., James A. Henderson, Why Negligence Law Dominates Tort, 50 UCLA L. R~v.
377 (2003). See also Gary T. Schwartz, The Vitality of Negligeuce and the Ethics of Strict Liability,
15 GA. L. REv. 963 (1981); Gary T. Schwartz, The Beginning and the Possible End oJ’Modern
American Tort Law, 26 G^. L. REv. 601 (1992).
3. PROSS~R ^ND I4d.~’rON ON "rH~ LAw oF Toa’rs § 31 (Sth ed. 1984). R~ST^T~M~NT(SEcoND)
OF Toa-rs, § 282 (1965) (describing negligence as conduct "which falls below the standard
established by law for the protection of others against unreasonable risk of harm"); DAN B.
Dou~s, Tn~ LAw o~ TORrS 258 (the plaintiffcan assert that any conduct counts as negligence).
4. Cz*w Ga~N~R~S, Tri~ LAws or Trim INT~RN~’r 61, 62 (1997). An English court held that
a defendant who stored biological viruses had a duty to cattle owners who would be affected
by the spread of the virus. VVeller and Co. v. Foot and Mouth Disease Research Institute, 3 All
E.R. 560, 570 (1965) ("IT]he defendant’s duty to take care to avoid the escape of the virus
was due to the foreseeable fact that the virus might infect cattle in the neighborhood and
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
Computer Viruses and Civil Liability
125
To pursue a successful negligence cause of action, a victim of viral infection must prove that (I) the defendant had a duty to the plaintiffto take
reasonable care to avoid the infection, (2) there was a breach of that duty,
(3) the breach was the actual and legal cause of the plaintiff’s loss, and
(4) the breach resulted in actual harm.
Technology plays a crucial role in a negligence analysis involving virus
infection. Courts require a plaintiff to prove breach of duty in a negligence
action by identifying an untaken precaution and showing that the precaution would have yielded greater benefits in accident reduction than its cost.
Such a cost-benefit analysis requires a familiarity with the technology as
well as economics of viruses and virus detection.
Section II of this article reviews the principles of computer viruses and
virus detection technology. Section III presents an analytical framework
for the evaluation of a negligence cause of action in a virus context, including an analysis of legal and economic aspects of damages due to computer virus infection.
The dynamic nature of virus technology may complicate proof of
negligence liability. The central element of a negligence plaintiff’s litigation strategy is the cost-effective untaken precaution. Failure to take a
particular precaution may constitute breach, but the claim nevertheless
may fail on proximate cause grounds if, for instance, the virus evolved
unpredictably and caused an unforeseeable type of harm. An alternative
precaution may pass the actual and proximate cause hurdles but would
likely not be cost-effective, and therefore fail the breach-of-duty element.
Such interaction between the dynamic and volatile nature of virus technology and the legal principles of negligence may create a Catch-22 situation that leaves the virus victim without legal recourse. Section IV analyzes and discusses these and other complications to litigation strategy. A
final section discusses and concludes.
II.
OPERATION AND STRUCTURE OF COMPUTER VIRUSES
A. Background
Malevolent software is intended to cause damage to or disrupt the operation of a computer system. The most common of these rogue programs is
theJcomputer virus. Other forms of malicious software include so-called
logic bombs, worms, Trojan horses, and trap doors,s
cause them t’o die. The duty is accordingly owed to the owners of cattle in the neighborhood
¯.."). Bulletin Boards, which allow downloading and uploading of software, are particularly
vulnerable to computer virus infection due to the sheer quantity of transactions performed
through Bulletin Board Systems. See, e.g., FIT~s rT ^L., supra note 1, at 60.
5. See, e.g., DOROTHY E. D~mNG & PETI~I~ J. DEr~I~G, ]NTERNET BESIEGED 75-78 (1998).
I
I
!
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
126
Tort Trial & Insurance Practice Law Jommal, Fall 2004 (40:1)
The term "virus," Latin for "poison," was first formally defined by Dr.
Fred Cohen in 1983,6 even though the concept goes back to John yon
Neumann’s studies of self-replicating mathematical automata in the 1940s.7
Dr. Cohen describes a computer virus as a series of instructions (in other
words, a program) that (i) infects other computer programs and systems
by attaching itself to a host program in the target system, (ii) executes when
the host is executed, and (iii) spreads by cloning itself, or part of itself, and
attaching the copies to other host programs on the system or network. In
addition, many viruses have a so-called payload capable of harmful sideeffects, such as data corruption.~
A virus may infect a computer or a network through several possible
points of entry, including via an infected file downloaded from the Internet,
through Web browsing, via an infected e-mail attachment, or even through
infected commercial shrinkwrapped software.° The recent trend in virus
transmission has been a decrease in infected diskettes and an increase in
infection through e-mail attachments. In a 1996 national survey, for instance, approximately 9 percent of respondents listed e-mail attachments
as the means of infection of their most recent virus incident, while 71
percent put the blame on infected diskettes. In 2003, the corresponding
numbers were 88 percent for e-mail attachments and zero for diskettes?°
As the definition suggests, computer viruses consist of three basic modules or mechanisms, namely an infection mechanism, a payload trigger, and
the payload. The infection mechanism allows the virus to replicate and
6. Fred Cohen, Computer Viruses (1985) (unpublished Ph.D. dissertation, University of
Southern California) (on file with the University of Southern California library).
7. Jeffrey O. Kephart et HI., Fighting Computer Viruses, Sci. AM., Nov. 1997, at 55. Dr.
Gregory Benford published the idea of a computer virus as "unwanted code." Benford apparently wrote actual "viral" code, capable of replication. DENN~N~ & DENNiNg, supra note 5,
at 74.
8. Jonr~ MACArEE & COL~N HAYNES, COMPtrrER V~RVSES, WORMS, DATA DIDLERS, K~ER
PRORAMS, ^ND OTHER THREATS TO YOUR SYSTEM 26; FREDERICR B. COHEN, A SHORT COURSE
ON COMPUTER V~RUSES 1--2 (2d ed. 1994). In his Ph.D. dissertation, Dr. Cohen defined a
virus simply as any program capabld of self-reproduction. This definition appears overly general. A literal interpretation of the definition would classify even programs such as compilers
and editors as viral. DENNING & DEN~iN~, supra note 5, at 75.
9. There are three mechanisms through which a virus can infect a program. A virus may
attach itself to its host as a shell, as an add-on, or as intrusive code. A shell virus forms a shell
around the host code so that the latter effectively becomes an internal subroutine of the virus.
The host program is replaced by a functionally equivalent program that includes the virus.
The virus executes first and then allows the host code to begin executing. Boot program
viruses are typically shell viruses. Most viruses are of the add-on variety. They become part
of the host by appending their code to the host code, without altering the host code. The
viral code alters the order of execution, by executing itself first and then the host code. Macro
viruses are typically add-on viruses. Intrusive viruses, in contrast, overwrite some or all of the
host code, replacing that with its own code. See, e.g., DENNING & DENNING, supra note 5, at
8l; FITES ET AL., supra note 1, at 73-75.
10. INST. FOR COMPUTER SEC. ~ ADMIN., ICSA LABS 9TH ANNUAL COMPUTER VIRUS PREV-
SURVEY 2003, "Fable 10, at 14, available at http://www.icslabs.com/2003avpsurvey/
index.shml.
ALENCE
!
!
!
i
I
I
I
I
I
i
i
!
i
i
I
i
i
!
Computer Viruses and Civil Liability
127
spread, analogously to a biological virus. This is the most salient property
of a computer virus.~1 The infection module first searches for an appropriate executable host program to infect. It then installs a copy of the virus
into the host, provided the host has not yet been infected.
When the host program executes, the virus is also executed. Upon execution, the virus typically performs the following sequence of actions. It
replicates (clones) by copying itself to other executable programs on the
computer?~ During execution, the virus program also checks whether a
triggering condition is satisfied. When the condition is satisfied, the virus
executes its harmful component, the so-called payload module. Triggering
events come in a variety of forms, such as a certain number of infections,
Michelangelo’s birthday, or the occurrence of a particular date. The Fridaythe-13th virus, for instance, only activates its payload on dates with the
cursed designation?3
Execution of the payload may produce harmful side effects, such as dcstruction or corruption of data in spreadsheets, word processing documents, and databases and theft of passwords.~4 Some effects are particularly
pernicious because they are subtle and undetectable until substantial harm
has been done: transposing numbers, moving decimal places, stealing passwords and other sensitive information?~ Payloads are not necessarily destructive and may involve no more than displaying a humorous message?6
Some virus strains do not destroy or corrupt information but consume
valuable computing resources,t7
11. Rogue PROgraMS: VIRUSES, WORMS, TROJAN HORSES 247 (LanceJ. Hoffman ed. 1990)
("The ability to propagate is essential to a virus program."); DENNINg & DENNINg, supra
note 5, at 73-75.
12. Potential target hosts include application and system programs and the master boot
record of the hard disks or floppy disks in the computer.
13. See, e.g., Eric J. Sinrod & William P. Rei!ly, Cyber Crimes: A Practical Approach to the
Application of Federal Computer Crime Laws, 16 SANTA Cr~AV.~ COMPUTER & Hmn T~cn. L.J.
177,217 n.176 (2000).
14. JAN HRUSKA, COMPUTER VIRUSES AND ANTI-VIRuS WARFARE 17, 17--18 (1990) (In addition to self-replicating code, viruses often also contain a payload. The payload is capable of
producing malicious side effects.). See also COheN, supra note 8, at 8-15 (examples of malignant viruses and what they do); MACAFE~ & HAS’N~S, supra note 8, at 61.
15. M~C^F~E & HAYNES, supra note 8, at 61.
16. Sinrod & Reflly, supra note 13, at 218 (describing the W95.LoveSong.998 virus, designed to trigger a lovesong on a particular date).
17. Viruses can cause economic losses, e.g., by filling up available memory space, slowing
down the execution of important programs, locking keyboards, adding messages to printer
output, and effectively disabling a computer system by altering its boot sector. The Melissa
virus, for instance, mailed copies of itself to everyone in the victim’s e-mail address book,
resulting in clogged e-mail servers and even system crashes. See, e.g., FITES ET AL., supra note 1,
at 23-24 ("The Christmas card [virus] stopped a major international mail system just by
filling up all available storage capacity."); Sinrod & Reilly, supra note 13, at 218 (describing
the Melissa virus).
See Section Ill(D), inf!’a, for an analysis of damages from computer virus infection. For
examples of benign viruses and how they operate, see, e.g., Con~, supra note 8, at 15-21.
I
I
I
I
I
I
I
I
I
i
I
I
I
I
I
I
I
I
I
128
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
It was once believed that viruses could not be transmitted by data files
such as e-mail attachments. Viruses such as the infamous Melissa taught
us otherwise. Melissa typically arrived in the e-mail inbox of its victim
disguised as an e-mail message with a Microsoft Word attachment. When
the recipient opened the attachment, Melissa executed. First, it checked
whether the recipient had the Microsoft Outlook e-mail program on its
computer. If so, Melissa would mail a copy of itself to the first fifty names
in Outlook’s address book, creating the appearance to the fifty new recipients that the infected person had sent them a personal e-mail message.
Melissa would then repeat the process with each of the fifty recipients of
the infected e-mail message (provided they had Outlook) by automatically
transmitting clones of itself to fifty more people.~s A Melissa attack frequently escalated and resulted in clogged e-mail servers and system crashes.
B. Technical Antivirus Defenses
Antivirus technology comes in two broad categories: virus-specific and generic. Virus-specific technology, such as signature scanners, detect known
viruses by identifying patterns that are unique to each virus strain. These
"identifying patterns" are analogous to human fingerprints. Generic technology detects the presence of a virus by recognizing generic viruslike
behavior, usually without identifying the particular strain.
A virus-specific scanner typically makes a specific announcement, such
as that "the operating system is infected with (say) the Cascade virus," while
its generic counterpart may simply say, "the operating system is (or may
be) infected with an (unidentified) virus." Virus-specific technology is more
accurate and produces fewer false positives, but generic technology is better
at detecting unknown viruses. Heuristic techniques combine virus-specific
scanning with generic detection, providing a significantly broadened range
of detection.
Technical antivirus defenses come in four varieties, namely scanners,
activity monitors, integrity checkers, and heuristic techniques.~9 Scanners
are virus-specific, while activity monitors and integrity checkers are generic. Activity monitors look out for suspicious, viruslike activity in the
computer. Integrity checkers sound an alarm when they detect suspicious
modifications to computer files.
1. Scanners
Scanners are the most widely used antivirus defense. A scanner reads executable files and searches for known virus patterns. These patterns, or "sig18. DAVID HARLEY IUr AL., VIRUSES REVEALED: UNDERSTAND AND COUNTER J~/IALICIOU$
SOFTWARE 406-- 10 (2001).
19. See, e.g., DENNING & DENNING, .mpra note 5, at 90-93; DUNHAM, .mpra note 1, at 7883, 102-08.
i
i
Computer Viruse.r and Civil Liability
i
i
i
I
I
I
I
I
129
natures," are the most reliable technical indicator of the presence of a virus
~n a computer system. A virus signature consists of patterns of hexadecimal
digits embedded in the viral code that are unique to the strain.2° These
signatures are created by human experts, such as researchers at IBM’s High
Integrity Computing Laboratory, who scrutinize viral code and extract sections of code with unusual patterns. The selected byte patterns then constitute the signature of the virus.2~ The scanner announces a match with
~ts database of known viral signatures as a possible virus.
The virus signature pattern is selected to be a reliable indicator of the
presence of a virus. An ideal virus signature gives neither false negatives
nor false positives.22 In other words, it should ideally always identi~ the
virus when present and never give a false alarm when it is not.23 The IBM
High Integrity Computing Laboratory has developed an optimal statistical
signature extraction technique that examines all sections of code in a virus
and selects the byte strings that minimize the incidence of false positives
and negatives.24
Scanners are easy to use, but they are limited to detecting known virus
signatures. A scanner’s signature database has to be continually updated, a
burdensome requirement in an environment where new viruses appear rapidly. Use of scanners is further complicated by the occurrence of false positives. This occurs when a viral pattern in the database matches code that
is in reality a harmless component of otherwise legitimate data. A short
and simple signature pattern will be found too often in innocent software
and produce many false positives. Viruses with longer and more complex
patterns will less often give a false positive, but at the expense of more false
negatives.25 Finally, as the number of known viruses grows, the scanning
process will inevitably slow down as a larger set of possibilities has to be
evaluated.U,
,~
20. HRusw, supra note 14, at 42.
21. Jeffrey O. Kephart et al., Automatic Extraction of Computer Vi~us Signatures, in PRoCEEDINGS OF THE 4TH VIRUS BULLET1N INTERI~ATIONAL CONFERENCE (R. Ford ed., 1994), available at http://www.research.ibm.com/antivirus/SciPapers/Kephart/VB94/vb94.html/179- 94,
at2.
22. A false positive is an erroneous report of the activity or presence of a virus where there
is none. A false negative is the failure to report the presence of a virus when a virus is in fact
present.
23. HRVSKA, supra note 14, at 42. For short descriptions and hexadecimal patterns of selected known viruses, .tee id. at 43-52; Kephart et al., supra note 1, at 11 ("[A] signature
extractor must select a virus signature carefully to avoid both false negatives and false positives.
That is, the signature must be found in every instance of the virus, and must almost never
occur in uninfected programs."). False positives have reportedly triggered a lawsuit by a
software vendor, who felt falsely accused, against an antivirus software vendor. Id.
24. Kephart et al., supra note 21, at 179-94.
25. Du~,~, supra note 1, at 78-83; Kephart et al., supra note 7. See also Sandeep Kumar
& Eugene H. Spafford, A Generic Virus Scanner in C+ +, Technical Report CSD-TR-92062, Dep’t of Computer Science, Indiana University, at 6-8, available at ftp://Ftp.cerias.
purdue.edu/pub/papers/sandeep-kumar/kumar-spaf-scanner.pdf.
26. See, e.g., Pete Lindstrom, The Hidden Costs of Virus Protection, Sw~ R~s. R~v. 5 (June
I
I
!
I
I
I
I
I
I
!
I
I
I
I
I
I
I
I
I
130
Tort THai & Insurance Practice Law Journal, Fall 2004 (40:1)
2. Activity Monitors
Activity monitors are resident programs that monitor activities in the computer for behavior commonly associated with viruses. Suspicious activities
include operations such as attempts to rewrite the boot sector, format a
disk, or modify parts of main memory. When suspicious activity is detected,
the monitor may simply halt execution and issue a warning to alert the
user, or take definite action to neutralize the activity.27 Activity monitors,
unlike scanners, do not need to know the signature of a virus to detect it.
It works for all viruses, known as well as unknown. Its function is to recognize suspicious behavior, regardless of the identity of the culprit.
The greatest strength of activity monitors is their ability to detect unknown virus strains, but they also have significant weaknesses. They can
only detect viruses that are actually being executed, possibly after substantial harm has been done. A virus, furthermore, may become activated before the monitor code and thus escape detection until well after execution.
A virus also may be programmed to alter monitor code on machines that
do not have protection against such modification. A further disadvantage
of activity monitors is the lack of unambiguous and foolproof rules governing what constitutes suspicious activity. This may result in false alarms
when legitimate activities resemble viruslike behavior. Recurrent false alarms
ultimately may lead users to ignore warnings from the monitor. Conversely,
not all illegitimate activity may be recognized as such, leading to false
negatives.28
3. Integrity Checkers
Integrity checkers look for unauthorized changes in system areas and files.
The typical integrity checker is a program that generates a code, known as
a checksum, for files that are to be protected from viral infection. A file
checksum, for instance, may be some arithmetic calculation based on the
total number of bytes in the file, the numerical value of the file size, and
the creation date. The checksum effectively operates as a signature of the
file. These checksums are periodically recomputed and compared to the
original checksum. Tampering with a file will change its checksum. Hence,
if the recomputed values do not match the original checksum, the file has
presumably been modified since the previous check and a warning is issued.
2003) ("In this day of 80,000 + known viruses and frequent discovery of new ones, the size
of the signature file can be large, particularly if the updates are sent out as cumulative ones.
Large updates can clog the network pipelines ... and reduce the frequency that an administrator will push them out to the end users.").
27. Kumar & Spafford, supra note 25, at 3-4.
28. HRvsr.a, supra note 14, at 75.
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Computer Viruses and Civil Liability
131
Since viruses modify and change the contents of the files they infect, a
change in the checksum may be a sign of viral infection.29
The advantage of integrity checking is that it detects most instances of
viral infection, as infection must alter the file being infected. The main
drawback is that it tends to generate many false alarms, as a file can change
for legitimate reasons unrelated to virus infection?~ On some systems, for
instance, files change whenever they are executed. A relatively large number of false alarms may trigger compliance lapses, as users may ignore
warnings or simply not use the utility. Integrity checking works best on
static files, such as system utilities, but is, of course, inadequate for files
that naturally change frequently, such as Word documents.
4. Heuristic Detection Methods
A fourth category of virus detectors uses heuristic detection methods. Heuristic rules are rules that solve complex problems fairly well and fairly
quickly, but less than perfectly. Virus detection is an example of a complex
problem that is amenable to heuristic solution. It has been proven mathematically that it is impossible to write a program that is capable of determining with 100 percent accuracy whether a particular program is infected
with a virus, from the set of all possible viruses, known as well as unknown.3~ Heuristic virus detection methods accept such limitations and
attempt to achieve a solution, namely a detection rate that is acceptable,
albeit below the (unachievable) perfect rate.
Heuristic virus detection methods examine executable code and scrutinize its structure, logic, and instructions for evidence ofviruslike behavior.
Based on this examination, the program makes an assessment of the likelihood that the scrutinized program is a virus, by tallying up a score. Instructions to send an e-mail message with an attachment to everyone in an
address book, for instance, would add significantly to the score. Other
high-scoring routines include capabilities to replicate, hide from detection,
and execute some kind of payload. When a certain threshold score is
reached, the code is classified as malevolent and the user so notified.
The assessment is necessarily less than perfect and occasionally provides
false positives and negatives. Many legitimate programs, including even
29. FITES ET AL., supra note 1, at 69-76 (Figures 5.2-5.5); Dor~n^ra, supra note 1, at 79.
See alw Kumar & Spafford, supra note 25, at 5-6.
30. FITES ET ^L., supra note 1, at 125.
31. Diomidis Spinellis, Reliable Identification of Bounded-Length Viruses ls NP-Complete, 49:1
IEEE TR~r~S^CTmtaS Or~ INFOataAT~O~ THEORY 280, 282 (Jan. 2003) (stating that theoretically
perfect detection is in the general case undecidable, and for known viruses, NP-complete.);
Nachenberg, supra note 1 ; See also Francisco Fernandez, Heuristic Engines, h, PROCEEnlto~S or
THE 1 ITH INTERNATIONAL VIRUS BULLETIN CONFERENCE 407--44 (Sept. 2001); David M. Chess
& Steve R. White, An Undetectable Computer Virus, at http://www.research.ibm.com/antivirus/
SciPapers/VB2000DC.htm.
I
I
!
I
I
I
!
!
I
!
I
I
I
I
I
I
I
132
Tort THai & Insurance Practice Law Journal, Fall 2004 (40.’1)
some antivirus programs, perform operations that resemble viruslike behavior.~2 Nevertheless, state-of-the-art heuristic scanners typically achieve
a 70 percent to 80 percent success rate at detecting unknown viruses.33
A heurisuc scanner typically operates in two phases. The scanning algorithm first narrows the search by, for instance, identifying the location
most likely to contain a virus. It then analyzes the code from that location
to determine its likely behavior upon execution. A static heuristic scanner,
for instance, compares the code from the most likely location to a database
of byte sequences commonly associated with viruslike behavior.~4 The algorithm then decides whether to classify the code as viral.3s
A dynamic heuristic scanner uses central processing unit (CPU)36 emulation. It typically loads suspect code into a virtual computer, emulates its
execution, and observes its behavior. Because it is only a virtual computer,
viruslike behavior can safely be observed in what is essentially a laboratory
setting, with no need to be concerned about real damage. The program is
monitored for suspicious behavior while it runs.37
Although dynamic heuristics can be time-consuming due to the relatively slow CPU emulation process, they are sometimes superior to static
heuristics. This will be the case when the suspect code (i) is obscure and
not easily recognizable as viral in its static state but (ii) clearly reveals its
viral nature in a dynamic state.
A major advantage of heuristic scanning is its ability to detect viruses,
including unknown strains, before they execute and cause damage. Other
generic antivirus technologies, such as behavior monitoring and integrity
checking, can only detect and eliminate a virus after exhibition of suspicious
behavior, usually after execution. Heuristic scanning is also capable of detecting novel and unknown virus strains, the signatures of which have not
yct been catalogued. Such strains cannot be detected by conventional scanners, which only recognize known signatures. Heuristic scanners are capable
of detecting even polymorphic viruses, a complex virus family that complicates detection by changing their signatures from infection to infection.~8
32. Fernandez, supra note 31, at 409 ("Many genuine programs use sequences of instructions that resemble those used by viruses. Programs that use low-level disk access methods,
TSRs, encryption utilities, and even anti-virus packages can all, at times, carry out tasks that
are performed by viruses.").
33. Nachenberg, supra note 1, at 7.
34. Certain byte sequences, for instance, are associated with decryption loops to unscramble a polymorphic virus when an infected routine is executed. If it finds a match, e.g., the
scanner detects the presence of a decryption loop typical of a polymorphic virus, it catalogues
this behavior.
35. Kumar & Spafford, supra note 25, at 4-5 ("Detection by static analysis/policy
adherence.").
36. The CPU, or central processing unit, of a computer is responsible for data processing
and computation..See, e.g., HRusv~, supra note 14, at 115; D. BE/qDER, COMPUTER LAW: Ev~DEfaCE Arid PROCEDURE § 2.02, at 2-7, 9 (1982).
37. Kumar & Spafford, s~upra note 25, at4.
38. Polymorphic viruses have the ability to "mutate" byvarying the code sequences written
I
I
I
Computer Hruses and Civil Liability
13 3
The explosive growth in new virus strains has made reliable detection and
identification of individual strains very costly, making heuristics more important and increasingly prevalent.39 Commercial heuristic scanners include
IBM’s AntiV-irus boot scanner and Symantec’s Bloodhound technology.
We now turn to a formal analysis of negligence in a virus context.
IIl. VIRUS INFECTION AS NEGLIGENCE CAUSE OF ACTION
A product, a service, or conduct cannot and does not have to be perfectly
safe to avoid liability. Society does not benefit from products that are excessively safe, such as bugfree software and automobiles built like armored
cars and limited to top speeds of twenty miles per hour. Even if bugfree
software were feasible, the resources consumed in achieving it would make
the product prohibitively expensive when it is finally released, and also
likely obsolete.
Society does not benefit from products that are too risky either. Society
benefits most from an optimal level of safety.4° In this section, we explore
the legal meaning of these concepts and the closely related question: how
safe does a product, including an intangible such as a computer program,
have to be to avoid liability?
Any risk in principle can be reduced or eliminated, at a cost. For many
risks, this cost exceeds the benefit of the risk reduction. We call such risks
"unavoidable." Risks that, on the other hand, can be reduced at a cost less
than the benefit of the reduction are called "avoidable." Unavoidable risks
provide a net benefit to society and, as a matter of public policy, should
not be eliminated. The converse is true in the case of avoidable risks.
The law of negligence recognizes this distinction and limits liability to
harm caused by avoidable risks. The primary legal meaning of the term
negligence is conduct that is unreasonably risky; in other words, conduct
that imposes an avoidable risk on society.4~
to target files. To detect such viruses requires a more complex algorithm than simple pattern
matching. See, e.g., DEr~t~N~ & DE~ra~ra~, supra note 5, at 89.
39. Nachenberg, supra note 1, at 9.
40. BEraD~R, supra note 36, at 8-41 to 8-42 n.108; C. Cno, Ara I~TRODISCTmr~ TO SORTWAR~ QUALITY COr~’rROL 4, at 12-13 (1980) (a software provider is under a duty to invest
resources in program debugging only up to the point where the cost of additional debugging
would outweigh the benefits of further error reduction); Thomas G. Wolpert, Product Liability
and Software Implicated in Personal Injury, D~r. Cott~. J. 519, 523 (Oct. 1993) ("By the time a
product is completely debugged, or nearly so, most likely it is obsolete.") See also IVARS PET~RSO~, FAX^L DEFECX 166 (1995) ("We live in an imperfect world . .. Absolute safety, if
attainable, would.., cost more than it’s worth.").
4!. PROSSER AND KEETON ON THE LAW OF TORTS, supra note 3, § 31; Do,ss, supra note 3,
at 275 ("Negligence is conduct that creates or fails to avoid unreasonable risks of foreseeable
harm to others."), The term also refers to the cause of action, namely the legal rules and
procedures that govern a negligence lawsuit. Id. at 269.
I
I
I
!
I
!
I
i
!
!
I
i
I
I
!
i
!
I
134
Tort Trial ¢_r Insurance Practice Law ~Tournal, Fall 2004 (40:1)
The remainder of this section discusses and analyzes the legal principles
that define the dividing line between avoidable and unavoidable risks, and
applies the principles in the context of computer virus infection.42
The plaintiff in a negligence action has to prove the following elements
to establish his or her claim.
1. A legal duty on the part of the defendant not to expose the plaintiff to
unreasonable risks.
2. A breach of the duty; namely, a failure on the part of the defendant to
conform to the norm of reasonableness.
3. A causal connection between defendant’s conduct and plaintiff’s harm. This
element includes actual as well as proximate cause. Defendant’s negligence
is the actual cause of the plaintiff’s harm if, but for the negligence, the harm
would not have occurred. Proximate causation means that the defendant’s
conduct must be reasonably closely related to the plaintiff’s harm.
4. Actual damage resulting from the defendant’s negligence.
We now turn to an analysis of these elements in a computer virus context.
A. Duty
The first step in a negligence analysis considers whether the defendant had
a duty to the plaintiff to act with due care or, conversely, whether the
plaintiff is entitled to protection against the defendant’s conduct.43 But how
and where do we draw the line that divides the plaintiffs who are entitled
to such protection from those who are not? Professor Richard Epstein
phrases the rhetorical question, "[w]ho, then, in law, is my neighbor?" He
finds an answer in Donoghue v. Stevenson: My neighbors are "persons who
are so closely and directly affected by my act that I ought reasonably to
have them in contemplation as being so affected when I am directing my
mind to the acts or omissions which are called in question."~
The courts frequently analyze the duty issue as a matter of public policy.
A defendant has a duty to the plaintiffif a balancing of policy considerations
dictates that the plaintiff is entitled to legal protection against the defendam’s conduct.45 The policy benchmark is based on fairness under the
42. Liability for iutentional transmission of a virus is governed by criminal law. A software
provider who intentionally transmits a computer virus with the purpose of stealing, destroying, or corrupting data in the computer of his competitor may be prosecuted under criminal
statutes such as the Computer Fraud and Abuse Act, 18 U.S.C. § 1030. This act is the principal
federal statute governing computer-related abuses, such as transmission of harmful code.
43. PROSSER ^rap KEETON ON ThE Law or ToR’rs, supra note 3, at 357 n.14.
44. Donoghue v. Stevenson, [ 19321 App. Cas. 562,580 (H.L. Scot. 1932) (cited in Rmn^RO
A. EPSTEIN, SIMPLE RULES POR A COMPLEX WORLD 196 (1995)).
45. Brennen v. City of Eugene, 591 P.2d 719 (Or. 1979); Bigbee v. Pac. Tel. & Tel. Co.,
183 Cal. Rptr. 535 (Ct. App. 1982); PROSSER AND KEETON ON THE LAW OF TORTS, supra
note 3, at 358 ("[D]uty is not sacrosanct in itself, but is only an expression of the sum total
of those considerations of policy which lead the law to say that the defendant is entitled to
protection.").
I
i
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Computer Viruses and Civil Liability
135
contemporary standards of a reasonable person.~ Prosser succinctly summarizes, "[n]o better general statement can be made than that the courts
will find a duty where, in general, reasonable persons would recognize it
and agree that it exists."47
In fleshing out the reasonable person policy benchmark of duty, courts
consider factors such as the relationship between the parties, the nature of
the risk, the opportunity and ability to take care, the public interest,48 and
whether the defendant created the risk that caused the loss.~°
Courts are more likely to recognize a duty in cases where the defendant
possesses a "special relationship" with the plaintiff,s° A common carrier,
for instance, has a duty to aid a passenger in trouble, an innkeeper to aid
a guest, and an employer to aid an employee injured or endangered in the
course of his employment,s* The law does not, however, impose a general
duty to aid another human being who is in grave, even mortal, danger. A
champion swimmer, for instance, is not required to help a child drowning
before his eyes, nor is anyone required to warn someone about to stick his
hand into a milling machine,s2
Given the high level of awareness and publicity surrounding virus attacks
and computer security, courts are likely to find that software providers and
distributors generally do have a duty not to impose an unreasonable risk
of viral infection on those foreseeably affected.~ A software provider, for
instance, who invites customers to download a software product from a
commercial website creates a risk that the software may contain a virus.
46. Casebolt v. Cowan, 829 P.2d 352, 356 (Colo. 1992) ("The question whether a duty
should be imposed in a particular ease is essentially one of fairness under contemporary
standards--whether reasonable persons would recognize a duty and agree that it exists."). See
alvo Hopkins v. Fox & Lazo Realtors, 625 A.2d 1110 (N.J. 1993) ("Whether a person owes a
duty of reasonable care toward another turns on whether the imposition of such a duty satisfies
an abiding sense of basic fairness under all of the circumstances in light of considerations of
public policy.").
47. PROSSER AND KEETON ON THE LAW OF TORTS, supra note 3, at 359.
48. Hopkins, 625 A.2d at 1110.
49. ~Veirum v. RKO Gem, Inc., 15 Cal. 3d 40, 46 (1975).
50. Lopez v. S. Cal. Rapid Transit Dist., 710 P.2d 907, 911 (Cal. 1985); see also "Farasoffv.
Regents of Univ. of Cal., 551 P.2d 334, 342 (Cal. 1976).
51. PROSSER AND KEETON ON THE LAW OF TORTS, supra note 3, at 376, 377 nn.32-42.
52. Handiboe v. McCarthy, 151 S.E.2d 905 (Ga. Ct. App. 1966) (no duty to rescue child
drowning in swimming pool); Chastain v. Fuqua Indust., Inc., 275 S.E.2d 679 (Ga. Ct. App.
1980) (no duty to warn child about dangerous defect in lawn mower).
53. FITES ET AL., supra note 1 at 141, 142 (Bulletin Board System operators provide a forum
for exchange of information, data, and software. Hence, a BBS operator may have a duty to
screen uploaded software for malicious components or, at least, warn users to use caution in
using downloaded software.); Palsgraf v. Long Island R.R. Co., 162 N.E. 99 (N.Y. 1928)
(establishing the precedent that a duty is extended only to those foreseeably affected). See also
David L. Gripman, The Doors Are Locked but the Thieves and Vandals Are Still Getting In: A
Proposal in Tort to Alleviate Corporate America’s Cybe~Crime Proble~n, 16 J. MARSHALL J. COMPUTER & INFO. L. 167, 170 (1997).
!
!
136
Tot~ Trial & In.mrance Practice Law Journal, Fall 2004 (40:1)
Everyone who downloads the software is within the scope of the risk of
virus infection and may have a cause of action if harmed by a virus.
B. Breach
"Breach of duty" refers to a violation of the duty to avoid unreasonable
risks of harm to others. The legal standard of reasonableness against which
the defendant’s conduct is to be measured is known as the "reasonable
person" standard. The reasonable person standard imposes on all people
the duty to "exercise the care that would be exercised by a reasonable and
prudent person under the same or similar circumstances to avoid or minimize reasonably foreseeable risks of harms to others."54
Courts have interpreted the reasonable person standard in three broad
ways.ss First, the reasonable person is endowed with characteristics, such
as a certain level of knowledge and ability. The reasonable person has shortcomings that the community would tolerate but is otherwise a model of
propriety and personifies the community ideal of appropriate behavior. He
is allowed to forget occasionally, for instance, but is presumed never to do
something "unreasonable" such as crossing the street on a red light at a
busy intersection,s~ The defendant’s conduct is then compared to that
which can be expected from this hypothetical reasonable person. The defendant is considered to be in breach of her duty of due care if her conduct
does not measure up to this standard.
Under a second interpretation of the reasonable person standard, a court
may adopt rules of conduct, the violation of which is considered prima
facie negligence. Violation of a statute, such as a speed limit, is an example
of prima facie negligence.
Finally, courts define the reasonableness of a risk in terms of a balance
of its costs and benefits,s7 Under the cost-benefit approach, avoidable risks
that can be eliminated cost-effectively are considered unreasonable. Failure
to eliminate or reduce such risks constitutes a breach of duty. When harm
results from an unavoidable risk, on the other hand, the defendant escapes
liability,sS
Professor Henry Terry appears to have been the first to define reasonableness of conduct in terms of a cost-benefit balancing,s9 This approach
is an analytical embodiment of the reasonable person standard, and has
54. O.W. HOLMI~S, THI~ COMMON L^W (1881) (the negligence standard is objective, "based
on the abilities of a reasonable person, and not the actual abilities of individuals").
55. See generally DoaRs, supra note 3, at 279.
56~ PROSSER AND KEETON ON THE Law oF TORTS, supra note 3, § 32.
57. DORRS, supra note 3, at 279.
58. PROSSER AND KEETON ON THE LAW OF TORTS, supra note 3, § 29 ("lAin accident is
considered unavoidable or inevitable at law if it was not proximately caused by the negligence
of any party to the action, or to the accident.").
59. Henry Terry, Negligence, 29 HARV. L. REv. 40 (1915).
!
I
!
!
I
!
!
I
Computer Viruses and Civil Liability
137
become part of mainstream negligence analysis,a° In fact, this is how courts
actually decide negligence cases?~ Cost-benefit balancing applies naturally
in a virus context, and the availability of cost-benefit models of viruses and
antivirus defenses in the computer security literature makes it logical and
feasible.~2
Courts apply the cost-benefit approach in a negligence case by focusing
on precautions the defendant could have taken but did not.~3 The courts
impose on the negligence plaintiff the burden to specify an untaken precaution that would have prevented the accident, if taken. The defendant will
then be considered negligent if the benefits of risk reduction provided by
the pleaded precaution exceed its cost.64
The role of the untaken precaution in negligence law is well illustrated
in Cooley v. Public Service Co.~s In Cooley, the plaintiffsuffered harm from a
loud noise over a telephone wire. She suggested two untaken precautions
that would have prevented the harm, namely (i) a strategically positioned
wire mesh basket and (ii) insulating the wires. The court ruled that neither
untaken precaution constituted a breach of duty. Both precautions would
have increased the risk of electrocution to passersby sufficiently to outweigh the benefits in harm reduction.
In a negligence case, more than one umaken precaution may have greater
benefits than costs, and the plaintiff may allege several precautions in the
alternative. The court may base a finding of negligence on one or more of
the pleaded untaken precautions.6a The Cooley court noted that there may
60. Do~as, supra note 3, at 267.
61. Mark E Grady, ~ntaken Precautions’, 18 J. LI~G^L STUD. 139 (1989) (courts actually
decide negligence cases by balancing the costs and benefits of the untaken precaution).
62. See, e.g., Fred Cohen, A Cost Analysis of Typical Computer Virttres and Defenses, in COMVUTEaS & SEC. 10 (1991).
63. Grady, supra note 61, at 139. The "untaken precautions" approach is how courts act~ally decide negligence cases. The positive economic theory of breach of duty posits that
negligence law aims to minimize social cost. Under this theory, a software provider would
escape liability by taking the cost-minimizing amount of precaution. The global social costminimization approach is a theoretical idealization, while the untaken precautions approach
is a more realistic description of how courts actually determine negligence.
The seminal articles on the positive economic theory of negligence include John Brown,
Toward an Economic Theory of Liability, 2 J. L~G^L STUD. 323 (1973); W. Landes & R. Posner,
A Tbeoty of Negligence, 1J. L~^~ STUD. 29 (1972); S. Shavell, Strict Liabili~. versus Negligence,
9J. L~^~ Swap. 1 (1980).
64. Grady, supra note 61, at 139, 143 (1989) (the courts "take the plaintiff’s allegations of
the untaken precautions of the defendant and ask, in light of the precautions that had been
taken, whether some particular precaution promised benefits (in accident reduction) greater
than its associated costs"); Delisi v. St. Luke’s Episcopal-Presbyterian Hosp., Inc., 701 S.W.2d
170 (Mo. Ct. App. 1985) (plaintiff had to prove physician’s breach of duty by specifying the
antibiotic he should have been given).
65. 10 A.2d 673 (N.H. 1940).
66. In Bolton v. Stone, [1951] App. Cas. 850 H.L., the plaintiffwas hit by a Cricket ball
and pleaded three untaken precautions, namely failure to erect a sufficient fence, failure to
place the cricket pitch further from the road, and failure to prevent cricket balls from falling
into the road.
!
i
!
I
!
!
!
138
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
exist a cost-effective precaution, other than the ones actually pleaded, that
would have satisfied the breach requirement. It is, however, the plaintiff’s
burden to identify and plead such a precaution, if indeed it exists.
The cost-benefit approach was first formally adopted by the courts in
a decision by Judge Learned Hand, in United States v. Carroll Towing Co.67
In Carroll Towing, a barge broke loose and caused an accident. The accident
could have been avoided if, for instance, the owner of the barge had had
an employee on board who could have prevented the barge from breaking
away. According to Judge Hand, "the owner’s duty.., to provide against
resulting injuries is a function of three variables: (1) The probability that
[the barge] will break away; (2) the gravity of the resulting injury, if she
does; [and] (3) the burden of adequate precautions.’’~
Denoting the burden of precaution by B, amount of harm by L, and the
probability of harm by P, Judge Hand provided his celebrated formula:
Liability would be imposed ifB is less than the product of L and P; in other
words, when the burden of precaution is less than the expected damages
avoided.~,’~
The negligence calculus weighs the cost of an untaken precaution against
the value of the reduction in all foreseeable risks that the precaution would
have achieved, not just the risk that actually materialized.TM In Judge Hand’s
assessment, the benefit of the reduction in all foreseeable risks that would
have resulted from having a bargee on board exceeded the cost of the
67. 159 F.2d 169 (2d Cir. 1947).
68. Judge Hand summarized the principles of negligence in Carroll Towing: "Since there
are occasions when every vessel will break away.., and.., become a menace to those about
her, the owner’s duty.., to provide against resulting injuries is a function of three variables:
(I) The probability that she will break away; (2) the gravity of the resulting injury if she does;
(3) the burden of adequate precautions." Denoting the probability by P, the injury by L, and
the burden by B, liability depends on whether B is less than P times L. Id. at 173.
69. See also Indiana Consol. Ins. Co. v. Mathew, 402 N.E.2d 1000 (Ind. Ct. App. 1980)
(court discussed the factors involved in negligence analysis, without formally quantifying
them, to reach decision that defendant’s action was reasonable).
70. See, e.g., RESTATEMENT (SEcoND) OF TORTS ~ 281(b), cmt. e (1965): "Conduct is negligent because it tends to subject the interests of another to an unreasonable risk of harm.
Such a risk may be made up of a number of different hazards, which frequently are of a more
or less definite character. The actor’s negligence lies in subjecting the other to the aggregate
of such hazards."
See also In re Polemis & Furness, Withy & Co, [1921] 3 K.B. 560 (C.A.). In Polemis, the
defendant’s workman dropped a plank into the hold of a ship, causing a spark that caused an
explosion of gasoline vapor. The resultant fire destroyed the ship and its cargo. The arbitrators
found that the fire was an unforeseeable consequence of the workman’s act but that there was
nevertheless a breach of duty. The key to the finding of negligence is the fact that courts base
their analysis of untaken precautions on a balancing of all foreseeable risks (not just the risk
that materialized) against the cost of the untaken precaution. In finding for the plaintiff in
Polemis, Lord Justice Scrutton stated, "[i]n the present case it was negligent in discharging
cargo to knock down the planks of the temporary staging, for they might easily cause some
damage either to workmen, or cargo, or the ship [by denting it]." ld. at 577.
Computer Viruses and Civil Liability
139
bargee. The barge owner therefore breached his duty of due care by failing
to have a bargee on board.
Like general errors, virus strains can be classified as avoidable or unavoidable. The transmission of a virus strain that a reasonably careful provider would detect and eliminate is an avoidable strain; an unavoidable
strain is one that even due care would not have prevented. An example of
an unavoidable virus is an unknown, complex strain that could only be
detected and eliminated at unreasonably high cost, by, for instance, implementing expensive and sophisticated scanning techniques based on artificial intelligence technology. If the computing environment is such that the
stakes are not particularly high, it may not be cost-effective to acquire and
implement the expensive technology required to detect such a complex
virus.
~l’he universe of all virus strains therefore can be divided into an avoidable and an unavoidable subset, as illustrated in the following diagram.
All Virus Strains
Unavoidable set
set
The following numerical example illustrates application of the costbenefit principle to prove breach of duty in a virus context. A hypothetical
commercial software provider uses a signature scanner7~ to scan for viruses
in her software products. A virus escapes detection and finds its way into
a product sold to a customer. The virus causes harm in the computer system
of the customer. The culprit virus is a novel strain that has been documented
fairly recently for the first time. It was not detected because its signature was
not included in the database of the software provider’s scanner.
The customer contemplates a negligence lawsuit. She must prove the
defendant software provider’s breach of duty by showing that the defendant
71. See Section II.B, Technical Antivirus Defenses, supra, for a discussion of technologies
such as signature scanners.
I
I
I
i
,I
I
I
I
I
I
I
I
I
I
I
I
Ii
!
!
140
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
could have used an alternative cost-effective precaution that would have
avoided the virus.
The plaintiff has several pleading options. Potential untaken precautions
include more frequent updating of the signature database, or perhaps use
of a generic scanner that does not depend on an updated database. Each
option has its own set of costs and benefits that have to be tallied to evaluate
its cost-effectiveness in order to establish liability.
Consider, for instance, the plaintiff’s pleading that the software provider
should have updated the signature database of her scanner more frequently.
This incremental precaution (based on the numbers in this stylized example) is efficient, because doing so would add three cents to the firm’s
average cost of production but would reduce the expected accident loss by
eight cents. The numerical data for the example are summarized in Table 1,
below.72
Table 1
Behavior
of firm
Firm’s cost
of production
per unit
Probability
of infection
Loss if
infection
Expected
loss
Current
Proposed
40 cents
43 cents
1/100,000
1/500,000
$10,000
$10,000
I0 cents
2 cents
Full cost
per unit
50 cents
45 cents
The first column lists the defendant’s alternative precautions, namely
scanning at the current rate and scanning at the proposed increased rate,
respectively. The second column lists the total production cost per unit of
software for each precaution option. The third column lists the probabilities of virus transmission corresponding to the respective precautions; the
fifth, the expected losses from a virus attack; and the final column, the full
cost per unit of software product, namely production plus expected accident costs. We assume that a virus attack will result in expected damages
of $10,000.
With the software provider’s current level of precaution, the production
cost per unit is forty cents, the chance of an infection is 1/100,000, and the
loss if an infection occurs is $10,000. The expected accident loss per unit
therefore is ten cents (1/100,000 × $10,000), and the total cost per unit
of software is fifty cents. If, on the other hand, the software provider implemented the proposed precaution pleaded by the plaintiff, the production
cost would be forty-three cents, the probability of infection would decline
72. Based on an example in A.M. POLINSKY, INTRODUCTION TO Law AND ECONOMICS 98
(Table I 1) (1983),
Computer Viruses and Civil Liability
141
to 1/500,000, and the expected loss would be two cents, giving a total cost
per software unit of forty-five cents.
Given this information, it is clear that the untaken precaution is efficient,
and the plaintiff would prevail on the issue of breach. Although increasing
the frequency of signature database updating to the level suggested by the
plaintiff would increase production costs by three cents per unit, it lowers
expected accident losses by eight cents.
C. Cause in Fact
A plaintiff must show that the defendant’s negligence was the cause in fact
of the plaintiff’s harm. Courts usually employ the "but-for" test to determine cause in fact. Under this test, plaintiff’s failure to take a precaution
is the cause in fact of the harm if the precaution would have avoided the
harm. In other words, but for the precaution, the harm would not have
occurred.
A plaintiff may fail the but-for test if she pleads the "wrong" untaken
precaution. Suppose, for example, that a product manufacturer negligently
fails to put a warning about a product hazard in the owner’s manual. A user
of the product is subsequently injured because of the hazard. If the injured
plaintiff admitted he had never read the manual, the manufacturer’s negligent failure to warn would not be a but-for cause of the customer’s injury.
An unread warning would not have been helpful to the user.73
The but-for principle applies similarly in a virus context. Due care may
dictate that a virus scanner signature database be updated once a month.
If the defendant admits, or discovery shows, that he skipped a month,
breach is easily established. If, however, the virus strain is a sufficiently
novel variety, its signature would not have been included even in the
skipped update. A scanner with a database updated at the due care level
would still not have detected the particular strain that caused the harm.
Failure to take this precaution constitutes breach of duty but is not an actual
cause of the infection.
This hypothetical is illustrated in Figure 2, a timeline of events. The
"dot" symbols (o) represent the defendant’s actual frequency of signature
73. DoBBs, supra note 3, at 410. See also McDowall v. Great W. Ry., 1903, 2 K.B. 331
(C.A.), rev’g [1902] 1 K.B. 618 (An improperly secured railcar became loose and injured the
plaintiffs. The court held that failure to secure the car behind its catchpoint constituted
negligence but that the precaution would not have prevented the plaintiff’s injuries, as evidence suggested that they were determined to set the car free. The cause-in-fact requirement
was therefore not met and the negligence action failed. Failure to take the pleaded untaken
precaution constitutes negligence but was not the cause in fact of the accident. Hence, plaintiff’s negligence action properly failed.).
I
!
!
142
Tort THal& Insurance Practice Law Journal, Fall 2004 (40:1)
database updating. Each dot represents an update. The "cross" (x) symbols
represent the plaintiff’s proposed frequency, the untaken precaution.
Figure 2
¯ Defendant’s actual updating frequency
x Plaintiff’s proposed updating frequency
New strain infects
plaintiff’s system
New virus
strain comes
into existence
at this point
Signature of new
strain first
incorporated
in this update
In this illustration, failure to undertake the plaintiff’s pleaded untaken
precaution is not the actual cause of the harm. As illustrated, the new virus
strain appeared after an update, infected the plaintiff’s system, and caused
harm before the next proposed update. The update prior to the virus’s
appearance would not have contained its signature, and the subsequent
update was too late. The culprit virus therefore could not have been detected, even with plaintiff’s proposed superior precaution, just as the unread manual, in the previous example, would not have prevented the plaintiff’s harm. The pleaded untaken precaution therefore fails on actual cause
grounds, even though failing to take it does constitute a breach of duty.
D. Proximate Cam’e
The plaintiffin a negligence action has to prove that the defendant’s breach
was not only the cause in fact but also the proximate, or legal, cause of the
plaintiff’s harm. The proximate cause requirement limits liability to cases
where the defendant’s conduct is "reasonably related" to the plaintiff’s
harm.74 Proximate cause may be absent, for instance, if the accident was
74. Proximate cause limitations on liability are imposed where, as a matter of principle,
policy, and practicality, the court believes liability is inappropriate. See, e.g., the dissenting
opinion of Judge Andrews, in Palsgrafv. Long Island R.R., 248 N.Y. 339, 352,162 N.E. 99,
103 (1928): "Vqqaat we do mean by the word ’proximate’ is that, because of convenience, of
public policy, of a rough sense of justice, the law arbitrarily declines to trace a series of events
beyond a certain point. This is not logic. It is practical politics."
!
!
I
Computer Viruses and Civil Liability
143
due to the unforeseeable and independent intervention of a second tortfeasor. Absent proximate cause, the first tortfeasor would escape liability
even if his breach and actual causation have been clearly demonstrated.
A crisp formulation of the proximate cause requirement is that the realized harm must be within the scope of risk foreseeably created by the
defendant, and the plaintiff must belong to the class of persons foreseeably
put at risk by the defendant’s conduct,ys
Proximate cause applies to two broad categories of cases, namely those
involving (i) multiple risks and (ii) concurrent efficient causes.7~ A multiplerisks case typically involves two risks, both of which would have been reduced by the defendant’s untaken precaution. The first is the primary risk,
which was clearly foreseeable to a reasonable person, and the second an
ancillary risk, which would not have been reasonably foreseeable. Suppose,
for instance, a surgeon performs a vasectomy negligently and a child is
born. The child grows up and sets fire to a house. The owner of the house
sues the doctor for negligence. This is clearly a multiple-risks case. The
primary risk consists of foreseeable medical complications due to the incompetent vasectomy, including an unwanted pregnancy. The ancillary risk
is the (unforeseeable) risk that the conceived child may grow up to be a
criminal.77 The proximate cause issue is whether the defendant should be
held liable for the harm due to the ancillary risk.
A concurrent-efficient-causes case involves multiple causes, all of which
are actual causes of the same harm.7~ In a typical concurrent-efficientcauses case, an original wrongdoer and a subsequent intervening party are
both responsible for the plaintiff’s harm. Suppose, for instance, a technician negligently fails to Fasten the wheels of plaintiff’s car properly. Awheel
comes off, leaving the plaintiff stranded on a busy highway. The stranded
plaintiff is subsequently struck by a passing driver who Failed to pay attention. The technician and the inattentive driver were both negligent and are
both concurrent efficient causes of the plaintiff’s harm. The proximate
cause issue is whether the second tortfeasor’s act should cut off the liability
of the first.
Proximate cause is a dualism consisting of two separate doctrines or tests.
One doctrine applies to multiple-risks cases and the other to concurrentefficient-causes cases. VVhen both situations, multiple risks as well as concurrent efficient causes, are present in the same case, both proximate cause
75. DoBss, rupra note 3, at 444. See also Sinram v. Pennsylvania R.R. Co., 61 E2d 767, 771
(2d Cir. 1932) (L. Hand, J.) ("[T]he usual test is ... whether the damage could be foreseen
by the actor when he acted; not indeed the precise train of events, but similar damage to the
same class of persons.").
76. Grady, supra note 61, at 296 ("Proximate cause is a dualism.").
77. Based on a hypothetical in Do~s, supra note 3, at 444.
78. Grady, supra note 61, at 299.
I
I
I
I
144
Tort Trial iv Insurance Practice Law Journal, Fall 2004 (4&l)
doctrines apply and the requirements for both have to be satisfied for proximate cause to exist.79
The reasonable foresight doctrine applies to cases of multiple risks,
where a primary and ancillary risk both caused the plaintiff’s harm. This
doctrine establishes the conditions under which the tortfeasor who created
the primary risk will be held liable for actual harm that has resulted from
the ancillary risk. The bungled vasectomy is a typical reasonable foresight
case. The reasonable foresight doctrine determines whether the surgeon
would be held liable for damage caused by the ancillary risk, namely the
risk that an unwanted pregnancy may produce a future criminal.
The direct consequences doctrine of proximate cause applies to cases
involving multiple efficient causes. The doctrine examines concurrent causes
to determine whether the person responsible for the second cause has cut
off the liability of the person responsible for the first cause. The "loose
wheel" case is a typical direct consequences case. The direct consequences
doctrine would determine whether the intervening tortfeasor (the inattentive driver who struck the stranded plaintiff) would cut off the liability of
the original tortfeasor (the negligent automobile technician). Some accidents involve purely multiple risks, while others involve purely concurrent
causes. In some cases, however, both doctrines apply.
Application of the two proximate cause doctrines is greatly simplified
and clarified when we divide the cases to which they apply into distinct
paradigms. We now turn to an analysis of the paradigms within each
doctrine.
1. Paradigms in Direct Consequences Doctrine
The direct consequences doctrine is divided into five paradigms, namely
(i) no intervening tort, (ii) encourage free radicals, (iii) dependent compliance error, (iv) no corrective precaution, and (v) independent intervening
tort.go
The no intervening tort paradigm is the default paradigm. It preserves
proximate cause if no tort by anyone else has intervened between the original defendant’s negligence and the plaintiff’s harm, as long as the type of
harm was foreseeable. In this paradigm, the original tortfeasor is not only
the direct cause of the harm but also the only wrongdoer. A speeding and
unobservant driver who strikes a pedestrian walking carefully in a crosswalk
is a clear example of a case within the no intervening tort paradigm.
Under the encourage free radicals paradigm, proximate cause is preserved if the defcndant’s wrongdoing created a tempting opportunity for
judgment-proof people. Proximate cause is preserved under the dependent
79. Id. at 298.
80. ld. at 301-21.
Computer Viruses and Civil Liability
145
compliance error paradigm if the defendant’s wrongdoing has increased
the likelihood that the victim will be harmed by someone else’s inadvertent
negligence. Proximate cause is broken under the no corrective precauuon
paradigm if a third party with an opportunity and duty to prevent the
plaintiff’s harm intentionally fails to do so. Paradigm (v) cuts off the original tortfeasor’s liability if an independent intervening tort caused the plaintiff’s harm.
Encourage free radicals and dependent compliance error are the most
interesting and relevant paradigms ~n a computer virus context. We now
turn to a detailed analysis of these paradigms.
a. Encourage Free Radicals
Negligence law is the most basic form of safety regulation, but it is an
ineffective deterrent against defendants who are shielded from liability by
anonymity, insufficient assets, lack of mental capacity, or lack of good judgment. Such trouble-prone individuals are termed "free radicals" because of
their tendency to bond with trouble. Examples of free radicals include
children, anonymous crowds, criminals, mentally incompetent individuals,
hackers, and computer virus authors.~ The deterrence rationale of negligence law would be defeated if responsible people who foreseeably encourage free radicals to be negligent were allowed to escape judgment by
shifting liability to the latter. Common law negligence rules therefore prescrve the liability of the responsible individuals.~2
Satcher v. James H. Drew Shows, Inc.s3 illustrates the free radicals paradigm. In Satcher, the plaintiff bought a ticket for a ride on the bumper cars
m an amusement park. A group of mental patients on an excursion joined
the plaintiff’s group. VVhen the ride started, the patients converged on the
defendant and repeatedly crashed into her from all angles, injuring her
neck permanently. The plaintiff filed suit, alleging that the defendant
owner and operator of the ride had been negligent in allowing the patients
to target and injure her. The appellate court reversed the trial court’s decision for the defendant on the grounds that the defendant had encouraged
free radicals.
Another free radicals case is presented by Weirum v. RKO General, Inc.~4
The defendant radio station broadcast a contest in which a disk jockey
would drive throughout Los Angeles. He would stop occasionally and announce his location on the radio. Teenagers would race to meet the disk
jockey and he would give a prize to the first one who reached him. Even81. ld. at 306-12.
82. ld. at 308.
83. 177 S.E.2d 846 (Ga. Ct. App. 1970).
84. 539 P.2d 36 (Cal. 1975).
146
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:l)
tually, two racing teenagers were involved in a road accident, killing the
plaintiff’s deceased. There were two concurrent efficient causes of the accident, namely the organizers of the contest and the reckless teenage drivers. The radio station negligently encouraged the free radical teenagers to
drive recklessly. The wrongdoing of the teenagers therefore did not cut off
the defendant radio station’s liability. The defendant radio station was held
jointly liable with the teens and, as the deeper pocket, likely paid most of
the damages.
(i) Limitatiom" on Liability for Encouraging Free Radicals--The defendant
will not be liable for encouraging free radicals unless she did so negligently.~s This implies that the behavior of the free radicals must have been
ex ante foreseeable, the actions of the free radicals must not have gone far
beyond the encouragement, and the opportunity created for them must
have been relatively scarce, to hold the defendant liable.
The defendant’s act of encouragement would not amount to negligence
unless the behavior of the free radicals was ex ante reasonably foreseeable.
The defendant would not be liable for the actions of the free radicals if
either they acted independently of the defendant’s actions or their behavior
went far beyond the defendant’s encouragement. In Weh~um, for instance,
it must have appeared reasonably probable to the radio station that its
contest would induce the kind of behavior that ultimately led to the accident, in order to hold the station liable. If one of the contestants had shot
another in order to gain an advantage, the radio station would probably
have escaped liability.~
If, besides the opportunity created by the defendant, several alternative
opportunities were available to the free radical to cause the same or similar
harm, the defendant’s encouragement likely did not significantly increase
the probability of the harm. The defendant therefore may escape liability
if the opportunity created for the free radicals is not particularly scarce. A
person flashing a wad of $100 bills would probably not be liable for the
harm caused by a fleeing thief who runs into and injures someone. Because
of the availability to the thief of many other similar opportunities, the flash
of money did not increase the likelihood of the type of harm that occurred.
If the person had not flashed the money, a determined thief would have
found another opportunity.87
The person encouraged by the defendant may be a responsible citizen
and not a free radical at all. In such a case, the defendant would escape
85. Grady, supra note 61, at 309 ("The pattern of EFR cases indicates that a defendant will
not be liable for free radical depredations unless it negligendy encouraged them.").
86. Id at 308.
87. ld. at 310 ("The defendant, in order to be liable, must negligently provide some special
encouragement of wrongdoing that does not exist in the normal background of incitements
and opportunities.").
I
I
Computer Viruses and Civil Liabiliq
147
liability. If Bill Gates had responded to the Weirum radio broadcast by
racing to collect the prize, his intervening conduct would have almost certainly cut off the defendant’s liability. Likewise, in the unlikely event that
Bill Gates would use a virus kit to create a virus that exploits a weakness
in Windows, the creator of the kit would escape liability. If, however, a free
radical, such as a judgment-proof hacker, did the same, proximate causality
would likely not be broken.
(ii) Encouragement of Virus Authors--Virus authors, as the originators of
dangerous malevolent software, are, directly or indirectly, responsible for
the harm caused by their creations. As such, they are always potential targets of lawsuits related to the harm. Virus authors often receive technical
assistance, such as access to virus kits on the Intemet that allow creation
of custom viruses. Such virus tool kits, which enable people who have no
knowledge of viruses to create their own, are commonly available on the
Internet. Some of these kits are very user-friendly, with pull-down menus
and online help available. Such a kit was used, for instance, to create the
infamous Kournikova virus?~
Although a virus kit is useful to someone who lacks technical proficiency,
it is not particularly helpful to a technically skilled person. A skilled and
determined virus author would not wait for a kit to appear on the Internet,
just as a determined thief would not wait for someone to flash a wad of
$100 bills before acting. The creator of a virus kit may escape liability if a
technically competent person downloaded and used the kit to create a virus.
Even if the technically competent virus author were a judgment-proof free
radical, the fact that the kit did not provide a means or encouragement
beyond resources already available to the author cuts off liability of the
original creator of the kit.
Virus authors also get assistance and inspiration from existing viruses
that can be easily copied and modified. Once an original virus is created,
altered versions are usually much easier to create than the original. Such
altered versions may have capabilities that make them more pernicious than
the original.~9 A virus named NewLove, for instance, was a more destruc88. See, e.g., htrp://www.cknow.com/vtutor/vtpolymorphic.hun; Sarah Gordon, Vir~ Writer~’:
The End oflnnocence, IBM UVhite Paper, http://www.research.ibm.com/anfivirus/SciPapers/
VB2000SG.htm (reporting the existence on the Internet of several sites with viruses in executable or source code form, available for download).
89. See, e.g., Jay Lyman, Authorities Investigate Romanian Virus Writer, at http://www.linux
insider.comJperl/story/31500.html ("The amazing side of this peculiar situation is that two
people are to stand trial for having modified original code of MSBIast.A (the first blaster
worm), but the creator of the worm is still out there... Antivirus specialists concur in saying
that such altered versions are not as difficult to create as the original."). The possibility of
variants of well-known viruses has caused concern. Id. ("A senior official at the [FBI] told
TechNewsWorld that there is concern about variants and the implications of additional virus
writers.").
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
i
I
I
I
I
I
I
148
Tort TriaI& Insurance Practice Law Journal, Fall 2004 (40:1)
tive variant of the LoveLetter virus. NewLove was potymorphic, which
made its detection more difficult than LoveLetter’s. It also overwrote files
on the hard disk that were not in use at the time of infection. Due to a
(fortunate) programming error, NewLove could not spread as widely as
LoveLetter, but ~t was much more destructive in computers to which it did
spread.9°
Virus authors are also encouraged and helped by a variety of network
security flaws that allow and facilitate the transmission of viruses. The
Blaster worm, for ~nstance, exploited a security flaw in Microsoft’s Windows operating system to invade and crash computers.91
In practice, it is G ften easier to track down individuals who created opportunities for virus authors than the authors themselves. Virus kits are
often posted on an identifiable Web page on the Internet and security flaws
can be traced to the manufacturer, as in the case of the Microsoft Windows
flaw. If virus authors are free radicals, individuals who create these opportunities for them would likely be the proximate cause of the harm. If they
are not free radicals, their wrongdoing may be considered an independent
intervening tort and, as such, will cut off liability of the encouragers.
(iii) Are Virus Authors Free Radicals?--Virus authors have properties
commonly associated with free radicals. They are often judgrnent-proof
and shielded by the anonymity of cyberspace. Virus authors are also increasingly turning to organized crime. Furthermore, virus attacks are underreported and underprosecuted, and the probability of catching a hacker
or virus author is comparatively low. Virus authors appear undeterred by
the threat of legal liability and often seem unconcerned about the problems
caused by their creations. All these factors are consistent with a free radical
profile.
The anonymity of the Internet is often exploited by cybercriminals. This
complicates the task of detecting computer crimes and tracking down offenders. It also makes it harder to obtain evidence against a wrongdoer
such as a virus author?2 Cyberspace provides the technology and opportunity to a skilled operator to assume different identities, erase digital footprints, and transfer incriminating evidence electronically to innocent corn90. K. Zetter, When Love Came to Town: A Virt~ bzvestigation, PC WORLD, Apr. 18, 2004,
available at http://www.pcworld.com/news/ar ticle/0,aid,33392,00.asp.
91. Danny Penman, Microsoft Monoculture Allows l~rus Spread, NEWSClENTIST ONLINE
NEWS, Sept. 25, 2003 ("lV]irus writers exploit human vulnerabilides as much as security
flaws.").
92. Gordon, supra note 88 ("IT]racing a virus author is extremely difficult if the virus writer
takes adequate precautions against a possible investigation."); Ian C. Ballon, Alternative Corporate Responses to lntem~et Data Theft, 471 PLI/Pat. 737,739 (1997); M. Calkins, They Shoot
7~’ojan Horses, Don’t They? An Economic Analysis of Anti-Hacking RegulatoT~ Models, 89 G~o.
L.J. 171 (2000).
!
!
Computer Hruses and Civil Liability
149
puters, often without leaving a trace.93 Suppose, for instance, a virus were
transmitted from the e-mail account of someone named Jill Smith and a
copy of an identical virus were tracked down in the same account. This
may look like a smoking gun but would likely not prove by a preponderance
of the evidence that Jill is the actual culprit. Someone may have hacked
into the Smith account, used it to launch a virus, and stored incriminating
files in the account.94
In several cases, cyber rogues were apprehended because of their recklessness or vanity. In May 2000, a virus named LoveLetter was released
into cyberspace. The virus first appeared in computers in Europe and Asia,
hitting the European offices of Lucent Technologies, Credit Suisse, and
the German subsidiary of Microsoft.9s
When recipients clicked on the attachment in which it arrived, the virus
sent copies of itself, via Microsoft Outlook, to everyone in the user’s address book. It would then contact one of four Web pages hosted on Sky
Internet, an Internet service provider (ISP) located in the Philippines, from
which the virus downloaded a Trojan horse. The Trojan horse then collected valuable usernames and passwords stored on the user’s system and
sent them to a rogue e-mail address in the Philippines.’~6
Investigators tracked the origin of the LoveLetter virus by examining
the log files of the ISP that hosted the Web pages from where the Trojan
horse was auto-downloaded. Investigators were able to pierce the anonymity of cyberspace, in part because of clues revealed by the perpetrator, perhaps out of vanity, such as a signature in the virus code.’~7
93. See, e.g., Ted Bridis, Microsof~ Offers Huge Cash Rewards for Catching Virus Writers, at
http://www.securityfocus.com/news/7371 ("Police around the world have been frustrated in
their efforts to trace some of the most damaging attacks across the Internet. Hackers easily
can erase their digital footprints, crisscross electronic borders and falsify trails to point at
innocent computers.").
94. M.D. Rasch, Criminal Law and the Intemzet, in THE INTERNET AND BUSINESS: A LAWYER’S
GUIDE TO Tn~ E~ER~IN~ LE~^L ISSUES (Computer Law Ass’n). Online version is available at
http://www.cla.org/RuhBook/chp 11.htm. See also B~zREvoRT NEws, Sept. 12, 2003 ("There
are many ways for virus writers to disguise themselves, including spreading the programs
through unwittingly infected e-mail accounts. The anonymity of the Internet allows you to
use any walnerable machine to launder your identity.").
95. The virus was written in Visual Basic code, the most common language for virus code,
characterized by a "dot.vbs" extension. Many users did not observe the dot.vbs extension
because the Windows default setting hides file extensions.
96. Zetter, supra note 90.
97. Investigators traced the origin of the posting to a prepaid account at Supernet, another
ISP in the Philippines. The LoveLetter virus was launched from two e-mail accounts, but
the prepaid account would have allowed the virus author to remain anonymous if he had not
provided additional incriminating evidence to investigators. The perpetrator was eventually
tracked down, in part because, perhaps out of vanity, he left a signature in the virus code.
The signature consisted of his name, e-mail address, membership in an identifiable small
programmer’s group, and hometown (Manila). The perpetrator also used his own home computer to launch the virus and dialed the ISP using his home telephone. This allowed the ISP
to determine the telephone number from its call-in log files.
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
!
150
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
The anonymity of cyberspace has enabled virus authors to graduate from
cyber-vandalism to organized crime. Virus writers are increasingly cooperating with spammers and hackers to create viruses to hack into computers
to steal confidential information, often hiding their identity by spoofing
the identity of the legitimate owner. Spammers are using viruses, for instance, to mass-distribute junk mail, by sending out viruses to take over
computers and e-mail accounts and using them to mass-distribute spam
messages.98 The owner of the hijacked computer usually does not know it
has been hijacked, although there are often subtle indications, such as
slower Internet connection.99
To further enhance his anonymity, the spammer may use a remailer, i.e.,
a server that forwards electronic mail to network addresses on behalf of an
original sender, who remains unknown. A remailer delivers the e-mail message without its original header, thus hiding the identity of the original
sender from the recipient. This ensures almost total anonymity for the
spammer.~°°
Virus authors appear to be undeterred by the threat of legal action. In a
leading study on the subject, Dr. Sarah Gordon examined the correlation
between the number of new viruses in the wild and high-profile prosecutions of virus authors as a measure of the deterrence value of prosecution.
Dr. Gordon reports that high-profile prosecutions have had limited deterrent effect.~m
98. The virus named "Sobig F," for instance, is programmed to turn a computer into a
host that sends out spare e-mail messages, often without the knowledge of the owner. It is
widely believed that half a million copies of the virus named AVF were sent by a spammer.
Unlike Melissa, the AV’F virus does not mail copies of itself out to everyone in the infected
computer’s address book. Instead, AVF makes the infected computer an intermediary by
opening a backdoor in the infected machine through which spammers can distribute their
junk mail.
99. Spare Virus Hijacks Computers, BBC NEws, at http://news.bbc.co.uk/l/hi/technology/
3172967.stm; Jo Twist, Why People Write Computer Virt~’es, BBC N~ws, at http://news.bbc.
co.uk/t/hi/technology/3172967~stm.
100. Spammers and Viruses Unite, BBC NEws, at http://news.bbc.co.uk/l/hi/technology/
2988209.stm (describing the hijacking program called Proxy-Guzu, which would typically arrive
as a spam message with an attachment. Opening the attachment triggers it to forward information about the hijacked account to a Hotmail account. This information then enables a
would-be spammer to route mail through the hijacked computer. The source of this spam
would be very hard if not impossible to trace, especially if the spammer and the sender of the
hijacking program employed anonymity-preserving techniques, such as a remailer.). See also
Lyman, supra note 89 (referring to "the difficulty of tracking down virus writers, particularly
when they are skilled enough to cover their digital tracks, [so that] few offenders are ever
caught").
101. Gordon, supra note 88 (finding no evidence that such prosecutions have alleviated
the virus problem, as measured by the rate of creation of new viruses in the wild subsequent
to high-profile prosecutions). See also R. Lemos, ’Tis the Season for Computer Viruses (1999),
at http://www.zdnet.co.uk/news/1999/49/ns-12098.html. It is well known that even after the
author of the Melissa virus had been apprehended (and expected to be sentenced to a multiyear
prison term), the appearance of new viruses on the Internet continued to proliferate and at
an increasing rate.
I
I
l
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
!
Computer bqruses and Civil Liability
151
Dr. Gordon’s conclusions were corroborated by another survey she undertook, in which virus authors and antivirus researchers were asked whether
the arrest and prospective sentencing of the Melissa author would have any
impact on the virus-writing community. All virus authors interviewed stated
that there would be no impact, immediate or long-term, while the antivirus
researchers were evenly split on the question. These results are consistent
with those of comparable surveys by other researchers.1°2
For example, a subsequent survey suggests that new laws will result in
more viruses than before. According to the survey results, a majority of
virus authors would either be unaffected or actually encouraged by antivirus legislation. A number of them claimed that criminalization of virus
writing would actually encourage them to create viruses, perhaps as a form
of protest or civil disobedience.1°3
Laws against virus creation cannot be effective unless virus incidents are
reported and perpetrators prosecuted. There is evidence that virus crimes
are seriously underreported and, as a consequence, underprosecuted.*°4
Commenting on the ineffectiveness of the law to combat computer viruses,
Grable writes, "[b]oth the federal and New York state criminal statutes
aimed at virus terror are ineffective because ... [t]he combination of the
lack of reporting plus the inherent difficulties in apprehending virus creators leads to the present situation: unseen and unpunished virus originators doing their damages unencumbered and unafraid."1°5
b. Dependent Compliance Error
The dependent compliance error paradigm applies where a defendant has
exposed the plaintiff to the compliance error--relatively innocent, inadvertent negligence--of a third party. It preserves the liability of the original
defendant when the compliance error results in injury to the plaintiff.
102. Gordon, supra note 88 (reference to a survey by A. Briney).
103. ld.( reference to DefCon survey).
104. ld. ("Minnesota statute §§ 609.87 to .89 presents an amendment which clearly defines
a destructive computer program, and which designates a maximum (prison term of) ten years;
however, no cases have been reported. Should we conclude there are no virus problems in
Minnesota?"). See also Michael K. Block & Joseph G. Sidak, The Cost of Antitrust Deterrence:
Why Not Hang a Price-Fixer NoTv and Then? 68 GEo. L.J. 1131, 1131-32 (1980); Stevan D.
Mitchell & Elizabeth A. Banker, Private Intrt~6on Response, 11 H^RV. J.L. & TEcn. 699, 704
(1998).
105. Gordon, supra note 88 (quoting J. Grable, 73"eating Smallpox with Leeches: Criminal
Culpability of Virt~ Writers and Better Ways to Beat Them at Their Own Game, 24 COMPUTERS
& Law (Spring 1996)). See also id. ("[G]iven the small number of virus writers who have been
arrested and tried ... this lack of arrests is one of the primary indicators used by some to
argue that laws are not a good deterrent."); Vir~ts Writers Di~sult to Fi~d i~ Cybw~pace,
BmR~voar N~ws (Sept. 2003) (reporting that it took eighteen days to track down the author
of the Blaster worm, even though the author left a clear trail behind, including his alias
stitched into the virus code, and references to a website registered in his name), available at
http://www.bizreport.com/print.php?art_id = 4917.
i
i
I
I
!
!
I
I
I
I
I
I
I
I
I
i
!
i
Tort Trial ~ Imurance Practice Law Journal, Fa# 2004 (40:1)
In Hairston v. Alexander Tank and Equipment Co.,~°6 a technician negligently failed to fasten the wheels of plaintiff’s car properly. A wheel came
off, leaving the plaintiff stranded on a busy highway. The stranded plaintiff
was subsequently struck by a passing driver whose attention had inadvertently lapsed. Liability of the original tortfeasor, the auto technician, was
preserved, because he had put the plaintiff in a situation where he was
exposed to a high likelihood of harm due to the compliance error of the
inattentive driver.
This principle is particularly applicable to computer security. Consider,
for instance, a computer security breach where a flaw, such as a buffer
overflow, allowed a virus to penetrate a network.1°7 The security apparatus
of the network fails to detect and eliminate the virus and it causes considerable harm to one or more computers in the network.
In situations such as this, the security lapse that allowed the virus into
the system is foreseeable and likely due to a compliance error. The person
responsible for the buffer overflow in the software, however, provided the
opportunity, and thus exposed the users of the network to the security
compliance error. Under the dependent compliance error paradigm, therefore, the liability of the person responsible for the buffer overflow will not
be cut off, in spite of the intervention of the subsequent security lapse.
The schematic diagram, below, summarizes the arguments in this section. It applies to a typical computing environment, such as the computer
network in the preceding (buffer overflow) example. The rectangle, V, represents the entire universe of virus strains. The virus universe consists of
106. 311 S.E.2d 559 (N.C. 1984).
107. A buffer is a contiguous piece of memory, usually dedicated to temporary storage of
data. A buffer overflow occurs when a program tries to store more data in a buffer than it has
the capacity for. The extra information overflows into adjacent buffers, overwriting or corrupting the legitimate data in the adjacent buffers. A buffer overflow has been described as
"very much like pouring ten ounces of water in a glass designed to hold eight ounces. Obviously, when this happens, the water overflows the rim of the glass, spilling out somewhere
and creating a mess. Here, the glass represents the buffer and the water represents application or user data." Mark E. Donaldson, Inside the Buffer Overflow Attack: Mechanism, Method
and Prevention, SANS INSTITUTE 2002 WroTE P^vEI~, available at http://www.sans.org/rr/
whitepapers/securecode/386.php. System Administration, Audit, Network and Security (SANS)
was founded in 1989 as a cooperative research and education organization, specializing in
computer security training and education. Buffer overflow is an increasingly common computer security attack on the integrity of data. The overflowing data, for instance, may contain
code designed to trigger specific actions, such as modify data or disclose confidential information. Buffer overflows are often made possible because of poor programming practices. An
attacker exploits a buffer overflow by placing executable code in a buffer’s overflowing area.
The attacker then overwrites the return address to point back to the buffer and execute the
planted overflow code. A programming flaw in Microsoft Outlook, for instance, made it
vulnerable to a buffer overflow attack. An attacker could invade a target computer and overflow a target area with extraneous data, simply by sending an appropriately coded e-mail
message. This allowed the attacker to execute any code he desired on the recipient’s computer,
including viral code. Microsoft has since created a patch to eliminate the vulnerability.
i
i
I
!
I
I
I
I
I
I
I
I
I
I
I
!
!
Computer l, qruses and Civil Liability
153
avoidable viruses (virus strains that could be detected and eliminated at a
cost less than its expected harm) and unavoidable viruses. In the diagram,
the avoidable set is represented by the larger ellipse inside the rectangle,
labeled V*, and the unavoidable set by the white area inside the rectangle
but outside the ellipse, labeled V-V*.
All Virus Strains, V
Unavoidable set,
V-V*
Avoidable set,
V"
Set
actually avoided,
VI
Set negligently
transmitted, V*-V
The innermost, smaller, and darker ellipse, V~, represents the possibility
that an avoidable virus nevertheless may be transmitted into the computing
environment. In the absence of negligence, no strain in V* will be transmitted. In the event of negligence of a party, such as a security flaw in a
computer system or l~ailure to use reasonable antivirus precautions, some
strains in V* could enter the system, and only a subset of V* will be avoided.
V~ represents the subset that will be avoided, and the rest of V*, the grey
area, denoted (V*-V~), represents the strains in V* that may enter the
system due to the negligence. Virus strains in (V*-V~), as a subset of V*,
should be detected if due care were taken. They will not be detected, however, because they are outside of V’.
The remainder of this section argues that the set of negligently transmitted viruses, represented by (V*-V~), is large relative to the set of unavoidable viruses, represented by (V-V*). The outer boundary of (V*-V’)
is defined by V*, and the inner boundary by W. The larger V* (the "further
out" the outer boundary) and the smaller V~ (the "further in" the inner
boundary), the larger (V*-V~). We show in this subsection, that V* is large
relative to V and V~ is small relative to V*, resulting in a large (V*-V~). A
virus attack therefore likely involves negligence.
Most cases of virus infection governed by the negligence rule involve a
compliance error. A defendant who exposes a plaintiff to the negligence of
a third party that results in a virus attack is therefore likely the proximate
cause of the harm, under the dependent compliance error paradigm.
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
154
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
This explains why, in the previous buffer overflow example, courts would
likely preserve the liability of an individual whose negligence was responsible for a buffer overflow in a computer system. The buffer overflow allowed a virus to enter the system and exposed users of the network to a
compliance error by the network security administrator. The security person’s compliance error, namely failure to detect and eliminate the virus,
allowed the virus to remain in the system and wreak havoc.
This conclusion remains valid, by a preponderance of the evidence, even
m cases where the culprit virus cannot be reliably identified as avoidable
or unavoidable. The reason is that most viruses are avoidable and their
presence likely attributable to a compliance error. The key factors that
drive this theory are that V* is large and V~ small.
(i) V~ Is Large--The Learned Hand formula, B --- P × L, dictates that,
to avoid liability, investment in antivirus precautions (B) should at least
equal the expected harm avoided (P × L). In this subsection, we argue that
V* is large for the following reasons. P × L is relatively large, so that the
legally mandated precaution level, B, must be large. The efficiency and
economy of antivirus technology indicate that a substantial investment in
precautions will result in a large avoidable set, V*.
(ii) P × L Is Large--Expected harm from infection, P × L, is large,
because the probability of virus infection, P, and the harm associated with
virus infection, L, are both large. P is large because of the substantial and
increasing prevalence of computer viruses on the Internet and in computer
networks. L is large because of the unique nature and unusual destructive
potential of viruses, both in an absolute sense, as well as compared to general computer security hazards.
(iii) P Is Large--virus prevalence is substantial and increasing.1°8 According to the influential 2003 ICSA survey, 88 percent of respondents
perceived worsening of the virus problem.~°~ Virus prevalence statistics m
the survey support the pessimistic response. The following graph, constructed from data in the ICSA Survey, illustrates the trend of an increasing
virus infection rate.
108. See. e.g., ICSA LARs 9TH t~tNUAL COMPUTER VIRUS PREVALENCE SURVEY 2003, .rapra
note I0, at 23 ("There is little doubt that the global virus problem is worsening. After a
somewhat quiet year in 2002, 2003 arrived with a vengeance. Beginning with the Slammer
worm in January, to Mimail and its many variants in December, we have seen one of the most
eventful years ever for computer viruses. For the 8th year in a row, virus infections, virus
disasters and recovery costs increased.").
109. Qualified respondents to the survey work for companies and government agencies
with more than 500 PCs, two or more local area networks (LANs), and at least two remote
connections.
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Computer l, qruses and Civil Liability
155
I00
50
1996
97
98
99
2000
Year
01
02
03
The high and increasing infection rate, which is a direct proxy for the
probability that any particular network will be hit by a virus attack during
a given time interval, suggests a high value for P in the Learned Hand
formula.
(iv) L Is Large--The expected harm associated with virus infection is
significant, both in an absolute sense, as well as relative to general computer
security hazards and hardware and software errors. The greater inherent
danger of viruses is due to the generality, scope of harm, persistence, growing payload severity, and advances in the spreading mechanism of the virus
threat.tl°
A typical traditional computer security breach is usually related to a particular identifiable weakness, such as a security flaw that allows unauthorized access to a hacker. Viral infection is a more general and more volatile
security threat, which makes it harder to plan a comprehensive preventive
strategy. It can enter the system or network in multiple ways, and any and
every program or data file is a potential target. It can be programmed to
carry virtually any conceivable resource-dissipating or destructive function,
and to attach it to any part of a system or network..11
110. See generally COHEN, supra note 8, at 24-27; INST. FOR COMPUTER SEC. & ADMIN.,
ICSA L^Bs 6TH ANNUAL COMVUT~B V~RuS PR~VALENC~ SUrvEy 2000. For a detailed analysis
and discussion of the nature and origins of the unusual danger level associated with virus
infection, see Meiring de Villiers, Virus ex Machina Res lpsa Loquitur, 1 SX^~FORI~ T~cn. L.
R~v., Section V.C (2003).
I 11. CoHEn, supra note 8, at 24 ("The virus spreads without violating any typical protection
policy, while it carries any desired attack code to the point of attack. You can think of it as a
missile, a general purpose delivery system that can have any warhead you want to put on it.
So a virus is a very general means for spreading an attack throughout an entire computer
system or network.").
i
i
!
I
I
l
i
I
!
I
!
!
I
I
i
156
Tort Trial & btJ’urance Practice Law Journal, Fall 2004 (40:1)
The chameleonlike evolution of virus technology poses unique challenges to virus detection and elimination efforts. The shape and form of
viral attacks evolve continuously, as evidenced by the appearance of a progression of stealth, polymorphic, macro, and e-mail viruses. Advances in
computer technology continuously open up new opportunities for virus
writers to exploit. Malevolent software exploiting e-mail technology is a
prime example. Conventional wisdom once reassured that it was impossible
to become infected by a virus simply by reading an e-mail message. This
wisdom was promptly shattered by advances in virus technology designed
to exploit the unique characteristics, as well as obscure weaknesses and
little-known flaws in new technologies. A family of viruses that exploited
a weakness in the JavaScript technology, for instance, was programmed to
infect e-mail attachments and, when the e-mail message was read, automatically compromise the computer system, without even having the user
actually open the attachment.~2
The release of a computer virus has been likened to opening a bag of
feathers on a tall building on a windy day. The Scores virus, for instance,
was created to target a large company, EDS, but ended up attacking several
U.S. government agencies, including NASA and the Environmental Protection Agency.’ ,3
The scope of potential harm caused by computer viruses is unprecedented. In a typical conventional security breach, a hacker may access an
account, obtain confidential data, and perhaps corrupt or destroy it. The
damage could, of course, be substantial, but it is nevertheless limited to
the value of the data and contained within the system or network hacked
into. If, instead, a hacker accessed an account by releasing a virus into the
system, the virus may spread across computers and networks, even to those
not physically connected to the originally infected system.~ ~4 Whereas the
! 12. ROt3ER A. GR,MES, NI~L~C~OOS Molnt.l~ CODI~ 394 (2001). JavaScript is a language
developed by Netscape in collaboration with Sun Microsystems to increase interactivity and
control on Internet Web pages, including the capability to manipulate browser windows. The
JavaScript e-mail worm, JS.KAK, which appeared at the end of 1999, exploited an obscure
Internet Explorer security flaw to disrupt computer systems and destroy data. It infects email attachments and, when the e-mail message is opened, automatically compromises the
computer system, without having the user open the attachment. A related, but less-wellknown and shorter-lived e-mail virus, the so-called BubbleBoy, exploited a security hole in
the Auto-Preview feature in Microsoft Outlook to send a copy of itself to every listing on the
user’s address list. BubbleBoy was one of the first attachment-resident viruses that did not
require the user to open the attachment in order to do its harm.
113. A. Bissett & G. Shipton, Some Human Dimensions of Computer Virus Creation and
Infection, 52 Ira’r. J. HuM~ra-Co~voT~ S’ruo~s 899, 903 (2000); E.L. Lmss, Sov~rw~l~ Ura~R
Sm~ (1990).
114. See, e.g., Robin A. Brooke, Deterring the Spread of Viruses Online: Can Tort Law Tighten
the "Net"? 17 R~v. L~wm. 343, 361 ("The market now provides enough statistics indicating
both high risk and potentially widespread damage from virus attacks, while either programming prevention or off-the-shelf capabilities to detect viruses may impose a proportionally
Computer Hruses and Civil Liability
157
conventional hacker can destroy data worth, say, an amount D, releasing a
virus to do the same job can cause this harm several times over by spreading
into N systems, causing damage of magnitude N × D, where N can be
very large. Although the two types of security breaches do similar damage
in a particular computer, the virus’s greater inherent danger is that it can
multiply and repeat the destruction several times over.I*~
Dr. Fred Cohen provides a dramatic illustration: "Sitting at my Unixbased computer in Hudson, Ohio, I could launch a virus and reasonably
expect it to spread through 40% of the Unix-based computers in the world
in a matter of days. That’s dramatically different from what we were dealing
with before viruses."1.6
A worm, the so-called Morris Worm, designed and released by a Cornell
University student, effectively shut down the Internet and other networks
connected to it. *.7 It was not designed to damage any data, but conservative
estimates of the loss in computer resources and availability range between
$10 million and $25 million?~
Dr. Cohen’s statement was published more than a decade ago. Today,
viruses spread much faster, and there is every indication that virus transmission will continue to accelerate. The 2003 ICSA report remarks, for
instance, that whereas it took the early file viruses months to years to spread
widely, subsequent macro viruses took weeks to months, mass mailers took
smaller burden."); id. at 348 ("Widespread proliferation of a virus originally undetectable
becomes compounded very quickly. Independent actors along the transmission chain can be
unaware of malevolent software residing in their computer, network, files, or disks, even if
they use virus protection software, because the software may not sufficiendy detect more
sophisticated code."). See also ALLAN LtmOELL, Vmvs! vii (1989) ("Most mainframe computers
can be successfully subverted within an hour. Huge international networks with thousands of
computers can be opened up to an illicit intruder within days." (quoting Dr. Fred Cohen));
Havs~t~, supra note 14, at 13 ("[N]ew viruses are highly destructive, programmed to format
hard disks, destroy and corrupt data. As viral infections become more and more widespread,
the danger of damage to data is increasing at an alarlning pace); id. at 14 ("The virus danger
is here to stay. In the USA, the Far East and Africa it has already reached epidemic proportions
¯.. In just three months in the Spring of 1989, the number of separately identifiable viruses
increased from seven to seventeen.").
115. DUNHAIVl, supra note 1, at xx ("Just one virus infection can erase the contents of a
drive, corrupt important files, or shut down a network.").
116. COHEN, supra note 8, at 25. See also Ga*NGRAS, supra note 4, at 58 ("A computer
harboring a virus can, in a matter of hours, spread across continents, damaging data and
programs without reprieve."). See also Bradley S. Davis, It’s Virttr Season Again, Has Your
Computer Been Vaccinated? A Sto-vey of Computer Crime Legislation as a Response to Malevolent
Software, 72 WASH. U. L.Q. 379, 437 and accompanying text ("[A] user whose computer was
infected could connect to an international network such as the Internet and upload a file onto
the network that contained a strain of malevolent software. If the software was not detected
by a scanning system.., on the host computer, infection could spread throughout the Internet
through this simple exchange of data."); Hen, Fast a Virus Can Spread, supra note 1, at 21.
117. For an account of the "Internet Worm Incident," see, e.g., ROGUE PROGRAMS, supra
note 11, at 203.
118. F~TES ET AL., supra note 1, at 51-52.
i
I
!
I
I
i
i
I
i
!
!
!
I
i
I
I
158
Tort Trial & In.~rance Practice Law Journal, Fall 2004 (40:1)
days, Code Red took approximately twelve hours, and Klez spread around
the world in two and one-half hours,t~’~
A third major distinction that makes viruses more dangerous than general security hazards is their persistence. A virus can never really be entirely
eliminated from a system. Generally, when a programming error or security flaw is rectified, the specific problem can be considered eliminated
from the system. In the case of viruses, however, one can never be sure
that a particular virus is gone for good. An infected program may be deleted
and restored from a backup, but the backup may have been made after the
backed-up program was infected and, hence, contain a copy of the virus.
Restoring the program will then also restore the virus. This may happen,
for instance, in the case of a virus that lies dormant for a while. During its
dormancy, periodic backups also will back up the virus. When the virus
becomes active, deleting the infected program and restoring it from the
backup will only repeat the cycle22° Even if the backup is not contaminated,
any user of the system with an infected floppy disk or contaminated e-mail
could reintroduce the virus into the disinfected system22
Many virus strains tend to survive progressively new generations of software. Replacing an old, infected spreadsheet program with a new and clean
version will temporarily eliminate the virus, but the new version will not
be immune to the particular virus. If the virus makes its way back, perhaps
via an e-mail attachment, it will eventually reinfect the new program222
119. ICSA LABS 9TH A~r~U^L COMPUTER VIRUS PREV^LEr~CE SURVEy 2003, supra note 10,
at 25.
120. Shane Coursen, How Much Is That Virus in the Window, ViRuS BULL. 15 (1996) (describing a common virus named Ripper that slowly modifies data while the data are being
archived, resulting in corrupted backups); DUtCH^M, supra note 1, at 129-30.
12 I. BROOK~, supra note 114, at 362 n.95 ("It is likely impossible to eradicate viruses completely. Simply disinfecting a computer system could cost a staggering amount. In 1990,
computer infection in the United States alone was estimated to be one percent, or about
500,000 computers... Unfortunately, even having a virus removed provides no guarantee of
safety from further virus harm. In the United States, 90 percent of all infected users experience
re-infection within 30 days of having the original virus removed."); Coursen, supra note 120,
at 13 ("IT]he fix must be implemented in such a way that it is all-encompassing and simultaneous across infected sites. Tending to one site and neglecting another will surely allow a
persistent virus to work its way back again."); id. at 16 ("Cleaning your program of a virus
does not guarantee that it will not come by for another visit. Just one leftover diskette or
program can have a snowball effect and start another virus outbreak. Vv’ithin a matter of
hours, the entire business could be under siege again. Any time spent cleaning up from the
initial infection or outbreak can easily be lost in those few hours. The complete virus recovery
process would have to be repeated.").
122, See, e.g., Con~, supra note 8, at 27 ("Eventually you probably change every piece of
software in you~~ computer system, but the virus may still persist, rvVhen you go from DOS
2~01 to DOS 2.3, to 3.0, to 3.1 to 3.2 to 4.0 to 4.1 to 5.0 to 6.0 to 0S/2, the same viruses
that worked on DOS 2.01 almost certainly work on each of these updated operating systems.
In fact, if you wrote a computer virus for the IBM 360 in 1965, chance[s} are it would run
on every lJ3M-compatible mainframe computer today, because these computers are upwardly
compatible."). Some viruses do become extinct over time, however. See, e.g., DuNr~^~, supra
I
I
I
i
I
i
I
I
I
I
!
I
I
I
Computer IHruses and Civil Liability
159
Converting infected documents to a later version often also automatically
converts the virus to one compatible with the new format.~23
The latest Internet worms and mass mail viruses have more staying
power--they remain virulent longer and spawn more variants. When infections do occur, it takes longer and costs more to disinfect systems and
recover from virus attacks.12~
The 2003 ICSA Survey reports an increase not only in prevalence of
virus attacks but also in the severity of disasters. The survey defines a "virus
disaster" as "25 or more PCs infected at the same time with the same virus,
or a virus incident causing significant damage or monetary loss to an organization,m2s In the 2002 ICSA survey, eighty respondents reported a
disaster, while the 2003 survey reported ninety-two disasters. Average disaster recovery time increased slightly in 2003 over 2002. Recovery costs,
however, increased significantly, by 23 percent, from a 2002 average of
$81,000 to $100,000 in 2003. ~26 The survey also reports a growth in severity
of virus payloads and consequences of infection, as well as changes in attack
vectors (modes of distribution), the latter exacerbating the volatility and
unpredictability of the virus threat.’27
The high danger rate associated with computer viruses makes them a
potentially potent and destructive tool for a perpetrator of terrorism, industrial espionage, and white-collar crime.~zs U.S. security agencies are
reportedly investigating the use of malicious software seriously as a strategic weapon/~’~ and the Pentagon established a SWAT team, administered
by the Computer Emergency Response Team Coordination Center, to
combat destructive programs, such as the Morris Worm?~°
note t, at xxi ("lMlany older Macintosh viruses do not function correctly on System 7.0 or
later. On PCs, many DOS file-infecting viruses are no longer as functional or successful in
the Windows operating system. Still, older viruses continue to work on older operating systems and remain a threat for users of older systelns.").
123. Bissett & Shipton, supra note 113, at 899, 902.
124. ICSA LARS 9Ta ArRiVAL CoravtrrER VIRUS PREVALENCE SURVEY 2003, supra note 10,
at 24.
125. Id. at 1.
126. "For the eighth year in a row, our survey respondents report that viruses are not only
more prevalent in their organizations, but are also more destructive, caused more real damage
to data and systems, and are more costly than in past years. This despite increases in their
use of antivirus products, improved updating and upgrading, better management of antivirus
systems. Corporations are also spending more time, energy, and dollars in purchasing, installing, and maintaining antivirus products without achieving their desired results." Id.
127. ld. at 6.
128. FITES ET ^~-., supra note 1, at 50-53 (describing the use of viruses to perpetrate acts
of sabotage, terrorism, and industrial espionage); ConE~, supra note 8, at 151-52; Clifford
Stoll, Stalking the Wily Hacker, 31 COMMS. ACM 484 (1988).
129. Jay Peterzell, Spying and Sabotage ky Computer, Titan, Mar. 20, 1989, at 25 (cited in
Ro~;uE PROCR~raS, supra note 11, at 92 n.134).
130. Ro~u~ PROOR^~S, supra note 11, at 92 n.133.
I
i
I
I
i
i
i
!
I
I
I
I
I
i
I
I
160
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
In summary, the expected harm from a virus attack, P x L, is relatively
large. Applying the Learned Hand formula, it follows that the legally mandated precaution level, B, must be large. We now argue that a large B implies
a large avoidable set, V*. The essence of the argument is that a large avoidable set, V*, is (i) technologically feasible and (ii) legally mandated.
(v) .4 Large V* Is Technologically Feasible Antivirus software became
available soon after the first appearance of computer viruses and has become increasingly sophisticated and effective, in response to parallel advances in virus technology. Although it is impossible to identify the presence of a virus with 100 percent reliability,’3~ state-of-the-art technology
has achieved close to a perfect detection rate of known viruses, and a detection rate of unknown virus strains perhaps as high as 80 percent and
growing. State-of-the-art heuristic virus scanners, for instance, are capable
of detecting at least 70 to 80 percent of unknown viruses.~32
Organizations such as Virus Bulletin, West Coast Labs, and others periodically publish evaluations of commercial antivirus products. Virus Bulletin, ~ an industry leader, uses a recently updated database of virus strains
to test antivirus software for its so-called 100 Percent Award. Products
receive this award if they successfully detect all the strains in the database,
suggesting that they are capable of detecting virtually all known strains.
Antivirus software that have consistently made this grade include products
such as Norton AntiVirus, Sophos Anti-Virus, and VirusScan.’34
West Coast Labs~s evaluates antivirus software for their ability to detect
as well as eliminate viruses. Products such as Norton AntiVirus, VirusScan,
and F-Secure, among others, have recently been certified for their ability
to detect and eliminate I00 percent of known virus strains.~36 Other organizations, such as the Virus Test Center at the University of Hamburg,
regularly test antivirus software and publish their results, including a list
of software with a 100 percent detection rate.~7
Some of the most effective antivirus programs are available free of
charge, at least for private users. Free software includes products such as
VirusScan, which made Virus Bulletin’s 100 Percent Award list and received similar honors from West Coast Labs. Norton AntiVirus, an antivirus product that has been similarly honored and that offers additional
131. Spinellis, supra note 31, at 280, 282 (stating that theoretically perfect detection is in
the general case undecidable, and for known viruses, NP-cornplete).
132. Nachenberg, supra note 1, at 7; Fernandez, supra note 31; Alex Shipp, Heuristic Detection of Viruses Within e-Mail, in PROCEEDINGS 1 ITH ANNUAL VIRUS BULLETIN CONFERENCE,
Sept~ 2001.
133. See http://www.virusbtn.corn.
134. DUNHAM, supra note 1, at 150-51 (Table 6.3).
135. See http:i/www.check-mark.corn.
136. DUNH^M, supra note 1, at 154 (Table 6.6).
137. See http://agn-www.inforrnatik.uni-hamburg.de/vtc/naveng.htm.
Computer Viruses and Civil Liability
161
features such as a user-friendly interface, powerful scan scheduling options,
heuristic technology for the detection of unknown strains, and SafeZone
quarantine protection, is available at modest cost at the time of writing.~s
A high detection rate is not limited to known virus strains. State-of-theart heuristic scanners, such as Symantec’s Bloodhound technology and
IBM’s AntiVirus boot scanner, are capable of detecting 70 to 80 percent
of unknown viruses.~39 Heuristic technology is relatively inexpensive. Symantec’s Bloodhound technology, for instance, is incorporated in the Norton AntiVirus product.~4°
The technological trend is towards greater sophistication and effectiveness and an increasing detection rate. IBM, for instance, a major center of
virus research, has been awarded a patent for an innovative automatic virus
detection system based on neural network technology?4~ The system uses
artificial intelligence techniques that mimic the functioning of the human
brain to enable it to identify previously unknown virus strains. The neural
network is shown examples of infected and uninfected code (e.g., viral and
uninfected boot sector samples) and learns to detect suspicious code. Care
was taken to minimize the occurrence of false alarms. The system reportedly captured 75 percent of new boot sector viruses that had come out
since its release, as well as two reports of false positives. Subsequent updates
of the product were designed to eliminate false positives of the kind that
occurred.
Ambitious research programs are under way that augur well for an even
greater detection rate. The inventors of the IBM neural network technology view it as a precursor to an immune system for cyberspace that operates
analogously to the human immune system. This envisioned cyber immune
system will operate through the Internet to "inoculate" users globally to a
virus within minutes of its initial detection)42
(vi) A Large V* Is Legally Mandated--Sophisticated antivirus technology
makes a large V* feasible.~3 V* is a legal concept, though, and encompasses
more than technological feasibility. V* is, by definition, the set of virus
I
!
i
138. At the time of writing (2004), the latest version of Symantec’s Norton AntiVirus was
available for less than $200. See also DuNnAM, supra note 1, at 158-59.
139. See discussion of heuristic detection technologies in Section II.B.4, supra.
140. See also http://~v.symantec.com/nav/nav_mac/; Du~qH^M, supra note 1, at 158-59.
141; Gerald Tesauro et al., Neural Networks for Computer Virus Recognition, ll:4IEEEExv~w
5-6 (Aug. 1996). See also Press Release, IBM, IBM Awarded Patent for Neural Network Technology, available at http://www.av.ibm.com/BrealdngNews/Newsroom/97-10-27/.
142. J.O. Kephart et al, Computers and Epidemiology, 30:5 IEEE Sv~cxRo~ 20-173 (May
1993).
143. A scanner with a properly updated signature database can detect close to 100 percent
of known virus strains. Heuristic scanners, such as Symantec’s Bloodhound technology, can
detect 70 to 80 percent of unknown viruses. IBM’s neural network virus detection technology
can capture 75 percent of new boot sector viruses. Innovative research promises that the trend
toward "perfect" virus detection and elimination will continue and perhaps accelerate.
I
162
Tort TriaI& In.mrance Practice Law Journal, Fall 2004 (40:1)
strains whose elimination is both technologically feasible as well as costeffective. This subsection draws on the economics of virus precaution to
show that a large V* is not only technologically feasible but also costeffective, hence within the scope of due care.
The Learned Hand formula, B -> P × L, dictates that, to avoid liability,
investment in antivirus precautions, B, should at least equal the expected
harm avoided, P × L. We have argued that the high danger level associated
with virus attacks (L), as well as a significant and increasing probability of
a virus attack (P), mandates a high investment in antiv~rus technology. We
now explore estimates of the numerical value ofP × L (and thus orB) and
obtain a quantitative estimate of the proportion of all virus strains avoidable
by the Learned Hand efficient level of precaution. This proportion xs a
direct estimate of the relative size of V*.
The ICSA survey reports that 92 of 300 respondents experienced at least
one incidence of a virus disaster over the one-year survey period, with an
average recovery cost of $100,000.’~4 The survey states that the recovery
cost figure likely underestimates the true cost by a factor of seven or eight,
when considering direct as well as indirect costs)4s An adjusted recovery
cost figure per disaster, therefore, in reality, may be closer to $700,000 to
$800,000. In addition to disasters, the survey data also show an average of
108 "ordinary" virus infections per month, per site.
If we take the recovery costs of a disaster to be $750,000 and 92/300 as
the probability that a particular site will experience a disaster in a given
year, then the ex ante expected annual monetary loss from a disaster is
$230,000. This is a conservauve estimate. It assumes, for instance, that each
of the respondents who reported experiencing at least one disaster during
the survey year did experience only one disaster. It also does not include
the cost associated with ordinary infections (not disasters), which are much
more numerous than disasters and also capable of significant damage.
A conservative estimate of the annual expected harm to an institution
from virus attacks amounting to a disaster is $230,000. This corresponds
to the term P × L in the Learned Hand formula and has to be balanced
by the same amount of precaution, B, to avoid liability. How much protection does $230,000 buy? A recent competitive analysis of leading antivirus vendors shows that Symantec’s premium antivirus product, the Symantec AntiVirus Enterprise edition, is available at a fee of approximately
$700,000 for a four-year/5,000-seat license with premium support. A similar product, Sophos Corporate Connect Plus, is available for $156,250,
144. The survey defines a "virus disaster" as "2 $ or more PCs infected at the same time
with the same virus, or a virus incident causing significant damage or monetary loss to an
organization." ICSA LABS 9TH ANNUAL COMPUTER VIRUS PREVALENCE SURVEY 2003, supra
note 10, at 1.
145. ld. at 2.
!
I
I
I
I
I
i
I
Computer Hruses and Civil Liability
163
under similar terms?~ Both Symantec and Sophos are recipients of Virus
Bulletin’s 100 Percent Award. Products receive this award if they successfully detect all the strains in a database compiled by Virus Bulletin, suggesting that they are capable of detecting virtually all known strains?47
These products also contain heuristic algorithms that enable them to detect
more than 80 percent of unknown virus strains.
Assuming, conservatively, that the Sophos and Symantec products are
capable of preventing 80 percent of disasters,~48 then an investment of between $39,000 (Sophos) and $175,000 (Symantec) in antivirus precautions
will prevent expected damage amounting to $0.8 × $230,000 = $184,000.
Both antivirus products are cost-effective, and therefore within the scope
of due care.
The detection of most viruses is not only technologically feasible but
also cost-effective. Most virus strains belong to V*. In fact, at least 80
percent, perhaps in excess of 90 percent, of all strains, known as well as
unknown, belong to V*. Having established that V* is large, we now argue
that V~ is small.
(vii) Vt ls Small--The diagram, below, represents the avoidable and unavoidable virus strains associated with a typical computing environment.
V* represents the avoidable set, as previously defined, and V~ represents
the set of viruses that will actually be prevented.
MI Virus Strains, V
V* (Avoidable Set)
I
I
I
I
V’ (Actually Avoided Set)
V~ is smaller than V*, because a rational, profit-maximizing defendant,
such as a software provider, has an economic incentive to fall short of the
146. Total Cost of Ownership: A Comparison of Anti-Virus Software, SOPHOS WHITE PAI~ER,
available at http://www.sophos.com/link/reportcio.
147. Duran^~a, supra note 1, at 150-51 (Table 6.3).
148. The 80 percent figure is a conservative estimate. The technology we discuss is capable
of detecting and eliminating at least 80 percent of unknawn viruses and virtually 100 percent
of known ones.
!
!
164
Tort TriaI & Insurance Practice Law Journal, Fall 2004 (40:1)
legal standard of due care, resulting in the transmission of some virus
strains in V*. The grey area, between V* and V*, represents the viruses
that should be prevented, because they belong to V*, but will not, because
of the precautionary lapse. The precautionary lapse is likely due to an
inadvertent compliance error.
c. Compliance Error
In order to understand the nature and origin of a compliance error, we
distinguish between durable and nondurable precautions against harm. A
durable precaution typically has a long service life, once it is installed.
Use of a durable precaution must usually be complemented by shorterlived, nondurable precautions, which have to be repeated more frequently
than durable precautions. A medical example illustrates the distinction
between durable and nondurable precautions. A kidney dialysis machine
is a typical durable precaution. A dialysis machine has a long service life
once it is installed, but it cannot function properly without complementary nondurable precautions, such as regular monitoring of the hemodialytic solution249
Antivirus precautions consist of a durable as well as nondurable component. Durable precautions, such as a virus scanner and signature database, must be complemented by nondurable precautions, such as regularly
updating and maintaining the signature database and monitoring the output of the scanner2s° A "compliance error" is defined as a deviation from
perfect compliance with the (Learned Hand) nondurable precaution rate. ~5’
A compliance error is efficient, even though the courts equate it to negligence. A rational, profit-maximizing entity such as a commercial software
provider will systematically fail to comply with the legally required nondurable antivirus precaution rate.
(i) Compliance Error Is Rational--Results in the law and economics literature predict that there will be no negligent behavior under a negligence
rule of liability, in the absence of errors about legal standards, when precaution is not random and when private parties have identical precaution
costs.’s: It seems, therefore, that the frequent occurrence of negligence in
society must be explained in terms of nonuniform precaution costs, or
errors by courts and private parties about the relevant legal standards, or
that precaution has a random or stochastic component.
149. Mark E Grady, Why Are People Negligent? Technology, Nondurable Precautiom; and the
Medical Malpracttce Explosion, 82 Nw. U. L. R~v. 293,299 (1988).
150. A scanner reads software code and searches for known virus patterns that match any
of the viral patterns in its database. See Section II.B, supra, for a review of virus detection
technologies.
15 I. Mark E Grady, Res Ipsa Loquitur and Compliance E, rot, 142 U. PtNN. L. Rtv. 887.
152. Id. at 889-91.
!
I
i
Computer Viruses and Civil Liability
I
!
i
I
I
I
165
Dean Mark Grady has argued that none of these theories explains the
prevalence of negligence entirely satisfactorily. Grady has proposed a theory according to which there is a pocket of strict liability within the negligence rule. According to the theory, a rational injurer may find an occasional precautionary lapse economically efficient and thus preferable to
perfectly consistent compliance with the legal standard of due care. The
frequency of such lapses will increase as the due care standard becomes
more burdensome. The occasional lapse is rational and profit maximizing,
as we argue below, but will nevertheless be classified as negligence by the
courts, because of the courts’ inability to distinguish between efficient and
inefficient lapses.
The level of investment in durable and nondurahle antivirus precautions
required by negligence law is determined according to the Learned Hand
formula,is3 Scanners, for instance, come in a variety of degrees of sophistication (and cost), ranging from basic systems that detect only known
strains, to heuristic artificial intelligence-based systems capable of detecting polymorphic viruses and even unknown strains. The optimal Learned
Hand level of investment in scanning technology would be determined by
balancing the cost of acquiring and operating the technology against the
expected harm avoided. The optimal nondurable precaution level, such as
frequency of viral database updating, is determined similarly.
The courts require perfectly consistent compliance with the Learned
Hand precautions to avoid a finding of negligence. If, for instance, the
courts require a viral signature database to be updated twice daily, then
even one deviation, such as one skipped update over, say, a two-year period,
would be considered negligent.~s4 When the courts apply the Learned
Hand formula to determine an efficient precaution level and rate, the calculation weighs the costs and benefits of the precaution each time it is performed but ignores the cost of consistently performing it over time. Consider a numerical example. Suppose the cost of a daily update is $10, and
the marginal benefit of the update is $11. Failure to perform even one such
update would be viewed as negligence by the courts. Over, say, 300 days,
153. See Section II.B, supra, on breach of duty.
154. In Kehoe v. Central Park Amusonent Co., 52 E2d 916 (3d Cir. 1931), an amusement
park employee had to apply a brake to control the speed of the car each time the rollercoaster
came around. ~vVhen he ~ailed to do so once, the car left the track. The court held that the
compliance error by itself constituted negligence, i.e., the court required perfect compliance
and considered anything less as negligence. Id. at 917 ("If the brake was not applied to check
the speed as the car approached.., it was clear negligence itself."). For other cases, see Grady,
supra note 151, at 901. In Mackey v. Allen, 396 S.W’.2d 55 (Ky. 1965), plaintiff opened a
"wrong" exterior door of a building and fell into a dark storage basement. The court held
the owner of the building liable for failing to lock the door. But see Myers v. Beem, 712 P.2d
1092 (Colo. Ct. App. 1985) (an action brought against an attorney for legal malpractice,
holding that lawyers are not required to be infallible).
I
!
i
!
!
i
166
Tort Trial t:r Insurance Practice Lmv Journal, Fall 2004 (40:1)
the courts expect 300 updates, because each of those updates, by itself, is
Learned Hand efficient. However, the courts do not consider the cost of
consistency, i.e., of never forgetting or lapsing inadvertently. Human nature
is such that over a 300-day period, the person in charge of updating will
occasionally inadvertently fail to implement an update.
Human nature, being what it is, dictates that perfection is (perhaps infinitely) expensive)ss Perfect consistency, i.e., ensuring that 300 updates
will actually be achieved over 300 days, would require additional measures,
such as installing a monitoring device alerting the operator to a lapse, or
perhaps additional human supervision, all of which are costly. Even assuming (heroically) that such measures would assure consistency, their cost may
nevertheless be prohibitive to a rational software provider. Suppose, for
instance, that such a measure would add an additional $2 to the cost of an
update. The marginal cost of an update ($12) is now more than the marginal benefit ($11). Hence, perfect consistency is not in society’s interest.
An occasional lapse is also reasonable from the viewpoint of the software
provider: The marginal cost of perfect consistency is greater than the marginal increase in liability exposure due to efficient negligence. The courts
nonetheless would consider such an efficient lapse to be negligence. Courts
act as if they ignore the additional cost of $2 to achieve perfect consistency.
Efficient lapses can be expected to become more likely and more frequent,
the more demanding and difficult the Learned Hand nondurable precaution rate, i.e., the more expensive perfect consistency becomes~
A major reason for the courts’ insistence on perfect compliance, in spite
of the inefficiency of such perfection, is that it is impossible or expensive
to determine whether any given deviation from perfect compliance is efficient. Vv’ho can judge, for instance, whether a software provider or website operator’s mistake or momentary inattentiveness was an economic or
uneconomic lapse? Courts, therefore, do not acknowledge efficient noncompliance where it is difficult to distinguish between efficient and inefficient noncompliance.
155. See, e.g., P~TERSOr~, supra note 40, at 194 ("Even under the best of circumstances, our
brains don’t function perfectly. We do forget. We can be fooled. We make mistakes. Although
complete failures rarely occur, neural systems often suffer local faults.").
156. The policy rationale behind the courts’ insistence on perfect compliance was expressed by Lord Denning in Froom v. Butcher, 3 All E.R. 520, 527 (C.A. 1975) ("The case
for wearing seat belts is so strong that I do not think the law can admit forgetfulness as an
excuse. If it were, everyone would say: ’Oh, I forgot.’"). Instead of incurring the considerable
I
I
!
I
measurement cost to distinguish between efficient and inefficient failures to comply, courts
simply equate any and all noncompliance to negligence. See also Grady, supra note 151, at
906; W. LANDES ~: R. POSNER, THE ECONOMIC STRUCTURE OF TORT LAW 73 (1987). Courts
tend to be forgiving, however, where the cost of ascertaining the efficiency of noncompliance
is low or zero. In cases where the deviation is demonstrably efficient or unavoidable, such as
an accident resulting from a defendant’s (provable) temporary physical incapacitation, courts
have not imposed liability. See, e.g., cases cited in Grady, supra note 151, at 887 n.26. See also
Computer Viruses and Civil Liability
167
We argue that an efficient lapse, a compliance error, in antivirus precautions is particularly likely, due to the nature of the technology and economics of viruses and virus detection.
(ii) Virus Transmission Likely Involves Compliance Error--Negligence in
antivirus precautions can occur in two ways, namely durable precautions
below the Learned Hand level and compliance errors.
A formal economic analysis of compliance error in the context of virus
prevention has shown that a rational software provider will invest in durable antivirus precautions at the due care level required by negligence law.
However, the provider will invest in nondurable precautions at a level below the due care level. It is cheaper to the provider to spend less on nondurable precautions and risk liability exposure, rather than incurring the
even higher cost of achieving perfectly consistent compliance with the legally imposed due care standard..57
Rational agents therefore will not fail in durable precautions but will
likely commit compliance errors. Investing in durable precautions up to
the efficient Learned Hand level is profit-maximizing because such investment reduces the provider’s liability exposure by more than it costs. A
compliance error is efficient due to the high cost of perfect consistency,
hence, likewise profit-maximizing. Most negligent behavior on the part of
rational, profit-maximizing software and service providers, therefore, will
be the result of compliance errors.
We now argue that virus prevention technology is particularly susceptible to compliance error. Compliance error has a high likelihood where
precautions are characterized by a high durable level, complemented by
high levels and intense rates of nondurable precautions. These conditions
make it harder to achieve perfectly consistent compliance with the due care
standard and characterize virus prevention technology.
(iii) Antiviru~," Precautions Consist of Durable Precautions Complemented by
a Significant Nondurable Component--Technical defenses against computer
viruses consist of a durable precaution, complemented by essential nondurable precautions.*ss Durable antivirus precautions come in four main
categories, namely pattern scanners, activity monitors, integrity monitors,
!
l
!
l
Ballew v. Aiello, 422 S.W~2d 396 (Mo. Ct. App. 1967) (finding defendant not liable for negligence because he was half asleep at the time he was allegedly negligent); Grady, supra note
151, at 887 n.59 ("For faints and other slips, it is possible for courts to judge whether they
should have been avoided. Indeed, courts’ measurement of unusual slips reintroduces the
negligence component back into the negligence rule.").
157. See de Villiers, supra note ! I0 (mathematical analysis of compliance error in virus
context). See generally Grady, supra note 151 (seminal article on compliance error).
158. Cohen emphasizes the importance of nondurable precautions in an antiviral strategy:
"Suppose we want to protect our house from water damage. It doesn’t matter how good a
roof we buy ... We have to maintain the roof to keep the water out. It’s the same with
protecting information systems." CoHEre, supra note 8, at 148.
n
!
I
!
U
!
!
I
l
i
!
l
I
!
!
168
Tort Trial ~ Insurance Practice Law a%urnal, Fall 2004 (40:1)
and heuristic scanners. 5~ The durable precautions are complemented by
nondurable precautions. An activity monitor, for instance, halts execution
or issues a warning when it senses viruslike behavior. This requires nondurable precautions in the form of human intervention, consisting of observation and interpretation of monitor alerts and an appropriate response.
Virus scanners operate by searching for virus patterns in executable code
and alerting the user when an observed pattern matches a virus signature
stored in a s~gnature database. Nondurable precautions complementary to
a scanner include regular maintenance and updating of the virus signature
databases, monitoring scanner output, and responding to a pattern match.
An inadequately maintained signature database would reduce the effectiveness of a scanner, and virus alarms are worthless if ignored.
Several factors make compliance burdensome. Integrity checkers and
heuristic scanners produce fewer false negatives but far more false positives
than regular scanners. A large number of false positives make compliance
more burdensome and efficient lapses more likely. False positives tend to
diminish the effectiveness of the antivirus strategy, perhaps to the point of
undermining confidence in the precaution. If the probability of a false
alarm were high enough, it may be rational and efficient for a human
operator to ignore some alarms. An ignored alarm may turn out to be real
and result in the transmission of a virus. If the Learned Hand precautionary
level required attention to all alerts, the courts would view such a lapse as
negligence, even if the compliance error were efficient from the viewpoint
of the human operator.
Scanners require a frequently updated viral pattern database, as new viruses are discovered at a high rate.j6° By the Learned Hand formula, the
high danger rate associated with viral infection imposes a demanding nondurable precaution rate, such as a high database updating frequency and
diligent monitoring of and responding to all alarms, regardless of the frequency of prior false alarms. Some critical applications may require virtually continuous updating, incorporating new virus strains in real time, as
they are discovered.
159. See Section II.B, "Technical Antivirus Defenses," supra.
160. IBM’s High Integrity Computing Laboratory reported, for instance, that by June
1991, new signatures were added to their collection at the rate of 0.6 per day. By June 1994,
this rate had quadrupled to 2.4 per day and has since quadrupled yet again to more than 10
a day. Kephart et al., supra note 21, at 179-94. See also Steve R. White et al., Anatomy of a
Co’mrnercial-Grade Immune System, IBM Thomas J. Watson Research Center research paper,
available at http://www.av.ibm.conaJScientificPapers/White/Anatomy/anatomy.html (in the
late 1990s, new viruses were discovered at the rate of eight to ten per day); Dv~n^ta, supra
note 1, at xix ("[A]n estimated 5 to 10 new viruses are discovered daily, and this number is
increasing over time."); Jennifer Sullivan, IBM Takes Macro Viruses to the Cleaners, WmE~
N~ws (Dec. 4, 1997) ("It is estimated that 10 to 15 new Word macro viruses.., are discovered
each day.").
I
i
t
,!
I
!
i
I
l
i
!
!
Computer Hruses and Civil Liability
169
This discussion of antivirus precautions suggests that they consist of a
high durable component, complemented by high rates and intense levels
of nondurable precautions. The result is a high likelihood of a compliance
error. The higher and more intense the rate of precaution, the more burdensome, hence more costly the cost of perfect compliance and the greater
the likelihood of a compliance error..61
d. Conclusion
Most virus strains are avoidable, which implies that most cases of virus
infection involve negligence. Furthermore, most cases of virus infection
governed by the negligence rule involve a compliance error. When a virus
penetrates a network and causes harm, t~ailure to detect it in time is therefore likely due to a compliance error. Liability of the individual who exposed network users to the compliance error will likely be preserved under
the dependent compliance error paradigm.
This conclusion remains valid, by a preponderance of the evidence, even
in cases where the culprit virus cannot be reliably identified as avoidable
or unavoidable. Even when the virus is not identifiable,~62 it is likely avoidable and likely involves a compliance error.
2. Paradigms in Reasonable Foresight Doctrine
The reasonable foresight doctrine governs multiple risks cases. The doctrine includes five mutually exclusive paradigms, namely (i) minimal systematic relationship, (ii) reasonably foreseeable harm, (iii) reasonable ignorance of the relationship, (iv) correlated losses, and (v) adverse selection.~63
Under the minimal systematic relationship paradigm, an inadvertently
negligent tortfeasor would not be held liable for coincidental harm that
results from his or her negligence. To illustrate this paradigm, suppose a
hypothetical defendant negligently exceeds the speed limit and arrives at a
spot just in time to be struck by a falling tree. Although an injured passenger plaintiff may argue credibly that falling trees are foreseeable, the (coincidental) accident is likely outside the scope of risk created by the defendant’s speeding. The defendant’s speeding created risks of traffic accidents,
but it neither created the risk of the falling tree nor increased the probability of its occurrence. The accident was therefore not within the scope
of the risk created by the defendant’s conduct, and liability fails on proximate cause grounds. It is coincidental and not systematically related to the
defendant’s negligence.
161. See de Villiers, supra note 1 I0, ¶¶ 8-14 (describing possible complications in identifying the exact virus strain responsible for certain harm).
162. Id.
163. Mark E Grady, Proximate Cause Decoded, 50 UCLA L. R~v. 293,322 (2002).
!
!
170
Tort Trial ~ Insurance Practice Law Jou~mal, Fall 2004 (40:1)
Suppose, on the other hand, that the tree had fallen in front of the
speeding driver and the car crashed into it. If it can be shown that the
impact could have been avoided had the driver traveled at a reasonable
speed, then the speeding driver’s negligence may have been a proximate
cause of the accident. Failure to stop with a short reaction time is a foreseeable risk of, and systematically related to, speeding.’"~
The reasonably foreseeable harm paradigm, described as the default paradigm under the reasonable foresight doctrine, imposes liability where an
ex ante known systematic relationship exists between the defendant’s negligence and the plaintiff’s harm?6s In O’Malley v. LaurelLine Bus Co.,’66 for
instance, the defendant’s bus driver let a passenger off in the middle of a
street, instead of at the regular bus stop. It was a dark and stormy night so
that the passenger did not realize where he was being let off. The court
held the defendant liable for injuries sustained when the passenger was
struck by a car. Letting people off in the middle of a street under such
conditions that they cannot ascertain the risks of dangerous traffic does
have a foreseeable systematic relationship to their being struck by a car.
Under the reasonable ignorance of the relationship paradigm, proximate
causality is broken when, even though ex post there is clearly a systematic
relationship between the defendant’s untaken precaution and the harm,
scientists would not have predicted the relationship ex ante. This paradigm
is particularly relevant in a virus context, where scientific and technological
state of the art evolves rapidly and often unpredictably?67
The issue of ex ante scientific knowledge is illustrated in the following
classic case, known as the "Wagon Mound."~6s A ship was anchored in
Alaska’s Anchorage harbor. It negligently discharged oil into the water, but
there was no apparent fire hazard, because the oil was of a type that required extremely high heat to ignite. Some debris, with a piece of cotton
attached to it, floated on the water under the oil layer. The debris was
covered by the oil and invisible to any observer. A welder’s torch set off
sparks that struck the cotton. The cotton smoldered for a while and eventually acquired sufficient heat to ignite the oil, causing a fire that burned
down the dock. The dock owner sued the owner of the ship for damages
under a negligence theory.
The oil spill created several risks, including hazards associated with water
pollution and fire. The fire hazard was unforeseeable, because of the nature
!
164. Berry v. Borough of Sugar Notch, 191 Pa. 345 (1899); see also Grady, supra note 163,
at 324.
165. Grady, supra note 163, at 326.
166. 166 A. 868 (Pa. 1933).
167. See Section I~.B, "Breach and Actual Cause Satisfied, but Proximate Cause Failed,"
infi’a, for a discussion and example of the role of reasonable ignorance of the relationship in
a virus context.
!
168. Overseas "Fankship (U.K.), Limited v. Morts Dock & Eng’g Co., Ltd. (The Wagon
Mound), [1961] A.C. 388 (Privy. Council 1961).
Computer Viruses and Civil Liability
171
of the oil and the fact that the debris and cotton were out of sight. The
risk of pollution was foreseeable but did not cause the harm.
The court accepted the testimony of a distinguished scientist who testified that the defendants could not reasonably have foreseen that the particular kind ofoil would be flammable when spread on water.~‘’’ The Privy
Council therefore properly denied liability, and the suit failed on proximate
cause grounds, namely reasonable ex ante ignorance of the relationship
between defendant’s untaken precaution and the harm.~7°
The correlated losses/moral hazard and adverse selection paradigms are
mainly of historical interest, although they are based on sound public policy
arguments that may be applicable in negligence cases.~7~ The New York
fire rule, which only permits recovery by the owner of the first property
to which a fire spread, is a classic example of denial of liability under the
correlated losses paradigrn,t72 The adverse selection paradigm denies liability where, due to a heterogeneity of risks, the plaintiff would have received a better insurance bargain than others.~7J
The final element of a negligence cause of action is actual damages, to
which we now turn.
169. ld. at 413 ("The raison d’etre of furnace oil is, of course, that it shall burn, but I find
the [appellants] did not know and could not reasonably be expected to have known that it was
capable of being set afire when spread on water.").
170. See also Doughty v. Turner Mfg. Co., [1964] 1 Q.B. 518 (C.A.), a case where proximate
causality also turned on scientific state of the art. In Dougb.ty, a worker negligently knocked
the cover of a vat containing molten sodium cyanide into the molten liquid in the vat. The
plaintiffs were injured when a chemical reaction between the molten sodium cyanide and the
cover, which was made of a combination of asbestos and cement known as sindayo, caused
an eruption that resulted in injuries to the plaintiffs. The risk that the cover might splash the
molten liquid onto someone was known and foreseeable, but the chemical reaction that actually caused the harm was unknown and unpredictable at the time of the accident. Scientists
later demonstrated that at sufficiently high temperatures the sindayo compound underwent
a chemical change that creates steam, which in turn caused the eruption that injured the
plaintiff. None of this was known at the time of the accident. The court therefore held for
the plaintiff, stating that the defendant was reasonably ignorant of the chemical reaction that
caused the injuries, ld. at 520, 525. The defendant escaped liability under the reasonable
ignorance paradigm.
171. Grady, .rupra note 163, at 330-31.
172. See, e.g., Homac Corp. v. Sun Oil Co., 180 N.E. 172 (N.Y. 1932); Ryan v. N.Y. Cent.
R.R., 35 N.Y. 209 (1866) (Defendant negligendy ignited its own woodshed, from which the
fire spread to the plaintiff’s house. The court denied liability, reasoning that first-party insurance by homeowners would be more efficient than imposing unlimited liability on a defendant for mass fires caused by its own inadvertent negligence. Such liability would constitute
a "punishment quite beyond the offence committed." ld. at 216-17). The fire rule seems to
have been limited to New York. Other courts have allowed recovery even when fire spread
over great distances and over obstacles. See, e.g., Cox v. Pa. R.R., 71 A. 250 (N.J. 1908)
(recovery allowed for damage from fire that had spread beyond several buildings from its
origin before destroying the plaintiff’s building). Even in New York, the doctrine was not
always followed. See, e.g., Webb v. Rome, Watertown & Ogdensburgh R.R. Co., 49 N.Y. 420
(1872). Consistent with the "extent of harm" rule, it may apply to secondary victims of virus
infection. See also PIIOSSrR & KEI~TON ON TrlE L^w oF To~rrs, supra note 3, at 282-83 (Time
& Space).
173. Grady, supra note 163, at 331.
!
!
I72
7brt Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
E. Damages
Damage resulting from virus infection can be classified into two broad
categories: pre-infection and post-infection damages. ~74 Pre-infection damages include the cost of detecting, tracing, identifying, and removing a virus
before it enters the system or network. Typical expenses include personnel
and managerial expenditures associated with the implementation and maintenance of software designed to detect a virus automatically at the point of
entry as well as expenses for tracing the source of the virus, advising the
source, logging the incident, and communicating with the owner of the
system on which the incident occurred.
Post-infection damages can be classified into two main categories:
(i) impact of the presence of a virus on the computing environment, before
execution of the payload, and (ii) damage caused by execution of the payload.
Viruses modify the computing environment when they install their code
on a host program and overwrite or displace legitimate code. Partly overwritten systems programs may become dysfunctional. Corrupted boot sector code, for instance, may prevent an infected computer system from booting and garbled spreadsheet formulas may make the program virtually
unusable. Theft of resources, such as clock cycles, may slow down processes
and, in the case of time-critical processes, cause them to behave unpredictably. Macro viruses, for instance, often disable menu options of Microsoft Word. Viral invasion of space in main memory and on the hard disk
may result in impaired performance and disablement of some programs,
including time-critical processes and resource-intensive software. In the
absence of virus detection software, these modifications are often unobservable until execution of the payload?75 These viral actions nevertheless
cause actual damage, by dissipating valuable computing resources and disabling or disrupting commercially valuable computer functions.
Virus attacks have effects beyond the money and other resources required to recover from the attacks. In a survey of organizational effects of
virus encounters, participants were asked about the organizational effects
of virus incidents on their company or working group. The following table
is a partial list of their greatest concerns, with the percentage of respondents reporting each effect.~76
174. David Harley, Nine Tenths of the Iceberg, ViRuS BOLL. 12 (Oct. 1999).
175. ld. at 13 ("General incompatibility/de-stabilization issues can manifest themselves in
several ways. System software/applications/utilities display unpredictable behavior due to conflicts with unauthorized memory-resident software. Symptoms include protection errors, parity errors, performance degradation, loss of access to volumes normally mounted and un~
availability of data or applications.").
176. ICSA LARS 9TI-I Al~IqUAt. COMPUTER VIRUS PREVALENCE SURVI~y 2003, supra note 10,
at 13 (Table 9).
!
I
I
I
I
I
I
I
I
I
i
I
i
i
I
i
I
Computer Viruses and Civil Liability
Response
Percentage
Loss of productivity
Unavailability of PC
Corrupted files
Loss of access to data
Loss of data
76%
67%
58%
50%
47%
173
Damage from execution of the virus payload comes in three categories:
loss of availability, integrity, and confidentiality of electronic information ?77 Attacks on ava ilability include renaming, deletion, and encryption
of files. Attacks on integrity include modification and corruption of data
and files, including garbling of spreadsheet formulas and destruction of
irreplaceable information. Attacks on confidentiality include security compromises, such as capturing and forwarding of passwords, e-mail addresses,
and other confidential files and information.
The ICSA 2003 survey on computer virus prevalence provides numerical
estimates of the effects of virus attacks. The survey defines a "virus disaster"
as "25 or more PCs infected at the same time with the same virus, or a
virus incident causing significant damage or monetary loss to an organization."~7~ Ninety-two participants in the survey reported disasters with
average server downtime of seventeen hours.~79 Respondents also were
asked how many person-days were lost during the virus disaster that struck
their company. The median time for full recovery was eleven person-days,
and the average was twenty-four person-days. The average dollar cost per
disaster, including employee downtime, overtime to recover, data and information loss, lost opportunities, etc., was in excess of $99,0007~°
Consequential, or secondary, damage is defined as (i) damage (both preand post-infection) due to secondary infection, namely damage to other
computer systems to which the virus spreads; (ii) damage due to an inappropriate response, such as unnecessarily destroying infected files that
could be cheaply disinfected and restored; (iii) psychological damage, such
as loss of employee morale and opportunities lost due to a sense of insecurity, bad publicity, and loss of reputation and credibility; (iv) the cost of
cleanup and disinfection, the cost of restoration of the computer system
and impaired data, and expenses related to upgrading computer security;
(v) legal risks, such as exposure to civil and criminal liability; and (vi) punitive
177. Harley, rupra note 174, at 13.
178. ICSA LABS 9wn ANNUAL COMPUTER VIRUS PREVALENCE SURVBY 2003, supra note 10,
at I.
179. Id. at 10.
180. ld. at 13.
174
I
!
I
!
I
,!
I
|
I
!
i
i
i
i
Tort Trial ~ Insurance Practice Law Journal, Fall 2004 (40:1)
action from parties with whom the victim had breached a contractual
agreement.~st
Certain viruses attempt to conceal their presence on the computer system. Such concealment action may itself cause damage to the computing
environment, independently of any harmful effect from execution of a payload. A virus may, for instance, attempt to thwart attempts to track it down
by looking out for attempts to read the areas it occupies in memory and
crashing the system in order to shake its "pursuer."
No viruses have been known to cause direct damage to hardware (at least
at the time of writing), and losses are usually limited to destruction of data
and related direct and indirect costs. A virus may cause indirect physical
harm to hardware. Certain viruses are, for instance, capable of impairing
the operation of a computer by writing garbage to a computer chip. It is
often cheaper to repair the damage by discarding the entire motherboard
than to replace a soldered chip?82
A negligence theory of liability would be irrelevant if no damages were
recoverable. A doctrine in tort law, the so-called economic loss rule, appears to significantly limit recovery for damages caused by virus infection.
The doctrine denies a defendant’s liability for pure economic loss, namely
loss not based on physical harm to person or property. In a related article,
we argue that damages related to viral infection, including pure economic
losses such as data corruption, are likely to be recoverable, the economic
loss rule notwithstanding, because (i) a virus may cause physical harm due
to the malfunction of a computer system, in applications such as medical
systems and aviation; (ii) a minority of jurisdictions have relaxed the rule
against recovery for pure economic loss; and (iii) an increasing number,
perhaps a majority, of jurisdictions recognize electronic information as legally protected property2s3
IV. LITIGATION COMPLICATIONS
The unique and dynamic nature of virus technology may complicate a
plaintiff’s litigation strategy. To succeed in a negligence action, the plaintiff
has to plead an untaken precaution that simultaneously satisfies the re181. HARLEY ET AL., supra note 18, at 97-100; DUNHAM, supra note 1, at7 (a user who
receives a virus warning "may shut off the computer incorrectly, potentially damaging files,
the operating system, or even hardware components like the hard drive"). See also ICSA LAss
6Tn ANNUAL COMPV’rRs VIRUS PREVALENCE SURVEY 2000, supra note 1 I0, at 31 (’Fable 16) (22
percent of respondents named loss of user confidence as a significant effect of a virus
encounter).
182. H^RLEV E’r ^L., supra note 18, at 100. See also Bissett & Shipton, supra note I13, at
899, 903 (describing the CIH virus, which overwrites memory, necessitating replacement of
the memory chip).
183. See de Villiers, supra note 110, § V!.B (economic loss rule).
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Computer Viruses and Civil Liability
175
quirements of breach of duty as well as actual and proximate cause. In other
words, the untaken precaution must be cost-effective and capable of preventing the harm if taken, and failure to take it must be reasonably related
to actual harm.
In a given case there may exist precautions that clearly satisfy at least
one, perhaps several, of the elements but no precaution that simultaneously
satisfies all the elements of a negligence action. Modifying the pleading
strategy by selecting an alternative precaution may fill the gap but leave
yet a different subset of elements unsatisfied.
Antivirus technology is varied and sophisticated, reflecting antivirus researchers’ response to the equally volatile and sophisticated nature of the
virus threat, and a plaintiff usually has a rich array of untaken precautions
to choose from. There may nevertheless, in many cases, exist no choice
that simultaneously satisfies all the elements necessary to build a negligence
case. Such a Catch-22 dilemma can, of course, arise in any negligence case,
but it is especially likely in virus cases, as we show in this section.*s4
A. Breach Satisfied but Actual Cause Failed
A plaintiff will prevail on the issue of breach if her pleaded untaken precaution is cost-effective. Breach can often be proved quite easily in a virus
context, by pleading a trivial precautionary lapse with negligible marginal
benefit, yet even smaller cost, hence efficient. Suppose a software provider
who signs up for fifty-two signature database updates per year is offered
four free updates. The software provider opts not to use some or all of the
free updates. The marginal cost, therefore, of increasing the updating frequency from fifty-two to, say, fifty-three times per year is approximately
zero so that the fifty-third update is almost certainly efficient. However,
the more trivial the lapse, the harder it is, generally, to establish actual and
proximate causality. The fifty-third update, although efficient, is unlikely
to make a significant practical difference in computer security. Failure to
implement the fifty-third update will likely fail the but-for test of actual
causality of a virus attack.
Although the fifty-third update will likely fail the but-for test, there is
ample scope for the plaintiff to rethink her pleading choice. The rich array
of available antivirus precautions virtually ensures the existence of an alternative precaution that would have prevented the virus, and therefore
satisfies actual causality. A generic technology, such as an activity monitor,
for instance, does not need an updated signature database to detect a novel
virus strain. The virus that the fifty-third update failed to detect would
therefore likely have been snared by an activity monitor. Failure to use an
184. Grady, supra note 61, at 139.
I
I
I
I
I
I
I
I
I
|
!
!
!
!
!
!
!
176
Tort Trial & Insurance Practice Law Journal, Fall 2004 (4&1)
activity monitor will be an actual cause of the virus infection. It may, however, not be cost-effective, hence, fail the breach requirement.
Generic virus detectors, such as activity monitors, are very efficient in
certain computing environments and quite inefficient and resourceconsuming in others. The particular environment in which the virus caused
the harm may be of the latter kind. The costs of the activity monitor may
outweigh its benefits, so that failure to use it does not constitute a breach
of duty, even though such failure is the actual cause of the virus harm.
Several factors may diminish the cost-effectiveness of an activity monitor
in a particular computing environment. Activity monitors do not perform
well with viruses that become activated before the monitor code and escape
detection until after they have executed and done their harm. Activity
monitors are also ineffective against viruses that are programmed to interfere with the operation of activity monitors. Certain virus strains, for instance, are programmed to sabotage the operation of activity monitors by
altering or corrupting monitor code. Some, but not all, machines and networks have protection against such modification. A further drawback of
activity monitors is that they can only detect viruses that are actually being
executed, which may be a significant detriment in sensitive applications
where a virus can wreak havoc before being caught by an activity monitor.
A further disadvantage of activity monitors is the lack of unambiguous
and foolproof rules governing what constitutes "suspicious" activity. This
may result in false positive alarms when legitimate activities resemble viruslike behavior and false negative alarms when illegitimate activity is not
recognized as such. The vulnerability of activity monitors to false alarms
makes them relatively costly.*~s A high cost of dealing with false negatives
and positives may outweigh the benefit provided by activity monitors in a
particular environment. An activity monitor may therefore not be costeffective because of any or all of these factors, even though it may have
been technically capable of detecting the culprit virus.
B. Breach and Actual Cause Sati~gqed, but Proximate Cause Failed
The rapid and often unpredictable development of virus technology introduces an element of unforeseeability into the behavior of viruses. New virus
creations often have the explicit goal of making detection harder and more
expensive.~6 Innovations, undoubtedly designed with this goal in mind,
185. The technology is programmed to make a judgment call as to what constitutes "suspicious behavior." There are, however, no clear and foolproof rules governing what constitutes
suspicious activity. False alarms may consequently occur when legitimate activities resemble
viruslike behavior. Recurrent false alarms may ultimately lead users to ignore warnings from
the monitor. Conversely, not all "illegitimate" activity may be recognized as such, leading to
false negatives.
186. See, e.g., Spinellis, supra note 31, at 280 ("Even early academic examples of viral code
were cleverly engineered to hinder the detection of the virus."). See also Ken L. Thompson,
Reflections on Trusting Trust, 27:8 CoraMs. ACM 761-63 (Aug. 1984).
!
!
I
I
I
I
I
I
i
i
i
I
I
!
I
I
i
!
i
Computer Viruses and Civil Liability
177
include stealth viruses,~S7 polymorphic viruses, and metamorphic viruses. ~s"
As a consequence, some virus strains are capable of transforming into a
shape and causing a type of harm very different from what was ex ante
foreseeable.
These and other unpredictable aspects of viruses may cause a negligence
action to fail on proximate cause grounds, where foreseeability is an issue.
In a particular virus incident, an ex post obvious systematic relationship
may exist between the evolved virus and the harm it has caused. If, however,
computer scientists could not ex ante foresee or predict this dynamic relationship, proximate cause may be broken and defendant’s liability cut off.
The following example illustrates this complication. Viruses can be
roughly divided into two groups: those with a destructive payload and those
without a payload, or with a relatively harmless payload, such as display of
a humorous message. For the purposes of this example, we refer to the two
types as "harmful" and "harmless" viruses, respectively?~9
Suppose a hypothetical software provider decides not to scan for "harmless" viruses, perhaps to increase scanning speed and reduce costs, or because of a perceived low likelihood of exposure to liability and damages.
The provider purchases only signatures of new viruses that are known to
be harmful, at the time, for inclusion in his scanner database. The software
provider then sells a software product containing a harmless virus strain
that, by design, was not detected. This virus infects the computer network
of the purchaser of the infected program.
The virus happens to be a metamorphic virus,~9° a type of virus capable
of mutating into a totally different virus species. In fact, it mutates into a
strain with a malicious payload capable of destroying data. The mutated
strain, now transformed into a harmful virus, erases the hard disk of its
host computer. The purchaser of the infected software contemplates a lawsuit against the vendor on a negligence theory.
187. Stealth virus strains are designed to evade detection by assuming the appearance of
legitimate code when a scanner approaches. See, e.g., Kumar & Spafford, supra note 25; see
also D^wD FERBR^CHE, A P^TnOLOGY OE COMPVTEa VmvSES (1992), for a description of stealth
viruses.
188. Polymorphic viruses change their signature from infection to infection, making them
harder to detect. Metamorphic viruses are capable of changing not only their identity but
also their entire nature and function. See, e.g., Carey Nachenberg, Understanding and Managing Polymorpbic Viruses, Tn~ SvMar~’r~c E~a-~Rvms~ Pawas, Volume 30. See also Spinellis,
.~vzpra note 3 I, at 280 ("Viruses that employ these techniques, such as W32/Simile[,] can be
very difficult to identify.").
189. Bissett & Shipton, supra note 113, at 899,903 ("Viruses may be classified as destructive
or nondestructive in their primary effect. The least destructive :.. simply print a... message
and then erase themselves .... Destructive effects include halting a legitimate program. More
destructive viruses erase or corrupt data or programs belonging to legitimate users of the
computer. ").
190. Metamorphic viruses are capable of changing not only their identity but their very
nature. See, e.g., Nachenberg, supra note 188.
i
i
!
i
!
I
I
I
I
I
I
I
i
I
I
I
!
!
178
Tort Trial & Insurance Practice Law Journal, Fall 2004 (40:1)
The plaintiff could easily prove breach of duty by arguing that the trivial
marginal cost to the software provider of scanning for "harmless" viruses
is outweighed by the foreseeable harm from such viruses in the form of
consumption of computing and personnel resources. Defendant, on the
other hand, could credibly argue that proximate causality should be broken
under the reasonable ignorance of the relationship paradigm. Although it
is clear after the incident that a systematic relationship existed between the
harm and the defendant’s untaken precaution (failure to scan for harmless
viruses), computer scientists were nevertheless unaware of this systematic
relationship ex ante. This systematic relationship originates from the ability
of harmless viruses to transform into harmful ones, which depends on the
existence and feasibility of metamorphic virus technology. This technology
was unknown ex ante, even to scientists.
C. Attempt to Fix Proximate Causality Fails Breach Test
The plaintiff in the foregoing metamorphic virus example may attempt to
fix the proximate causality problem by rethinking his pleaded untaken precaution. Once the first harmless virus has morphed into a destructive one,
the provider of the infected software can prevent further carnage by recalling all his previously sold software products and rescanning them for
all viruses, harmful as well as harmless. A plaintifftherefore may plead that
the defendanL once the infection came to his or her attention, could have
taken this action. Failure to recall will he the proximate cause of any further
(now foreseeable) harm from this type of virus, under the no intervening
tort paradigm, or perhaps the reasonably foreseeable harm paradigm. Failure to recall is, of course, also an actual cause of all further harm caused
by the virus:
The plaintiff nevertheless may still find him- or herself stuck in a legal
Catch-22. Empirical studies on the economic impact of product recall
strongly suggest that product recalls are very costly)~ In cases where human lives are not at stake, as is usually the case with ordinary commercial
software, product recall may very likely not be cost-effective and failing to
undertake it would not be a breach of duty. The plaintiffwho pleads product recall as an untaken precaution will likely be able to prove actual and
proximate causality but, this time, fail on breach.
V. CONCLUSION
This article analyzes the elements of a negligence cause of action for inadvertent transmission of a computer virus. The analysis emphasizes the
191. See, e.g., Paul H. Rubin et al., Risky Products, Risky Stocks, 12 REGrdlm.TION 1, which
provides empirical evidence of the costs associated with a product recall and states that "[o]n
the basis of this research, we conclude that product recalls are very costly, resulting in large
drops in the stock prices of affected firms... IT]he health and safety benefits to consumers
may not be worth the cost."
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
I
Computer Hruses and Civil Liability
179
importance of an understanding of virus and virus detection technology, as
well as the economics of virus prevention, in negligence analysis. The classic principles of negligence apply to a virus case, but a plaintiff’s case may
be significantly complicated by the unique and dynamic nature of the technologies involved.
I
I
!
i
I
I
I
I
I
Downstream Liability for
Attack Relay and Amplification
[This article is an adaptation of a talk delivered at the RSA Conference 2002 in San Jose,
California. All contents Copyright 2002 - Carnegie Mellon University, Pennsylvania
State Police, and White Wolf Security.]
Disclaimer
Points of view or opinions expressed in this presentation do not necessarily represent the
official position or policies of the Pennsylvania State Police, Carnegie Mellon University,
White Wolf Security, or RSA.
Who are the authors?
¯
Scott C. Zimmerman, CISSP, is a Research Associate at the Software Engineering
Institute, Carnegie Mellon University.
Ron Plesco, Esquire, is the Director of Policy for the Pennsylvania State Police.
Tim Rosenberg, Esquire, is the President and CEO of White Wolf Security
(www.whitewolfsecurity.com).
¯
¯
The Scenario
!
!
I
i
!
i
I
I
I
To demonstrate the concepts involved, we will use a simple and hypothetical scenario in
which four distinct entities are involved:
o
The first entity is Jane G. Jane is a network security administrator in the United
Kingdom. She works for a company that does approximately US$200M is
business per year. Her yearly salary is US$55,000.
The second entity is Megacorp’s web server, a non-mission-critical machine
accessible from the Internet. MegaCorp is a US$10.4 billion/year public
company. The server is hosted internally, and is physically located at MegaCorp’s
facility in Iowa. MegaCorp exercises complete control over all aspects of the web
server.
3. The third entity is a web server that belongs to a non-profit research hospital in
the state of Washington.
4. The last entity is Mr. Big Star, who receives medical treatment at the research
hospital.
While accessing the Internet at work, Jane finds a six-month old vulnerability in
Megacorp’s web server. Exploiting this vulnerability, Jane is able to gain privileged
!
I
I
I
!
I
I
access to the system. From Megacorp’s system, Jane then discovers a month-old
vulnerability on the hospital system located in Washington state. She is able to exploit
this as well and gains privileged access to the hospital server. Once Jane is a privileged
user on the hospital’s system, she is able to penetrate more deeply into the hospital’s
network wherein she finds a database server containing sensitive patient records. While
browsing the database, Jane G. stumbles on Mr. Big Star’s file and decides to download a
copy.
Having finished her shift at work, Jane G. installs a Denial of Service attack tool on the
MegaCorp server. She begins an attack against the hospital’s web server to throw the
administrators off her trail. She goes home and posts Mr. B. Star’s file to a web site in
Canada and sends it to her friends on IRC.
The chain of entities looks like this:
i~
Jene (3.’s desk’top
Hospital server in Washington
MegaCorp’s Web Server
i
I
i
!
!
!
I
I
I
I
i
I
I
I
I
!
I
I
Parties Involved - Legal Issues
Before we can discuss the various legal theories under which the suits can be brought, we
must first articulate which parties are involved in the case. The plaintiff is the person or
entity that was harmed by the act and is seeking restitution. The defendant is the person
or entity accused of committing the act. In this scenario, potential plaintiffs include
¯
¯
¯
MegaCorp
The hospital
Mr. Big Star
Note that this is not an exhaustive list, as we are focusing on the specific group of directly
harmed individuals. Potential defendants include
¯
¯
¯
Jane G.
MegaCorp
The hospital
It may seem strange that the hospital, for example, may be both a plaintiff and a
defendant, but in this case the hospital may seek damages from MegaCorp, and Mr. Big
Star may seek damages from the hospital. Unfortunately, events such as these are akin to
a multiple vehicle accident. We are presented with a large number of parties who have
been harmed, none of which is exactly sure what happened. What will happen in both the
multiple site attacks and the car accident is that all parties even remotely associated to the
incident will be listed as possible defendants and/or plaintiffs. Once the case lands in
court, it is up to the jury and the legal system to decide who did what to whom, who will
pay, and how much.
The Legal Theories
We now have a series of possible parties to the case. The next portion of the analysis is
identification of the legal theories under which the parties might be sued. This is a
difficult process as the law is a very specific creature. For the purposes of the next
section, we are going to focus on downstream liability. The crux of the downstream
liability issue is negligence. Negligence consists of four parts: duty, breach, causation,
and damages. We will approach each of these separately. Keep in mind that, in the real
world, separation of these items is extremely difficult as they are all closely linked
together.
Duty is simply defined as a prudent person’s obligation to use reasonable care. A more
detailed definition can be found in Prosser, Wade, and Schwartz’s Cases and Materials on
Torts: "requiring the actor to conform to a certain standard of conduct, for the protection
of others against unreasonable risks". To use an automotive analogy, a driver has the
duty to ensure his vehicle has fully functioning brakes and lights, good tread on the tires,
and so forth. Furthermore, the driver of the vehicle has the duty to operate her car with
reasonable care and not to drive recklessly. One of the most difficult aspects of showing
i
!
negligence is this: is there a clearly defined duty? In other words, regarding downstream
liability, does an owner of IT assets on the Internet have a duty to keep his systems secure
and not to be used to hurt another? We believe the answer to this question is a
resounding yes.
Assume for now that the duty exists; showing negligence means there must be a breach.
For a breach to occur, the plaintiff must show that the defendant failed to perform her
duty. In the worst case, the defendant did nothing at all to address network security
issues. In the less extreme case, the defendant could simply have failed to perform her
duty to the appropriate standard. Either will suffice to show a breach in the duty, as long
as the remainder of the requirements are met.
Causation means that the aforementioned breach caused the damages in the incident. In
this case, you will have to show what each of the parties did (or didn’t do) which led to
some real damages. It is imperative for the plaintiff to directly link the breach in duty to
very specific damages, and show that the damages which would not have been incurred
but for the breach.
i
In order for damages to be awarded, something has to be harmed. Damages are broken
down into three types:
¯ Nominal -just enough to say ’you won’
¯ Compensatory - repayment for actual and real damages
¯ Punitive - Amount above compensatory to punish the defendant and make an
example so as to deter similar conduct in the future
In our scenario, disclosure of Mr. Big Star’s medical condition leads to termination of
contract negotiations for a US$15M lead role. This dollar figure defines the damages
caused to Mr. Big Star by Jane G. through MegaCorp and the hospital. In some cases,
the damages may not be as visible. Revenue lost through a disabled e-commerce site can
be quantified, but what about loss of consumer trust?
i
What role does Jane G.’s employer play in the event? Her employer provided the
computer and Internet connection to perpetrate the act. The legal world has created a
theory of vicarious liability in this case, known as Respondeat Superior. Under this
theory, the harmed plaintiffs may be able to sue Jane’s employer for compensation. This
is beneficial from the plaintiff’s perspective as the employer typically has more financial
resources than the employee. Under the theory of Respondeat Superior, an employer
could be held vicariously liable for its employee’s actions:
¯
I
i
¯
¯
Where an employee is acting within the scope of employment and doing
something in the furtherance of his work; and
The employer is or should be exercising some control; then
The employer will be liable for the negligent acts of the employee
I
Jane G. is a network security administrator, and she conducted the attacks while at work,
using her employer’s resources. If her employer has published policies in place, and
enforces them regularly, it will be difficult to hold Jane’s employer vicariously liable. To
make this determination, one will have to look at their employment practices and internal
policies.
Jane G.’s employer may also have been negligent in its hiring practices (though we did
not directly address Jane’s background or character). If an employer hires a network
security administrator who has a questionable background, one of two things probably
happened:
¯
¯
The employer did not conduct a thorough background check.
The employer did conduct a background check but ignored the findings.
A similar situation would be that of a doctor who has committed malpractice at - and was
dismissed from - his last three positions. Hospital #4 hires him without conducting a
thorough background check, and the doctor commits malpractice yet again. The hospital
would then be guilty of Negligent Hiring.
Keep in mind that negligence, vicarious liability, and negligent hiring all assume that a
duty exists. Herein lies the difficulty: what is the due standard of care in a given
situation? What are the accepted best practices? What, exactly, should MegaCorp have
done to avoid being used as a conduit to the hospital intrusion? In general the duty is
defined as the actions taken by "a reasonable and prudent person". Unfortunately this
definition provides a wide range of possibilities: one person’s "reasonable" and
"prudent" is another person’s "overkill" and yet another person’s "insufficient". The
problem often becomes the need to discover what these terms mean in a given trade or
industry. However, a caveat applies: the tendency of an industry to be generally
negligent in its practices does not mean that the court will - or should - use these practices
as the de facto standard. Since our scenario deals with network security, the focus areas
here will be architecture, patches, and personnel.
!
Architecture
One of the most widely-deployed network security measures is the firewall. In broad
terms, this is a system that resides between the corporate network and the rest of the
Internet, filtering traffic according to its configuration. Ten to fifteen years ago, firewalls
were strange and almost unheard-of beasts. However, times have changed, and any
organization that does not protect its network with a firewall is likely to be greeted with
incredulity and dismay.
The Distributed Denial-of-Service attacks that affected prominent web sites in 2000 and
2001 contained thousands upon thousands of spoofed packets. Spoofed packets can be
generated by freely available software tools, and contain an invalid or incorrect source
address; the source address is not important as the flooding is meant to be a one-way
communication. The DDoS attacks were made possible by the almost nonexistent use of
I
I
I
I
I
i
I
I
I
!
I
I
I
l
I
i
I
i
egressf!ltering by network-connected entities. Egress filtering is a simple concept:
examine packets as they leave the corporate network to ensure no inappropriate or
malicious traffic escapes into the world. For example, spoofed packets should not be
allowed to leave the network because they do not bear a valid source address.
We would argue that an organization which owns/operates a connection to the lntemet
and does not filter traffic is already in breach of its duty to protect its assets from misuse
and abuse. The first two elements of a negligent cause of action have been met. All that
is missing is a hacker to come in and use the organization’s resources to hurt another.
That incident will provide the causation and damages.
Patches
As Mr. Bruce Schneier has stated, the cycle of developing buggy software and then
rushing to develop patches does not work. However, until the software development
process becomes as rigorous and precise as, for example, engine manufacturing, the patch
treadmill is the best the industry can offer. Working within this constraint, there is a
great deal of debate over the process of obtaining and installing necessary patches for
applications and operating systems. On one side are the proponents who feel that all
patches should be applied immediately. On the other side are those who cite any number
of patches in recent years that fixed one problem but created three more, and so they feel
that patching should be deferred until the patch is deemed safe and stable. Regardless of
which side of the ’patch war’ you take, installing patches is one of the best things an
organization can do to protect itself against automated attacks.
Personnel
The personnel issue is a sticky - and expensive - wicket for most organizations. System
and network administrators are often overworked because their employers cannot or will
not hire additional personnel. In this situation, the system administrators must prioritize
their tasks, and simply keeping everything running may fill 100% of their time. How
many system administrators are enough? There is no clear formula like "one SA for
every fifty accountants", so the needs and structure of the organization must be used to
determine a suitable staffing level. In most cases, however, having only one person to
cover any particular task is not a good idea: if only one person is on staff, what if he
becomes ill or goes on vacation? Has the organization made arrangements to provide
coverage for this employee’s duties? Beyond the number of personnel, the roles of the
individuals are quite important. Can any named defendant identify who exactly is
responsible for security? Is this role documented?
This brings us to the topic of due diligence. In the area of network security, as
everywhere else, due diligence is not a fixed point: it is a sliding scale. There is no
magical line separating negligent from responsible, where an incremental move in a
certain direction will cause a state change. Here are some clear-cut examples to
demonstrate both sides:
I
i
!
¯ Negligent: a default operating system installation, with no firewall or patches, on
aTl
Responsible: a hardened operating system with post installation changes behind a
robust firewall
I
i
1
I
I
I
!
I
I
!
I
I
I
I
I
Scott’s Assessment of Due Diligence
This section is so named because the position taken in this section is Scott’s; he is not
speaking for any other personnel or organizations.
This section currently applies only to businesses, although it may eventually apply to
individuals. It defines a minimum standard of conduct for a very important reason:
placing a system on the Internet, where it can potentially affect the systems of others,
entails a certain level of organizational responsibility
Due Diligence Statement 1 of 2
Installation of security-related patches, when potential exists to harm a third party:
These patches should be installed no later than ten (10) calendar days after release
of the patch by the vendor.
Many individuals will think that this interval is too short or (probably) far too short.
(There is at least one person who thinks it is too long.) Many of the reasons given for this
include the fact that there are simply not enough personnel to handle the work. However,
going back to the issue of organizational responsibility, the owner of the network has a
duty to make sure the network is as safe as it can reasonably be made. This duty includes
having access to the resources - i.e. personnel and equipment - needed to test and apply
patches in a timely fashion.
Due Diligence Statement 2 of 2
Egress filtering should be enabled on the network perimeter.
As mentioned earlier, there is no legitimate business purpose for spoofed packets, and
simple set of rules on the firewall or border router can block this traffic before it affects
someone else. These rules could likely remain static and still do the job, which is as
close as anything can get to "set it and forget it" in this arena.
This article has covered negligence and due diligence, but what happens if an
organization is negligent? The results of negligence can vary widely:
¯
¯
¯
No incident occurs - business as usual
Mild incident occurs- inconvenience
Serious incident occurs - substantial financial damage
I
I
!
i
t
!
1
I
I
!
I
i
I
i
¯
Most serious incident occurs - life is lost
The DDoS attacks would be classified by most folks as a serious incident; eBay, CNN,
and Yahoo! would almost certainly agree. However, a broad application of egress
filtering could have mitigated the damage.
What about sites with sensitive information?
The value of information is generally subjective. Ifa company’s trade secret- plans for
a new and improved Super-Widget, for example - were stolen or corrupted, the company
would have a difficult time quantifying the amount of loss: no one can predict exactly
how much money would have been made through the sales of the new product.
What about sites with large amounts of bandwidth available?
Sites with large amounts of available bandwidth - or "big pipes" - are often targets of
attacks because the fast network connection can facilitate a number of nefarious
activities. The potential for damage can be more easily reckoned in this case: an OC-3
can flood a T-l, but not vice-versa. One may argue the point that sites with big pipes
have a slightly greater responsibility to secure their networks, similar to the way that a
tractor-trailer driver needs to pay more attention to the function and condition of his
brakes than a person on a bicycle: if the tractor-trailer goes out of control, the potential
for damage is much greater.
What about sites that offer Service Level Agreements (SLA)?
Any reasonable SLA must account for the fact that the systems require maintenance.
One way around downtime is to have a load-balancing cluster of machines, and take
down one at a time to install patches and so forth. The choice here is either to allocate a
small amount of time for maintenance now, or to allocate a potentially much larger
amount of time later when something untoward happens, be it an intrusion or a software
bug that corrupts database tables.
Back to the Group Presentation
Questions to ponder:
¯
¯
¯
Should the plaintiffs go after the ISPs? Why or why not?
Does anything change if Jane G.’s employer is an ISP?
Evaluate the potential for damages; how much prevention could this amount have
purchased?
I
I
Conclusion
I
Case law is just starting on these issues; to date no far-reaching precedents have been set.
Most organizations will want to avoid being on either side of such a landmark case.
I
I
I
I
I
I
I
I
I
!
l
I
I
I
I
I
I
I
I
Please use this article to speak to your in-house counsel or other legal professional in
order to dedicate more resources to the cause.
I
Can Hacking Victims Be Held Legally Liable?
Page 1 of 3
August 24, 2001
Can Hacking Victims Be Held Legally Liable?
By CARL S. KAPLAN
Margaret Jane Radin of Stanford Law School wrote recently, that
S uppose,
a Web site operated by a securities brokerage suffers a crippling attack by
hackers. The ability of its customers to conduct trades is hampered for several
hours, or even blocked entirely. Imagine, too, that on the day of the attack the
stock market is volatile, and that many customers are trying unsuccessfully to
buy or sell stocks in a flash,
Of course, hackers are easy to blame. But what about the companies that
investors rely on to make trades? Are the brokerage firms and their network
providers -- which failed to prevent the attack that harmed the site -- vulnerable
to a second onslaught a nasty lawsuit from unhappy clients who lost money as a
result of the shutdown?
Professor Radin isn’t the only legal thinker posing this question. Another paper
co-authored by two partners and a legal assistant at a major law firm, also
considers whether companies that fail to take reasonable steps to protect their
computer systems from malicious attacks or internal malfunctions are sitting
ducks for lawsuits.
So far, lawyers say, the answer is unclear. There have been no reported court
decisions discussing the issue of a company’s liability for a hacker attack,
according to Radin, an authority on intellectual property, electronic commerce
and Internet law. But lawsuits in the near future are highly likely, she said.
In her paper, professor Radin examined the possible legal fallout from a
"distributed denial of service" attack. This is a particularly troublesome form of
digital mischief whereby hackers gain control of unsuspecting users’ computers
and use those distributed machines to flood a targeted site or service with junk
messages, overwhelming the site and causing it to be inaccessible to legitimate
customers. (Her study, "Distributed Denial of Service Attacks: Who Pays?"
commissioned by Mazu Networks, Inc., a Cambridge, Mass.-based security
company, is available on the company’s site.)
I
I
Radin concluded that there is a "significant risk" that in the near future targeted
Web sites will be held liable to their customers for harm arising from
distributed denial of service attacks. In addition, she reckoned that there is
another "significant risk" that the computer network companies that carry the
hackers’ attack messages -- such as ISPs and backbone network providers -- will
be held accountable to the targeted Web sites, and perhaps to the sites’
customers.
http://www.nytimes.com/2001/O8/24/technology/24CYBERLAW.html?ex=l l18548800&... 6/10/2005
I
Can Hacking Victims Be Held Legally Liable?
!
I
In the second paper, members of the cyberlaw practice group of Sidley Austin
Brown & Wood, a national law firm, considered the growing legal danger faced
by online service providers who suffer security breaches or the internal glitches
that can compromise their customer’s information.
I
I
I
I
i
I
!
I
Page 2 of 3
The study, "Liability for Computer Glitches and Online Security Lapses," by
Alan Charles Raul and Frank R. Volpe, partners at the firm, and Gabriel S.
Meyer a summer associate and J.D. candidate at Cornell University, was
published earlier this month in a Bureau of National Affairs newsletter on
electronic-commerce and will be available shortly on the firm’s Web site. It
concludes that e-commerce players must "demonstrate [a] willingness and
ability to implement aggressive security measures" if they wish to stave off
security breaches, avoid government intervention and escape, or at least limit,
damages in a lawsuit.
Professor Radin, director of Stanford’s Program on Law, Science and
Technology, said in a telephone interview that companies need to begin taking
seriously their potential legal liability for computer hacks. The vulnerability of
businesses to distributed denial of service assaults is staggering, she said, citing
a survey which found that more than one-third of respondents had experienced
denial of service attacks. That figure, from the 2001 Computer Crime and
Security Survey, conducted by the San Francisco-based Computer Security
Institute, may be the tip of the iceberg because companies, fearful of bad
publicity, often under-report attacks. Direct losses from denial of service attacks
on Yahoo, eBay and others in February of last year have been estimated at $1.2
billion by the Yankee Group, a consulting company.
"E-commerce is not going to take off if customers fear it won’t work in a
pinch," Radin said.
Moreover, said Radin, federal and state laws aimed at individual hackers have
shortcomings: Hackers are hard to trace and even when detected, are unlikely to
have the deep pockets coveted by victims and their lawyers.
I
In the brokerage Web site attack scenario, a customer or a class of customers
that suffered financial losses would sue the brokerage firm for damages,
according to Radin. The firm, in its defense, might point to a section of its
Terms of Service agreement with its customers. That fine print, no doubt,
would have a clause clearing itself of liability.
But whether that defense would prevail is not clear, said Radin, particularly if a
court finds the contract’s terms to be oppressive or overly weighted toward the
company, or if the contract’s validity is in question due to questions over proper
customer consent.
!
I
Also vulnerable to a negligence claim would be the network service providers
and hosting companies, said Radin. There would be no contract defense for
these companies to fall back on with respect to the broker’s individual
customers for the simple reason that there is no contract between them. On the
other hand, the potential legal warfare between the brokerage and the network
http://www.nytimes.com/2001/O8/24/technology/24CYBERLAW.html?ex=l l18548800&... 6/10/2005
I
Can Hacking Victims Be Held Legally Liable?
I
I
I
I
I
I
i
!
I
I
providers would likely proceed under the terms of their business contracts.
!
Page 3 of 3
To determine whether the corporate defendants are negligent, courts will look at
how any losses could have been prevented. "A court is going to say it is
negligent of you not to implement preventative measures if they are reasonably
effective and affordable," said Radin.
A jury will have to decide, in fact, if the company could have taken
preventative measures, said Radin. Trials will, therefore, be a battle of expert
witnesses, she predicted. But, she added: "I think as technology increases-- as
easy fixes become available -- it’s more likely that courts will be
unsympathetic" to companies that have not done their utmost to block hacker
invasions. That is particularly true with respect to the Internet service providers
which are in the best position to take system-wide precautions, she said.
Meanwhile, Raul of Sidley Austin, which represents major communication
companies and firms doing business online, said that his clients "either are, or
ought to be" worried about their legal liability for malicious hacks or
inadvertent glitches.
In his firm’s paper, Raul and his colleagues said that companies can seek to
manage their legal risks by adopting state-of-the-art security measures
suggested by industry groups and supporting federal laws aimed at
strengthening data security in the health and financial fields.
"Does a company have controls in place to prevent unauthorized access and
careless release of data," asked Raul. "Is the company training employees in
information security?" Is it constantly assessing its vulnerability to intrusions or
glitches? The answers are important because an aggressive plaintiffs lawyer is
sure to ask who was the person or unit responsible for data security? If the
defendant offers a weak response, said Raul, it will look "really bad."
C~oj~yr_ight 2002 T New York Times Com april I Permissions I Priv~
i
I
I
http://www.nytimes.com/2001/O8/24/technology/24CYBERLAW.html?ex=1118548800&... 6/10/2005
PRINCIPAL OFFICE: PHILADELPHIA
1900 Market Street
Philadelphia, PA 19103-3508
Tel: 215.665.2000 or 800.523.2900
Fax: 215.665~2013
For general information please contact:
Joseph A. Gerber, Esq.
LAS VEGAS*
601 South Rancho, Suite 20
Los Vegas, NV 89106
Tel: 800.782.3366
Contact: Joseph Goldberg, Esq.
"Affiliated with the law offices of J. Goldberg,
and D Grossman.
SEATI’LE
Suite 5200, Washington Mutual Tower
1201 Third Avenue
Seattle, WA 98101-3071
Tel: 206.340.1000 or 800.423.1950
Fax: 206.621.8783
Contact: Daniel Theveny, Esq.
ATLANTA
Suite 2200, SunTrust Plaza
303 Peachtree Street, NE
Atlanta, GA 30308-3264
Tel: 404~572.2000 or 800.890.1393
Fax: 404.572.2199
Contact: Samuel S. Woodhouse, Ill, Esq.
LOS ANGELES
Suite 2850, 777 South Figueroa Street
Los Angeles, CA 90017-5800
Tel: 213.892.7900 or 800.563.1027
Fax: 213.892.7999
Contact: Mark S. Roth, Esq.
TRENTON
144-B West State Street
Trenton, NJ 08608
Tel: 609.989.8620
Contact: Jeffrey L. Nash, Esq.
CHARLOI"I’E
Suite 2100, 301 South College Street
One Wachovia Center
Charlotte, NC 28202-6037
Tel: 704.376.3400 or 800.762.3575
Fax: 704.334.3351
Contact: Jay M. Goldstein, Esq.
LONDON
9th Floor, Fountain House, 130 Fenchurch Street
London, UK
EC3M 5DJ
Tel: 011.44.20.7864.2000
Fax: 011.44.20.7864.2013
Contact: Richard F. Allen, Esq.
CHERRY HILL
Suite 300, LibertyView
457 HaddonfieldRoad, P.O. Box 5459
Cherry Hill, NJ 08002-2220
Tel: 856.910.5000 or 800.989.0499
Fax: 856.910.5075
Contact: Thomas McKay, Ill, Esq.
NEW YORK
45 E~roadway Atrium, Suite 1600
New York, NY 10006-3792
Tel: 212.509.9400 or 800.437.7040
Fax: 212.509.9492
Contact: Michael J. Sommi, Esq.
CHICAGO
Suite 1500, 222 South Riverside Plaza
Chicago, IL 60606-6000
Tel: 312.382.3100 or 877.992.6036
Fax: 312.382.8910
Contact: James I. Tarman, Esq.
909 Third Avenue
New York, NY 10022
Tel: 212.509.9400 or 800.437.7040
Fax: 212.207-4938
Contact: Michael J. Sommi, Esq.
DALLAS
2300 Bank One Center, 1717 Main Street
Dallas, TX 75201-7335
Tel: 214.462.3000 or 800.448.1207
Fax: 214.462.3299
Contact: Lawrence T. Bowman, Esq.
NEWARK
Suite 1900, One Newark Center
1085 Raymond Boulevard
Newark, NJ 07102-5211
Tel: 973.286.1200 or 888.200.9521
Fax: 973.242.2121
Contact: Kevin M. Haas, Esq.
DENVER
707 17th Street, Suite 3100
Denver, CO 80202
Tel: 720.479.3900 or 877.467.0305
Fax: 720.479.3890
Contact: Brad W. Breslau, Esq.
SAN DIEGO
Suite 1610, 501 West Broadway
San Diego, CA 92101-3536
Tel: 619.234.1700 or 800.782.3366
Fax: 619.234.7831
Contact: Joann Setleck, Esq.
HOUSTON
One Houston Center
1221 McKinney, Suite 2900
Houston, TX 77010
Tel.: 832.214.3900 or 800.448.8502
Fax: 832.214.3905
Contact: Joseph A. Ziemianski, Esq.
SAN FRANCISCO
Suite 2400, 425 California Street
San Francisco, CA 94104-2215
Tel: 415.617.6100 or 800.818.0165
Fax: 415.617.6101
Contact: Forrest Booth, Esq.
TORONTO
One Queen Street East
Suite 2000
Toronto, Ontario M5C 2W5
Tel: 416.361.3200 or 888.727.9948
Fax: 416.361.1405
Contact: Sheila McKinlay, Esq.
n
n
WASHINGTON, DC
Suite 500, 1667 K Street, NW
Washington, DC 20006-1605
Tel: 202.912.4800 or 800.540.1355
Fax: 202.912.4830
Contact: Barry Boss, Esq.
WEST CONSHOHOCKEN
Suite 400, 200 Four Falls Corporate Center
P.O. Box 800
West Conshohocken, PA 19428-0800
Tel: 610.94! .5400 or 800.379.0695
Fax: 610.941.0711
Contact: Ross Weiss, Esq.
WICHITA
New England Financial Building
8415 E. 21st Street North, Suite 220
Wichita, KS 67206-2909
Tel: 316.609.3380 or 866.698.0073
Fax: 316.634.3837
Contact: Kenneth R. Lang, Esq.
WILMINGTON
Suite 1400, Chase Manhattan Centre
1201 North Market Street
Wilmington, DE 19801-1147
Tel: 302.295.2000 or 888.207.2440
Fax: 302.295.2013
Contact: Mark E. Felger, Esq.
US ONLINE AT WWW COZEN.CaM
PLEASE CO.ACT ANY OF OUR OFFICES FOR ADDITIONAL INFORMATION OR VISIT
n
m