Document 6570792
Transcription
Document 6570792
AAFS at IAFS: Future Directions in Forensic Science Session I 8:25 a.m. Thursday, October 16, 2014 -‐ 8:30 a.m. Introduction Daniel A. Martell, PhD President, American Academy of Forensic Sciences Forensic Science and Emerging Legal Policy Moderator: Daniel A. Martell, PhD 8:30 a.m. -‐ 8:45 a.m. 9:15 a.m. -‐ 9:30 a.m. -‐ 9:45 a.m. -‐ 10:00 a.m. -‐ 8:45 a.m. Forensic Policy in the U.S. Victor W. Weedn, MD, JD* 9:15 a.m. Delivery of Forensic Sciences in North America: Two Conceptual Models Douglas Lucas, MSc, DSc*, Barry A.J. Fisher, MS, MBA* 9:30 a.m. The Future Relationship of Law and Forensic Science: A Collaborative Model Carol Henderson, JD* 9:45 a.m. Error Reporting: Replacing Blame with Solutions Steven D. Benjamin, JD*, Betty Layne DesPortes, JD, MS* 10:00 a.m. An Attorney’s View of Bite Mark Admissibility in United States Courts Stuart A. Caplan, DDS, MS*, Howard Kaufman, MD, JD, MPH 10:15 a.m. The NAS Report: A Path Forward for Strengthening Forensic Science in the World? Duarte N. Vieira, PhD, MD* Terrorism and Crimes Against Humanity 10:15 a.m. -‐ 10:30 a.m. -‐ 10:30 a.m. Evaluating a Bosnian War Criminal for the World Court in the Hague Daniel A. Martell, PhD* 10:45 a.m. Break Advances in Crime Scene Investigation 10:45 a.m. -‐ 11:00 a.m. -‐ 11:15 a.m. -‐ 11:30 a.m. -‐ 11:00 a.m. Crime Scene Reconstruction Henry C. Lee, PhD* 11:15 a.m. The Examination of Biological Forensic Evidence on Exhibits and the Scientific Method Jane Moria Taupin, MA* 11:30 a.m. Likelihood Ratios in Sub-‐optimal DNA Profiles Jane Moria Taupin, MA* 11:45 a.m. Cold Season Forensic Entomology and Bacteriology Lavinia Iancu, MS*, Cristina Purcarea, PhD 11:45 p.m. -‐ 12:00 p.m. -‐ 12:15 p.m. -‐ 12:00 p.m. An Unusual Case of Commotio Cordis Resulting from a Side Impact Airbag Deployment Michael D. Freeman, MedDr, PhD, MPH, Paul Cahn, MD, MS* 12:15 p.m. Injuries Arising From Glass Drinking Vessels Used in Stabbing and Slashing Attacks S.V. Hainsworth, PhD*, R. Pitchford, R.W. Earp, S.J. Hamilton, G.N. Rutty, MD 2:00 p.m. Lunch Multidisciplinary Perspectives in Forensic Identification Moderator: 2:00 p.m. Betty Layne DesPortes, JD, MS -‐ 2:15p.m. -‐ 2:30 p.m. -‐ 2:45 p.m. -‐ 3:00 p.m. -‐ 3:15 p.m. -‐ 3:30 p.m. -‐ 2:15 p.m. Fingernail Biometric Identification Henry C. Lee, PhD* 2:30 p.m. Discriminant Function Analysis as Applied to Mandibular Metrics and Morphology to Assess Population Affinity in Asia Gregory E. Berg, PhD*, Jennie R.R. Jin, PhD 2:45 p.m. Altered Age Estimations in Populations With Primary IGF-‐1D Joan Fox, DDS* 3:00 p.m. Application of Stable Isotope Forensics for Determining Geographic Origin of Unknown Human Remains from Asia Gregory E. Berg, PhD*, Eric J. Barterlink, PhD, Lee Suhwan, MA 3:15 p.m. Reliability and Confidence of Fingerprint Features Selection Shiquan Liu, MS*, Luo Yaping, Glenn M. Langenburg, MS, PhD 3:30 p.m. Bitemark Analysis in Hungary as a result of Aligned Education, Cooperative Learning, and International Collaboration in Forensic Dentistry Armin A. Farid, DDS* 3:45 p.m. The Impact of Modified Extraction Methods on the Recovery of DNA From Skeletonized Remains Returned From the DPRK – Is There Regional Variability? S.M. Edson, MS, S.R. Ah Sam, MS* 3:45 p.m. -‐ 4:00 p.m. Break Multidisciplinary Perspectives in Post-‐Mortem Interval Estimation 4:00 p.m. -‐ 4:15 p.m. The Effect of Soft Tissue on Exposure Temperature Prediction from Burnt Bone Sarah Ellingham, MS*, Tim Thompson, PhD, Meez Islam, PhD, Gillian Taylor 4:15 p.m. -‐ 4:30 p.m. Assessing DNA Quality, Quantity, and Inhibition using a highly sensitive multiplex quantification system for Forensic Samples Jesse Ramirez BS, Gina Pineda, MS, Anne Montgomery, MS, Robyn Thompson, MS, Sudhir Sinha, PhD, Ryan Yee BS, Zach Goecker BS, Steven Lee PhD* Expert Witnesses 4:30 p.m. 4:45 p.m. -‐ -‐ 4:45 p.m. A Reality Show: You are Going to be an Expert Witness Haskell M. Pitluck, JD* 5:00 p.m. A Little Lesson in Logic Thomas W. Young, MD* Session II Advances in Forensic Behavioral Science Moderator: 2:00 p.m. Victor W. Weedn, MD, JD -‐ 2:15 p.m. -‐ 2:30 p.m. -‐ 2:45 p.m. -‐ 2:15 p.m. Global Perspectives on Contemporaneous Testamentary Capacity Evaluations Daniel A. Martell, PhD* 2:30 p.m. Forensic Linguistics Carole E. Chaski, PhD* 2:45 p.m. Diminishing the Death Penalty John L. Young, MD* 3:00 p.m. Detecting Malingering with the Autobiographical Implicit Association Test Laura Muscatello*, Anabella Alice Pozzoli Advances in Forensic Toxicology 3:00 p.m. -‐ 3:15 p.m. -‐ 3:30 p.m. -‐ 3:15 p.m. Technologic Advances on Chemical Identification Standards Victor W. Weedn, MD, JD* 3:30 p.m. Trends in Licit and Illicit Drug-‐Related Deaths in Florida 2001 to 2012 Dayong Lee, PhD*, Chris Delcher, MS, Mildred M. Maldonado-‐Molina, PhD, Lindsay A. Bazydlo, PhD, Bruce A. Goldberger, PhD 3:45 p.m. Break Advances in Digital and Multimedia Evidence 3:45 p.m. -‐ 4:00 p.m. -‐ 4:15 p.m. -‐ 4:00 p.m. Challenges and Opportunities in Forensic Multimedia Evidence Zeno J. Geradts, PhD* 4:15 p.m. The Application of Specialized Photography Michael E. Gorn, MS* 4:30 p.m. A Preliminary Study on the Individualizes of Monochromic Laser Printers Ning Liu, MA*, George Chiu, Chuntao Chen, Daozhong Lv) BEGIN SESSION I Forensic Science Policy in the United States Victor W. Weedn, MD, JD George Washington University, Washington, DC, United States Calls for forensic science reforms culminated in the 2009 National Academy of Sciences (NAS) Report, Strengthening Forensic Science in the United States: A Path Forward. The executive and legislative branches of the U.S. federal government have responded. The Senate Judiciary Committee has been working on legislation with the input of the forensic science community for the past few years and, at the time of this writing, is anticipated to introduce a bill that would establish a new Office of Forensic Science (OFS) within the Department of Justice. A Forensic Science Board (FSB) within the OFS would oversee a set of forensic science discipline-specific committees. If passed, this would constitute the first general regulation of forensic science in the U.S. Meanwhile, the White House established a Subcommittee on Forensic Science that oversaw a series of Interagency Working Groups. After significant delay, the products of these groups are being published. A National Commission on Forensic Science was established and first met in February 2014. This body is thought to be a temporary policy study group. The National Institute of Standards and Technology (NIST), which is in the U.S. Department of Commerce, has proposed an Organization of Scientific Area Committees (OSAC) to replace most Scientific Working Groups, which had previously been funded by the Federal Bureau of Investigation but no longer are. A Forensic Science Standards Board (FSSB) will oversee Scientific Area Committees (SACs) that will in turn oversee sets of subcommittees. NIST intends that the OSAC Code of Practice will be established and be accepted by national consensus standards. However, NIST hopes to hand off the OSAC to a professional association in three to five years so it would truly be a creature of the forensic science community, but funding in the out years appears to be problematic. Regardless, it is clear there is great motion and change afoot in forensic science policy in the U.S. Keywords: FORENSIC SCIENCE REFORMS; OFFICE OF FORENSIC SCIENCE; FORENSIC SCIENCE BOARD Delivery of Forensic Sciences in North America: Two Conceptual Models Douglas Lucas, MSc, DSc1, Barry Fisher, MS, MBA2; IAFS Past Presidents 1 Burlington, Ontario, Canada; 2Indio, CA, United States In 2009, The National Academy of Sciences (NAS) in the United States issued a report about the state of the forensic sciences in the United States following a wide-ranging, three-year study by a specially appointed committee.1 One of the issues studied was “The Structure and Operation of Forensic Science Laboratories” for which they found “great disparities among existing operations,” “extreme disaggregation,” “the fragmented nature of the forensic science community,” and the need to “minimize the current fragmentation and inconsistent practices” and to “upgrade the systems and organizational structures.” One of the NAS recommendations was “…removing all public laboratories and facilities from the administrative control of law enforcement agencies or prosecutors’ offices.” Following a meeting convened by the Laurel and John Arnold Foundation of Houston, Texas, to discuss projects that a private foundation might sponsor to attempt to make “transformative change” in the forensic sciences in the United States, one of the proposals was the conceptualization of an “ideal” forensic science delivery system that would be capable of providing uniformly high-quality services in all the required forensic disciplines to all appropriate clients in a specific jurisdiction and at a realistic cost.1 A small group of highly experienced forensic scientists have attempted to do this with only limited success—“limited” in part because of lack of consensus on whether a law enforcement agency is the appropriate place for such a system to reside. This session will consist of two presentations which will discuss some of the issues considered. One will present the arguments in support of a law enforcement agency as the proper residence and the other will propose some alternatives. Reference: 1 Laura and John Arnold Foundation. www.arnoldfoundation.org Keywords: NAS REPORT; FORENSIC SCIENCE; LAW ENFORCEMENT AGENCY The Future Relationship of Law and Forensic Science: A Collaborative Model Carol Henderson, JD Stetson University, Gulfport, FL, United States The relationship between law and science was aptly described by Harvard professor Sheila Jasanoff as both an essential alliance and a reluctant embrace. Much has changed in forensic science and law and their relationship since she published that statement in Science at the Bar in 1995. There has been a long-standing adversarial relationship between forensic science and the law. This presentation will discuss why a collaborative model is advantageous to both disciplines and the justice system. In the United States, there has been some movement toward a more collaborative model between the science, technology, and law disciplines as evidenced by the work of the American Bar Association’s Section of Science & Technology Law, the American Academy of Forensic Sciences, the White House Office of Science and Technology Subcommittee on Forensic Science, the National Commission on Forensic Sciences, the National Institute of Standards and Technology’s (NIST) Organization of Scientific Area Committees, and other governmental and professional associations. There is also much movement in this area by international associations and governmental agencies worldwide. While all these efforts are moving forward, we cannot ignore the fact that we are all facing dwindling resources and pleas for more practical solutions to ongoing problems. The nexus of science and technology is shifting where innovation and invention take place. Law has a role in this interplay. For example, while great advances have been made in the science of digital evidence, the admissibility of the many forms of such evidence is still being debated. The role of the law may come into play when there is a new application of well-known techniques, such as the expertise of forensic scientists being sought by those who are not the usual consumers of such services. An example can be seen in today’s art community, which increasingly looks to science to solve questions of authenticity and provenance. There is a continuing need for greater communication, interdisciplinary training, and collaborative research between the legal and scientific disciplines. A greater effort to share knowledge and deliver accurate information quickly and efficiently to both communities must be made. Our collective knowledge must be organized in ways that are useful to practical and theoretical scholars and which allow access to many. Innovation is required; creativity needs to be combined with analytical rigor to move our fields forward in the laboratory as well as the courtroom to ensure justice. Industry must also be involved. We are certain to have continuing challenges in the fields of science and law, but these challenges can be more readily overcome by a collaborative, not adversarial, approach. Keywords: LAW; FORENSIC SCIENCE; KNOWLEDGE DELIVERY Error Reporting: Replacing Blame With Solutions Steven D. Benjamin, JD, Betty Layne DesPortes, JD, MS Benjamin & DesPortes, PC, Richmond, VA, United States This presentation will discuss the goals and types of error-reporting systems that can be used in forensic science laboratories. Discussion will focus on whether the “culture of safety” dynamic supporting such error-reporting systems in other fields can be transformed into a “culture of science” dynamic to support error reporting systems in the forensic science community. The obstacles to using the nonpunitive systems (which provide anonymity for reporters) in which results are introduced in legal proceedings will also be discussed. The forensic science community is united in its commitment to perform good science and to get correct results. We accept that error occurs; all humans make mistakes. However, we strive to remove error from our procedures to improve forensic science services because accurate results are needed. In many other professional fields, the trend has been toward nonfault or nonpunitive error reporting to maximize data collection on errors. These fields include aviation, air traffic control, and various medical services. The stated goal in those fields has been to promote a “culture of safety” that seeks to increase reporting to prevent recurrences of errors and to generate information that could expose greater dangers. These systems shield reporters from any adverse or punitive consequences for reporting the mistakes and for making the mistakes. The systems apply to unintentional errors only; intentional errors can still lead to adverse consequences. Safety advocates have long touted the benefits of nonpunitive systems to increase information that will lead to systematic improvements in procedures and the overall quality of services. With the provision of forensic science services, maximizing the information about errors would seem to be a worthwhile endeavor. However, is implementing nonpunitive reporting systems the right step for forensic science facilities and can a nonpunitive reporting system work in forensic science laboratories? Keywords: ERROR REPORTING SYSTEMS; NONPUNITIVE SYSTEM; CULTURE OF SAFETY An Attorney’s View of Bitemark Admissibility in United States Courts 1 Stuart A. Caplan, DDS1, Howard Kaufman, MD2 University of Tennessee School of Dentistry, Memphis, TN, United States; 2Boca Raton, FL, United States Admissibility of bitemark (BM) evidence requires an understanding of the rules of evidence. In the United States, issues of fact are determined in federal district courts by judges and in lower courts by juries or judges. Admissibility of scientific evidence about facts is governed by the Federal Rules of Evidence and by the Supreme Court case of Daubert and its progeny. Most states follow federal law. Trial courts allow testimony of scientific experts to explicate scientific issues. Such experts are presumed to have unique understanding of scientific facts; therefore, not only can they testify to such facts but also express their opinion on issues such facts raise. However, cases are based on disagreement between experts for each side. Reasons for such disagreements vary. The experts may interpret or question facts differently, have unconscious personal biases, construct interpretations to suit the lawyers that employ them, or the issue may not be settled. The question is how judges and juries can resolve such disagreements. The initial problem is what evidence to allow into the discussion. In the Supreme Court case of Daubert (1993), the Court established the current standard for expert evidence to be admitted. It must be “relevant” and it must be “reliable,” that is, based on “scientific knowledge.” Rule 702, amended after these cases, now provides that testimony must be helpful and must: (1) be based on sufficient facts or data; (2) be the product of reliable principles and methods; and, (3) apply the principles and methods to the facts. Judges are the “gatekeepers” who determine who is an expert, that to which they can testify, and what testimony is relevant and accurate. The problem is that, in general, judges do not have scientific training. Judges are permitted to obtain their own experts; however, this essentially means this expert, rather than the judge, acts as the gatekeeper, and this is not appropriate. Many innovations have been proposed. Judges could be trained in science. Such judges could be dispersed among courts to hear cases with scientific issues, or special “science courts” could be established for cases with scientific issues. Courts could compile a list of experts for parties to use. Penalties could be imposed for improper testimony. Already, experts are being expelled from professional organizations for improper testimony which compromises their credibility, although they have sued for reinstatement. BM evidence is problematic since skin can be deformed or even torn during the bite and secondary changes such as inflammation and fibrosis may distort the marks. However, BM evidence can, at times, be reliable. BM evidence consistently revealed the unique tooth arrangement of Ted Bundy and confirmed his identification. Arch width measurement may indicate whether the biter was an adult or a child. It is necessary to establish protocols for gathering and analyzing BM evidence and the circumstances in which BM evidence will satisfy the standards for admissibility. Keywords: BITEMARK EVIDENCE; DAUBERT; EXPERT EVIDENCE The NAS Report: A Path Forward for Strengthening Forensic Science in the World? Duarte N. Vieira, PhD, MD Full Professor of Forensic Medicine and Forensic Sciences - University of Coimbra; President of the European Council of Legal Medicine; Chairman of the Scientific Advisory Board of the Prosecutor of the International Criminal Court Coimbra, Portugal Nobody today doubts the enormous potential of forensic sciences and its fundamental importance for the proper administration of justice. Nobody will contest that however good the laws, the courts or the quality of the magistrates of a country, justice can only reach levels of excellence if it can count on forensic services that provide a highly qualified scientific support in judicial decisions. The report of the USA National Academy of Sciences, published in 2009 and entitled "Strengthening Forensic Science in the United States: A Path Forward", has provided valuable insights and enumerate a set of recommendations which could serve for a substantial forensic development and improvement in this country. Although aimed at the American reality many of the points made in this report, as well as most of its recommendations, have a universal application. Many of them are fully applicable in other geographic areas, even in all continents. However, and interestingly, the NAS report still remains in ignorance of many of forensic practitioners from around the world and even by many of the main responsible for public forensic expert services of many countries. It is undisputed that very significant steps have been taken in many regions of the world to promote the improvement of the quality of the forensic expert services. But it is equally indisputable that much, but even much, remains to be done. The world is a mosaic of different realities, with enormous differences at several levels: economic, social, cultural, political, legal, geographical, etc. Different realities that also occur at the level of medico-legal and forensic systems and organizations in many countries, and sometimes even inside the same country. This presentation addresses the need to envisage the adoption of various recommendations of the NAS Report at a global scale and how it can contribute to a real improvement in the application of justice at a worldwide level. Keywords: NAS REPORT; FORENSIC SYSTEMS; FORENSIC STRENGTHENING Evaluating a Bosnian War Criminal for the World Court in the Hague Daniel A. Martell, PhD Park Dietz & Associates, Newport Beach, CA, United States This presentation sets forth a first-hand account of an assignment from the United Nations’ International Court of Justice in The Hague to evaluate the competency of a Bosnian war criminal to participate in an appeal of his case. The history and role of the International Criminal Tribunal for the former Yugoslavia (ICTY) in this context will also be described. An orientation to the 1995 “ethnic cleansing” and genocide in the town of Srebrenica in Bosnia and Herzegovina will be provided. This mass murder involved the killing of more than 8,000 Bosnian Muslim men and boys by members of the Army of the Serbian Republic and the mass expulsion of another 25,000–30,000 Bosnian Muslim civilians. In addition, according to the ICTY indictment, victims endured unlawful confinement, murder, rape, sexual assault, torture, beating, robbery, and inhumane treatment; the targeting of political leaders, intellectuals, and professionals; the unlawful deportation and transfer of civilians; the unlawful shelling of civilians; the unlawful appropriation and plunder of real and personal property; the destruction of homes and businesses; and the destruction of places of worship. The procedural history of the instant case will be discussed, with a focus on the international criminal-legal mental health standards for competency to pursue appeals in this context. In addition, the issues leading to the referral for a forensic neuropsychological examination in the case will be discussed. This will include a review of the relevant a priori medical and forensic psychiatric findings in the case and the issues that arose therefrom, driving the need for this assignment. The logistics of arranging travel, accommodations, security, the examination location, and coordination with international attorneys and United Nations (UN) representatives will be presented, including what happens when one’s luggage gets lost in Serbia. Insights with regard to maintaining objectivity and professional neutrality in such emotionally charged circumstances will also be shared. Specific challenges encountered during the assignment will be explored, including: (1) cross-cultural issues in conducting the forensic neuropsychological examination; (2) addressing reliability and validity concerns through selection of culture-free test instruments and measures; (3) issues of literal language translation; and, (4) interdisciplinary consultation with experts from other countries. Findings from the examination will then be summarized. This will include personal observations and a phenomenological account of the examination itself, as well as a discussion of how the results from the neuropsychological test battery were directly related to the issues of trial competency before the Court. Finally, the process of report writing and communication of findings to the Appeals Chamber of the International Criminal Tribunal for the former Yugoslavia will be discussed, together with the Appeals Chamber’s decision and the outcome of the case. Keywords: THE HAGUE; BOSNIAN WAR CRIMINAL; COMPETENCY EVALUATION Crime Scene Reconstruction Henry C. Lee, PhD Forensic Research & Training Center, West Haven, CT, United States Contemporary law enforcement has greatly expanded its ability to solve crimes by the adoption of new surveillance techniques and forensic procedures. Today’s crimes are most often solved by analysis of image recordings, digital evidence, and forensic evidence. The work of forensic scientists is not only crucial in criminal investigation, but also vital for civil litigation, major man-made or natural disasters, and the investigation of global crime. However, the success of analysis of forensic evidence is based on a system that focuses on teamwork, advanced investigative skills, and the ability to process a crime scene properly, recognizing, collecting, and preserving all relevant physical evidence and information. Crime scene investigation is much more than just processing or documentation of crime scenes, nor is it just the collection or packaging of the physical evidence. It is the first and most crucial step of any forensic investigation. The foundation of all forensic investigations is based on the ability of the crime scene investigator or forensic scientist to recognize the potential physical evidence, large or small, visible or latent, exculpatory or inculpatory, at the crime scene. The subsequent identification of the physical evidence along with determination of the possible source or origin of the evidence, that is, its individualization, are the next steps in the forensic investigatory process. Proper crime scene investigation is the starting point for the process of establishing what has happened, when and where it happened, who is involved and why, and how it occurred. Of course, careful processing, documentation, and collection of physical evidence are integral parts of the “investigation process” and crime scene reconstruction. If the potential physical evidence was not recognized, collected, or properly preserved, the forensic value of this piece of evidence is lost. Despite available current crime scene technologies, specialized equipment, and personnel, the integrity of the forensic services system and the effective utilization of physical evidence in crime solving are only as good as the integrity of the crime scene investigator and the objective legal system that supports those functions. Routine and high-profile cases will be used to demonstrate the methods and techniques in crime scene reconstruction. Keywords: CRIME SCENE RECONSTRUCTION; FORENSIC PROCEDURES; FORENSIC INVESTIGATION The Examination of Biological Forensic Evidence on Exhibits and the Scientific Method Jane Moira Taupin, MA Forensic Consultant, Melbourne, Australia This presentation will describe how physical evidence on exhibits associated with a criminal case involves the analysis and consideration of multiple factors. Attendees will understand that the evaluation of body fluid stains and biological trace matter is more than DNA analysis. The attendee will understand that an examination of an exhibit is not a “screening” process but one that depends on the scientific method. The introduction of sensitive DNA profiling techniques has meant that DNA profiles can be obtained from items where there is no visible staining and no biological matter can be associated with the DNA. The proposition of direct or indirect transfer of the DNA may be problematic. Information may be lost when the context is lost because items are simply swabbed or scraped to collect the trace debris. Rigorous sampling decisions ensure that any subsequent testing is relevant and useful. At this point of examination, many important decisions are made that may have an impact on any later analyses or conclusions. The examination of an exhibit should proceed through multiple hypotheses and alternative explanations. Different agencies may be involved at each stage of the analysis and there is a potential for fragmentation of the results. The problems inherent in a cohesive narrative of scientific results will be discussed. Keywords: BIOLOGICAL EVIDENCE; EXHIBITS; SCIENTIFIC METHOD Likelihood Ratios in Sub-Optimal DNA Profiles Jane Moira Taupin, MA Forensic Consultant, Melbourne, Australia This goal of this presentation is to provoke discussion on the necessary caveats and measures required when using the likelihood ratio as weight of evidence in sub-optimal DNA profiles. It will present a perspective on the likelihood ratio as viewed recently in criminal cases and in literature. The discovery that merely touching an object may result in a DNA profile has revolutionized the way investigations are conducted and interpreted when considering biological evidence in criminal cases. This discovery has also led to a desire to obtain information from ever-smaller amounts of sample and to attempt to determine a contributor profile from an unresolved DNA mixture from two or more people. Statistical analyses using the likelihood ratio, a measure of the weight of the evidence, have recently been implemented in the court system in Australia. Very large likelihood ratios, in the order of billions, have been presented as evidence for low level and/or unresolved mixture DNA profiles. A likelihood ratio may have different meanings and very different values in different contexts. The hypotheses proposed in likelihood ratios for mixture profiles evaluate scenarios as to whether or not the person of interest is a contributor. This is a different concept from weight of evidence obtained from single-source, optimal DNA profiles where source probability is the aim. Problems and potential solutions will be discussed. Keywords: FORENSIC DNA PROFILES; LIKELIHOOD RATIO; SUB-OPTIMAL DNA Cold Season Forensic Entomology and Bacteriology Experimental Model on Romanian Urban Territory 1 Lavinia Iancu, MS1, Cristina Purcarea, PhD2 Grigore Antipa National Museum of Natural History, Romania; 2Institute of Biology of the Romanian Academy, Bucharest, Romania Although forensic entomology is not used as a valid method for postmortem interval establishment in Romania, this study attempted to acquire area-specific data on the dynamics of insect species in this country, using a forensic experimental model. Moreover, forensic microbiology, representing a relatively novel discipline worldwide, gathered very scarce information on bacterial dynamics during the decomposition process. Therefore, the succession pattern of bacterial communities from the carcass tissues was investigated, providing novel insight for future development of a complementary tool in criminal investigations. In this context, the study strove to acquire pioneering forensic data for the Romanian specific range by identifying the succession of necrophagous insect species, combined with complementary data on the microbial diversity dynamics within animal carcasses, in order to define entomological and microbial targets in the decomposition process. The study focused on the characterization of necrophagous insect species’ chronological succession, and their stages of development, on pig carcasses exposed in an urban natural environment (Bucharest, Romania), and the composition of the bacterial communities inhabiting the carcasses colon and mouth cavities, in correlation with climate condition and decomposition stages. The forensic experiment, carried out in triplicate, was monitored for up to seven months during the cold and the beginning of the warm period (November 2012–May 2013), with all meteorological parameters being continuously recorded. The baits were placed on the ground and protected from vertebrate scavengers. The visiting necrophagous insect species were collected, including both adults and immature stages, and taxonomically identified. Tissue samples were harvested weekly from internal (4–8cm) and external sections of pigs’ colons and mouths. Total bacteria genomic DNA was extracted from each sample and bacterial 16S-rDNA fragments were amplified by Polymerase Chain Reaction (PCR). Bacteria diversity was investigated by Denaturing Gradient Gel Electrophoresis (DGGE) analysis and sequencing of electrophoresis gel extracted DNA fragments. The results showed an accelerated activity of necrophagous insect species in the warm period. The insect diversity was high, and the appearances time-course of new species and their development stages correlated with carcass decomposition stages and meteorological parameters. Bacterial 16S-rDNA DGGE analysis indicated the presence of a high Operational Taxonomic Units (OTU) number, with a higher diversity in the mouth than in the colon cavity, and different time appearance of various phyla. During the cold period, the number and representation of bacterial species was constant in both cavities. After a ten-week exposure, new bacteria species appeared in the colon cavity, while in the mouth cavity this diversity occurred after six weeks, a phenomenon associated with temperature increase. The novelty of this study consists of: (1) identifying area-specific necrophagous insect succession for forensic investigations on Romanian territory; (2) determining baitembedded bacteria diversity in a decomposing carcass during a cold period; and, (3) the first complementary data of bacteria and necrophagous insects dynamics for postmortem interval identification. Keywords: FORENSICS; COLD SEASON; INSECTS; BACTERIA; ROMANIA An Unusual Case of Commotio Cordis Resulting From a Side-Impact Airbag Deployment Paul Cahn, MD, MS1, Michael D. Freeman, MD, PhD, MPH2 Forensic Research & Analysis, Portland, OR, United States; 2Umeå University, Umeå, Sweden; Oregon Health & Science University, Portland, OR, United States; and Aarhus University, Aarhus N, Denmark 1 The goal of this presentation is to describe an unusual mechanism of a cardiac arrest associated with a relatively low-speed, side-impact collision resulting in an unintended side-impact airbag deployment. This presentation will impact the forensic science community by demonstrating the investigation of a cardiac arrest following an airbag deployment using the principles of counterfactual causation, an important tool for investigating causation in forensic medicine, in which the collective probability of alternative causal explanations is quantified and compared to the probability of a primary causal explanation. This discussion concerns a 2007 single-vehicle traffic crash involving a previously healthy 19-year-old Caucasian male who was the restrained driver of a mid-1990s model SUV traveling on a freeway in the far left lane, nearest the median barrier. For an unknown reason, the vehicle exited the left lane and struck the median barrier at a relatively low speed (<15km/h), resulting in the deployment of the driver’s side-impact airbag. An emergency services vehicle coincidentally passed the stopped vehicle within five minutes of the collision and the driver was observed slumped over the steering wheel. The victim was pulseless by the time he was reached by the Emergency Medical Services (EMS) personnel. The victim was defibrillated and his heart was restarted, but he was subsequently found to have suffered a permanent anoxic brain injury. The injury was later explained by the presumption that the victim had suffered from a commotio cordis as a result of the airbag impact. Commotio cordis is a phenomenon in which the heart arrests following an impact over the cardiac silhouette that coincides with the ascending phase of the T wave, which comprises only about 1%-3% of the total cardiac cycle. The mortality rate of the condition is very high, currently around 65%. The condition is seen most commonly in young male athletes. The kinetic energy required to cause the injury is thought to be approximately 50 joules or more, or approximately the amount of energy in a 2kg object dropped 2.5 meters. An engineering analysis of the collision event indicated that the airbag deployment was unintentional given the collision circumstances, and that the deployment was likely due to a design defect. In a subsequent legal action, the defending manufacturer asserted, via expert opinion, that the cause of the cardiac arrest was not a commotio cordis because: (1) the chest impact from the airbag was too distal or lateral to the victim’s precordium to have impacted the heart; and, (2) the most likely cause of the cardiac arrest was an undiagnosed viral myocarditis that coincided with the airbag deployment. A forensic epidemiologic and biomechanical analysis was undertaken to evaluate the strength of the opposing claims. The first question was whether there was a biomechanically plausible explanation of how the impact to the left side of the victim’s chest could have resulted in an impact to the heart. A plausible explanation was found in the fact that the victim had a severe pectus excavatum deformity, which both shifted the heart to the left and thus closer to the impact and also brought the sternum into closer proximity with the heart. More importantly, the severe concavity of the chest would tend to redirect a left-to-right force applied to the lateral chest into an anterior-toposterior force at the sternum. A counterfactual approach was used to assess the probability that the airbag impact was the cause of the cardiac arrest. The counterfactual approach to causal assessments examines the “but-for” perspective; that is, but-for the hazard, how likely is it that the injury would have occurred at the time that it did? This method is useful when precise quantification of the risk of a hazard is not practical, but it can be determined that the alternative risks are either very large or very small relative to the probable range of the hazard risk. For the subject case, it was impossible to say how likely it is that a commotio cordis would result from a side-impact airbag deployment in an adult male with pectus excavatum, but the alternative hypothesis, that the victim was suffering from an occult case of myocarditis which in turn caused a cardiac arrest that occurred during the same five-minute window as the airbag deployment, could be more precisely quantified. In order to estimate this risk, first the annual incidence of myocarditis in males 15-24 years of age using U.S. hospital discharge data was estimated. Data from the Nationwide Inpatient Sample (NIS) of the Healthcare Utilization Project of the U.S. Agency for Healthcare Research and Quality were accessed for the years 2006-2008. The NIS is a unique database of hospital inpatient stays that contains data from approximately eight million hospital stays each year, or around a 20% sample of all hospital discharges in >40 states. All cases identified by ICD-9 codes for myocarditis and cardiac arrest were pulled and compared to the U.S. population of the same age and gender (based on U.S. Census Bureau estimates) in order to arrive at annual incidence rates. The results of the analysis were as follows: The annual rate of myocarditis per year in the male 15-24-years-of-age population was 1 in 29,388, and the incidence of cardiac arrest given a diagnosis of myocarditis was 1 in 4,007,750. From the annual rate, the five-minute risk of cardiac arrest associated with myocarditis was calculated to be 1 in 439 billion, respectively. During the same years of the NIS, it was also determined that there were no cardiac arrests in the same population of patients like the victim; i.e., with no known comorbidities. Thus, the myocarditis-related cardiac arrest theory was the only alternative theory investigated. A rough estimate of the risk of cardiac arrest due to airbag deployment, based on data from the National Highway Traffic Safety Administration Special Crash Investigation reports, indicated that over the prior 25 years, there has been at least 1 cardiac arrest per 800,000 airbag deployments. A comparison between the two risks favors the known airbag trauma as the cause of the cardiac arrest over the hypothesized myocarditis-related cardiac arrest by approximately 550,000 times. This enormous disparity allowed for the conclusion that the airbag trauma was the most probable cause of the cardiac arrest, via the explanatory mechanism of commotio cordis. Keywords: COMMOTIO CORDIS; AIRBAG; COUNTERFACTUAL CAUSATION; FORENSIC BIOMECHANICS; FORENSIC EPIDEMIOLOGY Injuries Arising From Glass Drinking Vessels Used in Stabbing and Slashing Attacks S.V. Hainsworth, PhD, R. Pitchford, R.W. Earp, S.J. Hamilton, G.N. Rutty, MD University of Leicester, Leicester, United Kingdom Recent reports in the United Kingdom estimate the annual cost to the National Healthcare System (NHS) as a result of alcohol harm at £2.7bn. Glassware is used as an impulsive weapon in 4% of violent incidents in the United Kingdom. The injuries that occur can be significant, leading to serious injury and death, and usually fall into categories of either stabbing or slashing. Injuries can also have a component of blunt force trauma, depending on the way in which the weapon is used. In order to better understand the injury potential of different types of glassware and measure the forces involved in glass-related attacks, a study was conducted using English pint glasses, in particular “Nonic” glasses and “Tulip” glasses. Slapping attacks, where a glass is held in the hand and slapped onto the victim, are dynamic attacks and in order to determine the level of force that can be generated, a novel force plate dynamometer was used to record the peak forces generated by a number of assailants. Additionally, high-speed video was used to record the way in which the glasses fractured and any shards from the glasses penetrated a synthetic skin simulant. Tests were made using both a flat plate and a mannequin’s head. A silicone rubber skin simulant was used to allow the damage created by shards to be assessed. Annealed and tempered glassware was tested and the glass fracture patterns and types of shards that were generated were compared in terms of the damage that was obtained. The average force generated during a slapping attack was found to be ~1,000N. This is a significant force; therefore, it would be expected that the injuries would be a combination of blunt force injuries and sharp force injuries from the glass fragments that result on impact. The results of the engineering experiments are presented in terms of observed forces and damage patterns and compared to those found in a pathology context in order to gain an improved insight into the way in which injuries arise in assaults using glass as an impulsive weapon. Keywords: GLASS VESSELS; STABBING AND SLASHING ATTACKS; ALCOHOL HARM Fingernail Biometric Identification Henry C. Lee, PhD Forensic Research & Training Center, West Haven, CT, United States Biometrics or biometric authentication is the technology devoted to identification of individuals using biological traits or characteristics. Biometric characteristics can be either physiological or behavioral. Physiological characteristics include fingerprints, DNA, blood types, iris patterns, retina patterns, hand geometry, palm prints, vein geometry, face recognition, ear shape, and body odor/scent, among others. Behavioral characteristics include signature dynamics, hair, footprint, typing rhythm, gait, and voice. Fingernails as forensic evidence have not been adopted as a biometric trait in any current biometric system. A system of fingernail biometrics for human identification is proposed. Fingernails have excellent properties suitable as factors for biometrics. They are usually classified based on ridge patterns. Vertical nail ridges are seen rather commonly and usually are not signs of serious illness. These ridges generally extend from the base of fingernail to the tip in an orderly, regular fashion and tend to become more prominent with age. Horizontal nail ridges run from one side of the nail lengthwise across to the other side. Horizontal nail ridges may indicate the presence of an underlying illness or medical condition, although this is not always the case. One special type of horizontal nail ridge that may indicate underlying illness is called Beau’s lines. Fingernails are found at crime scenes and on victims, suspects, witnesses, and evidence. Unlike tissue or blood, fingernails do not easily decompose but are easy to store and transport without contamination. Collection of fingernails is painless, harmless, and convenient. Fingernails can also be found on badly decomposed bodies and body parts, especially in catastrophic incidents. Not only mitochondrial DNA but also nuclear DNA has been successfully analyzed from fingernail fragments. Fingernail patterns and physical features were extracted by image processing. Features of size, length, and width of fingernails; DNA profiles; vertical nail ridges; horizontal nail ridges; special features, inclusions, and color of fingernails were measured and combined as a template representative of the fingernail pattern, which can be used for matching to the template database. The use of fingernails may present a human identification system that can consider multiple modalities integrated into a biometrics system. Keywords: BIOMETRICS; FINGERNAIL; EVIDENCE COLLECTION Discriminant Function Analysis as Applied to Mandibular Metrics and Morphology to Assess Population Affinity in Asia Gregory E. Berg, PhD, Jennie R.R. Jin, PhD JPAC-CIL, Joint Base Pearl Harbor-Hickam, HI, United States Anthropologists tend not to rely heavily on mandibular metric or morphological data to determine biological or population affinity for several reasons. First, the most widely available comparison groups are usually limited to Western populations, e.g. U.S. Whites and Blacks. Second, when other populations are available for comparison, the sample sizes are relatively small and typically don’t have associated morphological data. Third, anthropologists are more familiar with cranial metrics than with traits or metrics from the mandible. Therefore, there is a need to develop reliable formulas for deriving population affinity based on mandibular metric and morphological data. This presentation examines the ability to classify individuals using metric and morphological data taken from the mandible, particularly examining the questions of population diversity in Asia. This study concentrates on individuals from multiple populations in Southeast and Northeast Asia, including Cambodians, Vietnamese, Thais, Koreans, and Chinese. Out groups used for comparison purposes will contain U.S. Whites and Blacks. The total sample size is in excess of 1,000 individuals; all individuals are assumed to have late 19th- to 20th-century birth years. Eleven measurements were collected for each mandible in the study; eight standard and three relatively new.1 Six mandibular morphological traits also were used in the analyses. Step-wise Linear Discriminant Function (LDF) analyses using the Mahalanobis D2 distance statistic were undertaken. LDF analysis examined not only closely related groups (e.g., Vietnamese and Thais) but also took multiple, further geographically separated groups into consideration (e.g., Vietnamese and Koreans). Performance of the models suggests that while both metric and morphological data can separate the groups effectively, a combined approach using both sets of data simultaneously in the same LDF is a superior approach. Comparisons using out groups also demonstrated excellent accuracy rates. Taken together, this research shows that mandibular metrics and morphology can be used to classify unknown human remains when the biological affinity of the specimen is unknown, and these data are a valuable tool for the forensic anthropologist. Reference: 1. Berg, Gregory E. 2008 Biological Affinity and Sex Determination Using Morphometric and Morphoscopic Variables from the Human Mandible. Dissertation, University of Tennessee, Knoxville. Keywords: MANDIBULAR METRICS; ANTHROPOLOGY; POPULATION AFFINITY Altered Age Estimations in Populations With Primary IGF1-D Joan Fox, DDS Phoenix, AZ, United States It has been observed among early-adolescent children who were short in stature that a deficiency in Insulin-like Growth Factor 1 (IGF1-D) is an uncommon etiology. Normally, the levels of IGF1 fluctuate throughout life, peaking at the onset of puberty and decreasing thereafter. It should coincide with levels of growth hormone. Subnormal linear growth is the primary clinical display of IGF1-D when compared with an age-similar population. From a dental perspective, children may be screened for developmental disorders by using dental radiographs and clinical examination of the oral cavity as well as evaluating growth charts and medical history. It is conceivable that there is an under-diagnosed IGF1-D population that has the pathognomonic delay in osseous growth. In this subset group of patients diagnosed with IGF1-D that has also received replacement therapy, it is also possible that the dental and chronological age estimations are skewed because of this variable. A relatively accurate age assessment with consideration made to the delayed growth and development can be made with dental age-estimation tools. Demirjian’s method of age estimation for this population may not be the most reliable method of age estimation; however, if the patient is living, it is probably the preferred approach. If the subject is deceased, evaluating the cementum incremental lines in an extracted tooth (if it is an option) may be a more accurate method. From a forensic standpoint, the examiner should take into consideration the display of unerupted, crowded, developed, and permanent as well as retained primary teeth, skeletal development, and symmetry. Keywords: GROWTH FACTOR 1 (IGF1-D); GROWTH HORMONE; INSULIN DEFICIENCY Application of Stable Isotope Forensics for Determining Geographic Origin of Unknown Human Remains From Asia Gregory Berg, PhD1, Eric J. Bartelink, PhD2, Lee Suhwan, MA3 1 JPAC-CIL, Joint Base Pearl Harbor-Hickam, HI, United States; 2California State University, Chico, Chico, CA, United States; 3Ministry of National Defense Agency for KIA Recovery and Identification, Seoul, South Korea The application of stable isotope analysis has provided novel approaches for determining origin of unidentified human remains from forensic and archaeological contexts. Stable isotope values measured in hard tissues (e.g., bone and teeth) can provide a record of an individual’s life history and geographic origin. Human bone stable carbon and nitrogen isotopes of collagen and stable carbon isotopes of bioapatite reflect consumption of food resources, which vary between geographic regions due to cultural dietary differences. This study discusses the application of stable isotope forensics for provenancing human bone obtained by the Joint POW/MIA Accounting Command Central Identification Laboratory (JPACCIL) and the Ministry of National Defense Agency for KIA (Killed in Action) Recovery and Identification (MAKRI) Central Identification Laboratory in Seoul, South Korea. Stable isotope analysis was used to determine whether human remains could be distinguished between U.S. American military members as compared to Northeast Asians, as well as with Southeast Asian reference data previously obtained. More specifically, isotope values consistent with consumption of a strictly C3-based diet (e.g., rice) are considered more likely associated with individuals from Asia, whereas values consistent with greater consumption of a mixed C3/C4 diet (e.g., corn, sugar) were considered to be more likely from North America. The study included more than 50 human bone samples recovered from various sites in the Korean Peninsula as well as from Southeast Asia and the Pacific contexts. Samples were selected from known battles or incidents (i.e., while the exact identity of each individual may not be known, the origin of the individual is known—either U.S. American or Asian). Mitochondrial DNA with haplogroup assignments from MAKRI-contracted laboratories or from the Armed Forces DNA Identification Laboratory were available for most samples. The stable carbon isotope values from collagen extracts formed a bimodal distribution and indicated diets consisting of varying amounts of C3 and C4 resources. When compared by geographic reference data, there was minimal overlap in collagen carbon isotope values between U.S. Americans and individuals from Asia. As expected, U.S. American stable carbon isotope values were significantly elevated relative to Asians, reflecting greater contribution of C4 resources in the diet. An earlier study by Bartelink et al. (2014) demonstrated that linear discriminant function analysis could correctly classify over 95% of all samples based on stable carbon isotope values. Preliminary data from this study suggests classification rates at or near the same levels, even when including Northeast and Southeast Asians in the dataset. Keywords: STABLE ISOTOPE FORENSICS; HUMAN IDENTIFICATION; MITOCHONDRIAL DNA Reliability and Confidence of Fingerprint Features Selection 1 Shiquan Liu, MS1, Luo Yaping,1 Glenn M. Langenburg, MS, PhD2, People’s Public Security University of China, Beijing, China; 2Minnesota State Bureau of Criminal Apprehension, Saint Paul, MN, United States The fingerprint identification process begins with an analysis of the latent prints. It requires conducting a full analysis of the quantity and quality of both the impression and the minutiae. Some groundwork has already shown how to conduct a complete analysis of latent prints in the United States. However, this experiment explored how to document the reliability and confidence with a minutiae selection among Chinese fingerprint examiners in the Analysis, Comparison, Evaluation-Verification (ACE-V) process. The Green-Yellow-Red-Orange (GYRO) system developed by Dr. Glenn Langenburg was used to annotate the minutiae during the analysis phase of the ACE-V process, but this study changed to the Green-Blue-Red (GBR) system. Green features represent the examiner’s high confidence of existence, Blue features represent medium confidence, and Red features represent low confidence. This is the first time a national experiment has been conducted with fingerprint examiners around China. Using the GBR system, the examiner’s confidence with the features was recorded. Four fingerprint trials (two from Langenburg and two from the crime lab of the People’s Public Security University of China) were selected for the experiment; the data will provide a large-range analysis of minutiae selection for approximately 200 fingerprint examiners from different provinces in China. There are four aspects that will be discussed from the results calculated by the statistics: 1. Selection of relevant level-two features; results from the “confidence” of the latent fingerprints. Based on Cedric’s model, a new model will be set up to qualify the weight of minutiae configuration of different confidences. 2. Accuracy of minutiae selection and error rates of the final conclusion. Attempt to discover what types of minutiae resulted in most of the experts making errors, then describe the minutiae and provide recommendations. 3. Present intra-examiner and inter-examiner statistics results. 4. Results will be compared to the United States’ and Holland’s results to determine the difference. This research will have an impact on latent print operations throughout China and other fingerprint communities by increasing the understanding of fingerprint features selection and increasing the examiners’ awareness for potential errors generated in low-quality areas or other factors. Reducing the errors of fingerprint features selection will decrease the chance of wrongful comparison, reduce the risk of erroneous identifications, and, most importantly, maintain the public’s confidence in the police and forensic science in China. Keywords: FINGERPRINT IDENTIFICATION; GYRO; LATENT PRINT Bitemark Analysis in Hungary as a Result of Aligned Education, Cooperative Learning, and International Collaboration in Forensic Dentistry Ajang Armin Farid, DDS Semmelweis Medical University, Budapest, Hungary After attending this presentation, attendees will better understand how bitemark analysis in Europe is becoming more and more a part of police investigations. This presentation will impact the forensic science community by underlining the fact that violent biting happens all over the world, regardless of how much attention is given to this matter. Hungary has a long history in recognizing and analyzing bitemarks. The Forensic Institute of the Semmelweis University, one of the most traditional and renowned centers of medical education, had first mentioned the possibility of analyzing bitemarks in 1904. Textbooks include bitemark analysis in 1968 by Dr. László Harsányi. However, due to the low rate of criminal activity in the socialist countries, little weight was given to using bitemark analysis as part of police investigations. The occurance of human bitemarks was considered rare or non-existent; however, animal bitemark analysis was conducted more frequently due to an increase in dog bites. Since 2008, a continuous media campaign and education program for Hungarian crime scene technicians helped raise awareness of the importance of bitemark analysis among the police and the general population resulting in the reporting of several bitemark cases, within a short span of time, to the authorities. The most prominent case, which occured in December 2010 in a suburb of the capital, Budapest, shocked the Hungarian nation, when a two-month-old child was beaten by her father and then bitten twice to be silenced. The police were called after the child was taken to the hospital 12 days after the initial abuse. Due to the condition of the child on admission to the hospital, it was apparent that additional abuse had occured. Since this case involved a “closed populaton” suspect pool, the forensic odontologist was asked to analyze the dental evidence obtained from both parents. It is important to note that the father initially confessed to biting the infant. A thorough examination and analysis of the bitemark excluded the mother as a biter and included the father as the probable biter in this case. The highly specific, individualized characteristic of the bite helped to strengthen this conclusion. The forensic dentist in Hungary worked closely with his mentor in the United States, who provided advice and guidence in this unique case. The father was finally convicted and sentenced to six years imprisonment in February 2013. A few weeks later, another child in Hungary was reportedly bitten by a neighbor who had been asked to babysit the child while the mother was at work. The police became involved in this incident when the mother came home and found several bitemark wounds on the child. In the same time frame, an elderly woman was attacked in her own home, and in self-defense she bit one of the intruders on the leg, causing a bitemark that served as evidence. It is believed that the increase of education, not only among the police and other agencies concerned with children’s and women’s welfare but also increasingly among the general population, on the significance of bitemarks will bring more and more cases to light, providing the judicial system with additional evidence. Understanding the value of education in bitemark analysis, the International Law Enforcement Academy (ILEA) in Budapest, which is run by the FBI, has asked for these topics to be incorporated into the training of their international police recruits and agents. Thus, the knowledge has been transmitted to Bulgaria, Kosovo, Romania, Macedonia, Montenegro, and other countries in Europe. The Semmelweis Medical University, Department of Forensic Medicine, in Budapest, which has integrated bitemark analysis, age estimation, and disaster victim identification into its curriculum, allows newly trained dentists to bring their knowledge and learning back to their home countries, including Germany, Norway, Austria, Sweden, the United Kingdom, Greece, Cyprus, Iran, Israel, and other countries in the Middle East. In the dental forensic community, it is a known fact that some countries are more advanced than others in utilizing forensic dentistry in crime investigation and/or disaster victim identification while others struggle to achieve a united vision and understanding among the different governmental agencies involved. Therefore, it becomes the moral and ethical responsibility and privilege of the more advanced forensic communities to engage in a collaborative effort to integrate this science into police investigative work in the less-developed territories. This study will present that these incredible, unforeseen accomplishments have been made possible because of the continued collaboration, exchange of knowledge and ideas, accompaniment, and capacity building and the support and encouragement received from mentors and colleagues working in the field in the United States. In facing common challenges and service to humanity, the concept of accompaniment and assistance in developing capacities is vital in chartering an individual path for progress. Great accomplishments in public service are achieved through unity of vision and action plus mutual and ongoing collaboration and support. Keywords: BITEMARK ANALYSIS; FORENSIC DENTISTRY; INTERNATIONAL COLLABORATION The Impact of Modified Extraction Methods on the Recovery of DNA From Skeletonized Remains Returned From the DPRK – Is There Regional Variability? S.M. Edson, MS1 and S.R. Ah Sam, MS2 Armed Forces DNA Identification Laboratory, 115 Purple Heart Drive, DAFB, DE 19902 2Joint POW/MIA Accounting Command – Central Identification Laboratory, 310 Worchester Ave., Joint Base Pearl Harbor – Hickam, HI 96853, United States 1 The Armed Forces DNA Identification Laboratory (AFDIL) and the Joint POW/MIA Accounting Command – Central Identification Laboratory (JPAC-CIL) work towards the identification of missing US service members from past military conflicts. To that end, both laboratories are constantly improving laboratory protocols that improve both the speed and efficiency of the procedures. Since the inception of AFDIL in 1992, the protocol for extracting total genomic DNA from skeletal samples has undergone two major modifications. The first, released to casework in 2007, was a change to a new extraction buffer that provided complete demineralization of the osseous material and a 10 fold reduction in the amount of sample required for processing (2.0g was reduced to 0.2g). The second, released to casework in 2013, was a removal of the phenolchloroform (PCIA) purification and an incorporation of a silica purification column. The second extraction modification, known in-house as ‘demin2’, has shown a positive difference in success for some skeletal elements when testing for mitochondrial DNA (mtDNA); an ~14% increase in success with a low-copy number Y-STR analysis protocol (LCN-Y); and ~5% increase in success for autosomal testing. Of note was a Y-STR artifact observed only in remains recovered from a region in North Korea. While this was only one incident, it encouraged the scientists to examine whether or not there would be specific success rates of skeletal elements recovered from different localities. To examine this possibility, only skeletal elements from the Korean War were examined. Between 1990 and 1994, 208 boxes of remains were returned to the United States by the Democratic People's Republic of Korea (aka, North Korea). Each box was attributed to a specific village or region in North Korea and purportedly comprised a single individual. Through anthropological and mtDNA analysis, it has been determined that most of the boxes -contain commingled remains and there are significantly more than 208 individuals represented in the assemblage. To further complicate matters, shared mtDNA sequences have been observed between the K208 boxes and remains recovered during US/DPRK Joint Recovery Operations (JRO). MtDNA, LCN-Y, and autosomal DNA testing success rates for each skeletal element were examined for each regional assemblage of remains. Comparisons were made between the original extraction method, the first modification involving complete demineralization (‘demin1’), and the most recent extraction modification ('demin2'). The goals are to: 1) provide a clearer picture of the regional taphonomic effects on skeletal remains as they relate to DNA analysis, and 2) demonstrate how advances in DNA technology have aided in sorting a highly commingled skeletal assemblage and led to the identification of US service members. Further studies will involve similar analyses on remains recovered from regions associated with Southeast Asia and World War II conflicts. Disclaimer: The opinions or assertions presented are the private views of the authors and should not be construed as official or as reflecting the views of the Department of Defense, its branches, the U.S. Army Medical Research and Materiel Command, the Armed Forces Medical Examiner System, or the Joint POW/MIA Accounting Command – Central Identification Laboratory. Keywords: DNA; AFDIL; JPAC-CIL The Effect of Soft Tissue on Exposure Temperature Prediction From Burnt Bone Sarah Ellingham, MSc, Tim Thompson, PhD, Meez Islam, PhD, Gillian Taylor Teesside University, Middlesbrough, United Kingdom Determining the maximum exposure temperature from burnt skeletal tissue can be of crucial importance for the reconstruction of incineration conditions in forensic case work. Traditionally, utilized indicators such as color change, microstructural alterations, or sole crystallinity measurements are mere qualitative indicators which are subject to fluctuations based on a variety of factors other than heat exposure and, therefore, inherently unreliable. The goal of this study was to develop a statistically robust and reliably reproducible quantitative method of burn temperature prediction, which takes into account the effect of soft tissue presence as well as different exposure times. Research was carried out using sheep (Ovis aries) rib bones of two experimental groups, defleshed “green” bones as well as bones with soft tissue presence, which were burnt at temperatures between 100°C and 1,100°C in 100°C increments for 15, 45, and 90 minutes. The external bone surface was subsequently removed, ground to powder, and analyzed through Attenuated Total Reflectance Fourier-Transform Infrared Spectroscopy (ATR-FTIR) at an optical range of 400cm-1 to 2,000cm-1. From the produced spectra, eight absorption peak ratios, which have been described in Thompson et al., were calculated on which Linear Regression analysis was performed, creating a formula for the prediction of the burn temperature.1 Results indicated that from defleshed sample spectra, burn temperatures can be predicted with a standard error of +/-70°C. Variation in the exposure time does not make a significant difference to the prediction accuracy. The presence of soft tissue, however, has a significant influence on heat-induced changes of the bone matrix in low (<300°C) as well as high (>800°C) temperatures, shielding the bone matrix and slowing down combustion in the former, and acting as fuel, causing an accelerated combustion in the latter (p<0.05). At medium temperatures, no significant difference between bones burnt with and without soft tissue presence was noted. This research allows for an accurate determination of exposure temperature from defleshed bones and a founded estimation of the temperature range from bones which have been exposed to burning with soft tissue, thus marking a significant advancement in the comprehension of burnt bone. The clear impact of the presence of soft tissues, and the lack of influence of duration, provides forensic investigators a new perspective with which to interpret the results of FTIR measures derived from burnt bone. Reference: 1. Thompson TJU, Islam M, Bonniere M. “A new statistical approach for determining the crystallinity of heat-altered bone mineral from FTIR spectra.” Journal of Archaeological Science. 2013 1;40(1): 416-22. Keywords: BURNT BONE; EXPOSURE TEMPERATURE; INCINERATION Assessing DNA Quality, Quantity, and Inhibition Using a Highly Sensitive Multiplex Quantification System for Forensic Samples Stephen Lee, PhD1, Jesse Ramirez, BS1, Ryan Yee, BS1, Zach Goecker, BS1, Gina Pineda, MS2, Anne Montgomery, MS2, Robyn Thompson, MS2, Sudhir Sinh, PhD2 1 San Jose State University, San Jose, CA, United States; 2InnoGenomics Technologies, New Orleans, LA, United States Real-time Polymerase Chain Reaction (PCR) quantification of human DNA provides an important estimate of amplifiable DNA in a biological sample. Current methods in forensic laboratories include SYBR® Green, Plexor® HY, and Quantifiler® Duo assays. Recent advances in mini Short Tandem Repeat (STR) analysis systems have made it possible to analyze highly compromised samples. A quantification system that estimates the level of degradation in a forensic sample is a useful tool for DNA analysts. There are already systems reported in the literature that provide a quality assessment for degraded DNA samples. One uses a Ya5-lineage Alu genetic element and a second employs single copy targets in a multiplex assessing nuclear and Y chromosome targets ranging from 67-190bp. The advantage of an Alu system is the presence of a large number of fixed insertions. Published reports state only 20% of the Yb-lineage Alu elements are polymorphic for insertion presence or absence in the human genome. A large number of these fixed elements are present in every human genome, thus enhancing sensitivity and minimizing the individual specific variation possible when using a multi-copy target quantification system. Alus are Short Interspersed Elements (SINE), ~ 300bp insertions, distributed in large copy number (>1000 copies/cell). The use of internal primers to amplify a segment of an Alu element allows for higher primate specificity and high sensitivity when compared to a single copy target. The new qPCR system utilizes two independent genomic targets: an 80bp “short” target sequence and a 207bp “long” target sequence.. Primers and probes are selected so that they are completely independent. The ratio of the long targets versus short targets provides a useful assessment of DNA quality. This Degradation Index (DI) has applications in predicting the profiling success of forensic samples. Use of a synthetic target as an Internal Positive Control (IPC) provides an additional assessment for the presence of PCR inhibitors in the test sample. Initial inter-laboratory testing indicates the PCR efficiency for both long and short targets is consistently above 90%. The amount of synthetic IPC target was adjusted to provide reproducible Ct values between 18–22 cycles for samples with no inhibition. Precision and sensitivity studies indicated this system has a sensitivity threshold of 1pg, similar to those reported for other Alu-based quantification systems and comparable to other commercially available systems. Studies comparing this system with other commercially available quantitation systems show concordance of quantitation values between systems. Furthermore, the preliminary inter-laboratory results demonstrate the predictive value of the DI on degraded DNA and the IPC results on humic acid (inhibitor) spiked samples. In conclusion, a DNA-based qualitative/quantitative/inhibition assessment system is a valuable tool for deciding which DNA test kit to utilize when processing forensically compromised samples. Keywords: DNA; ALUS; DNA ANALYSIS A Reality Show: You Are Going to be an Expert Witness Haskell M. Pitluck, JD Crystal Lake, IL, United States If you are a forensic scientist and work on cases in the legal justice system, the odds are high that at some point you are going to have to testify before a body that will make a decision based, at least in part, on your work. It will not be like the television episodes and movies. It will be reality. Your work should be accurate and correct, but it is equally important that you know how to conduct yourself before the adjudicative body so you can effectively impart your work and testimony in a manner understandable to all concerned—a very small and special audience. You may be the best forensic scientist in the world, but if you are not able to explain your work, your conclusions, and your opinions, you will not be an effective witness. Unlike the quick resolution of the various forensic matters in the media, if parties to a case cannot attack the science, they attack the scientist. This presentation should assist in preparing you to be the best witness you can possibly be. The key to this is preparation as there is no substitute for good preparation. Topics to be covered will include preparation of your CV as well as dealings with the attorneys, other experts, judges, and the jury, some do’s and don’ts, and even some tips on ethics, as well as other suggestions to help you become a better witness. The goal is to be the most prepared, honest, and effective witness possible. Keywords: LEGAL JUSTICE SYSTEM; JUDICIAL ETHICS; EXPERT WITNESS A Little Lesson in Logic (논리의 작은 교훈) Thomas W. Young, MD Heartland Forensic Pathology, LLC, Kansas City, MO, United States After attending this presentation, attendees will recognize logically sound and unsound ways to reason from evidence. This presentation will impact the international forensic community by instructing forensic scientists about how to infer in a way that is logical, truthful, and reliable, particularly when offering sworn testimony in a courtroom. Scientists have made great progress in recent years in the development of technology useful for court cases; however, there is little mention in the forensic science literature about how to draw truthful conclusions from such scientific evidence. Consider this “little lesson in logic:” The Inferential Test (추론 테스트) One can be reasonably certain if witness accounts of the past are consistent or not consistent with physical evidence in the present, but one cannot reliably surmise past events from physical evidence unless there is only one plausible explanation for that evidence. (과거에 발생한 일에 대한 목격자의 진술과 현재 존재하는 물리적 증거가 일치하거나 일치하지 않는 경우에는 상당히 확신을 가질 수 있지만, 해당 증거에 대한 타당한 설명이 한 가지 밖에 없는 경우가 아니라면, 물리적 증거를 통해서 과거에 발생한 일에 대해 확실히 추정할 수는 없다.) This statement has been translated into logical operator notation and demonstrated to be a tautology—a statement that is necessarily true—using truth tables and proofs.1 Consider the following five statements, all given the scientific fact that 1 + 1 = 2: 1. (1 + 1) → 2 If one was added to one, then the sum is two. 2. ~(2) → ~(1 + 1) If the sum is not two, then one was not added to one. 3. 2 → (1 + 1) If the sum is two, then one was added to one. 4. 2 → (1 + _) If the sum is two, then one was added to __. 5. ~(1 + 1) → ~(2) If one was not added to one, then the sum is not two. The five statements above are conditional “if…, then…” statements. The item to the left of the conditional arrow—the antecedent—can represent what happened in the past, and the item to the right of the arrow—the consequent—can represent the consequent physical evidence discovered in the present. The first two statements are simplified versions of modus ponens (the way of affirming) and modus tollens (the way of denying)—two famous valid argument forms. The first two statements are true. They represent comparing a witness account of the past to physical evidence in the present for consistency (“What he said could happen.”) or inconsistency (“What he said could not happen.”). Such comparisons are valid for certainty. The third statement in the invalid argument form of affirming the consequent is not correct because numbers too numerous to count other than 1 and 1 can be added together to get a sum of 2. Truthfully, the statement should read: 2 → (? + ?). Surmising past events from physical evidence cannot be done reliably. Such complex inferences are unlikely to be true unless there is only one plausible explanation for the evidence (Statement 4).1 The fifth statement in the invalid argument form of denying the antecedent is also not correct because not adding 1 to 1 can also lead to a sum of 2. Truthfully, the statement should read: ~(1 + 1) → ?. If witnesses are presumed to offer false statements, then the truth of what happened cannot be known. The illustrations above can also apply to forensic casework. The famous Australian case of Lindy Chamberlain is an important example of what can happen when inferences from scientific evidence are not sound. Reference: 1. Young TW. The Inferential Test is Always True. Think of it as a Law. http://www.heartlandforensic.com/writing/the-inferential-test-is-always-true-think-of-it-as-a-law. Keywords: EVIDENCE REASONING; INFERENTIAL TEST; LOGIC END SESSION I BEGIN SESSION II GLOBAL PERSPECTIVES ON CONTEMPORANEOUS TESTAMENTARY CAPACITY EVALUATIONS Daniel A. Martell, PhD Park Dietz & Associates, Newport Beach, CA, United States Contested wills, trusts, and estate plans are extremely expensive to litigate and can significantly erode the corpus of the estate while undermining the wishes of the testator. This is particularly true when late changes are made to an existing plan. Planning for the possibility of a will contest is both prudent and cost-effective. International standards and practices in this area of mental health law will be presented. A careful and well-documented assessment of the testator’s capacity, prepared at the time that a will is executed or any significant changes are made, can document and preserve evidence of the testator’s competency and freedom from undue influence. This simple step can prevent years of litigation and unnecessary delays in executing the client’s desires. The evaluations include both psychiatric examination and neurocognitive testing. Evaluations are generally conducted as close in time as possible to the date that a will or trust will be signed and can usually be completed in one day. The process includes a careful review of the medicolegal record; objective psychodiagnostic and neuropsychological testing; a meticulous forensic psychiatric examination of the testator(s), and interviews of significant others, as needed. A customized neuropsychological test battery is administered, tailored specifically to those cognitive functions most relevant to testamentary capacity, including attention, concentration, memory, and executive functioning. Psychodiagnostic testing is used to evaluate psychiatric symptoms. The forensic psychiatric examination includes taking a complete history, a mental status examination, careful documentation of the testator’s competency, and assessment of the factors that increase susceptibility to, or protect against, undue influence. The evaluation can be digitally recorded, if desired. A comprehensive report is then prepared to be filed with the estate plan. Keywords: NEUROPSYCHOLOGICAL TEST; MENTAL HEALTH LAW; PSYCHIATRIC EXAMINATION Forensic Linguistics: Language as Clue and Evidence Carole E. Chaski, PhD ALIAS Technology, LLC, Institute for Linguistic Evidence, Georgetown, DE, United States This presentation explains the use of linguistics as a forensic science by describing cases in which language is a clue and/or evidence at trial and how standard analytical methods from linguistics can produce reliable and admissible forensic evidence. There are some criminal and civil cases in which language plays a crucial role as an investigative clue, such as kidnapping, defamation, suicide/murder, and stalking/threatening. These cases exemplify the “four corners of forensic linguistics” (Chaski 2013): linguistic profiling, identification, text-typing, and intertextuality. In the kidnapping of the Lindbergh baby, the ransom note was important in determining that the author of the note was a non-native speaker of English and a native speaker of German. The language of the note provided an investigative clue. This clue was developed by applying a standard method in linguistics to a forensic issue. The standard linguistics method is contrastive analysis—a tool in linguistics that demonstrates differences in structure between languages. These differences between English and German can then be applied to the forensic issue of authorship: whoever wrote the note showed native German-speaking tendencies in writing English. Contrastive analysis is a reliable method in linguistics based on the theory of linguistic structure; when it is applied forensically, it produces reliable results as linguistic profiling. Defamatory statements posted anonymously or pseudonymously on the internet create an actionable event which is becoming more frequent. The investigative question is who authored the defamatory statements. Given a pool of suspects, it is possible to determine authorship with a high degree of accuracy based on the syntactic structures. The linguistic method of syntactic analysis can be combined with a proprietary categorization mechanism so that documents from one author can be reliably differentiated from another, and a questioned document can be assigned to a known author. This authorship method has been used in more than 25 cases and admitted into trial testimony in both federal and state courts in the United States under both Frye and Daubert standards. In suicide investigations, two questions can arise about a suicide note: (1) Is this a real suicide note?; and, (2) Did the decedent author the note? The authorship issue can be resolved using the method previously described while the authenticity of the suicide note requires a different method. Based on text classification methods in computational linguistics, a statistical procedure can determine with a strong degree of accuracy whether the note is a real suicide note or some other type of document. Text authentication or text-typing procedures have also been developed for threatening communications, predatory chats, and deceptive language. In claims of ownership and contractual torts, an issue that often arises is whether or not two documents are truly independent of each other. If, for example, employee manuals or patents are truly independent, they will overlap to a baseline—because they include similar information—but they will not overlap much higher than the baseline. When the texts are “too close” because their overlaps are far higher than the baseline expected in an industry, then there is reason based in linguistics to assert that the employee manuals or patents are not independent and constitute a violation of a non-compete agreement or patent infringement. Intertextuality can be measured and the similarity used to demonstrate dependence and relationship between texts. Keywords: FORENSIC LINGUISTICS; LINGUISTIC PROFILING; SYNTACTIC STRUCTURES Diminishing the Death Penalty John L. Young, MD New Haven, CT, United States On May 27, 2014, the U.S. Supreme Court announced its decision in the case of Hall vs. Florida.1 It reversed that state’s highest court’s decision that had upheld the death penalty for Mr. Freddie Lee Hall, a cognitively impaired defendant convicted of murder. In essence, the Florida Court must now address its narrow use of IQ scores to determine the presence or absence of cognitive impairment for sentencing in death penalty cases. Writing for the 5:4 majority, Justice Anthony Kennedy succinctly (and most quotably) pointed out that “intellectual disability is a condition, not a number.” For its important part, the dissent, represented by Justice Samuel Alito, warned against “instability” and “protracted litigation,” by no means an inapt assertion in view of the fact that Mr. Hall’s crime took place in 1978. This decision is unusual for the extent to which it favorably quotes from professional societies’ amicus curiae briefs. According to some commentators, it also may be seen as following the general direction of evolving public opinion.2 This decision is likely to have an immediate impact on several other states. It will also be interesting to continue to follow Mr. Hall’s own case. We should, in addition, look for global opinions responding to the decision. These may in turn impact the thinking of the American public, including its reactions to such developments as the ongoing difficulties of procuring chemicals for execution by lethal injection. American states are gradually abolishing the death penalty, and none has recently reestablished it. Perhaps a welcome maturing tend is at work here. But what is then to be made of developments in such areas as the control of firearms? As citizen professionals sharing a shrinking globe, we are subject to increasing scrutiny of ourselves and of one another. As responsible forensic scientists, we share the burden of managing our collective reputation in the public arena. The Hall case can well serve to focus our responses to these challenges. References: 1. No, 12-10882, citations pending 2. Liptak, A: Justices reject a rigid IQ rule for executions. New York Times, 28 May 2014, p.A1 Keywords: DEATH PENALTY; COGNITIVE IMPAIRMENT; HALL VS. FLORIDA Detection Malingering with the Autobiographical Implicit Association Test Laura Muscatello1, Annabella Alice Pozzoli2 1 Reggio Emilia, Italy; 2Legnano, Italy A new method that can be used to identify a true autobiographical memory (the intentions and reasons that motivate an act) is the autobiographical Implicit Association Test (aIAT).1 It is a variant of the Implicit Association Test (IAT) which is used to establish whether an autobiographical memory trace is encoded in the respondent’s mind/brain.2 With the aIAT, it is possible to evaluate which of two autobiographical events is true. The method consists of a computerized categorization task. The aIAT includes stimuli belonging to four categories. Two of them are logical categories and are represented by sentences that are always true (e.g., “I am in front of a computer.”) or always false (e.g., “I am climbing a mountain.”) for the respondent. The two other categories are represented by alternative versions of an autobiographical event (e.g., “I went to London for Christmas.” or “I went to Seattle for Christmas.”), only one of which is true. The true autobiographical event is identified because, in a combined block, it gives rise to faster Reaction Times (RTs) when it shares the same motor response with true sentences. Validation experiments have documented very high classification accuracy over a wide range of tests, with average accuracy rates exceeding 90%. The aIAT has been validated in both forensic and clinical settings. This presentation reviews the aIAT main applications (including malingered whiplash syndrome, malingered depression, etc.) and features one of the most popular Italian cases in which it was used: The Case of Como. References: 3. Sartori, Agosta, Zogmaister, Ferrara, & Castiello, 2008 4. Greenwald, McGhee, & Schwartz, 1998 Keywords: AUTOBIOGRAPHICAL IMPLICIT ASSOCIATION TEST (aIAT); IMPLICIT ASSOCIATION TEST (IAT); THE CASE OF COMO Technologic Advances on Chemical Identification Standards Victor W. Weedn, MD, JD George Washington University, Washington, DC, United States The Scientific Working Group for the Analysis of Seized Drugs (SWGDRUG) has divided chemical analytical techniques into three categories based upon their discrimination potential. However, advances in the technologies have rendered this schema overly simplistic. Gas Chromatography–Mass Spectrometry (GC/MS) has been a mainstay of chemical identification in forensic science labs since the 1970s. GC/MS has been useful for drug chemistry, toxicology, and trace evidence and was considered a definitive chemical identification technique. However, traditional GC/MS only produces nominal mass values. Different compounds may have the same nominal mass. Current MS instrumentation can produce accurate and exact mass, which can differentiate many compounds with the same nominal mass, but there are still compounds with the same exact mass that need to be discriminated, either by separation techniques, fragmentation, or alternate interrogation. Stereo-isomers are particularly problematic. Time-of-flight mass spectroscopy, capillary electrophoresis, new chromatographic columns and nano-Liquid Chromatography (LC), supercritical fluid extraction, Raman spectroscopy, and benchtop Nuclear Magnetic Resonance (NMR) analyzers are among the newer techniques that have yet to achieve widespread utility in forensic science labs. Meanwhile, forensic science labs seem to cling to older technologies that are particularly problematic. The relationship of these newer technologies to SWGDRUG criteria will be discussed. Keywords: CHEMICAL IDENTIFICATION; SWGDRUG; GC/MS Trends in Licit and Illicit Drug-Related Deaths in Florida from 2001 to 2012 Dayong Lee, PhD1, Chris Delcher, MS2, Mildred M. Maldonado-Molina, PhD3, Lindsay A. Bazydlo, PhD4, Bruce A. Goldberger, PhD5 1-5 University of Florida College of Medicine, Gainesville, FL, United States Background: Drug-induced mortality has steadily increased over the past two decades in the United States. Drug mortality surveillance systems are valuable in monitoring drug use patterns over time and for evaluating the impact of drug control policy. Florida, the fourth most populous state and the epicenter of the recent prescription drug epidemic in the U.S., maintains a timely drug mortality surveillance system. In this presentation, attendees will be informed about the mortality rates for licit and illicit drugs in Florida from 2001 to 2012. This study examined yearly patterns, demographic characteristics, and statistical correlations between drug trends. The ratio of drugcaused deaths and drug-present deaths were evaluated to study relative lethality of specific substances. Methods: All drug-related deaths reported by Florida medical examiners to the Medical Examiners Commission (MEC) through toxicology reports from 2001 to 2012 were included. A death was considered “drug-related” if at least one drug was identified in the decedent. A single drug was reported as either a causative factor or a presenting factor (i.e., identified in nonlethal concentrations), but not both. Multiple drugs can be listed as a cause of death and/or present. On the whole, deaths related to the following drugs or drug groups were reported: amphetamines, benzodiazepines, cannabis, carisoprodol/meprobamate, cocaine, ethanol, gamma-hydroxybutyrate, heroin, inhalants, opioids, phencyclidine, and zolpidem. Results: The rate of drug-caused deaths was 8.0 per 100,000 population in 2001, more than doubling to 17.0 in 2010, then decreasing to 13.9 in 2012. Among drugcaused deaths, 34.7%-59.2% involved more than one drug and these polydrug deaths continuously increased over the years. Ethanol-caused mortality increased until 2009, then the rate stabilized to 3.0-3.1 for the subsequent four years. Alprazolam and diazepam contributed to the majority of benzodiazepine-caused deaths. However, less than 10% were solely due to alprazolam or diazepam; methadone, oxycodone, and cocaine were frequently co-involved. Similar to benzodiazepines, opioid-caused mortality rates peaked in 2010 and started to decline (-28%) in the period 2010-2012. The rate of oxycodone-caused deaths increased from 1.9 to 8.0 per 100,000 population over 2001-2010 and decreased to 3.8 in 2012. Annual rates of heroin-caused mortality were negatively correlated with opioids and benzodiazepines (ρ ≥ -0.670; p ≤ 0.034, respectively). Further, an increase in heroin-lethality (4.8 in 2010 to 12.0 in 2012) coincided with the increase in average heroin purity reported by the U.S. Drug Enforcement Administration. Cocaine-caused death rates decreased from 4.6 to 2.8 per 100,000 population over 2007-2012. Amphetamines, zolpidem, and inhalantscaused deaths were on the rise, although rates were low (≤0.6 per 100,000). Conclusions: Declines in benzodiazepine- and opioid-caused deaths in 2011-2012 may have been related to Florida’s attempts to regulate inappropriate dispensing of prescription drugs and other factors. This period, however, was also marked by a rise in heroin-caused mortality, which may reflect growing heroin abuse as an alternative to prescription opioids. Increases in amphetamines, zolpidem, and inhalants-induced mortality are an additional public health concern. The data provide important information for understanding the relationships among drug-related deaths, drug-intake patterns, and regulations, which could aid in establishing preventive measures for future drug overdoses. Keywords: DRUG-RELATED DEATHS; BENZODIAZEPINES, OPIOIDS Challenges and Opportunities in Forensic Multimedia Evidence Analysis Zeno J. Geradts, PhD Netherlands Forensic Institute, Ministry of Justice, Den Haag, SH, Netherlands The growth of multimedia on the internet is increasing exponentially due to the large number of cameras and other sources of video and imaging that are currently available, resulting in issues concerning the digital investigation into the amount of multimedia data. Consequently, it is necessary to have new and intelligent methods developed to filter and find relevant data in big data. Several methods have been developed for indexing multimedia data for faster searches as well as methods such as facial extraction and others that identify biometric features. Additionally, techniques for camera identification are available for finding the images made with the same-source camera. In this presentation, an overview is given of developments in the field, ranging from camera identification to feature extraction from images, as well as searching for features in images and methods for finding hidden features such as veins and heart beats in images. Processes for video restoration of partly erased video files and fragmented parts of a video are likewise a challenge. Detection of manipulation of videos and images is also a field where guidelines have been developed. Forensic validation of these methods as well as determining likelihood ratios and prior incidents remain a challenge. In this presentation, several examples of determining likelihood ratios are presented together with methods for the validation of results. Keywords: MULTIMEDIA; DIGITAL INVESTIGATION; FACIAL EXTRACTION The Application of Specialized Photography for the Enhancement of Forensic Evidence Michael E. Gorn, MS Sarasota County Sheriff’s Office, Sarasota, FL, United States Using photography for the enhancement of forensic evidence should be the first step taken in the examination process. Photographic processes are typically nondestructive compared to chemical enhancement as using chemicals to develop patterns and impressions can cause artifacts to appear in the photograph, degrade the detail needed for subsequent comparisons, and/or affect further DNA testing due to the introduction of a liquid reagent. This presentation will focus on specialized photographic techniques and how they can be used to develop contrast between an area of interest and the underlying substrate. The focus will be on infrared (IR) and ultraviolet (UV) techniques, although polarized light photography and co-axial lighting will also be discussed. IR/UV photography has been around for many years; however, it does not appear to have reached widespread use in forensic analysis. One reason might be because conventional IR/UV photography using film was challenging, whereas digital technology can be used by someone with basic camera knowledge as its advent has made the process far simpler. The ability to view evidence in real-time infrared, similar to viewing using a video camera, makes this technology particularly useful for the screening of items in the field and lab. Research into the limits of IR photography will also be discussed, including when it is necessary to move into the chemical arena. The benefits of this technology can be realized on a wide variety of evidence including, but not limited to, bloodstain patterns, gunshot residue, document alteration, victim identification, injury documentation, and fingerprints. This applies to evidence both in the laboratory and in the field. Various camera systems will be discussed, along with the necessary lighting and camera conversions. Examples of how IR and UV photography have been used in casework will be presented and will include the reconstruction of bloodstain patterns on the clothing of suspects in two homicide cases. In both of these cases, infrared photography was crucial in visualizing patterns that could not be readily seen with the naked eye and were needed for the interpretation of events and reconstruction of the incident. The presenter will bring an infrared camera to provide the audience with the opportunity to see the technology first-hand and hopefully appreciate its simplicity as well as the advantages in the visualization of a variety of evidence types. Keywords: IR/UV PHOTOGRAPHY; DIGITAL TECHNOLOGY; CAMERA SYSTEMS A Preliminary Study on the Individualities of Monochromic Laser Printers Based on Banding Features 1 Ning Liu, MA1, George Chiu2, Chuntao Chen1, Daozhong Lv1 Department of Forensic Science, Jiangsu Police Institute, Nanjing, China; 2School of Mechanical Engineering, Purdue University, West Lafayette, IN, United States Banding artifacts, which are caused by photosensitive drum velocity variation or its resultant scanline spacing variation, are often perceived in outputs of laser printers as periodic light and dark bands perpendicular to the print direction. Gear transmission errors have been proven to be the main sources of these types of output density fluctuations, which were addressed as class features for forensic classification of laser printers in some researches, and frequency analysis has been used to measure the frequencies of halftone bandings. Based on the same theory and method, 50 devices of two models of HP® laser printers with several different photoreceptor drums were sampled and investigated. The bandings in both printed halftone images and black texts were analyzed with signal power spectrum. The objective of this study was to prove the possibility of discriminating documents printed by different laser printers of same type by the means of banding analysis. Three methods were used to optimize the signal extraction, including scanning in reflective mode for halftone images, scanning in film mode, and microscopic imaging with transmitted light for black texts. This study showed that when a set of specific banding frequency components characterized the class signature of a laser printer, the relative intensity of the banding signals consistently exhibited its individuality, which was reflected by the various amplitudes of the frequency components. Banding artifact can be a promising feature for eliminating the suspect printer(s) if the notable differences in relative intensity of banding signals are detected. Keywords: LASER PRINTER INDIVIDUALITY; BANDING ARTIFACT; FREQUENCY ANALYSIS