HIPAA Constraints on Brain Fingerprinting Evidence in Criminal Court: Pitfalls & Possibilities
Orwellian fears of mass-government use of neurotechnologies and the rise of a “thought police'' are seemingly moving beyond fiction to the status quo. Novel neurotechnological applications have emerged in the courtroom, in public policy decision-making, and even in legal education through conferences and coursebooks. Municipalities are aware of this move: they have actively funded it. Sustaining this proliferation is millions of dollars worth of research grants from government agencies like The John D. and Catherine T. MacArthur Foundation. [1] Yet, these investments are not without reason. The current applications of neuroimaging technologies—namely fMRI, EEG, and PET scan evidence—are extensive and versatile when verifying the credibility of witness testimony. From proof of inability to waive Miranda rights to inability to form criminal intent, as well as evidence that certain plaintiffs are still experiencing pain after their accidents, the possibilities are endless. [2] The most recent development, the P300 EEG response tool, is already in use by the CIA, though details surrounding usage are currently undisclosed. [3]
With the increase in research and development of these technologies, questions arise as to whether those who would opt to undergo a P300 scan to bolster the credibility of their testimony—who are likely to be laymen—can be reasonably expected to provide informed consent in line with the Health Insurance Portability and Accountability Act (HIPAA), which guarantees informed disclosures to public entities like the courts. Specifically, use of the P300 scan would beg two questions: (1) their familiarity with the legal specifics of the provision and (2) the strength of their working knowledge of the powers and limitations of these technologies, which researchers are still working to define. Therefore, as it currently stands, acquisition of P300 data—authorized or not—violates HIPAA guarantees, as it, by definition, falls under the category of protected health information; thus, any consent for authorized disclosure could not be a fully informed decision. Such disclosures prove precarious for a scan participant's future employment opportunities, because there is currently not enough available research regarding the extent of medical information that might be revealed; this means that a P300 analysis could reveal additional information that might also fall under the protected health information umbrella. Thus, it is not yet possible for participants to ethically come to a decision to consent to such disclosures in court.
‘P300’ refers to a particular brain wave response with a peak appearing 300 milliseconds after a familiar stimulus is exposed to the participant. It rivals other current neurological techniques for lie detection, such as fMRI, as it is said to have better temporal resolution compared to the current localization limits of the typical fMRI machine, potentially yielding more accurate results. [4] Hypothetically, this technique could serve as a better means of verifying whether suspects were at a particular crime scene. [5] Given that a 2008 analysis revealed that, in a review of 200 rulings overturned by DNA evidence, eighty percent included mistaken eyewitness testimony, the tool emerges as having extremely valuable potential in criminal proceedings and within the domain of national security. [6]
No matter how alluring the applications sound, however, there lies a common critical concern attributed to many neurotechnologies currently in the works: ensuring the privacy of participants. Specifically, the HIPAA privacy rule can be applied, as institutions gathering such data are “covered entities” because they would transmit health information in “electronic form” to courts. [7] Yet, as identified by the Department of Health and Human Services’ own research, private practitioners and physicians—those who would be contracted to conduct these tests—happen to be among the most common violators of the privacy rule. [8]
Much like the work of the CIA, little has come out on public record regarding how the courts interpret P300 data in the context of privacy concerns. In the 2003 case Harrington v. State, appellant Terry J. Harrington, convicted of first-degree murder, claimed the lower court had erroneously dismissed reports implicating another suspect, recantation of statements made by the State’s primary witness, and new brain fingerprinting evidence. Upon review, the Supreme Court of Iowa reversed the lower court’s decisions and granted him relief on a due process violation. [9] In a Daubert hearing on whether to admit the brain fingerprinting evidence, Judge Timothy O’Grady ultimately deemed the P300 technique admissible, but stated that Harrington’s attorneys had failed to prove that the presentation of such evidence would have changed the outcome of the lower court. [10] As such, there remains little legal scholarship directly engaging with potential privacy concerns.
A clearer image for what might be disclosed through evidence submissions—namely, medical data—can be seen in the cases of Koch v. West Emulsions Inc. (2006) and United States v. Mezvinsky (2002), both of which involved the same scanning process required to gather P300 data. In Koch v. West Emulsions Inc., plaintiff Carl Koch was awarded damages in a pain case surrounding wrist injuries he had sustained in a work incident the year prior. In order to prove that he was still experiencing pain, Koch received an fMRI scan which revealed a “signal in the pain matrix” was not present when his other wrist was lightly touched under the scan. [11] In United States v. Mezvinsky, defendant Edward M. Mezvinsky was charged with sixty-nine federal law violations, with twenty-four being related to fraudulent financial schemes. Here, Mezvinsky offered PET scan evidence revealing frontal lobe organic brain damage in support of an insanity claim integral to his defense. [12]
In both cases, neuroimaging in its most basic form, or scan interpretation, indicated that the participant brain’s mental states and sensitivity can be reasonably deduced from simple scan data. Indeed, arguments for how these observations might affect the subject participant’s health have already been deemed scientifically sound enough to be presented to courts. Moreover, these observations are clearly detailed and documented for public viewing when court records are made publically accessible online.
While P300 data submission is limited to the Harrington case, preliminary research suggests that EEG readings may predict future health outcomes—ones that employers could potentially use as rationale for denying employment opportunities. These predictions could be considered protected health information under HIPAA rules, as they include but are not limited to: the participant’s past, present, or future physical or mental conditions and information that could reasonably identify the individual, such as name, address, birth date, and social security number. [13]
Furthermore, current EEG scan readings provide neural atrophy rates that are predictive of when memory decline will occur for the individual being scanned. [14] Data has been used to demonstrate personality changes associated with individuals suffering from head trauma. [15] Scans have been proffered for proof of competency to stand trial. [16] Brain evidence has been offered for the purposes of reducing sentence length where there is proof of brain trauma. [17] EEG and MRI results are commonly used to support the finding of an organic mental disorder within social security disability law. [18] Similar data has been offered for proof of incapacity to enter a legally binding contract. [19]
Although this information fits the HIPAA definition of protected health information, in cases where authorization is given—meaning when courts are provided the written consent of the participant or a court-issued subpoena is applied—these disclosures would be permitted under HIPAA. Of particular interest as it relates to P300 data, HIPAA currently grants mandated disclosures “to identify or locate a suspect, fugitive, material witness, or missing person,” and, notably, to confirm or deny the presence of a defendant at a crime scene. [20] Although there are no trends that suggest subpoenas are currently being leveraged to confirm or deny the presence of suspects at crime scenes, they are not out of the realm of possibility and are legally permissible given the language of the provision.
In either case, the privacy worry is ever-present. Although individuals might consent to such disclosures, the breadth of research surrounding what can be deduced from P300 scan data suggests there is far more room within the neuroscientific research community to investigate the full capacities of these machines. As new medical conclusions arise, individuals might choose to opt-out of voluntarily consenting to such a disclosure. Thus, better informed consent for such disclosures most appropriately align with the goals of the HIPAA privacy rule.
Therefore, in the wake of new applications for analysing tangential mental and physical conditions, when observing scan data, private practitioners ought to fully disclose any potential deductions outside of what P300 readings intend to capture. Failing to do so begs the question of whether or not informed consent can actually exist in this process. As the capabilities of our clinical technologies expand, it is critical that communication of their potential risks expands with them.
Edited by Alexander Liebskind
Sources:
[1] Francis X. Shen, “Neuroscience, Mental Privacy, and the Law,” 36 Harvard Journal of Law and Public Policy 2, 661 (2013).
[2] Teneille Brown and Emily Murphy, “Through a Scanner Darkly: Functional Neuroimaging as Evidence of a Criminal Defendant's Past Mental States,” 62 Stanford Law Review 4, 1132 (2010).
[3] Patricia Wen, Scientists Eyeing High-Tech Upgrade for Lie Detectors, Boston Globe (June 16, 2001), online at https://web.archive.org/web/20010625235945/http://www.boston.com/dailyglobe2/167/nation/Scientists_eyeing_high_tech_upgrade_for_lie_detectors+.shtml (visited October 14, 2021).
[4] Shen, “Neuroscience, Mental Privacy, and the Law,” 668.
[5] Michael Gazzaniga, The Ethical Brain 110 (Dana Press 2005).
[6] Brandon L. Garrett, “Judging innocence,” 108 Columbia Law Review 55, 79 (2008).
[7] Summary of the HIPAA Privacy Rule, U.S. Department of Health & Human Services (2021), online at https://www.hhs.gov/hipaa/for-professionals/privacy/laws-regulations/index.html (visited October 14, 2021).
[8] Enforcement Highlights, U.S. Department of Health & Human Services (2021), online at https://www.hhs.gov/hipaa/for-professionals/compliance-enforcement/data/enforcement-highlights/index.html (visited October 14, 2021).
[9] Harrington v State 122, Iowa Ct. App. 9, (2005).
[10] Gazzaniga, The Ethical Brain, 112.
[11] Brady Somers, “Neuroimaging Evidence: A Solution to the Problem of Proving Pain and Suffering?,” 39 Seattle University Law Review 1, 1408 (2016).
[12] United States v. Mezvinsky 206, F. Supp. 2d 661, 663 (2002).
[13] Summary of HIPAA.
[14] Davide V. Moretti et al., “Specific EEG Changes Associated with Atrophy of Hippocampus in Subjects with Mild Cognitive Impairment and Alzheimer's Disease,” 2012 Radiology 1, 1 (2012).
[15] Donald J. Nolan and Tressa A. Pankovits, “High‐Tech Proof in Brain Injury Cases,”27 Trial 1, 27 (2005).
[16] Nathan J. Kolla and Jonathan D. Brodie,“Application of Neuroimaging in Relationship to Competence to Stand Trial and Insanity,” 1 NEUROIMAGING IN FORENSIC PSYCHIATRY ch.9, 147–48 (2012).
[17] Judith G. Edersheim et al., “Neuroimaging, Diminished Capacity and Mitigation,” 1 NEUROIMAGING IN FORENSIC PSYCHIATRY ch.10, 163–64 (2012).
[18] 3 SOC. SEC. LAW & PRAC. § 42:147 n.1 (“In some cases, the origin of the dys‐ function is readily identified with diagnostic tools such as computed tomography (CAT) scanning of the brain, magnetic resonance imaging (MRI) of the brain, or electroencephalography (EEG) which reveals the electrical brain wave patterns.”)
[19] Owen D. Jones and Francis X. Shen, “Law and Neuroscience in the United States,” 1 INTERNATIONAL NEUROLAW: A COMPARATIVE ANALYSIS 1, 354 (2012).
[20] Enforcement Highlights.