Photo of Eric Alexander

We have been following the issue of potential product liability for software, including in connection with medical devices, for a while.  Much of our attention, predictably, has been on FDA regulation of device software, including issues related to resistance to hacking to obtain information or cause harm.  Like here, here, here, and here.  We (well, Bexis and a colleague) have also taken a dive into the legal implications of the brave new world of 3D Printing.  As of that publication in 2017, there had not yet been any published decisions finding that software was a product for purposes of any state’s product liability law.  Id. at 163.  We have been surprised by the relative paucity of litigation over personal injuries against software manufacturers, social media companies, application developers, and manufacturers of various products that utilize software.  They get sued over alleged consumer economic injuries, business torts, contract disputes, and a range of anti-competitive behaviors all the time, but not that much over claims that defects in software resulted in physical injuries to actual people.  At least so far.

Given that we are just simple drug and device product liability lawyers, the issues in this post made us enlist an honest-to-goodness tech nerd lawyer, Gerry Stegmaier.  He deals with things like cybersecurity, privacy, the internet, and investigations for tech companies.  That makes this a hybrid (or perhaps cyborg) guest post with a modified disclaimer about responsibility and blame.  Not that we really think our disclaimers do much in the real world.

If science fiction has taught us anything, however, it is that technological advances can have unintended consequences.  That certainly goes for software.  HAL 9000 in 2001:  A Space Odyssey losing it on the crew – software.  Skynet in the Terminator movies achieving self-awareness, triggering a nuclear holocaust, and sending a cyborg back in time, only to have that cyborg get crushed by a hydraulic press and give humans the technology to create Skynet – software (and the time paradox).  Self-driving cars deciding who gets injured in no-win situations – software.  Asimov’s first law of robotics, that a robot may not injure a human being or, through inaction, allow a human being to come to harm – software.  Robots, of course, need to be programmed to do their thing.  More and more, as everything from vacuums to drones to automobiles seem to operate independent of contemporaneous human judgment and guidance, we will be increasingly faced with questions related to programming and the performance of that programming when tied to traditional products.

Marc Andreessen famously said, “software is eating the world.”  Unfortunately, much like litigation over obesity, the world will look to allocate responsibility for the effects of electronic over-eating.  Programming, at least in almost all of its current forms, reflects advanced human judgment and guidance as general intelligence in artificial intelligence does not yet exist, regardless of current industry hype cycles.  Increasing reliance on software and the programming it embodies may be scary to Luddites and anti-Luddites alike.  The role that humans play in the programming may assuage some, whereas others may despair at human fallibility (or our collective arrogance).  We cannot really spell that all out for everyone, but we do know that injuries happen and people sue over them.

In Holbrook v. Prodomax Automation Ltd., No. 1:17-cv-219, 2021 U.S. Dist. LEXIS 178325 (W.D. Mich. Sept. 20, 2021), we have what we think is the first decision to recognize software as a product for purposes of state product liability law.  It involves a death from an accident at an assembly line that used a bunch of robots and – as is almost always the case with product liability cases involving machinery—a plaintiff (here, an employee) who undisputedly failed to follow in-place safety policies.  Her estate sought to impose liability on a number of entities principally on the theory that the software that ran the robots was insufficient to protect her from her own irresponsible actions.  Owing to some quirks of Michigan law, it was the defendants that wanted the allegedly defective software to be considered a product.  That is the opposite of what we expect most software/programming companies would want when faced with potential liability for personal injuries.

Some Software Background

The software industry, despite occasional rhetoric suggesting it makes “products,” has much to lose and little to gain should the law surrounding intangibles be retooled.  Decisions that would habitually recognize and revisit the notion that software is not a product for purposes of product liability law will most certainly catch Silicon Valley and the rest of the software world by surprise.  Few software companies possess lawyers, much less product management teams, with any specific experience or familiarity with product liability law.  Where companies provide software for products, the software itself is almost never considered a component of the product nor is it or would it be “manufactured” using product-related standards.  In essence, the saying that if something does not have bugs it could not be software (which brings a smile to any software engineer’s face) has important implications if imported into product liability law.  Software is almost always intended to replace things that people used to do, whether that is driving, reading x-rays, or (as in Holbrook) operating an assembly line.  The law traditionally has judged what people do by a negligence standard of a “reasonable person” in the same position.  Product liability, however, imposes strict liability.  So to treat software as “products” runs the risk of replacing reasonableness with strict liability standards with respect to everything software does.  All of which brings us to the present case directly.

Holbrook’s Facts

The facts of the case are a bit complicated for those not versed in assembly line manufacturing, but we will to boil them to focus on the software liability issues for the two remaining defendants, one of which (Flex-N-Gate or “FNG”) was the parent company of the decedent’s employer and the other of which (Prodomax) designed, built, and installed the employer’s assembly line.  The companies that made the robots utilized in the assembly line had been dismissed.  FNG used the assembly line to make trailer hitch receiver assemblies for a certain brand of pickup trucks, and the decedent was employed as a maintenance technician.  To perform maintenance while the assembly line was operating, the decedent was supposed to press a button and wait for a green light before entering through a security door.  Pressing the button would turn off the robots in that area and raise the height of the safety walls around it, preventing other robots from entering the area.  She was also supposed to have locked the door to the area she was entering in an open position, which would also prevent robots from entering that area.

The decedent did none of those things.

Instead, on one day in 2015, the decedent climbed over a wall into a particular area to conduct some required maintenance.  With none of the safety measures activated, once she had addressed the maintenance issue, a robot from another section, performing as programmed, entered the zone and crushed her, after which other robots burned her (as programmed to weld parts together).  The injuries were fatal.  Shortly after the accident, the assembly line was re-programmed to power down all the robots when any zone door was opened.  Were the robots truly robots, in an Asimovian science fiction sense, they could have followed his first law, which would have precluded them (again, through programming) from harming any human.  But, this isn’t science fiction and that type of sentient self-awareness does not yet exist.  Nor does the programming to allow a machine to address the situation independently.  This reality is not at all new and remains one that philosophers, scholars and artificial intelligence and machine learning experts continue to ponder and tackle.  (Not the most august source, but this should give you some idea of what we mean here.)

Holbrook’s Analysis

Shortening the procedural history some, the decedent’s estate brought both common law negligence—which would have been the sole claim possible, had a non-programmed human run over the plaintiff with a forklift—and product liability claims based on the programming of the assembly line.  The court decided dueling summary judgment motions.

As readers will know from plenty of litigation over the years on the “FDA defense,” Michigan enacted a fairly defense-friendly Product Liability Statute in 1995.  One aspect of that statute is that all common law negligence claims are abrogated when the statute applies.  Id. at *11.  Certain statutory requirements and defenses led to the defendants wanting claims based on the programming of the assembly line to be considered “product liability” claims.  Under the Michigan statute, a product liability action seeks recovery under any theory for physical injuries “caused by or resulting from the production of a product.”  Id. at *12.  A product includes all of its component parts.  Id. at *14.  Production, in turn, is defined broadly as the “manufacture, construction, design, formulation, development of standards, preparation, processing, assembly, inspection, testing, listing, certifying, warning, instructing, marketing, selling, advertising, packaging, or labeling” of a product.  Id. at *13.  This broad definition tends to be helpful to manufacturers in Michigan.

Whether software/programming is a product under Michigan law had not been addressed by any court, so the Holbrook court’s analysis was on a clean Erie slate.  As noted above, relatively few cases anywhere have addressed this issue and those that have—without the broad definition from the Michigan statute—have found it is not.  (A recent opportunity for an appellate decision on this issue under California and/or Wisconsin law went away when the Ninth Circuit limited its decision in Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021), to the issue of immunity under Section 230 the Communications Decency Act, which generally immunizes interactive computer service providers from liability for publishing another’s content.)  Those decisions have generally looked to definitions, as in both the second and third Restatements of Torts, of products as tangible things—that is, something that can be touched—whereas intangible things like software would be a service or rights to intellectual property.  With the facts here and the Michigan statutory definition, however, the court’s focus was on the programming as a “component” of the assembly line, which was clearly a product.  2021 U.S. Dist. LEXIS 178325, **15-16.

The PLC programming is an “integral” and “essential” part of the 100 line because, as [the estate] puts it, “without . . . the PLC program . . . the robotic components would not have been orchestrated to move at all within the line.”

Id. at **16-17 (citation omitted).  Thus, the challenge to the programming in terms of safety measures was a challenge to the design of a product for purposes of the MPLS—a challenge that in most states few software producers would like to confront.  Interestingly, although the court stated that the programming was “either a product itself or a component of the [assembly] line,” id. at *16, it never directly rules that the programming, alone, was a product.  Indeed, it emphasizes that the “programming need not qualify as a product itself.”  Id. at *17.

The estate offered a number of arguments to try to avoid abrogation.  It argued that programming was not part of the assembly line’s design because it occurred after the assembly line’s installation.  “That the programming was completed at the facility does not make it any less a part of the product’s design.”  Id.at *18.  It also argued that the definition of “product” in the Restatement (Third) of Torts § 19 as “tangible personal property distributed commercially for us of consumption” should be followed, but Michigan courts had not relied on this definition and the MPLS statutory definitions discussed above were broader than the definition in the Restatement or in the current Black’s Law Dictionary.  Id. at *19.  The estate also argued that the software here would be considered a service under the Uniform Commercial Code, but the “ultra-broad definition of ‘production’ in the MPLS means that the heavy bulk of Prodomax’s work in making the [assembly] line would qualify as a provision of goods rather than services.”  Id. at **19-20.  It also challenged the premise of abrogation, but that was a dead end.

With abrogation in place and with the estate’s agreement that FNG was not a manufacturer or seller under the MPLS, it was dismissed.

Prodomax could not rely on that distinction, so it argued that the decedent’s misuse—admitted by the estate—barred recovery under the MPLS.  It did not assert other defenses under the MPLS, including the one based on comment k—that the assembly line was unavoidably unsafe.  Under the MPLS, reasonably foreseeable misuse, an issue for the trial court to decide, is not a bar to recovery.  Here, the misuse was found to be foreseeable mostly because the security walls were easily climbed over when not raised.  Id.at *23.  It was insufficient that employees were trained on and expected to follow safety protocols, because “Prodomax anticipated improper wall-climbing and took measures, mainly through installing Vertiguard walls, to make such misuse more difficult.”  Id. at *24.  That’s one difference between strict liability, even of the statutory kind, and negligence.  Plaintiffs can recover despite acting unreasonably—while products must always perform safely.

Implications

While the defendants in Holbrook achieved their near term litigation objective, the longer term implications of the decision, or similar decisions, could have grave consequences for software development generally and for liability considerations related to any devices that are connected to the Internet or otherwise utilize software.  As a practical matter, this will increasingly implicate virtually all “smart” devices.  Software is generally regarded as being iterative (that is, involving repetitive actions), and one of the most distinctive features of Internet-enabled devices is their ability to not only continuously receive software updates but also to patch and otherwise repair existing software.  Thus, the introduction of strict liability via a determination that programming or software is a product could have profound implications.

Liability for defects in the design of software or how other products utilize software may not be terribly hard to prove in some cases.  Other situations, including alleged defects in open source software or machine learning, could be much more difficult to establish either what the defect was or who was responsible.

Unlike a prescription drug, for instance, software can and does get updated without running through lengthy studies and regulatory processes.  Like in Holbrook, the ability to make a change to address a safety issue relatively quickly and easily could be used against the software/programming companies in the context of risk/utility analysis.  Software providers probably will not have the learned intermediary (or sophisticated user) doctrine, comment k, or preemption as viable defenses.  In direct-to-consumer contexts, providers routinely seek to use contracts such as click-wrap licenses to allocate software-related liability including limitations on liability, forum selection clauses, compulsory arbitration and similar protective measures.  In the product liability context when personal injuries are at issue, we would not expect those clicks to be as meaningful as their authors might hope.  Overall, technology companies and similar organizations should closely monitor the Holbrook decision and others of its ilk.  Such groups might not ordinarily pay close attention to product liability law, but it would appear that product liability law may be on the cusp of paying closer attention to software development.