Photo of Bexis

Today we have a guest post on some recent developments on whether strict liability applies to software, apps, artificial intelligence, and other forms of electronic data, which depends, as this 50-state survey addresses, on whether such intangible items meet the common-law definition of “product.” It is by Reed Smith attorneys (and repeat guest posters) Mildred Segura and Jamie Lanphear. As always, our guest posters should receive 100% of the credit (and any blame) for what they have written.

**********

Those of us who grew up watching The Jetsons remember fantasizing about a future filled with flying cars and robot housekeepers like Rosey (later spelled Rosie), who handled everything from vacuuming to snarky commentary. While the flying cars haven’t arrived—yet—our present-day product liability defense attorney selves are equally captivated by the proliferation of AI, particularly in the life sciences sector.

AI’s expanding role across diagnostics, epidemiology, drug discovery, clinical trial management, personalized medicine, and healthcare operations has established it as a critical tool for advancing patient care, accelerating research, and optimizing business processes within the life sciences sector. But with innovation comes risk—and the legal landscape is evolving to address these new advancements. One of the trickiest challenges is that AI doesn’t fit neatly into existing legal frameworks. Product liability law, for example, has traditionally dealt with physical products. For decades, the prevailing view has been that software is not a “product.” With the proliferation of mobile apps, social media, and AI-enabled tools, however, that consensus is shifting.

Courts Are Redrawing the Line

Recent decisions show that courts are increasingly willing to entertain product liability claims against software-based systems—either in whole or in part—under a few different theories. Here are three (non-exclusive) approaches we’re seeing:

  1. Function-by-Function: The “Defect-Specific” Approach

The Northern District of California was the first to articulate the “defect-specific” approach in the In re Social Media Adolescent Addiction Personal Injury/Products Liability Litigation MDL. Plaintiffs—primarily parents suing on behalf of minors—allege that several social media platforms were intentionally designed to be addictive, leading to harms such as mental health issues and the sexual exploitation of minors. A key legal battle was whether these platforms should be considered “products” under traditional product liability law. Historically, courts have treated software as a service, not a product, shielding developers from product liability claims. And that was the argument made by the defendants in their motion to dismiss. The court, however, rejected this “all or nothing” approach. 702 F. Supp. 3d 809 (N.D. Cal. 2023). Instead, the court examined the specific functions alleged to be defective, evaluating whether each could be analogized to tangible personal property. For every single function it analyzed, the court concluded that there was some tangible property analogous to it and allowed the plaintiffs’ design defect claims to proceed. We have previously reported on this decision here and here.

The “defect-specific” approach has since influenced other rulings. In In re Uber Technologies, for example, the court applied a similar framework to evaluate safety features in a rideshare app, finding some sufficiently “product-like” to allow strict liability claims to proceed. 745 F. Supp. 3d 869 (N.D. Cal. 2024). In more recent cases—like AF v. Character Techs. and Doe v. Roblox—plaintiffs appear to be borrowing this playbook by focusing their product liability claims on the design and operation of specific software features, rather than the platform as a whole.

  • App-as-Product: The Platform-as-a-Whole View

Some courts have taken a broader view, evaluating whether the app or platform as a whole is analogous to tangible personal property. In T.V. v. Grindr, the Middle District of Florida took this position to conclude at the motion to dismiss stage that a dating app was a product subject to strict liability. 2024 U.S. Dist. LEXIS 143777 (M.D. Fla. Aug. 13, 2024). The court considered how the app was designed, mass-marketed, and monetized, emphasizing that while the Restatement (Third) of Torts generally defines “product” as tangible personal property, the definition also includes intangibles when their distribution and use are sufficiently analogous to the distribution and use of tangible goods. The court highlighted the need for common law to keep pace with technological change and underscored the policy behind strict liability: placing the cost of injury on the party best able to prevent it. Mass-marketing and distributing the app for profit was enough, in the court’s view, to make it a product under tort law.

  • Drawing a Line Between Design and Content

A third approach—often invoked in response to arguments under Section 230 of the Communications Decency Act (CDA)—distinguishes between content and the medium through which content is delivered. In Garcia v. Character Technologies, a case involving claims against a generative AI chatbot, the court differentiated between the chatbot’s expressive outputs (not actionable under product liability) and its designfeatures, such as inadequate age verification, which were actionable. Claims based on “content” were dismissed; claims based on “design” were not. This ruling reinforced a critical and increasingly litigated boundary: product liability applies to how a digital tool is built—not what it says.

Continuing the Trend: Nazario v. ByteDance Ltd. as the Most Recent Example

The latest example comes from New York. In Nazario v. ByteDance Ltd. (June 2025), the court declined to dismiss product liability and negligence claims against several social media platforms. The plaintiff alleged that algorithms had targeted a minor with harmful content. In its analysis, the court drew a clear distinction between passive content delivery—which is often protected by Section 230—and the use of targeted recommendation engines that leverage demographic and behavioral data.

That distinction proved significant. The court determined that, if proven, the platform’s targeting practices could constitute a design defect in a product, rather than mere editorial judgment. As a result, Section 230 did not categorically bar these claims, since they were framed around the design and function of the algorithms, not their content. This decision reinforces the growing trend of courts treating software and AI as products—at least at the motion to dismiss stage. Courts have yet to address the issue at the summary judgment stage, so it remains to be seen how they will rule once they have the benefit of a full factual record developed through discovery.

A New Wrinkle from Ohio: Deditch v. Uber Technologies, Inc.

A recent development in an Ohio case, Deditch v. Uber Technologies, Inc., 2025 WL 1928937 (N.D. Ohio July 14, 2025), adds a new wrinkle to this evolving area. There, a ride-sharing company moved to dismiss a negligence claim involving its app by arguing that the Ohio Product Liability Act (OPLA) abrogates all common law product liability claims—including those based on the alleged negligent design of its app. What’s curious about this strategy is that the defendant simultaneously agreed “that its app is not a product” under OPLA’s definition. Id. at *2. That raised a novel issue of whether OPLA’s broad abrogation of common-law product liability extended to litigation not involving statutory “products.” Id. at *3. The court thus certified the question of whether claims involving digital apps fall within the Act’s scope to the Ohio Supreme Court. This case is one to watch, as the Ohio Supreme Court’s answer could set an important precedent for how software is treated under state product liability statutes. 

Meanwhile, in the EU…

Across the Atlantic, lawmakers in countries without the Anglo-American common law tradition are not waiting for the courts. The EU recently adopted a new Product Liability Directive that explicitly defines software and AI as “products”—even when delivered as a service. While that shift does not bind U.S. courts, it does add international momentum to an already-evolving concept. For global companies, the implications are hard to ignore.

So, Is Software Now a Product?

Not always. Not everywhere. But it’s getting harder to say no. Courts are showing a growing willingness to interrogate how software works—and, in some cases, to treat it like the very things tort law was designed to address: products.

As software increasingly serves as the bridge between human decision-making and tangible, real-world outcomes, the traditional distinction between a service and a product may be breaking down.

We are witnessing this boundary shift in real time. Software, step by step, may be moving from one side of the line to the other.