Photo of Bexis

One of the things we didn’t mention in our prior post about the excellent Sardis v. Overhead Door Corp., 10 F.4th 268 (4th Cir. 2021), Rule 702 decision, was that one of the inadmissible experts had relied on “search[ing] it on Google” as the basis for some of his junk opinions.  Id. at 287 (citing transcript of expert’s testimony).

[A]t first blush, [the expert’s] testimony . . . appears relevant.  But in reality, [the expert never explained what those . . . standards are or if they even exist.  Instead, [he] pointed the district court and the jury to “Google” for standards he could not identify.  No other witness offered testimony on these unidentified standards.  That is patently insufficient to establish a duty to test a product in a certain way and a breach of that duty.

Id. at 289.  We’ve wondered about that aspect of Sardis ever since.

Then we saw a blogpost recently about another case that came to the same conclusion, Sherman v. BNSF Railway Co., 2022 WL 138630 (C.D. Ill. Jan. 14, 2022).  So we decided to take a closer look.  The answer is pretty clear − “[s]uffice it to say that Google searches by an expert to support an opinion are not an encouraging and confidence building method of research for a purported expert in a field.”  Nobles v. DePuy Synthes Sales, Inc., 2020 WL 6710810, at *2 (D.S.C. Aug. 3, 2020) (excluding search purporting to establish product identification).  Likewise, “a ‘quick Google search’ is not admissible evidence.”  Vargas v. United States, 2020 WL 6894666, at *5 (N.D. Ill. Nov. 24, 2020), aff’d, 17 F.4th 753 (7th Cir. 2021).  See Miller v. Coty, Inc., 2018 WL 1440608, at *5 (W.D. Ky. March 22, 2018) (“These opinions are excludable because they either rely on personal opinions or informal internet searches, neither of which are reliable sources of opinions under Rule 702 or Daubert.”).

An excellent example is the Sherman case, which involved junk science causation testimony in what we call a “toxic soup” chemical exposure case.  The plaintiff in Sherman had been “exposed to various toxic substances and carcinogens including but not limited to asbestos, coal dust residue, solvent fumes, oil mist, diesel exhaust, benzene, and brake dust,” and was claiming that cancer caused by “cumulative” exposure.  Id. at *1.  The excluded expert had relied on Google searches for essentially all of his litigation-inspired research.  “[H]is ‘general approach is to do a Google search, and that’s exactly what I did in this case as well.’”  Id. at *3 (quoting expert’s testimony).

But rote reliance on Google’s proprietary black-box search algorithm didn’t cut it under Rule 702.  The expert:

  • “did not retain a list of what he viewed, and what information he considered,”
  • “has no record of when the Google search was performed,”
  • did not know “what search terms he used,”
  • did not know “which sites he looked at” or “which articles he looked at,” and
  • could not testify to “what information he considered and discarded or why.”

Id.  That combination of total reliance on Google and failure to document even that methodology resulted in exclusion of a causation opinion that “seriously lacks indicia of reliability.”  Id.  Because plaintiff’s expert merely “conducted a Google search but did not retain a list of studies which related to that search,” his justification for opinions was “entirely ipse dixit.”  Id.

The Court is entirely precluded from finding [the expert’s] methodology was reliable where he did not keep any record whatsoever of the particulars of his Google search. . . .  Significantly, it would be essentially impossible for defense counsel to effectively cross-examine [him] at trial without knowing the particulars of [his] Google searches, specifically any information he reviewed and rejected and the reasons for doing so.  [This] methodology is so lacking that it would be nearly useless to apply the non-exhaustive Daubert factors to it in order to determine its reliability.  [The expert] . . . made abundantly clear that he expects the parties to accept [his causation opinion] because he says it did.

Id.  The only thing that this excellent discussion in Sherman lacks is precedential support from other Rule 702 decisions evaluating Google-based expert testimony.  That’s what this post intends to rectify.

An expert’s Google search was similarly excluded as ipse dixit in Walsh v. LG Chem America, 2021 WL 4859990, at *6 (D. Ariz. Oct. 19, 2021).  The mere existence of a large number of Google hits could not establish a design defect:

[T]he [expert’s] Report’s cited sources do not clearly indicate that the dangers posed by the batteries relate to the [claimed defect]. . . .  [A] Google search for “lithium ion battery 18650 fire” that returns 400,000 hits only proves that the batteries may have been involved in a number of fire-related incidents; it does nothing to prove that such incidents were directly the result of unprotected terminals.

Id. Thus, Walsh concluded that, under Rule 702, there was “simply too great an analytical gap between the data and the opinion proffered.” Id. (quoting General Electric Co. v. Joiner, 522 U.S. 136, 146 (1997)).

Winkler v. Madix, Inc., 2018 WL 4286197 (N.D. Ill. Sept. 7, 2018), reached essentially the same result.  The plaintiff’s warnings “expert” asserted opinions where “his sole research was to do a ‘Google search’ of potential warning labels that he believed could have been used and then to design his own label.”  Id. at *5.  That “methodology,” if it could be called that, did not pass Rule 702 muster:

[The expert’s] ultimate conclusions [case-specific description omitted] are not reliable because [he] has not adhered to the standards of intellectual rigor that are demanded in [his] professional work, such as relying on the data generated by other researchers, making proper personal observations or taking other appropriate actions. . . .  [T]he record reflects there is simply too great an analytical gap between the data and the opinion proffered by [the expert] in his expert report.

Id. at *6 (citations and quotation marks omitted).

Exclusion of an expert was affirmed in Bielskis v. Louisville Ladder, Inc., 663 F.3d 887, 894 (7th Cir. 2011), because the trial court properly concluded that an expert’s methodology “using the Internet search engine Google and typing in [a] phrase” . . . sounded more like the sort of ‘[t]alking off the cuff’ − without data or analysis − that we have repeatedly characterized as insufficient.”  Id. at 894.  In Fiveash v. Allstate Insurance Co., 2013 WL 12097615, at *3 (N.D. Ga. Aug. 1, 2013), an expert “explain[ed] that public adjusters often rely on Internet search engines, such as Google and Amazon, to look up the prices of personal property.”  Id. at *3.  That didn’t survive Rule 702 scrutiny either.  Even if that was “standard practice” outside of litigation, the expert’s “failure to provide any support requires the Court to take a leap of faith and rely on [the expert’s] ipse dixitId. at *4.  Exclusion of expert testimony was affirmed in Ervin v. Johnson & Johnson, Inc., 492 F.3d 901 (7th Cir. 2007), for similar reasons.  “[A]n Internet Google search that revealed one case report,” combined with “temporal proximity,” was not sufficient grounds to “rule in” the defendant’s drug for purposes of “differential diagnosis.”  Id. at 903.  Likewise, “a compilation of Google and Yahoo search results” was not a reliable basis for an expert opinion in vonRosenberg v. Lawrence, because “search engines tailor their results to ensure that the responses are most helpful to the searcher.”  413 F. Supp.3d 437, 451 (D.S.C. 2019) (citation and quotation marks omitted).  Similarly, a Google-search-based opinion was bounced in Price v. L’Oreal USA, Inc., 2020 WL 4937464, at *4 (S.D.N.Y. Aug. 24, 2020), because “[w]ithout a record of the materials reviewed, [the] methodology cannot be tested, challenged or replicated.”

There are many more examples of courts excluding experts where their purported methodology was little more than running Google searches and seeing what turned up.  See Longoria v. Kodiak Concepts LLC, 2021 WL 1100373, at *15 (D. Ariz. March 23, 2021) (expert’s “decision to rely on three websites solely because they were near the top of a list of 3.37 billion search results from a Google query seems questionable”); Elevate Federal Credit Union v. Elevations Credit Union, 2021 WL 4034166, at *3 (D. Utah Sept. 3, 2021) (expert’s production of Google searches not prejudicially late; inviting defendant to seek “exclu[sion]” under Rule 702 “to the extent they reflect flaws in his methodology”); Wai Feng Trading Co. Ltd v. Quick Fitting, Inc., 2018 WL 6726557, at *10 (D.R.I. Dec. 21, 2018) (“online searching and Google” as a “methodology also fails to pass the Daubert gatekeeping stage”); B.F. v. Abbott Laboratories, Inc., 2016 WL 2609794, at *4 (E.D. Mo. May 6, 2016) (excluding opinion based on an “unpublished doctoral dissertation of an unidentified student” dredged up by a “Google search”); Toffoloni v. LFP Publishing Group, LLC, 2010 WL 4877911, at *2 (N.D. Ga. Nov. 23, 2010) (expert who “used Google searches . . . did not show how this seemingly imprecise method is objectively reliable”), aff’d, 483 F. Appx. 561 (11th Cir. 2012); Makor Issues & Rights. Ltd. v. Tellabs, Inc., 2010 WL 2607241, at *5 (N.D. Ill. June 23, 2010) (opinion that ignored material produced in discovery in favor of “search[ing] the Internet using ‘Google’ and ‘Bing’ search engines” excluded); Arista Records, LLC v. Usenet.com, Inc., 608 F. Supp.2d 409, 424 & n.22 (S.D.N.Y. 2009) (that an expert “ran a few Google searches” was not a sufficient basis for an opinion).  Plainly, an expert saying simply “I Googled it” is not enough to meet the proponent’s burden of proof under Rule 702.

Nor can Google transform someone into an expert.  Google searching is no substitute for actual expert qualifications.  Hall v. Flannery, 840 F.3d 922, 930 (7th Cir. 2016), affirmed exclusion of a purported expert who “acknowledged at trial that when he read the” decedent’s “autopsy report, he ‘didn’t know what [the fatal condition] was exactly.’  So he conducted a Google search and found several papers.”  Id. at 930.  That wasn’t sufficient to establish the necessary expertise:

We do not doubt that [the witness] is an intelligent doctor who possesses considerable knowledge about [certain medical fields].  However, the record lacks sufficient evidence demonstrating that this knowledge and the related experiences render [him] qualified to opine about [the decedent’s condition].

Id.  Madison v. Courtney, 2019 WL 8263428 (N.D. Tex. Jan. 26, 2019), reached a similar conclusion – albeit more pungently:

The Court finds that [the witness] is not qualified to testify as an expert on the relevant facts. . . .  [I]t appears [the witness], instead of turning to medical or dental publications, merely ran a Google search . . ., pasted the results, and then concluded, [specific opinion language omitted], with no explanation of how his special knowledge or training assisted him in reaching an expert conclusion.

Id. at *3.  Accord Rodgers v. Beechcraft Corp., 2016 WL 7911632, at *4 (Mag. N.D. Okla. Oct. 26, 2016) (that the witness “conducted a Google search” and one other search did not “qualify him to proffer the challenged opinion”), adopted, 2017 WL 465474 (N.D. Okla. Feb. 3, 2017); Loussier v. Universal Music Group, Inc., 2005 WL 5644422, at *3 (S.D.N.Y. June 28, 2005) (the “specific research that [the witness] did for this case, such as ‘Google searches on the internet,’” did not qualify the witness as an expert).

Since anyone can run a Google search, and most everyone who might serve on a jury already has, some courts have held that such activity “falls within the common knowledge of the jurors.”  Charalambopoulos v. Grammer, 2017 WL 930819, at *19 (N.D. Tex. Mar. 8, 2017) (“running a basic Google name search does not require any specialized background, so presenting evidence on the results of such a search does not require expert testimony”).  Likewise, an expert opinion based on “the results of basic ‘Google searches’” was excluded in De Boulle Diamond & Jewelry, Inc. v. Boulle, Ltd., 2014 WL 4413608 (N.D. Tex. Sept. 5, 2014), because “[n]one of this evidence requires any special knowledge, skill, or experience to interpret” and “[t]he jury is fully capable of considering the same evidence.”  Id. at *4.

Conversely, Google searches can be admissible when undertaken in support of other, more scientific methodology.  Several cases so hold.  See Takeguma v. Freedom of Expression LLC, 2021 WL 487884, at *7-8 (D. Ariz. Feb. 10, 2021) (expert opinion relying on a Google search admissible because the opinion was also based on several other sources); Scarlett v. Doe, 2020 WL 7586942, at *5 (W.D. Wash. Dec. 22, 2020) (opinion not excludable due to incidental use of Google); Morris v. Tyson Chicken, Inc., 2020 WL 4208057, at *5 (W.D. Ky. July 22, 2020) (opinions passed “in part” on Google results admitted “especially considering the work [the expert] did in addition to his Google searches”); Geiger v. Creative Impact, Inc., 2020 WL 3268675, at *9 (D. Ariz. June 17, 2020) (the expert’s “opinion was based on much more than ‘a web search via Google’”); Cox v. Callaway County, Missouri, 2020 WL 1669425, at *3 (W.D. Mo. Apr. 2, 2020) (opinion admitted where expert “did not appear to rest his opinion on the standard of care . . . solely on a Google search”); In re E.I. du Pont de Nemours & Co. C-8 Personal Injury Litigation, 2017 WL 237778, at *8 (S.D. Ohio Jan. 11, 2017) (expert may use Google searches to “lead him to” information of a type that experts properly rely on).

We love Google.  We use it all the time – but we don’t want our experts to do that, except maybe tangentially.  We expect our experts to be real experts.