A. Vince Colella
Moss & Colella P.C.
Moss & Colella P.C.
For decades, advertising expenditure has been on the rise. Companies are now spending over 300 billion dollars a year peddling their products and services. Consequently, acquisition of market data for creating “targeted” ads to consumers is of paramount importance to sales revenue. To provide perspective, in 2021 the global market for data collection (and analysis) was estimated to be worth nearly $17.7 billion dollars. However, the clever pitches and wild imaginations of the Mad Men advertising world of the past have been long surpassed by technologically innovative ways of determining what consumers want — and, more importantly, what they need.
Has the quest for consumer preferences gone too far?
For years, internet search engines have served as windows to our consumer souls. “Cookies,” the small pieces of text sent to a browser by a website we visit — under the auspices of helping to revisit the website after leaving — have proven to be incredibly useful in determining the interest levels in commercial products and services. In the last several years, tech giants have come under sharp criticism for their clandestine collection of consumer interests, resulting in multiple class action lawsuits for the unauthorized collection, distribution, and sale of our cookie data.
Yet, the practice of acquiring data continues and has evolved further with the public’s heavy reliance on voice recognition software.
Today, our voices, fingerprints, and facial features are being recognized and digitally stored (and shared) without consent, raising legal and ethical concerns of an alarming nature. While it is certainly convenient to ask Siri to call a friend or family member, it is frightening to think that “she” hears and records everything that we are saying without our knowledge or permission. All of us have experienced engaging in a private discussion about an interest, say golf, and within minutes pick up our phone or log onto our computer to be inundated with advertisements centered around golf products, accessories, and destinations. Thus, raising the question, “Do we have a privacy right against the unauthorized use of digitally stored information for advertising purposes?”
Has the quest for consumer preferences gone too far?
For years, internet search engines have served as windows to our consumer souls. “Cookies,” the small pieces of text sent to a browser by a website we visit — under the auspices of helping to revisit the website after leaving — have proven to be incredibly useful in determining the interest levels in commercial products and services. In the last several years, tech giants have come under sharp criticism for their clandestine collection of consumer interests, resulting in multiple class action lawsuits for the unauthorized collection, distribution, and sale of our cookie data.
Yet, the practice of acquiring data continues and has evolved further with the public’s heavy reliance on voice recognition software.
Today, our voices, fingerprints, and facial features are being recognized and digitally stored (and shared) without consent, raising legal and ethical concerns of an alarming nature. While it is certainly convenient to ask Siri to call a friend or family member, it is frightening to think that “she” hears and records everything that we are saying without our knowledge or permission. All of us have experienced engaging in a private discussion about an interest, say golf, and within minutes pick up our phone or log onto our computer to be inundated with advertisements centered around golf products, accessories, and destinations. Thus, raising the question, “Do we have a privacy right against the unauthorized use of digitally stored information for advertising purposes?”
Recently, Google settled a consumer protection lawsuit for a whopping $5 billion dollars in a case alleging that Google’s analytics, cookies, and apps allowed it to track consumer activity even when they set Google’s Chrome browser to “incognito” mode and other browsers to “private” browsing mode. Brown v Google, LLC., U.S. District Court, Northern District of California, No. 20-03664. At the center of the controversy was whether Google had made a legally binding promise not to collect users’ data when they browsed in private mode. During the summary proceedings, the court ruled that Google’s privacy policy and other statements made by the company suggested that there were limits on how and what data information may be collected.
Interestingly, most states do not have specific legislation carving out civil causes of action for the unauthorized collection and sale of data. In fact, only five states have comprehensive data privacy laws. The lack of legislation and statutory penalties for the unauthorized collection and sharing of cookie data provides a convenient defense to the nature and extent of “damages” the unauthorized collection of browsing history and web activity. In other words, tech companies are asking, how are consumers hurt by advertisements for the products and services that those same consumers are actually interested in? In Brown, plaintiff’s claimed damages under several state statutes, including the California Wiretap Act, in addition to the right to privacy under the Fourth Amendment, including the disgorgement of profits related to unlawful internet tracking.
In Illinois, lawmakers passed a privacy regulation prohibiting the unauthorized collection and distribution of biometric information like fingerprints, eye scans, voiceprints, and facial geometry. BIPA, which stands for the Biometric Information Privacy Act, carries stiff statutory penalties for the unlawful collection of biometric data allowing the prevailing party to recover $1,000.00 for each negligent violation and $5,000.00 where the conduct is determined to be intentional or reckless. In perhaps the most consequential BIPA decision of 2023, Cothron v White Castle, Inc., the Illinois Supreme Court ruled that when a party collects, captures, or otherwise obtains a person’s biometric information without prior informed consent, a violation accrues. White Castle estimated the impact of this decision would subject the company to over $17 billion dollars in damages.
While biometric data may be used for far more nefarious activities, i.e., hacking, than run of the mill cookie data, the legislation provides a neat blueprint for other states to follow in crafting internet privacy legislation.
In the meantime, a Michigan “personal data and privacy” act sits in committee with proposed civil penalties, including fines of up to $2,500.00 per violation. If passed, the act would surely result in sweeping litigation within the state — the likes of which could prove devastatingly costly to companies that chose to engage in the unlawful acquisition and sale of data.
—————
A. Vince Colella is a founding partner of Southfield-based personal injury and civil rights law firm Moss & Colella.