Please ensure Javascript is enabled for purposes ofwebsite accessibility

Is Facebook really having its 'Big Tobacco moment'?


The whistleblower, former Facebook product manager Frances Haugen, also asserted during an exclusive interview that aired Sunday on CBS’ “60 Minutes” that a 2018 change to the content flow in Facebook’s news feeds contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together. (CNN/CNN Newsource)
The whistleblower, former Facebook product manager Frances Haugen, also asserted during an exclusive interview that aired Sunday on CBS’ “60 Minutes” that a 2018 change to the content flow in Facebook’s news feeds contributed to more divisiveness and ill will in a network ostensibly created to bring people closer together. (CNN/CNN Newsource)
Facebook Share IconTwitter Share IconEmail Share Icon

HUNT VALLEY, Md. (SBG) — Scroll through Instagram or take a drag of a cigarette?

Pick your poison, Congress says. Either way, Facebook tried to lie about how it could harm you, according to federal officials.

Members of Congress have been comparing Big Tech to Big Tobacco ever since damning internal Facebook documents were published in the Wall Street Journal and damning testimony was given by a former Facebook employee — alleging the harmful effects social media has on teenagers and children.

The comparison

When I sued Big Tobacco, the most compelling evidence against these companies were from their own files showing studies, reports and other documents that they knew that tobacco caused cancer,” said Sen. Richard Blumenthal, D-Conn., who is helping lead the bipartisan effort to investigate and regulate Facebook. Blumenthal also sued Big Tobacco in the 1990s when he was Connecticut’s attorney general. “They knew that these products were addictive. Facebook knows that Instagram is addictive.

Back in the 1950s and ‘60s, people started to realize that the research and science behind cigarettes was uncertain. In 1964, 42% of adults smoked, and by the next year, warning labels were mandated by Congress. As more research and science came out, bans on certain ads and smoking in certain places started to take hold, including ending the use of “Joe Camel,” who, at the time, was as easily recognizable to six-year-olds as Mickey Mouse.

The year 1998 marked a big hit to the industry — the $206 billion master settlement with 46 states, the largest in history. It settled Medicaid lawsuits against the tobacco giants.

One year later, the Department of Justice sued Philip Morris USA, Altria, R.J. Reynolds Tobacco and Lorillard. They were found guilty in 2006 for engaging in a decades-long conspiracy to mislead the public about the risks of smoking, the danger of secondhand smoke and the addictiveness of nicotine. The major tobacco companies were found to have manipulated the nicotine delivery of cigarettes, deceptively marketed cigarettes as “light” or “low tar” knowing that they were still as hazardous, targeted the youth market and refused to produce safer products.

Last month, Facebook’s internal documents were released to the public.

Research reports, online employee discussions and drafts of presentations to senior management purport the following: the company has a secret exemption for high-profile accounts; the company knows Instagram is toxic for many teen girls; the company doesn’t respond to flagged drug cartels and human traffickers; and the company laid plans to market to children under age 13, according to the report from the Wall Street Journal and the whistleblower testimony.

Blair Levin is a nonresident senior fellow in the Metropolitan Policy Program at Brookings Institution, and the policy adviser to New Street Research, a global telecommunications and tech equity research firm. A former lawyer, he served as chief of staff to the Federal Communications Commission chairman at the time and oversaw the implementation of the 1996 Telecommunications Reform Act.

Levin said the framework of tobacco and social media is quite similar.

“The framework is: there’s a problem that a lot of people have talked about on the outside that the company doesn’t seem to recognize. And then it turns out that the company studied that problem and found that, just as other people had found, there’s a harm being created. And then there’s a third point that the company, in response to that knowledge, did not stand up and say, ‘Oh my goodness, we must mitigate against the harm,’ but, rather, said, ‘No problem. And let’s see if we can get more kids to do it. That’s Joe Camel.’ And it increased engagement,” Levin said.

Possible fixes

'Big Tech is the new Big Tobacco. They are harming our kids for profit.' – Rep. Ken Buck, R-Colo.

So, with that framework in mind, does Big Tech face the same regulatory, restrictive fate Big Tobacco did?

Possibly.

For tobacco, the regulations included restricting marketing and sales to youth; requiring warning labels; ensuring “modified risk claims” were supported by evidence; requiring disclosure of ingredients; banning flavors; and implementing other requirements for companies that sell the products.

To regulate Big Tech, lawmakers have proposed reforming Section 230 from the 1996 law, which states (in part), “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Some criticize the section as granting immunity to the companies, but Georgetown Law Professor Anupam Chander explained why it was originally put in place.

“The basic idea is this is not a normal newspaper. With a normal newspaper, you have two or three or five letters to the editor and you can read every one you have. You read the whole letter, you might edit the letter, and you control exactly what goes in the paper...The internet is very different...The idea is that you can’t read everything that’s posted on your site,” Chander said.

If it wasn’t for this law, the professor said, cases against tech companies for content on their sites are long, expensive and hard to determine. They don’t end up being worth litigating for the plaintiff.

And, the companies have millions and millions of pieces of content posted a day.

“It’s hard to figure out a way to expose the apps to liability for harm that occurs on the sites, because there’s inevitably going to be something that’s said that’s bad, and that’s wrong, and that’s illegal and harmful,” Chander said.

Levin said he doesn’t believe Section 230 grants total immunity to the companies; instead, it gives them immunity from what people do on their platform, but not from what they do with their platform.

To his point, a bill introduced last Thursday aims to hold tech companies liable not for the harmful content on their sites, but what they do about that harmful content.

Specifically, The House Energy and Commerce Committee is targeting algorithms, after Facebook whistleblower Frances Haugen testified that Facebook’s algorithm incites misinformation, hate speech and ethnic violence. Under this bill, the Justice Against Malicious Algorithms Act, a platform would lose Section 230 protections and be liable if it knowingly or recklessly used a personalized algorithm to suggest content that materially harms the user.

The idea, hatched and touted by Democrats, is that the companies be held accountable and rid of the “You can’t touch us” attitude. Critics say the concept wouldn’t necessarily work, because the platforms would just show users more sanitized content vetted by lawyers, or more group-targeted algorithms.

However, other potential regulations have been suggested by lawmakers, such as forcing social media companies to share more information about their software with data on how people interact with the apps, or have external researchers investigate data to determine things like health effects. Levin said this is something that very well could emerge from the regulation effort.

Other ideas including creating a new government agency to oversee tech companies, which the whistleblower recommended, passing stronger privacy and security laws for children and pushing antitrust laws to make the public less reliant on a giant few.

Congress dysfunction

'Big Tech is essentially handing our children a lit cigarette and hoping they will stay hooked for life.' – Rep. Bill Johnson, R-Ohio.

Despite the bipartisan buzz in Washington, D.C. on the matter, Levin doesn’t think Congress can do the job of reining in the tech giants. Congress’s effort to reduce liability comes from two different motivations. Republicans want the companies to be liable for censorship, but Democrats want harmful content and misinformation removed. Reforming Section 230 would give the platforms incentive to censor a lot more material, so it’s doubtful Republicans will be in favor of it.

“You've got two very different trains running in opposite directions for 230 reform,” Chander said. “It's hard to know where that all lands, or whether that just crashes. I think it's going to be a difficult challenge for Congress to find. They both agree that they hate Big Tech. Right. But they hate Big Tech for very different reasons.”

Legislation or litigation?

'I think that, for sure, technology has addictive qualities that we have to address, and that product designers are working to make those products more addictive, and we need to reign that back as much as possible.' - Marc Benioff, founder of B2B cloud computing company Salesforce

Levin said he could see Big Tech’s debacle playing out in two different ways: legislation or litigation. Legislation leading to regulation, the analyst said, doesn’t seem likely considering the division in Congress. On top of that, “Facebook has a lot of teammates,” meaning Congress is taking on Google, Twitter and the many tech giants likely to side with Facebook, which could be a tall task.

In the end, Facebook could face much bigger problems than regulation: class action lawsuits and probes from state attorney generals. That, Levin said, would be a “horrible story for them, and it will get worse every day.”

“If you were Mark Zuckerberg, would you rather face a Senate hearing or a deposition from a really smart, able lawyer who’s read all your documents and documents of all your employees?” Levin said. “Yeah, there is no question.”

He pointed out that Facebook’s strategy was to say the whistleblower wasn’t a very high-up employee or involved in many meetings, but a private attorney looks at that and thinks what she revealed is the tip of the iceberg. Furthermore, Levin said, “If there was exculpatory information that would say Facebook did a great job [handling the harmful effects found by research], I think we would have already seen that.”

Different definitions of harm

'I was once Mark Zuckerberg’s mentor, but I have not been able to speak to him about this. Unfortunately, all the internet platforms are deflecting criticism and leaving their users in peril.' – Roger McNamee, venture capitalist

Congress is divided on the problems with social media platforms, but many Americans are too.

“We’ve had these problems ever since we could communicate and put it to print and send it across to other people,” Chander said.

Some people are more concerned about harmful, hurtful or inaccurate content, while others are concerned about freedom of speech and eliminating censorship for the free circulation of ideas. Each individual can be harmed by something that others may not see any harm in.

And, good has come and is coming out of the platforms, too. As the law professor put it, “The tobacco industry didn’t have evidence that tobacco was reducing cancer rates.” But Facebook says it has proof also through its own internal research that Instagram improves the mental health of teenagers and demonstrates other positive effects.

Is the comparison accurate?

So, is posting a Facebook status or Instagram photo like lighting a cigarette?

Chander says no.

“I’m really not convinced that this is a Big Tobacco moment...The reality is that the platforms are actually far more complicated in their meaning for society than we recognize.”

Levin says if lawsuits are filed and the plaintiffs can avoid a motion to dismiss and engage in discovery, which he thinks they have “a material chance of doing,” a lot more damaging information will surface, and maybe someday, a settlement.

But Facebook, Instagram, Snapchat and all the tens of dozens of social media apps don’t seem to be going anywhere soon. Despite all the efforts of Congress to regulate and restrict tobacco companies, cigarettes and other tobacco products continue to line the shelves of gas stations and convenience stores.

Loading ...