Father of slain journalist takes on YouTube over video

By Tom Jackman
The Washington Post

WASHINGTON (AP) — It has been more than four years since journalist Alison Parker, doing a live television interview in southern Virginia, was killed when a former colleague walked up and shot her and videographer Adam Ward.

Despite repeated requests from her father and others, videos of the slaying remain on YouTube, as do countless other graphic videos that show people dying or that promote various outlandish hoaxes.

Andy Parker has never watched the videos of his daughter’s death, including GoPro footage recorded and posted by the shooter. But he and others have notified YouTube and Google, YouTube’s owner, that the graphic videos continue to exist on the dominant worldwide video platform.

 “We’re flagging the stuff,” Parker said. “Nothing’s coming down. This is crazy. I cannot tolerate them profiting from my daughter’s murder, and that’s exactly what they do.”

There is no specific law prohibiting YouTube from hosting disturbing videos.

So Parker filed a complaint on Feb. 20 with the Federal Trade Commission, arguing that YouTube violates its own terms of service by hosting content it claims is prohibited and urging the FTC to “end the company’s blatant, unrepentant consumer deception.”

The complaint, drafted by the Civil Rights Clinic of the Georgetown University Law Center, notes: “Videos of Alison’s murder are just a drop in the bucket. There are countless other videos on YouTube depicting individuals’ moments of death, advancing hoaxes and inciting harassment of the families of murder victims, or otherwise violating YouTube’s Terms of Service.”

YouTube said in a statement that it had removed thousands of copies of the video of Parker’s shooting since 2015.

“Our Community Guidelines are designed to protect the YouTube community, including those affected by tragedies,” the statement said. “We specifically prohibit videos that aim to shock with violence, or accuse victims of public violent events of being part of a hoax. We rigorously enforce these policies using a combination of machine learning technology and human review. … We will continue to stay vigilant and improve our policy enforcement.”

The FTC has taken action against Google and YouTube recently, fining Google $170 million in September to resolve claims that it illegally collected data about children younger than 13 who watched toy videos and television shows on YouTube.

Google to pay $170 million to settle charges it violated kids’ privacy law on YouTube.

Lenny Pozner has become a reluctant expert on how YouTube works, and how to get things removed.

His 6-year-old son Noah was killed in the shootings at Sandy Hook Elementary School in Newtown, Conn., in 2012. Soon, he faced intense harassment from people who believed the slayings of 20 children and six teachers were a hoax, including repeated publication of his home address, death threats and a book claiming that no one died at Sandy Hook. He formed the HONR Network, with hundreds of volunteers, to help change policy and remove harmful content from the Internet.

“We have had tens of thousands of pieces of content removed,” Pozner said. “Unfortunately, our success with YouTube has been largely dependent on the dedication of individual (YouTube) staff members,” who manually remove videos one at a time. “As with all of these mega-(corporations), staff turnover is high, which means that we are constantly forced to reeducate each successive team. By the time the new team understands the issues, they move on and we essentially have to start over.”

Pozner said: “Facebook has been more responsive. They have evolved their terms of service to include the banning of hate speech. Twitter has not only been the most unresponsive to date, in my opinion, their platform, as a differentiator, actively protects and promotes hate speech and disinformation.”

He said his group’s goal “isn’t a matter of removing offensive content."

"What is offensive is subjective. It is a matter of removing content that is already illegal because it defames, harasses or otherwise infringes on a victim’s civil rights.”

Experts said websites are largely protected from liability for the content that they host under Section 230 of the Communications Decency Act. But “Google and YouTube are engaged in essentially lying to their consumers about the type of content that’s on their platform, and how consistently they review that content,” said Spencer Myers, a Georgetown law student involved in the complaint. “Those deceptions violate the FTC Act” prohibiting deceptive trade practices.

“There is also an element of children being involved,” said Aderson Francois, the director of the Georgetown clinic. “YouTube represents to parents that their kids aren’t going to be exposed to things that are classically violent. They go to YouTube to watch cartoons or learn how to put on lipstick. Then they’re exposed to things like this.”

YouTube said that in one-quarter of 2019, it removed 1.3 million videos for violating policies on violent or graphic content. But it makes exceptions for material with educational, news, scientific or artistic value, such as news coverage of the Parker case with footage included.

If a video is not suitable for all viewers, warnings and age restrictions are applied, YouTube said in a statement.

YouTube also said it had updated its harassment policy in 2017, after hearing from victims’ families, to remove hoax claims, and in 2019 it assigned “protected group status” to victims of violent events or their families. The platform said it quickly removed content when flagged by users, though Andy Parker and Pozner disagreed.

Some advocates say YouTube can solve the problem of inappropriate content with the right algorithms. “I have a patent that can fix this problem very easily,” said Eric Feinberg, vice president of the Coalition for a Safer Web, who said he has presented his solutions to YouTube. “There’s no incentive for them to do it” because of the site’s massive profitability, Feinberg said. Google does not report YouTube’s annual earnings, but Google itself reported revenue of $160.7 billion in 2019.

Feinberg said using key data points — in Parker’s case her name, her station’s name and words like “shooting” or “killing” — would enable YouTube to remove not only existing videos but also to stop or flag potentially offensive videos for review before they are posted.

Instead, Feinberg said, YouTube uses “algorithm amplification” to suggest one video after another that are related to the ones being watched, supplying a steady stream of content related, at least initially, to the first videos.

Feinberg said that after he watched about 20 videos of the Parker shooting, YouTube began recommending other versions of the event to him.

“YouTube tries to say, ‘We’re just a bulletin board,’” Feinberg said. “No, you’re not. Because of the algorithm, you’re the paper, the pen and the pad.”

Parker said he has been struggling with YouTube for years, including when he launched a foundation called Four Alison to help underprivileged children. Anonymous posters began creating videos saying the foundation was a scam, featuring footage of his daughter’s death. “The cruelty that’s out there is just staggering,” Parker said.

As with Pozner and Sandy Hook, the hoaxers spurred Parker to “go down the rabbit hole” in trying to convince YouTube to remove such content. “But it’s not like you can just call Google to complain,” Parker said. “There’s no customer service number.”

He enlisted Pozner’s help. “He would flag stuff and nothing would happen,” Parker said. “Google’s terms of services say ‘We don’t allow any graphic content. All of it is bulls---. If parents really knew that Google was doing this, they wouldn’t let their kids watch YouTube.”