Signal president Meredith Whitaker learned what not to do from Google

Meredith Whitaker, a former Google manager who is now president of Signal. (Florian Hetz for The Washington Post via Getty Images)

Florian Goetz | The Washington Post | Getty Images

Meredith Whitaker took on a leading role at the Signal Foundation last year, moving into the nonprofit world after a career in academia, government work and the technology industry.

Now she’s president of the organization that runs one of the world’s most popular encrypted messaging apps, which tens of millions of people use to keep their chats private and out of sight of big tech companies.

Whittaker has real reasons to be skeptical of for-profit companies and their use of data — she previously spent 13 years at Google.

After more than a decade at the search giant, in 2017 she learned from a friend that Google’s cloud computing division was working on a controversial Defense Department contract known as Project Maven. She and other employees found Google hypocritical for working on artificial intelligence technology that could potentially be used to combat drones. They began discussing collective action against the company.

“People were coming together every week talking about organizing,” Whittaker told CNBC on the sidelines of Women’s History Month. “The company already had a kind of consciousness that didn’t exist before.”

In a tense environment, Google employees learned that the company reportedly paid former CEO Andy Rubin a $90 million package despite credible sexual misconduct claims against the Android founder.

Whittaker helped organize a mass walkout against the company, bringing thousands of Googlers with him to demand more transparency and an end to forced arbitration for employees. The walkout was a historic moment in the tech industry, which until then had few high-profile cases of employee activism.

“Give me a break,” Whittaker said of Rubin’s revelations and subsequent exit. “Everybody knew; the whisper network was whispering no more.”

Google did not immediately respond to a request for comment.

Whitaker left Google in 2019 to return full-time to NYU’s Institute for Artificial Intelligence, an organization she co-founded in 2017 that says its mission is to “help ensure that systems AIs are accountable to the communities and contexts in which they are “re-used”.

Whitaker never set out to pursue a career in technology. She studied rhetoric at the University of California, Berkeley. She said she was broke and needed a gig when she joined Google in 2006 after submitting her resume to She eventually landed a temp job in customer service.

“I remember the moment someone explained to me that a server is a different type of computer,” Whittaker said. “Back then, we didn’t live in a world where every child learned to program – this knowledge was not saturated.”

“Why do we get free juice?”

In addition to learning the technology, Whittaker had to adapt to the culture of the industry. At companies like Google, that meant generous benefits and lots of coddling back then.

“Part of it was trying to figure out why are we getting free juice?” Whitaker said. “It was so foreign to me because I didn’t grow up rich.”

Whitaker said she was “osmotically learning” more about the tech sector and Google’s role in it by observing and asking questions. When she was told about Google’s mission to index the world’s information, she remembers that it sounded relatively simple, even though it involved many complexities, touching on political, economic and societal issues.

“Why is Google so passionate about net neutrality?” Whitaker said, referring to the company’s fight for ISPs to offer equal access to content distribution.

Several European telecommunications providers are now calling on regulators to require technology companies to pay them a “fair share”, while the technology industry claims such costs are an “internet tax” that unfairly burdens them.

“Technological nuances and political and economic things, I think I learned at the same time,” Whittaker said. “Now I understand the difference between what we say publicly and how it can work domestically.”

At Signal, Whitaker can focus on the mission without worrying about sales. Signal has become popular among journalists, researchers and activists due to its ability to encrypt messages so that third parties cannot intercept them.

As a nonprofit, Whittaker said Signal is “existentially important” to the community and that the app has no underlying financial motive to deviate from its stated position of protecting private communications.

“We do everything we can, sometimes spending a lot more money and a lot more time, to make sure we have as little data as possible,” Whittaker said. “We don’t know anything about who’s talking to who, we don’t know who you are, we don’t know your profile picture or who’s in the groups you’re talking to.”

Tesla and Twitter CEO Elon Musk has praised Signal as a direct messaging tool, tweeting in November that “Twitter DM’s goal is to supersede Signal.”

Musk and Whittaker share some concern about companies profiting from AI technology. Musk was an early supporter of ChatGPT creator OpenAI, which was founded as a non-profit organization. But in a recent tweet, he said it had become a “profit-maximizing company effectively controlled by Microsoft.” In January Microsoft announced a multibillion-dollar investment in OpenAI, which calls itself a limited-profit company.

Beyond just the confusing structure of OpenAI, Whittaker is in on the ChatGPT buzz. Google recently entered the generative artificial intelligence market by debuting its chatbot called Bard.

Whittaker said she doesn’t see much value in the technology and struggles to see how it could be used. Eventually, the excitement will subside, though “maybe not as dramatically as Web3 or something like that,” she said.

“He doesn’t understand anything,” Whittaker said of ChatGPT and similar tools. “It predicts what is likely to be the next word in the sentence.”

OpenAI did not immediately respond to a request for comment.

She fears that companies could use generative AI software to “justify the degradation of people’s jobs”, causing writers, editors and content creators to lose their careers. And she definitely wants people to know that Signal has absolutely no plans to include ChatGPT in its service.

“On the record, as loud as possible, no!” Whitaker said.

WATCH: The AI ​​hype is real

The AI ​​hype is real and creates new 'bets' for big tech companies, says Anne Winblood

Source by [author_name]

Exit mobile version