Google Stifled Scientists’ Writing About AI Research

Alphabet Inc’s Google this year moved to tighten control over its scientists’ papers by launching a “sensitive topics” review, and in at least three cases, requested authors to refrain from casting its technology in a negative light, according to internal communications and interviews with researchers involved in the work, Reuters reported.

Google’s new review procedure asks that researchers consult with legal, policy and public relations teams before pursuing topics such as face and sentiment analysis and categorizations of race, gender or political affiliation, according to internal webpages explaining the policy.

According to Reuters, four staff researchers, including senior scientist Margaret Mitchell, said they believe Google is starting to interfere with crucial studies of potential technology harms.

“If we are researching the appropriate thing given our expertise, and we are not permitted to publish that on grounds that are not in line with high-quality peer review, then we’re getting into a serious problem of censorship,” Mitchell said.

Reuters reported that studying Google services for biases is among the “sensitive topics” under the company’s new policy, according to an internal webpage. Among dozens of other “sensitive topics” listed were the oil industry, China, Iran, Israel, COVID-19, home security, insurance, location data, religion, self-driving vehicles, telecoms and systems that recommend or personalize web content.

It seems to me that Google feels that it has something to hide when it comes to research not only about AI, but also about several other topics. There is no point hiring scientists to examine something if Google is just going to alter the findings in order to make the company look better. I’m suspicious that Google including the oil industry under “sensitive topics” means that Google is getting something lucrative from that industry.

Leave a Reply


This site uses Akismet to reduce spam. Learn how your comment data is processed.