Big Tech Speech-Police Feature To Promote ‘Inclusive Language’

Welcome to a more sensitive, alert, and attentive Google. 

The technology giant has created a “speech-police” mechanism that will identify users who use politically unacceptable phrases when searching. The feature, which will be integrated into Google Docs, would alert authors to avoid specific terms and phrases that “may not be inclusive to all readers.”

Google’s latest tool, like predictive text, is said to propose woke alternatives. When users write “landlord,” they will be given the option of “property owner,” as well as an inclusion warning. The phrase “humankind” is believed to be a recommended replacement for the seemingly contentious term “mankind.”

Also on the hit-list are gender-specific terms such as “policemen” or “housewife,” with reported suggestions of “police officers” and “stay-at-home spouse”. 

Critics have blasted the new AI-powered language tool, dubbed “assistive writing,” accusing the search engine of being both obtrusive and sanctimonious.

Big Brother Watch, a Big Tech watchdog group, termed the attempt to monitor speech for political correctness “very invasive.”

“With Google’s new assistive writing tool, the company is not only reading every word you type but telling you what to type,” said Big Brother Watch’s director of rights Silkie Carlo.

“This speech-policing is profoundly clumsy, creepy and wrong, often reinforcing bias. Invasive tech like this undermines privacy, freedom of expression and increasingly, freedom of thought,” Carlo noted.

The International Centre for Law and Economics’ Lazar Radic considered the move “incredibly conceited and patronising,” adding that the censorship “can also serve to stifle individuality, self-expression, experimentation, and — from a purely utilitarian perspective — progress.”

“What if ‘landlord’ is the better choice because it makes more sense, narratively, in a novel? What if ‘house owner’ sounds wooden and fails to invoke the same sense of poignancy? What if the defendant really was a ‘housewife’ — and refers to herself as such? Should all written pieces — including written forms of art, such as novels, lyrics and poetry — follow the same, boring template?”

Terms like “motherboard” are also considered unwoke and are subject to warnings.

However, during testing, there were allegedly severe issues.

When ex-Klu Klux Klan leader David Duke used offensive racial slurs and talked about hunting Black people in a transcribed interview, the words he used received no warnings. But when President John F. Kennedy’s inaugural address was put to the test, the speech received warnings such as replacing the phrase “for all mankind” with “for all humankind.”

The tool also indicated that Dr Martin Luther King Jr. should have substituted “the furious urgency of now” with “the intense urgency of now” in his famous “I Have a Dream” speech, according to the New York Post.

Sam Bowman of Works in Progress, added: “It feels pretty hectoring and adds an unwanted political/cultural slant to what I’d rather was a neutral product [as] a user.”

In their defence, Google said the feature was in an “ongoing evolution” designed to identify and “mitigate” unwanted word biases.

A spokesperson from Google said: “Assisted writing uses language understanding models, which rely on millions of common phrases and sentences to automatically learn how people communicate. This also means they can reflect some human cognitive biases.”

He added, “Our technology is always improving, and we don’t yet (and may never) have a complete solution to identifying and mitigating all unwanted word associations and biases.”

RELATED ARTICLE
UK Police Days Off REACH Total of 2.4 Million Working Days Due to ‘Mental Health Issues’