Alcine, a computer programmer in New York, called out Google about the blunder that had served up the offensive racial slur on the photos he'd uploaded.
"What kind of sample image data you collected that would result in this?" he asked in a series of angry tweets Sunday evening.
His outraged comments quickly picked up traction and the attention of a senior engineer at Google, who identified himself as Yonatan Zunger on Twitter. His account was linked to a Google+ blog of a senior engineer of the same name.
The chief architect of the Internet giant's Google+ platform, promptly jumped into the fray, expressing horror at the bug and promising to get it fixed as quickly as possible.
"This is 100% Not OK," he told Alcine in a tweet. "Sheesh. High on my list of bugs you *never* want to see happen. ::shudder::" Zunger said in another.
Google still working on fixes
On Monday, Zunger said Google would stop using "Gorillas" as a label and was still clearing up the glitch in search results.
Google is also working on longer-term improvements in the use of words to label photos and its image recognition software, which automatically generates the tags.
"Lots of work being done, and lots still to be done. But we're very much on it," Zunger tweeted, explaining that image recognition software has problems with obscured faces, as well as different skin tones and lighting.
"We used to have a problem with people (of all races) being tagged as dogs, for similar reasons," he said.
Alcine has thanked Zunger for his response.
Google didn't immediately respond to calls from CNN late Wednesday seeking comment on the matter.
Previous problems: Confusing dogs and horses
It's not the first time the Internet company's programming has misidentified subjects. In May, a user caught it tagging her dogs as horses
The photo sharing service Flickr has also faced difficulties
, labeling photos of both black and white people with "ape" and "animal."
In a continuing discussion on Twitter of the problem Alcine highlighted, Zunger insisted that it was down to "ordinary machine learning trouble."
But he acknowledged that "the history of racism is what makes this error particularly bad."