Siri Lobotimised in China to Stop it Thinking About Prostitutes

By Gary Cutlack on at

Apple's Siri personal assistant, which got into a little bit of trouble for suggesting places once could pick up accommodating ladies in China, has had the sex-finding part of its brain removed by Apple's tech people.

According to China Daily, Siri now returns the response "I couldn't find any escort services" when asked to find iOS users a late night sexy massage, with an Apple representative telling the paper: "Responding to reports from our users, we have blocked information related with 'escorts'."

Other search terms to do with violent behaviour, like asking Siri where you can buy a gun from in a hurry, also appear to have been blocked, with such queries also now returning no useful results. [The Register]