jump to navigation

I’ve seen that face before somewhere… May 25, 2010

Posted by Mark Hillary in IT Services.
Tags: , , , , , , , , , , , ,
2 comments
What are the commercial implications for technology such as facial recognition?

It’s a technology that is already available today. Users of the popular Google Picasa photo-sharing site that ‘tag’ a friend in a photograph will find that the site scans their photo collection and suggests other photos where the same friend has appeared – asking if they also want to tag that photo.

But think of the implications if a computer can immediately recognise a person. Google recently launched a search tool called Google Goggles that lets users search the Internet for items using a photograph – so you can photograph something with your mobile phone and then search for whatever is in the photo. But they didn’t enable facial recognition for this tool – imagine if you could photograph a stranger on the train and find all their online social networks through a photo search. It’s a stalkers dream tool.

Commercially there should be immense opportunities for facial recognition to improve security, but the companies that are exploring these technologies also need to be aware of what people will tolerate and what is seen as beneficial. For example, most people would feel more secure at airports if passports used facial recognition technology.

But do you remember the 2002 Tom Cruise movie, Minority Report? It was set in the near future and focused on a computer that could see into the future – so the police could catch criminals before they ever committed a crime. One memorable sequence in the film shows Tom Cruise walking through a future city centre where the advertising billboards use facial recognition to profile who he is in real time and to change the advert to something appropriate to him as an individual consumer.

Privacy regulations and public mistrust are going to prevent something like that happening any time soon, but with freely available social networks now using facial recognition technology, are we already on the slippery slope to a place where anonymity is impossible?

Google: guilty as charged? May 10, 2010

Posted by Mark Hillary in Government, IT Services, Outsourcing, Software.
Tags: , , , , , ,
add a comment

Last month, three Google executives in Italy were all given six-month suspended jail sentences in a criminal trial that has been condemned by Internet observers the world over.

What was their crime? To be country managers of Google – the owner of video-sharing service YouTube – and to not have taken fast enough action when a video of an autistic teenager being bullied was uploaded to the service.

It sounds bizarre, because YouTube clearly cannot have a human operator watching every second of every video and approving it personally. The site has more than 24 hours of video uploaded every single minute – and that’s only increasing.

YouTube has strong controls on adult content and an easy to use system for users to report abusive or offensive videos. The system means that the community polices itself, but being reactive it does mean that offensive content can be available for a period of time until reported.

But Judge Oscar Magi did not rule against YouTube for not offering a strong enough system of content control. He said: “In simple words, it is not the writing on the wall that constitutes a crime for the owner of the wall, but its commercial exploitation can.”

The Judge ruled the Google managers were guilty of criminal charges in this case because the YouTube system earns money from placing adverts around the videos, therefore in the eyes of the judge these executives were directly profiting from the abuse of a child. Is it a fair application of criminal law to state that it applies differently if the defendant was making money from their actions? It seems from the decision in this trial that if the Google executives were running YouTube as a public service, with no adverts or profit, then they might have no charge to answer to.

Now, that’s even more bizarre.