By Supantha Mukherjee
To get GT Sophy ready for the game, different units of Sony brought in fundamental AI research, a hyper-realistic real world racing simulator, and infrastructure for massive scale AI training, the company said in a statement.
The AI first raced against four best Gran Turismo drivers in July, learnt from the race and outperformed the human drivers in another race in October.
“It took about 20 PlayStations running simultaneously for about 10 to 12 days to train GT Sophy to race from scratch to superhuman level,” said Peter Wurman, director of Sony AI America and the leader of the team who designed the AI.
While AI had been used to defeat humans in the games of chess, Mahjong and Go, Sony said the difficulty in mastering race car driving was the many decisions that need to be made in real time.
Sony’s rival, Microsoft, which recently bought Activision for $69 billion, has been using games to improve AI by offering up new challenges for AI models to solve.
Gran Turismo, a racing simulation video game, made its debut in 1997 and has sold over 80 million units.
Sony wants to apply the learnings to other PlayStation games.
“There are a lot of games that pose different challenges for AI and we’re looking forward to starting to work on those problems,” he said. (Reporting by Supantha Mukherjee, European Technology & Telecoms Correspondent, based in Stockholm; Editing by Sandra Maler)

Comments - share your knowledge and experience
Please note you must be a Maverick Insider to comment. Sign up here or sign in if you are already an Insider.
Everybody has an opinion but not everyone has the knowledge and the experience to contribute meaningfully to a discussion. That’s what we want from our members. Help us learn with your expertise and insights on articles that we publish. We encourage different, respectful viewpoints to further our understanding of the world. View our comments policy here.
No Comments, yet