Originally Posted by Sony Interactive Entertainment
AI breakthroughs to date range from AI competitors who have mastered strategy games such as chess, shogi, and Go to more complex, real-time multiplayer strategy games. Gran Turismo Sophy, a collaboration between Polyphony Digital, Sony AI, and Sony Interactive Entertainment, takes machine learning to the next level by introducing an AI agent to a hyper-realistic racing simulation that requires rapid decisions within the complex dynamics of a race against opponents.
Sony AI COO Michael Spranger describes Gran Turismo Sophy as “an AI agent that learned to drive by itself at a very competitive level and is able to compete with the best GT Sport drivers in the world.” Trained via a technique called reinforcement learning, Gran Turismo Sophy began as a blank slate and evolved from an AI that could barely maintain a straight line on a track to a racer that pushed the best Gran Turismo (GT) Sport drivers in the world to their limits.
Gran Turismo Sophy’s large-scale training began in January 2021 and, after being put up against various Polyphony Digital team members and top GT drivers, Gran Turismo Sophy was ready for its first major test.
At the 1st “Race Together” event on July 2nd, 2021, Gran Turismo Sophy faced a team that included Takuma Miyazono, the top Gran Turismo Sport driver in the world, and three other top drivers. The AI agent was utilized on a team of racers and the competition was set across three tracks and three car combinations.
In timed trials, racers took on tracks solo and logged their times. Gran Turismo Sophy also raced the same courses solo and beat the racers times. However, playing one-on-one with players was a different challenge. Gran Turismo Sophy didn’t fare so well when sharing the track with other drivers in a competitive team race.