GTA 5 is better than Cyberpunk 2077

Grand Theft Auto V: The Rise And Fall Of The DIY Self-Driving Car Lab

A few years ago, artificial intelligence researchers discovered that Grand Theft Auto V, the blockbuster 2013 video game, was good for more than stealing cars and causing mayhem on the fictional mean streets of Los Santos. Its realistic graphics and physics engine provide an excellent virtual environment for training self-driving cars. The game’s sprawling urban and rural environments are populated with pedestrians, drivers and animals that dynamically interact with the AI. Training an autonomous car system is all the more efficient when running over a pedestrian doesn’t result in real-life injuries and legal troubles.

The fun didn’t last long, though.

Over the past year, GTA V’s publisher, Take-Two Interactive, has quietly shut down a number of high-profile projects with cease-and-desist letters, according to multiple AI researchers involved in the projects. Take-Two is slamming the breaks on more than just commercial endeavors. Academic researchers have also been dealt with aggressively.

In the most notable case, a joint project between OpenAI, the Elon Musk-backed research group, and DeepDrive, a platform for training self-driving cars in GTA V, was shut down. The collaboration allowed OpenAI’s bot, called Universe, to learn how to drive a car in GTA V. The project was generating interest from researchers, but it didn’t take long for the game publisher to also take notice. A January 11, 2017 OpenAI blog post announcing the project was pulled down and the code on GitHub withdrawn. “We took down our project following the receipt of a letter from Take-Two,” said an OpenAI spokesperson.

DeepDrive was a personal project of engineer Craig Quiter, who joined Uber-owned self-driving car group Otto last November. It’s now defunct. “DeepDrive started with a mission to allow more people to work on self-driving AI, and the interest it received was absolutely incredible – something I am truly grateful to you all for,” reads a note from Quiter on the site’s home page. “Unfortunately, however, we are restricted from providing resources using GTA V for legal reasons.”

Princeton’s Autonomous Vehicle Engineering program had been using GTA V for about two and a half years, before it shut down earlier in 2017. “We have used [GTA V] as part of our research in ‘proof of concept’ efforts and [Take-Two is] not happy about it,” said Alain Kornhauser, a professor at Princeton University and faculty chair of the school’s autonomous vehicle program, in an email.

The Princeton lab is now building its own simulator called “MaxiumusFurtum IV,” Kornhauser said.

Grand Theft Auto’s developer, Rockstar Games, didn’t respond to multiple requests for comment, but has previously posted about its policy on single-player modding: “Take-Two has agreed that it generally will not take legal action against third-party projects involving Rockstar’s PC games that are single-player, non-commercial, and respect the intellectual property (IP) rights of third parties.”

The company sent a statement in the past to Mashable in response to AI researchers use of the game: “We welcome discussions about the use of our technology to help further academic research, but it’s obviously not appropriate for corporations to take our work and use it for their own financial interests or for researchers to distribute unlicensed copies of our code as part of their work without first seeking our permission.”

For the gaming world, Take-Two’s tactic comes as no surprise. The game publisher’s aggressive pursuit of game modification software, which allows users to alter gameplay logic and asset files, is well known. Most notoriously, the company went after popular modding tool OpenIV, but backpedaled after backlash from fans.

Video games — not just GTA V — have become an increasingly popular method for training AI systems how to solve complex problems. A technique known as reinforcement learning enables a computer to learn how to accomplish tasks through trial and error. The AI developer sets a reward signal for a specific objective, such as staying on the road. The computer (also called an agent) attempts to achieve the reward by trying sometimes millions of times. Reinforcement learning works well in a game environment where errors — like crashing a car — won’t cause real damage.

The AI can read the game’s image frames as if it were the real world by applying deep learning techniques such as a convolutional neural network for image recognition. And by digging down deeper into the game’s code, AI developers can simulate sensors important to real self-driving cars like LiDAR and radar.

Although using GTA V to train autonomous cars captured widespread attention in the press, the reality of using the game to train self-driving car systems for the real world may have never gone very far. Davide Bacchet, principal engineer of autonomous driving at China-based autonomous car startup NIO, told Forbes: “GTA has been extensively used for research by universities and many people around the world, but nobody in the Autonomous Driving team here in NIO ever considered using it, because we are building a safety critical system, and the level of certification that is involved is completely different.”

Bacchet said Nio makes its own virtual simulator to train its neural networks.

“I personally agree with the position taken by [Take-Two],” Bacchet continued. “If people were using their game for anything outside of pure leisure, they should have been informed.”

Similarly, Alphabet’s autonomous car unit, Waymo, has developed its own AI simulation environment it calls Carcraft and has recreated the cities of Phoenix, Austin and Mountain View. Waymo has 25,000 virtual self-driving cars navigating 8 million miles per day through the three virtual cities, according to the Atlantic.

Take-Two, however, hasn’t shut down all the AI projects associated with its game. Stephan Richter, a PhD candidate at the Technical University of Darmstadt in Germany, continues to publish research using GTA V related to visual perception systems. Richter declined to answer specific questions around his research group’s relationship with the GTA V developer, except that he has “taken care to comply with Rockstar Games’ policies,” he said. “All data we publish or have published is for research and education only.”

Harrison Kinsley, an independent Python programmer based in Texas, also continues to play around with GTA V for his own custom self-driving AI system that he’s built for fun. Kinsley suspects Rockstar and Take-Two only goes after AI researchers who are mucking around in the game’s underlying code. He said he doesn’t modify the code and his AI only reads the game’s visual information to drive autonomously.

Kinsley maintains a 24/7 livestream of his AI — which he calls Charles — on Twitch. He trained the AI how to drive with only five hours of playing time (no reinforcement learning). Charles the AI is taught to drive as fast as possible and doesn’t bother following the law. The AI careens haphazardly down the road, frequently hitting pedestrians and cars along the way. If the AI loses the vehicle it’s driving, it immediately steals another. “I wouldn’t get into a car that has this AI,” he said.

Leave a Reply

Your email address will not be published. Required fields are marked *