Nvidia AI Can Render Complete Urban Environments in Unreal Engine 4

Nvidia AI Can Render Complete Urban Environments in Unreal Engine 4

Interest in artificial neural networks has skyrocketed over the years as companies like Google and Facebook have invested heavily in machines that can think like humans. Today, an AI can recognize objects in photos or help generate realistic computer speech, but Nvidia has successfully built a neural network that can create an entire virtual world with the help of a game engine. The researchers speculate this “hybrid” approach could one day make AI-generated games a reality.

The system build by Nvidia engineers uses many of the same parts as other AI experiments, but they’re arranged in a slightly different way. To goal of the project was to create a simple driving simulator, but without using any humans to design the environment.

Like all neural networks, the system needed training data. Luckily, work on self-driving cars has ensured there’s plenty of training footage of a vehicle driving around city streets. The team used a segmentation network to recognize different object categories like trees, cars, sky, buildings, and so on. The segmented data is what Nvidia fed into its model, which used a generative adversarial network to improve the accuracy of the final output. Essentially, one network created rendered scenes, and a second network would pass or fail them. Over time, the network is tuned to only create believable data.

Nvidia plugged its AI into the Unreal Engine 4, which powers games like PUBG, Fortnite, and Octopath Traveler, and many more. However, the team didn’t render a full environment in UE4. Rather, the AI only got the basic topology of an urban environment. The AI filled in the world with buildings, roads, vehicles, and textures based on what it learned from the training data.

The resulting footage looks almost real, and Nvidia didn’t even need any massive server farms to run the AI generating this virtual world. It only needed a single GPU, although admittedly, it was a monstrously powerful graphics card — a $3,000 Titan V. The AI created 25 frames per second, which is enough to make the simulation smooth and playable. However, the AI would change the colors and textures in every frame because there’s nothing strange about that to a machine without any memory; each frame is a new event with no connection to the previous one. To fix that, Nvidia engineers created a rudimentary short-term memory for the AI to help it remain consistent.

While this technology is fascinating, the designers caution that it may be decades before games actually use AI-generated worlds. They compare this project to ray tracing to generate more realistic lighting. There are ray tracing demos that go back many years, but Nvidia only launched ray tracing-capable GPUs a few weeks ago.

Continue reading

Tesla Stops Accepting Bitcoin Due to Potential Environmental Harm
Tesla Stops Accepting Bitcoin Due to Potential Environmental Harm

Less than two months after it began accepting Bitcoin, Tesla CEO Elon Musk has announced the company is terminating the experiment, at least for now.

Corn-Based Ethanol Is Actually Worse For The Environment Than Gasoline, Study Finds
Corn-Based Ethanol Is Actually Worse For The Environment Than Gasoline, Study Finds

Corn cultivation appears to be responsible for ethanol's less-than-ideal environmental impact, according to new research by the University of Wisconsin-Madison.

Bitcoin Mining’s Environmental Impact Rivals Meat Production, Oil Drilling: Study
Bitcoin Mining’s Environmental Impact Rivals Meat Production, Oil Drilling: Study

Bitcoin isn't as much "digital gold" as it is "digital crude," say a group of researchers from the University of New Mexico.

Dish, Environmental Group File Lawsuits to Stop SpaceX Starlink Gen 2
Dish, Environmental Group File Lawsuits to Stop SpaceX Starlink Gen 2

They have different reasons for making this demand, but the end goal is the same: fewer Starlink satellites in orbit.