Grand Theft Auto creator full interview
Grand Theft Auto creator wants to bring realism to massive online games
Ever since releasing the original Grand Theft Auto in 1997, Dave Jones has been obsessed with making believable worlds. He worked on more recent online game worlds like the one in All Points Bulletin (APB), and at each point along the way, he ran into huge obstacles on a technological front.
Now, he serves as president of Cloudgine, which has created a cloud-based game engine to help alleviate the problems of running online titles. And the company released a demo game called They Came From Space to show off the quality of the cloud-gaming engine, which is designed to take advantage of cloud computing and produce massive physics-based simulations.
Jones started Cloudgine in Edinburgh, Scotland, in 2012 to develop cloud computing technology for making games. So far, studios have used the tech to create the upcoming Microsoft exclusive Crackdown 3 and Facebook’s virtual reality title, Oculus Toybox. Cloudgine can be used to make massive online games for virtual reality.They Came From Space is a “proof of concept” game for the PC, Oculus Rift, and HTC Vive virtual reality headsets. We talked to Jones about the demo, where as many as 10 players can participate in the same game world at the same time.
They Came From Space uses a funny art style and tone that borrows heavily from classic 1950s B-movies about alien invasions. In the demo, you can annihilate entire cities while playing a role as a massive alien. As the destruction takes place, you can see tons of individual particles floating in the air, pushed along by a physically accurate wind. We talked with Jones about how tough it is to make these games.
Here’s an edited transcript of our interview.GamesBeat: I wondered why you guys went ahead and did the new game as a proof of concept when you already had Crackdown 3 and the Oculus Toybox.
Dave Jones: It was mainly to show we could support additional players in VR, further to those that were in Toybox. One of the benefits of running a bunch of simulations in the cloud is we can connect a lot more players, pretty much for free. We wanted to demonstrate something with that.
GamesBeat: How many more players can get into the game?
Jones: On the server side, it scales really well. It’s pretty much a completely scalable architecture. You can just add players and add players. At some point, you’re going to hit a rendering limit on the client. But from a backend infrastructure perspective, that’s one of the big benefits of building the simulation in the cloud. It scales pretty much infinitely. We can just add more and more CPU threads to keep building up the simulation. Right now, the demonstration we’re doing live is just going to be 10 players. But that’s only because that particular demonstration needs to hit the right frames per second, 90 frames per second for VR.
The other thing, which we don’t have in either Crackdown or Toybox, we actually attach a cloud GPU to the game as well. We use that to give the VR player a selfie camera, basically. If you want to stream from your perspective, that’s just not very conducive to streaming. A 2D view from a VR perspective doesn’t look very good. But one thing we liked was, because the VR player is the main protagonist, we gave him another camera — effectively, a virtual camera that’s rendered in the cloud. It doesn’t really matter where it’s coming from … from a streaming perspective. It’s neat for streamers because obviously, that’s a new method of discovery for games. It gives them a controllable camera, so they can have it looking at their face or off to the side.
GamesBeat: Are they streaming that in VR, then, or in a 2D perspective?
Jones: To a 2D screen. It’s kind of like having a third-person camera. If you watch the video, you can see that everything’s taken from a third-person camera because it looks better that way. But it’s coming from a camera properly set up for the game. It’s not like a dev camera. It’s a camera we can give to VR players, like attaching a virtual selfie stick to themselves.
That’s a good example of — a lot of people have talked about doing actual rendered streaming for a full game. We think, from a streamer’s perspective, it makes sense because you’re sending from just one GPU to potentially hundreds of thousands of viewers. It could be applied to any game. If devs want to, they can start applying really good cinematic cameras to their games and give control of those to streamers or players. It’s a better way to stream a different perspective on the game.
GamesBeat: What are some good things to notice as far as what the game is capable of, as we look at the demo? I saw all the physics-based elements, the pieces of the buildings flying around.
Jones: Yeah, there’s lot of that. Nothing’s scripted. Everything is completely physical. It’s small things. If you wave your hands, as a VR player, or if you watch the downdraft from the drones. That’s all modeled, so the trees bend based on wind and things like that. It’s all the small neat touches that make the world much more physical, which I think players will come to appreciate.
From a VR perspective, a lot of VR worlds are a little bit sparse because you’re trying to put all your power into rendering. But this goes the other way because you get a very nice interaction with a VR world where everything is physical. That makes a big difference, but it is quite neat for a VR player to get that level of physicality and density.
GamesBeat: How are you able to do some of that? Are you offloading rendering tasks into the cloud?
Jones: That demonstration used Unreal and PhysX. Typically, in any game, when you want to look at your physics, the maximum budget for that sort of thing [is probably] going to be taking about 20 percent of your CPU, applying that to physics. Other things like AI, pathfinding, audio, they all have to fit in the budget. Normally, running something like PhysX, you can give maybe 20 percent of your CPU to it.
With this, we run PhysX completely in the cloud, and we run multiple instances of PhysX. We used something like six instances of PhysX for that game. Effectively, you can have six times, or even more so, because that can be on individual CPUs. You could have 200 percent of the CPU you’d normally see in a game just applied to physics.
We’re about to do something similar with AI as well. Typically, you have about 10 percent of the CPU budget in your game for AI. What if you could have a whole CPU, multiple CPUs, just dedicated to AI? It’s taking each of those incrementally, pushing more of them to the cloud and then scaling them up. But it’s all based on technologies developers know. They know PhysX. They know Unreal. They know how to do pathfinding and so on. We’ll start to push a lot more of those services to the cloud — but in a very usable way — using very normal game technologies.
GamesBeat: What are the different things that Crackdown 3 or Toybox used or showcased compared to what this demo does?
Jones: Crackdown 3 was using Havok in the cloud. It was more about doing complex structures, complex simulations of large buildings. It’s things like, whatever way you destroy a building, throughout the superstructure the stresses would be calculated properly, and the building would fall over in a completely non-scripted way. It was applying things like heavy compute to building superstructure.
In They Came From Space, we’re applying it more to forces from the VR players. The VR players get access to these huge arsenals of super weapons, these massive destruction beams that cut through the landscape and destroy all the buildings in their path. It’s just a different application of how game design can creatively use physics.
With Toybox, it was more about super-fast interaction in a virtual space, a social space, where you’re both playing with the same things. Any latency would have been immediately noticeable at 90 frames per second when you’re stacking blocks and passing objects to another VR player. So, that was about solving problems around very smart ownership, how we transfer ownership between players in the backend, in the cloud.
GamesBeat: Can you talk about the reason for Cloudgine and the need to bring the cloud into modern gaming?
Jones: It’s really to open up new creative opportunities. These days, we’re mostly limited by CPUs — on console and PC alike. There’s been a lot of hardware evolution in GPUs. For the last two generations of consoles, it’s mostly been about higher resolutions and HDR. This last generation is all about 4K. But rarely have we seen these kinds of huge pushes in compute power, which I believe offers the most interesting creative opportunities for new games. That controls things like AI and physics, all the things that open more doors from a design perspective.
Our goal was to start to offer developers an easy path to taking game systems that they know and saying, “If you want, you can run some of these systems as they are in the cloud. The benefit you’ll get is we’ll make them scale — from 1X to 2X to 10X. We’ll make it very easy to pick up any of these technologies and make them scale by running them in the cloud.” That way, we can start to build huge, complex simulations and dynamic worlds that just can’t be done today.
Even today, when you look at typical game worlds, they are very static. You can fire off rocket launchers and stomp around in huge mechs in a game like Titanfall, but ultimately, everything you come across in the world, every structure, is static. We like to think these worlds are very dynamic, but they’re still — the [polygons] are set. They’re indestructible. That’s one example of where we could start to make inroads on something a little bit more dynamicFor more info click here
Comments
Post a Comment