And Now For Something Completely Machinima

CM Interview with Dane Johnston (Omniverse Machinima)

May 08, 2021 Ricky Grove and Phil Rice Season 4
And Now For Something Completely Machinima
CM Interview with Dane Johnston (Omniverse Machinima)
Chapters
2:11
Dane's background in games
7:32
Omniverse Machinima and AI
12:05
Main components of Omniverse Machinima
14:25
How the Omniverse demo was created
18:39
VFX in Omniverse Machinima
21:53
Audio2Face
30:44
Who owns created content?
And Now For Something Completely Machinima
CM Interview with Dane Johnston (Omniverse Machinima)
May 08, 2021 Season 4
Ricky Grove and Phil Rice

Dane Johnston is the Director of Omniverse Connect at NVIDIA and a member of the recently released (May 4, 2021) Omniverse Machinima. We spoke with him just before the release and asked him about the Machinima app.

LINKS

Show Notes Transcript Chapter Markers

Dane Johnston is the Director of Omniverse Connect at NVIDIA and a member of the recently released (May 4, 2021) Omniverse Machinima. We spoke with him just before the release and asked him about the Machinima app.

LINKS

Ricky Grove  0:08  
Welcome machinima filmmakers My name is Ricky Grove and you're listening to completely machinima podcast. Now many of you may have noted the announcement of the odd numbers, machinima in the fall and have watched eagerly for the email. It says it's been released. Well, it hasn't quite been released yet. But at the recent graphics Technology Conference, I had a chance to watch an introduction to the omniverse machinima presented by Dane Johnston, the director of the universe Connect, and a member of the universe machinima team. I contacted him and asked him to talk to us about it. And he's here today. Hey, Dave.

Dane Johnston  0:48  
Hey, how's it going?

Ricky Grove  0:49  
It's going well, so glad that you agreed to come on here and talk to us about this really exciting technology that you guys are releasing.

Dane Johnston  1:00  
Yeah, we're super excited about machinima as a whole. The whole genre is just a ton of fun. So it's really excited to be able to do that here in a video.

Ricky Grove  1:10  
Right. How did you first hear about machinima when you were? I know you mentioned in the introduction that you had had experience in the gaming industry. When did you first hear of machinima?

Dane Johnston  1:21  
So I think the first one I ever watched was the original like, redvers boys. Oh, yeah. You know, I was that that was probably more than 10 years ago, right. And I just, oh, yeah, they were absolutely hysterical. I know some memories. All my friends made some of her own at that time. And you have just been kind of watching that whole space since then. And I've got a young daughter who watches you know, a lot of in in Minecraft game videos and things like that. game genre, right. So it's like, I know, it's got a following across many different age groups. And I'll still watch red versus blue every once in a while. I need a good laugh.

Ricky Grove  2:03  
Well, anyway, let's talk a little bit about your background. How did you come to be on the omniverse machinima team?

Dane Johnston  2:11  
Yeah, sure. I guess we'll start at the beginning. I I've been a Gamer since Gosh, I don't know since I played video games on pc. So it was kind of a natural fit. I did a bunch of mods in college for Battlefield one, I kind of modded off of the really popular desert combat mod, which oddly enough, my boss is the creator of desert combat. So it really came full circle for me. But I ended up joining a startup after college, which was physics, the geophysics startup, which I joined the technical artists to work on samples and work on games with them, and we're getting bought by Nvidia. So a bit of a video for 13 years now. And I've been working my initial crude video working Yeah, working with games, right. So I would go on site to all different game developers, and we would, we would work on increasing, you know, their, their usage of physics and putting in cool effects and stuff like that. And that kind of naturally segwayed into where I'm at with omniverse right now is working with the artists DCC tools, right. And we've been making, you know, trailers and videos for our our GDC conferences for the last, you know, my team for the last 510 years. And we just love making real time. Cinema, cinematography, right. members of my team made the video soul videos, I don't know if you remember those, those made in Unreal Engine four. So we just, we love the idea of real time, cinematics, right. And making them in games is fun. And using components from games is fun. It's just that. Yeah, that's pretty much how we dove into this. We were like, We got this platform. Let's, let's see if we can take all these game assets and make some cool stuff out of it.

Ricky Grove  4:05  
Yeah, yeah. It's my understanding that it came out of an in house way of collaboration, a system for collaboration. It within Nvidia, right.

Dane Johnston  4:14  
Yeah, right. We had all these challenges. And there was many different challenges we had one was where a lot of us even before the current situation we're in, we're remote. And I've always been in St. Louis, and our offices in California. And we have all these artists that are working with super high quality content. And it's painful across, you know, across cross continent across the world. So that was one of the big challenges. And on top of that, we had all these really neat technologies from all these researchers within Nvidia and from outside of video, and it was never like brought into one place. You know, I have this really cool fire simulation that someone made in the sample application. I'm like, I want that right now. Right? I want to I want to throw fire on ero VR, right? So we ended up being like, right, I think it's time to roll our own thing here, we want to solve all these problems, we want a single place that has the best rendering in the world, the best simulation in the world. And the best, you know, back end content management in the world. And we just, we just dove in. Sure. And it was it was really fortuitous that USD had become an open standard as well. And we just kind of jumped in on that. And we're like, Alright, well, now we've got this bag. Yeah, build off of. So kind of kind of all culminated,

Ricky Grove  5:34  
the timing was just great, because USD had slowly get started to get acceptance. And now it's much more widely distributed than it would have been, if you guys had launched the say a

Dane Johnston  5:46  
year ago. Yeah, yeah, we see. We see now direct integrations and to to the latest, Autodesk Maya, for instance, Blender has some USB support as well. And epic. Unreal Engine four has added a lot of USB support. So we and then we work with them and do what we can to try to push that as much as we can. But But yeah, it's great. It's great CD adoption. And I think it's really going to help all artists out because it's going to be easier to go between all these different tools. Indeed, well,

Ricky Grove  6:15  
you made that very abundantly clear in your demonstration of the omniverse machinima platform and moving USD content in and out, it was just great. I enjoyed that very much. I've watched it many times. Okay, I've got another question for you. You mentioned in your talk, you said that you wanted to make the omniverse platform available to anybody and everybody. Does that mean that the omniverse in omniverse, Machina will will still be free when it comes out of beta.

Dane Johnston  6:45  
So I don't know the exact I don't think we've like arrived on our exact messaging for pricing and stuff like that. But there's a strong inclination that there will be you know, like a community license very similar to what you see out of all the major game engines, right. So I don't I can't say with any, any definitive thing right now. But for permission to miss. I think that's a strong, strong chance. That's, that'll be true.

Ricky Grove  7:15  
I think that's a good idea. Hope you guys do that. I also wanted to ask you about the AI component in omniverse machinima, Nvidia has really invested heavily in deep learning and AI. Why is it important to machinima filmmakers in the omniverse machinima platform?

Dane Johnston  7:32  
Yeah, so there's even even the video we made in August, there was three main AI driven technologies that you see on screen one is, you know, the pose estimation. Then there's audio to face, which does the automatic mouth movement and everything like that based on audio only. That actually there's dlss as well for for the rendering, just so you know, get every last bit of frame rate out of whatever card you have, right. The reason that AI is important. Is that as a chance to make things easier, right, that's to me when I think about you know, what's what's important for for doing machinima is one is just ease of use, right? So have AI to clean up my animations that can make you know any of my, my puppetry of my characters significantly easier. Right, right. And that's what I need. And we have a bunch more of AI technologies kind of lined up in the wings, that we're trying to integrate machinima to make things. I think I alluded to my talk, you know, doing better things with retargeting and things like that. But that's why it's so important is that we can train it, and we can make it so the workflows are faster.

Ricky Grove  8:41  
Yep. To my knowledge, there's no other machinima application, even the GTA director mode, which is probably the most popular machinima platform right now. does not have anything like that. So I'm really psyched about that aspect of it. The minimum requirements run versus an RT x GPU. Does that mean that any GPU with the RT x technology works? Is there a workaround for somebody who has a GTX card? Or is it pretty solid that you really have to have r TX in order to make it work?

Dane Johnston  9:17  
Yeah, unfortunately, right now, it's pretty solidly in the RT x range, we do have the ability to and we'll have like OpenGL, different renderers that will work on lower end but you still need the RT x render the R TX card to render our path traced in a ray traced environments. So the actual min spec for it is right now it's it's really any RTS. Yeah, yeah. The one that's a little tough right now is going to be the R tx 2060. Or no, no, so that's a good card. We really need kinda like a paid gig. Mm hmm. For DRAM, right,

Ricky Grove  9:53  
right. Yeah, I just covered it the hard way when I thought I had a Quadro card, and I thought that would work. With an I pulled up audio to face and it was just gray. And I went Oh, no. And I found myself in that awful position of having to get an RT x card in a market in which demand so far exceeded supply that yes, I thought maybe take out a car, car loan or something, you know for it. But I managed to end up selling the quad for the same amount that I bought a nice RT x 3070 3070. I got it in there. And I was okay without too much trouble. But I was just lucky. It's gonna be a little bit hard, I think all the way through this year in order for people to upgrade their GPUs. Yeah, I wanted to ask you a little bit about the awesome live capture, you demonstrated on your iPhone 12. The wrench component and wrenches wr NCH is the company that is whose technology guys are using is only for the iPhone you mentioned you mentioned and yet, on the wrench website, they have a whole tutorial on using a webcam to do that. Is that something you guys are going to add in the future? Or can you can you do that now? Or is it still only that? That what is it the on the iPhone? tensor? Not tensor? The technology that captures the 3d image?

Dane Johnston  11:24  
Oh, right. Yeah, no.

Ricky Grove  11:25  
Um,

Dane Johnston  11:26  
so what's what's really funny is that I made my video for GDC because obviously we have to record them ahead of time, right? Because we're on right of course, we're not doing a live GTC between the time I captured that video and it released they they released their PC version. So that works out of the box. Yep. Okay.

Ricky Grove  11:45  
That's great. Oh, there's

Dane Johnston  11:47  
no nothing needed from machinima. Yep, just works. Excellent. Yeah, I

Ricky Grove  11:52  
can't wait to try that.

Dane Johnston  11:53  
Yeah. And I think the thing that's really interesting to me is it's gonna get better the more people use it.

Ricky Grove  11:59  
Yes. Yeah,

Dane Johnston  12:00  
I think I get feedback, like, Hey, I really need this, then it's just gonna get better. Yeah.

Ricky Grove  12:05  
What are the core components of the machinima omniverse app? You mentioned that in your introduction, but I'd like to have our listeners know exactly what it is.

Dane Johnston  12:16  
The most core component is a sequencer, right? So the ability to take animation clips, and put them in order and play them back right to actually kind of like build your story out. And those animation clips can come from all sorts of variety of different sources, you can source them from, you know, the internet makes the most website when you know, whatever tool you need to generate, generate clips, or you know, use ranch to generate your your still mesh clips. The second component is we also have a camera animation tool, which has no curves in there to try to make it so you can make a nice, easy set of camera animations. There's obviously the wrench AI pose estimation that we have in there. But we also have a recorder for that pose estimation. And what's really interesting about that is since since that's basically you know, live mocap and we have the ability to chord the mocap it doesn't have to say machinima so if you're a pure blender, machinist, right, and and you just want the motion capture, you can do that. You know what, we're going to export it as a USD skill animation, and you can pull that into into your other tool and use it.

Ricky Grove  13:29  
That's great.

Dane Johnston  13:30  
Yeah, that's the great thing about the USD open standard, right? Is everything that you produce, you should be able to move around.

Ricky Grove  13:36  
Yeah, I like that very much. You were talking about the materials going both ways. In and out of omniverse machinima?

Dane Johnston  13:44  
Yeah, as we say, yeah, that's that's the work of the connectors. So it I think in the video, I mean, we're shipping with some content, right? So our partners at a squad, the gaming video game squad and telltale worlds for bannerlord. They both given us content to give out to machinists. And and yeah, we use the connector workflow for that, right. So squad has a mod editor on Unreal Engine four, and we're able to kind of, you know, just right click and export and bringing that content.

Ricky Grove  14:13  
That's great. Speed is of the essence. What's the recommended workflow on on reverse machine? Am I like, for example, how did you go about putting together that really great demo video that you did?

Dane Johnston  14:25  
Yeah, we actually got another one coming out, hopefully, hopefully next week, which I think, Oh, excellent. So how we we tried to do as much in engine as we could. Now when we made the video back in August. In the end, the release date was originally very aggressive. That's the best way to put that. As you can see, we're not out yet. We use that that that time frame to learn like what the heck tools do we need to build? Right so that was and we learned that we need to build a lot so The workflow that we kind of do right now is obviously to get the content in to omniverse. And there's a couple different ways you can do that we have kind of standard FBX or OBJ import, or you know, you if you're a DCC user, you obviously go into PTC and export it to Tommy horror. So it kind of aggravate content first. That's, that's kind of the most important thing. And then we tried to say, we try to stay a lot inside of the machine of AB, we think we have enough tools and components there to let you animate both your characters and you know, random stuff like boxes, or whatever type of rigid animation you want to do, we can do inside of machinima. The other stuff we, we, and you can use this x for that animation as well, because we already have a physics engine there too. So if you want to put a bunch of rigid bodies, and if the guy knock them over, you know, there's there's ways to do that. Right now, it's kind of one of the things we want to see from from the users is, when it comes to your actual animation, it's very much kind of a standard cinematic style, right? You're gonna you're going to need an animation clip, you're going to need to put it in order. You're gonna need to set up your cameras and all stuff like that. So we don't have we don't have kind of what I would think from, you know, much older, machinima style is, you know, we don't have like a gamepad character controller yet. Right. That's something we want to add later. To let you kind of, you know, build cinematics, how you would if you're rolling inside of game, right,

Ricky Grove  16:35  
right, the puppeteering method. Yeah,

Dane Johnston  16:37  
but that's kind of where we're at right now. Yeah, yeah, we're gonna our main puppeteering method is just wrench right now. But we like that we plan on adding as we go more.

Ricky Grove  16:49  
Yeah, I noticed that when you were demonstrating adding clips, you would capture a clip and then capture another clip, and then you brought them together on the sequencer? Did it automatically make the transition between the two clips? Ah, that's a good question.

Dane Johnston  17:04  
So right now it's a hard transition. So we don't have blending between clips. So it'll be like, you know, whatever clip you had, and boom, it'll go straight to the next clip. That's actually on my shortlist for the next release is to make sure we have some sort of ability to even do like linear interpolation between zones but right now it's a hard it's hard clip to clip.

Ricky Grove  17:22  
Okay. I love the VFX in the demo that fire and mist and dust and everything was just terrific. Can you tell us on some of the other VFX showcase content and and effects you're going to have in the first release of omniverse machinima?

Dane Johnston  17:41  
Yeah, so when it comes to VFX, for the first release, your two, your two main areas of the effects are going to be flow, which was the the fire and smoke simulation. And we have blast, which is a destruction engine. Which, which runs in real time to be able to break walls and stuff like that. The one that we obviously have our post process effects you know, there's there's volumetric fog. And all sorts of, you know, lens effects for depth of field and all that that are that are nice and nice and accurately rendered. We do not have a generic particle system yet in this release, that's a that's also really high on our list. So, you know, that's kind of the big thing I think missing from for machinimas. This time around. And because of that real basic particle stuff, you're going to have to use more time sample animation this time around for that not to be much more rigid animation for effects. Yeah,

Ricky Grove  18:39  
I noticed you were talking how much you enjoyed the fire effect because the fire actually illuminated the objects in the scene. Super cool.

Dane Johnston  18:51  
I mean, that's, you know, especially come from the game background. doing that, in the past was just such a pain. You'd have to, you know, probably put your own light there, and you'd have to animate the light and try to make it feel flickery. Right. And it just never worked right. games now. Yeah, I never did. Yeah, it's it's terrible. But it's, uh, why why can you do it super expensive. So that's what's really cool with this path tracer and how fast it is that, you know, it's able to illuminate you know, for real, right. It's actually got like, the real math behind it. Yeah. makes a big difference. It really does.

Ricky Grove  19:31  
Yeah, that's one of the experiments I'm going to do when I start working with the omniverse machinima app is do some fire in the dark kinds of things and see what I can come up with. Oh, one thing I wanted to ask you about that I didn't quite understand. And it was. You mentioned that, that both of the games that are you're including content with have a ton of mods in the community? How exactly do mods work in omniverse machinima

Dane Johnston  20:00  
To the mods for the actual community. I mean, that's we're not replaying the game. Obviously in machinima right now, we're just taking the content in if if you know you're able to open up those assets, I'm just thinking, and I'm not familiar enough with the bannerlord assets format. But if there's a way to get that format into an FB x or OBJ, or USD, you'll be able to pull it in. So I'm thinking, you know, when you're when you're talking about squad, I see right, if you get access to the content for it, you should be able to, you should be able to bring it in,

Ricky Grove  20:37  
I say, so it's, you're not actually using the mod in omniverse machinima you're using it in the game, right? And then bringing that content in, I get it,

Dane Johnston  20:45  
right. So that's one thing I know about anybody in the modding community or the machinima community is they're all very resourceful is maybe the best way to put that. And I think no matter what they'll be able to find a way to get content in things that I'm not allowed to do as a corporate entity.

Ricky Grove  21:04  
Right. I remember those early days in machinima where every week some somebody had hacked something in some game, and we're all going through and on, there was such a generosity that would pass it around and go, Hey, you can do this, oh, my God, I've got a camera system in an X game, you know, and everybody rushed to do stuff. And it was so cool.

Dane Johnston  21:27  
I work on it the other way. Right. So we're more of the type where, you know, there's that great World of Warcraft export, that's been around for Oh, yeah. You know, like, 10 years and stuff like that. And we've, we've done a bunch of tests with it. And so I can't share those because like I said, you know, as Nvidia I can't sit and be like, Hey, I'm gonna make a video to use world Warcraft assets. That's not easy. At all comes in fine. And you can do all sorts of really, really cool stuff with it.

Ricky Grove  21:53  
That's reassuring. I wanted to ask you a little bit about the audio to face. And you mentioned that the you're not going to have audio auto retargeting in the initial release that you're working on that for the second one, but you can actually retarget the facial and lip sync manually right now. Right?

Dane Johnston  22:15  
Right, right. So I just, I just did some of that yesterday, just to kind of see how we were with status. So there's two there's a, there's multiple parts to this answer. With the with the retargeting and the auto retargeting. A lot of that is going to be for your, your skeleton, right. So again, if I pull like one of the World of Warcraft characters out of World of Warcraft and brought it into machinima, and I wanted to put wrench on it, I have to do but I have to do a lot of manual work right now, because that's a totally different rig than, perhaps the default rig. So that second release is to get that automatic. So this release, you're gonna have a lot better luck playing with the existing content. Right? When it comes to retargeting for the audio to face, it's actually a lot simpler than I was even led to believe when I did my initial video. It's about a 10 minute process to take your face in the face of whatever your your asset is, you have you paint weights on it, you know, for like, Where's your nose, your mouth and all stuff like that in the audio interface app. And then from there, you're actually able to export a USB cache of whatever your playback was. Once you're in machinima, it's literally a matter of selecting the face on the character, pulling in your cache and selecting that. And it's a one button wingtsun. So once they're linked, got it. Now they can play together. So it's, it's Yeah, it's surprisingly easy. easier than I thought it was gonna be. I was like, oh, man, I've got to make all sorts of UI to make this. Make this Really? Yeah, yeah. Yeah. It's one is one button. It's pretty straight. Well, that's really. Yeah. But yeah, I agree. I think people will be able to do great play with our database with this release and have good luck.

Ricky Grove  24:02  
Oh, I'm glad. I'm glad. Yeah, there's a couple of really good in video tutorials on that manual retargeting. I hope you guys in your first release, put put something out as well. It would be really helpful. You mentioned that you're going to be providing a lot of tutorials. It would be great.

Dane Johnston  24:22  
Yeah, and I'll keep doing that. I actually plan on doing a twitch streamer too, as well kind of walkthrough. Oh, great. Yeah, so people can ask some questions, and I could try to try to not follow through my fumble through my own.

Ricky Grove  24:36  
That's always embarrassing. I know. A couple other questions, and then we'll let you go. I wanted to ask you a little bit about the XR and VR element of all Wham you said that there's an xR element coming to omniverse in general. I know that's not top on your list, but how's that gonna work and what do you think of it?

Dane Johnston  24:56  
Yeah, so and I like that. I don't have it. I don't have it. That date, but we're working on both. And I think actually the XR for like a tablet camera is an experimental state, they might actually be able to access with the current create, I'll have to verify that. But no, what I'm kind of excited about for that is a few different things. So the how I've used xR VR already, and kind of my own workflows is obviously give tablet, you can use it as your virtual camera, right, so you can sit there and see see the world and you can record that camera movement. So that can be your your, your way how you make your your actual camera movements is just by looking into that virtual world, you're calling your camera, right, it's really neat. And that's a pretty straightforward way to use it. With VR. Again, you can use that for puppeteering we'll, we'll have an ik system coming out over the summer. So you'll be able to if you have five or something like that, you could, you know, set up ik between your head and points and maybe buy some of those or five trackers and put them on your hip. And things like that to get additional movement and accuracy. One thing I experimented with previously with with even with the ranch pose estimation stuff is that it's harder for a single camera tool to really understand where your hip and your feet are in the real world. Because ranch has two modes, it's got Pro mode, it's got standard mode. And the standard mode does not take into account your root moment moment. So it's only basically estimating that basically your pelvis is locked in place as the best way to put that. And for a single camera pose estimation software, it actually can be very accurate when you're in that mode. And root movement is not going to be as accurate. So one thing to do, for instance, is you can just take up the one of the our pucks, right and you put on your hip. And then you can take the best of both worlds, you can get accurate, great location for your your motion capture. And then you use just a single camera for brush your body. And it's still like, infinitely cheaper than buying one of those $10,000 body suits.

Ricky Grove  27:13  
Oh god. Yes, God. Yes, I

Dane Johnston  27:14  
think of wholesale right. So right. You know, all sorts, things like that. And you can do you know, you can jump into VR and you can do your camera work from left to right, you can get in there with your set in via VR and do all that. So I think there's a there's just infinite possibilities with xR VR, because you're actually getting yourself into the set. You're getting yourself into your production. And that that level of virtual production. I mean, I'm super excited about I really,

Ricky Grove  27:44  
yeah, yeah, I can see. We were all very excited when in our first podcast to talk about omniverse machinima, and it was an unusual reason. And it's just the machinima Inc, the company that bought machinima.com and then moved on to YouTube and created such huge system and made so much money collapsed, and it created a sort of bad taste in a lot of people's mouths. And it was one of the reasons why I left the machinima community for several years, and the term machinima became somewhat. It connotes a sort of low class quality that was associated with machinima Inc. And so we noticed that a lot of the professional companies like epic and unity weren't using the term machinima, they were using the term real time cinema or virtual filmmaking. And so we were delighted that when you guys announced the omniverse, that you use the term machinima. Because although you guys may think that there was no issue in the wider community, there was, in fact, some people that have contacted us after our podcast that said, I'm so glad you guys are using the term machinima. I was worried that it was just going to be lost, you know?

So right. Thank you guys for doing that. Was there any specific reason why you chose to do that? Or was it just it seemed obvious to

Dane Johnston  29:25  
Yeah, I think, you know, it was revenue actually leaves the entire omniverse product. That was like, yeah, this needs to be called machinima because that's what it is. Right? This is taking a real time render engine that can be used for games, it's taking game assets and it's making, you know, it's making cinematics out of it. We need to honor that tradition. And we realize because because you know, we all watch it. I think that's the biggest The biggest thing is building kind of fans of the just the genre as a whole It's more interesting to us than sometimes Hollywood cinema, right? Because we like to see what people create and we love video games, right and why Gosh, video games are so pretty now that they share our you know, what's, you know, you can do just insane stuff. So that that was that was a real the real reason behind calling it machinima and not just to make people trying to figure out how to pronounce it. But it really was it really was harkening back to that, that history, right? I mean, that is, you know, like you said, there's a lot of weird connotations around the actual company that was named as such, but, but the spirit of it is, is what's important. Well, anyway,

Ricky Grove  30:44  
I just got about one or two questions, and then then you're off to Nvidia Machina. Milan. One question. One question that has been on my mind quite a bit is that I noticed that the licenses on inside of the platform are numerous. There's like 30 or 40 licenses on there. And it made me wonder about the legal status of works that are created in omniverse machinima by machinima filmmaker, have you guys talked about that at all? What What do you think?

Dane Johnston  31:17  
You know, I'm glad you brought this up. Because that's something that's kind of easy for, you know, US engineers and artists over here, or just making a tool to kind of forget about that that's a really important component of this will provide, you know, what you can and can't do and more human terms, non non non loyally terms. But the content that we're shipping comes with a Creative Commons license attached to it, and it's really just requires attribution. Right. So I, you basically can make what you want. And you just need to make sure you say, hey, this was made and Okay, ami versus machinima and use the squad assets. For instance, you know, those those guys squad, for instance, gave us gave all of us all this content to work with, they just want a little tagline says, Hey, by the way, this is a squat assets. So it's a it's that simple, creative commons license with attribution. I think it's I think it's cc 4.0. But I can, you know, we'll make sure that's nice and blatantly in everybody's face, we're actually gonna be including the license directly with the content.

Ricky Grove  32:24  
Okay, that's great. I'm glad to hear that. Well, last question for you. Since you've announced it, and also the GTC and your presentation and other presentations, have you had other game companies approached you about making their content part of the omniverse machinima?

Dane Johnston  32:44  
Yeah, yeah, we have. And we're going to try to keep incorporating more and more. We're actually include one thing that's really addressing and we don't have, necessarily what I would call the exact rights to this. But we have like this great Minecraft path, as well. But you'll have right off the bat. There's another application on omniverse called laneways. Which in a turtle a video person made in their spare time, which takes your your Java Minecraft world pushes it into USD. And we actually have a blocky character that impose estimate. You can Yeah, it's so that that's another game that we're including that we can't put we can't put on the website. I can't say hey, Minecraft included does not mean we're not including Minecraft. But that, you know, we have the ability to make Minecraft machinima pretty much right away out of the box. I think that's excellent. I had my daughter doing it. My eight year old daughter doing it. She's like, This is so fun. And the quality is so high. And I'm like, Yeah, I know. I think this is one that people are really gonna get excited about. Yeah, we have other we have other companies too. And we'll roll it out.

Ricky Grove  34:04  
Okay, that's great. Yeah. Any idea when we might expect the release of it?

Dane Johnston  34:11  
I have a day that I'm not allowed to share but very, very soon. I actually have the final release candidate on my computer right now. So very, very, very, very

Ricky Grove  34:23  
excellent. Well, listen, Dane, thank you so much for being with us today. And I'm so glad you put this helped make this project happen and thank you for sharing your insights and your information with a completely machinima podcast

Dane Johnston  34:38  
today. No, I appreciate thanks for having me on. I'm really really excited to see what everybody makes out of this and you know really excited about feedback. What's going to make your life easier to make machinima Yes. So any in will have a forum. You guys go plug me on that and you're telling me Hey, what what do you want? What's next? What's your next impediment to making quick high quality content? Excellent.

Ricky Grove  35:05  
Excellent. Well, we will spread the word and I certainly will be on the platform making stuff so we'll see what happens.

Dane Johnston  35:12  
All right, appreciate it. Thanks.

Ricky Grove  35:23  
Music for this episode is by Mark Berg pulls apart at free music archive.org Thanks for listening

Transcribed by https://otter.ai

Dane's background in games
Omniverse Machinima and AI
Main components of Omniverse Machinima
How the Omniverse demo was created
VFX in Omniverse Machinima
Audio2Face
Who owns created content?