I'm creating an animation studio today

Need a writer? Call me bubba.

Awesome! Can you animate my last crazy Ambien dream?

OMG please yes.

This seems like some really cool technology. I used to love playing sports games and pitting AI vs AI just to see what kinds of things would happen. Cool stuff.

Yes Jon, exactly.

jpinard, in order to do that all we would need is 3D models of the objects in your dreams. ;)

So a quick update!

I’ve been working on this since basically the first of the year, or about 3 weeks.

In this time I’ve:

Met with an animation producer in Burbank to pitch him the idea. We talked for around 4 hours and he said all the players in town could use the tech. He wants to present it to them ASAP.

Built the algorithms for most of the way it works, and also the flow and design. I’ve pseudocoded around 50% of it so far. I even created a whitepaper explaining it! I’ll link later in the thread.

Decided that it’s definitely going to be a series of Python programs interfacing with Blender. There will be a main program that will end up as an executable that actually creates the script for Blender/whatever tool so we create independence in that regard. We’ll use APIs for that.

In the meantime, I’ve learned Python and Blender and am creating the program in Python to be run natively in Blender. You just hit the “Run Script” command and it fires off a bunch of code and creates objects, moves them around, rotates them, resizes them, etc.

It’s already creating movies that could be used for a wild music video or something. The recommendation from the producer in Burbank is that I use it to create music videos for smaller bands that are up and coming first, to show the public what it can do.

In the 3 days since he said that I managed to have the AI create it’s first music video. It’s pretty wild as the objects are completely random (including a sheep and a monkey head), but hey, the A.I. created it, not me. I didn’t touch the tools, just hit Run and then rendered the result. Pretty wild!

So it’s going super well. I’ll link some results soon, you’ll see it get better and better at making them as I teach it what film making, acting and editing is.

If certainly be interested in reading the white paper you wrote. I’m curious as to what exactly the AI is actually doing in the system.

Hey Timex, hopefully this works for the whitepaper:

https://docs.google.com/file/d/0B4m-4gIVPlIoM0NsUzNtWVh4ck0/edit?usp=docslist_api&filetype=msword

So big breakthrough today. I have a rudimentary interface now so I can “talk” to it and give it mood, energy, plot points, etc. It now knows camera tricks, lighting templates, object texturing, and all that stuff. Today I gave it several plot points and it created a 3 minute video with over 80 jump cuts. As soon as I can figure out how to batch render these scenes I can start churning out content as fast as it will render. So the first task, aligned with Phase 1 in that doc, is to start making music videos.

Anyone have a good song to make a music video to? I’ll probably start with EDM soundtracks since my 2 year old likes them. But theoretically it could do any genre, any song. Just need to decide what the mood is.

I wonder if there is meta-data out there about songs, where it has timestamps and moods and energy levels of various music. Something like the database Shazam uses. It would be fun to let it loose on a big musc library and see what it comes up with. But the metadata needs to be there already (tagging everything myself takes repeat listenings and manual encoding now).

I think what you want exists, just not sure what’s available in a in a free form. There must be heaps of other tools to analyse a track and get basic data like BPM, peaks, troughs, intensity, etc.

Hmm, that service looks cool. Their dataset looks huge at over 37 million songs. I just need a few dozen/a hundred or so to test it. BPM data, etc would be ideal, my code already uses BPM for the Dancing routine.

GraceNote does something similar, IIRC. There’s also Pandora’s Music Genome Project, but AFAIK, they never released a public API like they hinted they would a few years ago.

As for songs? I’d be curious to see how it’d do turned loose on some stuff like “Analog Nights” and “Crimewave,” both by Mega Drive. They’re back to back on the album and fairly different moods for a pair of awesome Synthwave tracks.

Totally unrelated, but if you wanted to experiment with picking up moods and movements within a single song, Blind Guardian’s “And Then There Was Silence” is basically the most bitchin’ song ever written about The Iliad, so, you know, there’s that.

Sooooo. . .

Armando, very interesting. I’m interviewing animators right now, we were just discussing this concept. The part my AI does is take that generated screenplay and do the film/animation part itself. I actually now have the algorthim written (in Python) that parses a specialized script template and turns that into animation. Wild time to be living in! The task now is to teach it all the tricks filmmakers use to execute a script. The ones it creates on its own (not a script but generated at run time) are pretty nonsensical themselves. I think clever writing is still along way from being replaced.

Rudememtary action driven formulaic stories though? I think we (the collective, everyone working on this) are getting closer to that everyday.

Yeah, I did seem to recall what you guys were plugging away at was slightly different, but the moment I saw the headline, I thought of you!

I love the idea of the computer creating the story and the people acting it out. They should experiment with avant garde plays written that way and acted out by a troupe.

Ok, so I’m around 4-6 weeks from finishing the prototype of this “Cinematic AI”. When I’m finished it should be able to parse a script (a proprietary format similar to the “standard” American screenplay/teleplay format … FYI there actually is no standard, I had to invent that) and make a movie out of it automatically. It’s been a fantastic journey and the most intellectual thing I’ve ever done. I mean, just programming the AI to know all the camera settings and shots and why to use them in different contexts was a beast. I also had to standardize model creation and metatagging, which there is also no industry standard for.

I also wrote it modularity and independent of animation/game engine\rendering software. It can therefore be integrated into all the major software packages: Maya, Unity, and whatever else has an API that can control the software directly. As long as you can control everything with an API, it should work no problem. And it makes the movie totally by itself! You don’t have to do anything but write the script… for now.

I have talent that works at HBO, Blizzard, MTV, Dreamworks, and a few more lined up. Once I complete the prototype I’m going to gather them all up in LA and show them this thing. But should blow their fucking hair back.

By the way, as a completely unintended side effect, thus will be the first AI that can dream. I’m going to start letting it dream in the next few weeks, hopefully. Should be interesting.

Excelsior, Guap!

Thanks Rich, I appreciate it!

Holy shit!
Just found this thread and…I don’t even know what to say.
Keep at it, Guap.
I honestly am really looking forward to seeing your work!