[ #RebusFarm #ROTM ] CG Record and RebusFarm has announced that Mobile Strike by Jocelyn Strob...
And We have a quick interview with Strob about the Visual Effects he done for the spot!
Could you please introduce more about Mobile Strike. How long did it take to complete?
Hi! In October, My long time friend director Patrick Boivin (and his producer, Stéphane Tangay, a very nice guy) asked me to create some VFX and 3d animations for a Mobile Strike ad. Mobile Strike is a popular mobile MMO video game (second highest grossing game on the App Store in 2016). That game is well-known for its ad campaign featuring no other than Schwarzy (Arnold Schwarzenegger) himself.
The plan was to use 3d models, textures and some mocap data provided by the Mobile Strike team, animate and render them for integration on footage of set and actors shot by Patrick’s team (which implied camera tracking). Pyrotechnic effects were supposed to be mostly done practically on the set using maquette of the tanks and other assets, but I finally made more CG pyro than what was planned and we ended up not using any maquettes.
There was still many practical pyro: sparks on the ceiling lamp behind the chopper, flashes on the ground when the soldiers fall and on the wall behind the missile launcher, small blasts on the couch, the computer screen disintegration and the biggest one: the piano explosion.
In CG I did the tank, helicopter and plane explosions as well as adding smoke here and there, missiles trails, canon blasts and crashing plane trail.
Patrick also took compositing in charge, adding glows, lens flare and small debris. He was also playing the role of the father while his son Romeo was playing the kid.
Before shooting the plates I could do some tasks like rigging the tank, preparing assets (importing meshes, redoing shaders) and making some tests like the tank cannon blast test. Then the shooting took place on November 15 leaving me one month to track all the cameras (a dozen complicated ones and a few simpler ones), do all the animations, lighting, fluid sims and renderings. When I saw the plates, all with complicated camera motion, lots of motion blur, grain, moving object, smoke and debris, I realized I had to hire a helper (my long time friend Joseph Tran) to help me on the camera tracking and model some missing pieces like the chopper missile launcher. The shots could not be tracked automatically so it took around 2 weeks only for camera tracking which left me very few hours a day to sleep during that month in order to complete the rest of the job!
Both you and Patrick Boivin work together in many cool stuff do you mind to share something about him?
Sure with joy! Pat and I met around 25 years ago in high school when, in class, he took my drawing compass and plugged it in an electrical outlet which can gives you an idea of his love for pyrotechnics! During many years we were both passionate about drawing SF and horror comic books and illustrations which led us to meet others like us (including Joseph) and form an artist group called Alliage. That group then morphed from a comic book artists group to a video creation artists group which during 10 years worked together on a TV show concept called Phylactere Cola. The most similar thing I could think of is Monty Python. My role in the group was doing CG and practical VFX, practical make up FX, foam latex masks and puppets etc. Patrick was the director. But every member of the group (of around 9 artists) was doing a bit of everything: acting, script writing, story boarding, and even stunts! All the ideas of the hundreds of short movies we made within that concept were the result of intense group brainstorming.
Since then I have always collaborated with Patrick on many of his projects. Patrick is very popular on YouTube with near 400k subscribers and more than 300 millions view. On his channel Patrick is mostly experimenting with personal projects often using stop motion. But once in a while companies will contact him to create a video for them to promote their products on his channel like was the case with this Mobile Strike ad. Patrick develop the idea, direct the spot and gather the whole team with the help of his producer Stéphane. Notable projects with did together that way were: Iron Baby, Little Ant-Man (for Marvel), X-games Energy (for X-Games), Tony Hawk Secret HuckJam Factory (for and with Tony Hawk), Escape (for Ford, i was just acting as a Zombie), Michael Jackson vs Mr. Bean (i just added the missiles in CG).
What did you use to make HDR360 photo shoot?
I have a Sony A7RII camera (42 megapixels) and a Sunex superfisheye lens (185° field of view) with a panoramic tripod head. I stitch them with Autodesk Stitcher. The lens is made for aps-c Canon so I still need to upgrade my lens to get the full 42 megapixels resolution but it’s still much more than enough for 360 HDRI and IBL usage. Those 42 MP are more useful for 3d scanning with Agisoft Photoscan which I used to scan the couch for easier integration of the tank rolling on the couch, creating smoke around the couch and adding rolling grease pens on the couch.
Tell more about experience in Tracking with Blender.
I am learning Blender since recently and it never cease to impress me. It’s a very light, stable and powerful software. That was the first time I used it in production and didn’t regret it at all! Now with Alembic, every studio should start integrating Blender in their pipeline.
The motion tracking tools (in the “movie clip editor”) are very efficient and gave us perfect results for all our complicated shots on this project. I learned how to use them quickly by following the tutorial available on the Blender Cloud called “Track match Blend” byt Francesco Siddi.) We could track 1 or 2 shots a day since they were requiring a lot of manual tracking and I can say Blender is perfect for more manual tracking. When I say “more manual tracking”, I mean tracking were we need to place many trackers by hand and animate them almost frame by frame. We still used the tracker a lot so I should say it was semi-manual tracking. Blender is able to place many trackers at once automatically on one frame and start tracking them all but it can’t add new trackers along the way while tracking which make it a bit less powerful for fully automatic tracking. But in my experiences with other more automatic tracking softwares. At first I always try to track fully automatically and then very often end up tracking manually cause the auto-tracking was not working well, so it finally took more time cause I often just loose time trying to track automatically. While in Blender I just go right away in a more manual fashion and I know I will get my perfect result at the end of the day. Once enough points are tracked, Blender automatically recreate the scene in 3D with the camera and it was very fast to output h264 quicktime with background plate and test objects to test the tracking. Blender was also very handy to create proxies and undistort the footage. There is a tool for that in the movie clip editor. To undistort the footage correctly I printed a grid pattern that I shot with every lens on the set and I used this technique in Blender to find the distortion values.
There is no doubt in my mind that I will always use Blender for camera tracking on all my future projects.
Tell us about 3ds Max, V-Ray and Phoenix FD for Explosion effects.
Now with the new quick simulation setups included in the last version of Phoenix FD, basic setups are very fast. Every fluid simulators seems to have its own special look in terms of smoke and fire patterns. I like the look I can get with the Phoenix FD massive smoke vorticity. I am also learning Houdini but I was not yet ready to use it for pyro on that project. Phoenix is really fast to setup and simulate. I also used a few sim licenses (also new to the latest version) to speed up the process by simulating on my farm. What I like with V-Ray and Phoenix FD is the support that I get by Chaosgroup. Vlado, Svetlin, Ivaylo and their team are so patient and resourceful! It’s always with confidence that I can start a project using their tools! The biggest 3ds Max strength in my opinion is it’s V-Ray integration. V-Ray is the best production renderer and the best V-Ray integration is still the one for 3ds max. I just hope they could make V-Ray with the same level of perfection in Blender and Houdini. My dream would be Vlado and his team joining the SideFX team to create the most perfect 3d software ever!
The problem with 3ds Max is that it relies too much on 3rd party plugins (which end up being very costly) while Houdini doesn’t need any plugins and even Blender doesn’t rely that much on add-ons and has so many features that 3ds max does not have (motion-tracking, compositing, editing, fluid sim, fast start up time). Not to mention the fact that Houdini has an Indie Version that is very cheap and Blender is totally free, making 3ds Max a totally insane choice money-wise. It’s just a matter of taking the time to learn those newer options.
How do you see V-Ray and Phoenix FD work together. Tell us more about Phoenix FD simulation and shader please!
They work pretty well together but V-Ray is out since a lot more time while Phoenix FD is still a relatively new product. So I can say that V-Ray for 3ds Max is really near perfection while Phoenix FD still have much more room for improvement.
A nice thing to note is that V-Ray now has a volume grid feature allowing to load Phoenix FD sim (or volume data from any software like FumeFx or Houdini) and this without having to install phoenix FD. That means I could create some clouds shapes with Phoenix FD and sell them on Turbosquid for any V-Ray user to use. Which I will certainly do soon so check my Turbosquid page!
In the past I used FumeFX but since Phoenix FD is also capable of simulating water I switched for it. I really like all the effort that are made by Chaosgroup to accelerate volume rendering and ocean rendering. I don’t know how hey do it but they use some light caching tricks to accelerate rendering (like the Grid-Based Self-Shadowing) that are very efficient.
In terms of simulation one very efficient workflow is to simulate a low rez version and add details or slowing down the sim with the resimulation process adding details with vorticity (wavelet-nice method). On this project I had some bugs preventing me from using this process but I know Chaosgroup is working hard on the matter and I they will fix it fast. The automatic grid resizing is also a huge time-saver since it is so much faster to simulate when the grid is small. I had some bugs with that one too in this project but it is now fixed. When those 2 are fixed Phoenix FD will be a real production workhorse!
About shading one thing I like to use is the RGB channel to mix different colors in the smoke depending on the source and it creates interesting effects when the different smoke colors mix together. Like in my tank firing test, the ground dust was the same color as the sand while the smoke from the cannon was white.
What you love most in Phoenix FD. What you wish to see Phoenix FD to be improved in future release.
I like the fast setup, fast simulation, fast rendering, good integration with V-Ray and, like for V-Ray, amazing support with direct access to the developers via the Chaosgroup forum. What I would like to see in future Phoenix FD versions is the ability to simulate variable viscosity (to create lava or melting/hardening effects for example), tools to create and control tendrils and drops in liquid (like with the micro-solvers or gas-field vop in Houdini or the Q-solver plugin for real Flow). I’m also dreaming of distributed single sim calculation on the farm like possible with Houdini (with obviously some restrictions) and also cheaper sim license like available with Houdini Indie (with which one can simulate on 4 computers with a single Houdini indie license and the 3 free engine Indie licenses for a total of 199$/year while Phoenix FD license is 830$ + 210$ for each sim license (a total of 1,460$ to simulate on 4 computers).
Thanks Strob for the inspiring work and nice interview!
About Reel of the Month Sponsors
Main Sponsor: RebusFarm GmbH
The Sponsor for Reel of the Month - RebusFarm is a leading render farm service which instantly provides you with 5000 XEON CPUs to render your animations and still-images. Winning Reel of the Month means you will get 350 RenderPoints worth 350 Euro from RebusFarm.
co-Sponsor Real Displacement texture
The Real Displacement texture are produced by using highly detailed 3d-scans to get accurate color and depth informations from a real location with polycounts up to 20.000.000 and native scan-resolutions over 16k. Using a image-based-lightcancelling-process to reduce the light and shadows in the diffuse-map. All maps are provided with layers for glossyness, roughness, ambient occlusion and more, so it's prepared for physical rendering. Most maps are leveled to set it to 100% and it fits - so you save time setting up a material. Normal-maps is provided as a alternative to the displacement, or to support it. With these textures you can reconstruct the original surface incl. the geometry with a minimum of resources and a maximum of speed and flexiblity. Winning Reel of the Month means you will get 03 Free Collection of Real Displacement texture.
Please Submit your Reel or film on CGRecord TV to win CGRecord | RebusFarm Reel of the Month !
More about Spotlight