Appleseed Renderer Interview
Still image from Fetch, a short film rendered entirely in Appleseed
Featured

Interview: François Beaune on Appleseed Renderer

François Beaune is the project founder and team leader of Appleseed, which is a render engine that is completely open source like Cycles or Luxrender. Appleseed is MIT-Licensed, which is probably the most liberal license in existance. It is availabe for Maya, Blender, Gaffer and other programs.

BD: François, tell us a little about the project and in what ways it is compatible with existing software.

FB: We have Plugins for Blender, Maya and there is a preliminary plugin for Houdini. Unfortunately none of those plugins are really feature complete at the moment, so it is a work-in-progress. We also would like to do real C++ integrations into Maya and Blender at least so users can have full real-time feedback for lighting and shading in the viewport. That are things we definitely need to improve. But we have bridges with most DCC apps.

BD: I have also heard that you want to integrate it into Gaffer.

FB: We actually basically completed the integration into Gaffer recently. That is the work of a developer of Appleseed called Esteban Tovagliari. He did a terrific job over the last year. Just a few words about Gaffer – it is a lookdev tool by a studio named Image Engine from Vancouver. It is also fully open source under MIT license so it is compatible with Appleseed. We are fully integrated into Gaffer and the next version of Gaffer will actually ship with Appleseed built-in. That's by far our best integration so far. With Gaffer and Appleseed together we have a really nice open source lookdev application. That is something that did not exist before.

BD: You started roughly six years ago and right now it seems like the features are on a par with other open source pathtracing engines like Cycles. How much is Appleseed used in production right now?

FB: I started in 2009 and made the first version available in July 2010. We have most of the basic features except for volume rendering and subsurface scattering. We are working on it right now. That's a big one of course. Actually it is a feature many people ask us because they want to render characters. We have a form of translucency which is called „thin translucency“ and it is very useful for things like tree leafs or grass. But it does not work at all for characters. Both volume rendering and SSS will likely be introduced in 2015 because we are working on a new short film for which we need those features.

BD: So you are getting asked for features – who is your user base? Who is using Appleseed and for what?

FB: In our team we have people who use Appleseed in production – we did two documentaries for BBC last year, we did an advertisement and some promotional films. That's about it in regards of production use for now. As I said earlier, Appleseed is not integrated well enough into Maya so it is still a bit difficult to use. People give it a try but are not yet ready to make the plunch and do the commercial walk with it. But there is a small community around Appleseed, people using it for character design etc. We have a lot of interest from people doing character design. That is why we keep getting requests for SSS. It is really high priority on our feature implementation list.

BD: The focus of your renderer is animation and VFX – how did you design the renderer to be especially useful for animation.

FB: First of all, I have a personal interest in animation. I'm living in Annecy which is the town of the Annecy animation festival which is a really large festival for animation with worldwide reach. I always wanted to do animation and I have a close friend who also walks in this area. We wanted to do a short film and we recently did it – Fetch! I also believe there is a lot of open source renderers on the market but none of them really targets animation. In my previous jobs that was what we were working on so that is what I know how to do right – motion blur, flicker-free animation etc.

BD: How good is your current motion blur implementation? How well is curved motion blur, deformation motion blur etc. implemented?

FB: We have really good support for all kinds of motion blur on par with all kinds of commercial renderers like V-Ray or Mental Ray etc. We support transformation motion blur with as many steps as you want and also curved. Plus deformation motion blur with as many segments as you want. It is also really fast – we were surprised that motion blur did not cause any issue on Fetch. It is just really fast.

BD: Appleseed is CPU only – do you see a market for it? Especially with many small studios and freelancers switching to GPUs right now.

FB: Right now Appleseed is a purely CPU renderer. The reason why it is CPU only is that we support programmable shading through OSL which is CPU only for the moment. We also support other forms of programmability, for example Disney SeExpr which allow us to combine layers with formulars. All this stuff does not run on the GPU at the moment. If it would run on the GPU, it is questionable whether it would be very efficient. That's why we target only CPU rendering at the moment. We don't have plans for GPU support yet. We always keep an eye on what's going and we are interested by GPU rendering but it does not look like we can use it by now. We are also interested in rendering large scenes with lots of geometry and textures. Right now you cannot really do that on the GPU unless you have extremely expensive GPU with like 24 Gbyte of RAM and there are not so many on the market. We want to support large scenes and lots of flexibility by programmable shading which is both not really feasible on the GPU.

In terms of market it is true that GPU rendering is very attractive for smaller studios because it cuts rendering times by an order of magnitude. But we believe there is a market for a more flexible renderer that can do things you may not be able to do with a GPU renderer. You also have to keep in mind that Appleseed is still kind of a hobby project which we do in our free time and we don't want to fight with GPU incompatibilities, driver problems and things like the split between OpenCL and CUDA. That is another reason why we are not walking the GPU right now.

BD: The other part for which you want to optimize Appleseed for is VFX – volume rendering and things like voxel- and point density data support are a crucial part for that. What are your plans in that area?

FB: Absolutely! Volume rendering is a major feature we are working at the moment. Just like for the rest of Appleseed we want very robust support for volumes in the sense that we want fully path-traced volumes so the only bias we will get will be noise. As I said earlier we are working on a new short film that will require volume so that is why we are working on it right now, together with Subsurface Scattering. Volume and Subsurface Scattering are actually interconnected fields. You could do Subsurface Scattering as a form of volume for example. A lot of research about fast path traced volumes has been published recently, some of it by Solid Angle, the makers of Arnold. There is a lot of cool technology to walk, implement and research.

BD: Can you tell us a little bit about the new short film?

FB: It is a bit of a secret still. One day I noticed there is an electrical outlet and I realized it looks like a human face. I had this thing for ages and finally I realized it had eyes and a mouth and it smiled so I thought that could be a good character for a short animated film. So we started working on a script involving an outlet and a switch next to each other on a wall in a house in the 50ies. It is going to be a short film again, probably about 2 minutes long. It is going to use photorealistic rendering, probably rendered in 4k. We will use Gaffer for the lookdev and we will use Open Shading Language for all the shading. Volumes will be needed for the atmospherics like dust in the air to provide an ambience. And there is a children's hand appearing at some point in the movie so we will need Subsurface Scattering as well.

This interview was first released in German for the magazine Digital Production. BlenderDiplom is now presenting the interview in its original, non-translated form.

 

We use cookies

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.