Monet Drip is an augmented-reality-enhanced art piece that I handled the development of. 
When I was brought onto the project, the canvas was already created, and it was my job to finish off the vision by creating an oil simulation that should behave in a very specific way to communicate the complex paths that artworks can take.
There were a number of challenges involved in bringing this to life. Fluid simulation is computationally intenseive, but perfectly doable in a pre-rendered format, such as film or television. However, fluid simulation in an augmented reality application is a whole other beast.
In real-time formats such as AR, fluid can not be truly simulated in real time- computers just do not have the capability to do so yet. So, the effect has to be approximated.
There we a couple of potential ways to do this. The simulation could have been done in Blender and cached- having a model created for each frame, and recalled on playback in AR. However, this had a couple of limitations.
The cached data would have been too large for a mobile app, and this would not have allowed for real-time simulation.
On the other hand, since the movement of the oil had to be so specific, it would have been impractical to compute any part of the simulation in real-time, as this could change the outcome of it. It had to appear to be a simulation, but in-fact be a pre-determined.
The solution to this technical challenge ended up being rather simple. The simulation was done in Blender, using a combination of forces and a careful balance of fluid parameters. Then, it was exported as a video, which was heavily manipulated in After Effects. Next, I imported the video into Unity, and the AR activation is a prefab containing video playback of the simulation, along with some shader trickery to make it appear as though the oil as interacting with light.

You may also like

Back to Top