Happy Friday 3D e VFX Artists, today I want to speak to you about the new video that appeared on the web a few days ago, which shows how to achieve real-time rotoscoping using the power of Machine Learning.
How to made roto, using Machine-Learning / AI… can we say Goodbye to Green Screen?
In the video we see the simplicity and speed of how to make the roto from the footage, in the video, we see the Rotoscoping of 3 videos in 10 minutes using Machine-Learning / AI. Can we say that the roto artist’s work is questioned? and what is the future of this technology?
Today there are several online tools that help us with rotoscoping using Machine-Learning / AI, here are some of them:
Runwayml
www.runwayml.com
You can cut anything from all your videos …
Runwayml Green Screen is a web tool that helps you crop, instantly crop objects from your videos, with a simple click
Also, Unscreen is another online based platform where you can generate the roto for your footage:
Unscreen
www.unscreen.com
100% Automatically, without a single click
No need to pick pixels, select colors, paint masks or move paths: Unscreen analyzes your video 100% automatically and generates a high-quality result. Simple, good.
Can we say that the roto artist’s work is questioned?
Honestly, I think the web is probably the least interesting platform for this sort of thing, these products are great for creating videos for youtube or social media.
Maybe for well-lit scenes and slow movements, these tools would be absolutely fine even now, but in my opinion, we need a few more years to make it the main tool.
What is the future of visual effects?
On the other hand, if we want to work at higher levels, a tool that I recommend to speed up the roto work is the new auto-rotoscoping feature works on DaVinci Neural Engine and works by working on color grading.
Magic Masks in Resolve
/magic-mask-in-davinci-resolve-17/
Magic Mask has specific controls that you can set based on what you want to mask. It has a setting to quickly select an entire person or particular features such as the face, legs, clothes or arms.
Nuke auto-rotoscoping feature?
I think in the near future it would be interesting how Nuke will use AI technology in rotoscoping, if the technology eventually becomes just a node in Nuke, it will be thousand times more interesting.
Also, check this video about VFX Artists compete against artificial intelligence, using After Effects and Mocha Pro for Rotoscoping, how to speed up the roto job.
I’m curious to hear everyone’s thoughts on this new tech! write in the comments what you think!