Is Mobile Lidar any Good for VFX? - Polycam and Syntheyes

3 hours ago

In this video we are going to take a look at the lidar capabilities on the IPhone 16 Pro and wether it's good enough for VFX work. And a basic workflow using Polycam, Blender, Syntheyes and Resolve Fusion. First we will use Polycam to do our scan, although you could probably use other apps. Then we bring it into Blender where we will re-scale and position our scan to the origin.
When we are done we will export our model to use in Syntheyes for 3D Tracking. Using the model to create seed points on the mesh will allow us to use the model to help in our solve. After we are get our accurate camera solve we will export out the camera to DaVinci Resolve Fusion where will set up a test render to check our accuracy. Although the accuracy is impressive it certainly can't compete with professional lidar scanning, but there is a lot we can do with the mobile scanning.

***HOLIDAY SPECIAL 20% OFF the entire site through Friday Dec 20th!
Use - 'HOLIDAY20' during checkout.

**VFX Courses** - https://www.prophetless.com
Fusion Foundations - https://www.prophetless.com/fusionfoundations
Syntheyes Masterclass - https://www.prophetless.com/syntheyesmasterclass

Get 15% off Syntheyes with our affiliate link here, discount on checkout:
https://borisfx.com/store/affiliate/?product=syntheyes&host=standalone&purchase-options=new-annual-subscription&a_aid=65d512a4435f0&a_bid=3878d814

And if you like our videos please consider subscribing!

We've had people reach out asking how they can help support. If you feel the videos add value, we just setup a paypal me account if you would like to help support us:
https://paypal.me/prophetless

00:00 - Intro
01:11 - Overview of the scan in Blender
02:12 - Bring the Mesh into Syntheyes
04:38 - Adding trackers
07:36 - Setting seed points on our mesh
09:27 - Solving from seed points
11:27 - Using constraints
12:52 - Export to Fusion
13:31 - Test composite in Fusion
16:32 - Test Render

Loading comments...