Hi there,
I might just be being dense, but I can’t seem to work out how this was done?
I understand that there was image tracking used, which covered up the real world painted section. But what is stumping me is when the camera moves away from the image target, while the 3D experience stays tracked?
At the moment, all I can think of would be to use a VPS experience so that the general environment is used as the track. But this demo came out with just the Image Tracking release, sooooo…
Any support or direction would be great, thank you!