Hello everyone, the 3D objects inside of my scene kind of stutter/shake even if the device is held still. Is there any way to resolve this?
Can you share a video or GIF of what you are experiencing? More information may be helpful
This thread may also help give more information:
Hi Ian, would you have an email I can reach you at?
Please reach out to support@8thwall.com and the team can help assist
In some cases this can be called the βskyscraperβ effect. Our AR software is detecting the largest detectable surface in front of the camera. We can see in our example project here:
That the the surface itself is tracking quite well (at y=0), and 3D models that are closer to the surface are tracking well too. However, there are very small and minor camera corrections every time the camera moves a tiny amount, and so as you move further away from the detected surface in the y directions is creates a kind of βskyscraperβ effect (bottom of the skyscraper is stable and might move a tiny bit but this multiplies as it goes up in height). So this can even be more obvious as you get closer to a 3D model as it seems like you are needing to do in your AR experience.
We have had users with similar questions in the past with image target projects specifically and have implemented some lerping logic to have the object move slower with the camera.
https://threejs.org/docs/#api/en/math/Vector3.lerp
However, there is a downside to this as the objects may move slightly after the camera moved there for giving a sense of drifting. We saw success with a custom incorporating this logic into an image target project.
Hi Ian,
Regarding the solution you mentioned above using Vector3.lerp.
Do you have some examples of this implementation?
Iβm currently working in a project with a similar approach and the jittering is noticeable.
It is worse specially when Iβm close to the model.