Help with surface detection / ground anchoring issues

OK, I’m having quite a time of getting objects to appear properly sized and anchored to the ground (I’m using Studio). The ultimate goal is to have a few virtual landmarks surrounding the user which they can approach and interact with through their phone in AR. From what I’ve read, placing objects at y=0 (and ensuring their pivot is correct) should place them at ground level once the app starts, relative to the player camera. With that in mind, I should be able to place some boxes around the origin and have the player see them on the ground, and approach them. Instead, objects at y=0 all seem to be floating, and I’m struggling to actually position them at ground level, even spawning them dynamically at runtime.

Here’s a simple test to reproduce the issue:

  1. Clone the World Effects project. https://www.8thwall.com/8thwall/world-effects
  2. Change the ground plane to an unlit material and make it slightly transparent.
  3. Launch the app. Note that the ground plane is at chest height and obscures most of the world.
  4. Tap to place some cacti on varying surfaces. You’ll notice that they are placed properly at surface level, NOT at y=0 where the ground plane is.

Or here’s a link to a version of the project I’ve set up, which also includes a sphere for perspective: 8th.io/5cwvy

A few things I’ve tried which didn’t help:

  1. Adding the coaching overlay from the Absolute Scale demo, which doesn’t seem to help, and swaps back and forth between ‘normal’ and ‘limited’ statuses frequently in a way that would make the UX terrible, prompting the user to recalibrate constantly.
  2. Getting the same touch position that cactus placement uses and attempting to relocate the environment to that position. Y is always a tiny number and essentially remains at 0.
  3. I considered the objects might be scaled incorrectly or further away along the ground than I expected, but that doesn’t seem to be it. In absolute scale, all content seems to move with the phone vertically. So if I try to move my phone downwards so that I can see ‘underneath’ the sphere, both the sphere and the ‘horizon’ stay in the same place on my phone screen, and everything shifts vertically along with the phone instead of staying in place.

Other context:

  • I’m using a Galaxy S9, but we saw similar issues on a more recent iPhone, a Galaxy S21, and a Pixel 7a (and in a different location).
  • It seems like there may be times as the app is loading where the ground level is ‘correct’, but by the time the world camera has loaded, everything is floating.
  • I’m pretty sure that I should be using an Absolute Scale camera for this kind of experience, but the issue persists whether I am using Absolute or Responsive scale. The demo linked above is currently set to Responsive still, since that’s what the World Effects demo uses.
  • Across all these tests, the environment sometimes follows me as I walk around rather than staying put, which seems wrong.

It does seem better if I test outside in a completely open space, but even then, if I try to approach an object, the ground plane likes to shift around and my objects disappear or get resized, despite being in absolute scale mode. I’m struggling to get a remotely usable result and it feels like I must be missing something really obvious. Can anyone offer any insight?

Thanks!

Hi Logan, welcome to the forums!

I’d be happy to look into this and provide some insights.

For context, World Effects AR comes in two types: Responsive and Absolute.

Responsive mode works as soon as any flat surface is detected. However, if maintaining a 1:1 scale with real-world objects is crucial, this mode may not be ideal since it lacks enough spatial data to determine precise sizing.

Absolute mode solves this by requiring users to move their device around first, gathering spatial information before content appears. The tradeoff is that this introduces a delay in displaying objects.

To investigate further, I cloned the World Effects sample project you linked and added a repeating texture to the ground plane to better visualize where it’s being placed.

I’m not noticing any issues on my end. Could you provide a video showing the issue you’re experiencing? Also, keep in mind that reflective surfaces or repeating patterns with no unique features can cause tracking inconsistencies.

Heya, thanks so much for the response! I have a few questions based on what you said, if you don’t mind:

  1. You mentioned that absolute scale mode introduces a delay before content appears while the device collects surface data. It feels like what I’m seeing is that there is no delay, and the world ‘loads’ before the surface detection completes, and then everything from the scene is still incorrectly placed, even once the surfaces have been detected. So:
    A) In Absolute Scale mode, should the world reposition itself as new surfaces are detected, and/or a way to control that happening? When I was testing outside and tried to approach one of the cubes, it only sort-of got closer, and then the whole world repositioned and the cube moved behind me. I did play with the ‘recenter’ functionality some, it didn’t seem to affect the y axis and mostly I want the experience to remain anchored once it’s loaded, just at the correct location.
    B) Should the content not actually load before surface detection is completed, or is that something I need to build a flow for? I did try having the environment in its own Space and only load that space once the surface detection was working, but everything was still floating. I’m not sure I tried a complete flow where the coaching overlay loads, they complete it, and then the new space loads, so I could try that if you think it will help.
  2. Is this a reasonable approach to achieve what I’m trying to do, given that ultimately this experience may be taking place on a grass lawn? Maybe that’s too repeating of a surface for the ground detection to work well? Is there a better way to build world-anchored experiences like this that doesn’t involve all the VPS setup/approval?

And here’s some videos showing the issues I’m having. These are all in Absolute Scale mode and I added a texture to the ground like you suggested. Sorry for the youtube shorts but I’m not allowed to upload yet :sweat_smile:

First video (indoors, no coaching):

You can see that at first the app loads and the ground plane looks reasonable, then the camera loads and the plane shifts up some, then the camera finishes loading and you can see the ground plane at ‘phone’ height. Then I place some cacti which actually seem to land on the ground plane, which then shifts up some more of its own accord. Then I lower my phone directly vertically and you can see that the floor moves up and down with it, rather than being anchored anywhere. All of this seems wrong to me.

(Other videos coming in the next couple of posts since I can only do one embed per)

Thanks again for the help!

Second video (indoors, coaching):

I implemented the coaching overlay from the Absolute Scale project. In this test it didn’t flip back and forth between Limited/Normal as much as it sometimes does, but you can see that the ground is pinned to the phone’s height as I move it vertically, and the ground plane shifts around quite a bit as I move the phone.