Will the camera's orientation when starting the project affect the positive and negative directions of the virtual space?

I am currently using GPS to do an object generation project. Every time I calculate the target longitude, latitude, and user distance, it is probably correct. However, the virtual space seems to be different depending on the camera’s orientation when opening the project. Suppose today I open facing forward. Project, the system calculates the relative position as (1,1). If I close the project face to the left and reopen it, the calculated position is still (1,1), causing the object generation position to always be in a certain direction of my camera.

This behavior is expected. If you’re calculating relative to the camera, you’ll likely need to obtain the position in world space.

In three.js, you can use:

https://threejs.org/docs/#api/en/core/Object3D.getWorldPosition

To access three.js, you can use:

const {THREE} = window as any

I’ve tried for a few days, but it’s still not as good as expected. I’ll try again to see if there are any other solutions. Thank you for your reply and help.