Hi all!
My group and I are working on a project creating an AR experience to compliment a live performance where the audience can use their mobile devices to look around the room and see different animated 3d-models. For example, we’re looking to add fire on the walls, a rising pheonix, and some lightbeams going into the artist as well.
We want to synchronize these animations and assets with the actual performance itself (different animations for different parts of the song), but we aren’t sure how.
A group member brought up using Timecode, and I wanted to see if anybody has heard of a way to implement this, or if there is any other way to synchronize animations to some form of time keeping.
Any help is appreciated! Thank you!