have some of you already tried to implement Live view into the Nreal glasses?
I have some issues connecting the Live view functions to the glasses themselves (using the camera mounted on the glasses but mainly using the glasses as a “main screen” - to show interactive objects and directions on the glasses, not on the phone).
Thanks for any help/info.
Could you point out what you are trying to do?
My goal is to integrate Google Live View directly into the Nreal glasses.
Meaning that I do not want to use the live view functionality with the mobile, but with the glasses themselves (it would project interactive objects such as arrows directly into the glasses). Do you have any experience with this functionality? (Mentioned Live view: What is Google Maps AR navigation and Live View and how do you)