Are you able to interact with those virtual MRTK objects in the scene? Hand tracking works for me as well and you do not need MRTK, it’s inherent in NRSDK
Yes I am able to interact with the objects in the MRTK example scene. My intention was to use the microsoft’s MR UX and other SDK capabilities (Spatial mapping / Sharing Anchor ). Albeit, NRSDK has the hand tracking capability, not all the gestures that are possible in HoloLens are available and we have to make do with the NRSDK hand tracking gestures to interact.
Hello . Im using Unity 2019.4.31f1 and have an issue about hand tracking. After the open the app in my N-real development kit, I get a message that “Not supporting Handtracking calculation” and The device is not supporting Hand-calculation. Is there any solution to fix it? I follow the instruction in the video above.
what phone do you use? some phones aren’t supported i think
I didn’t use smart phones. I use N-real dev kit and made an app from apk by using scrcpy.
Also I tried my devkit’s computing unit upgrade. And it says that the version is up to date! Is there any solution I can try?
After changing the time of Devkit’s computing unit, I get proper operation! If someone has same issue, Try to change the time!
Has anyone had any success in setting up Gaze tracking using MRTK and nReal? We are trying to build a hands-free interface in which the user can use head tracking/Gaze to select inputs such as buttons.
Im having a lot of troubles with handtraking, its not accurate at all , i followed @robi and disabled the pointers in the hands but the ray cast its like crazy.
Hi, please check our latest V1.9.3 NRSKD + MRTK project.