ARCore Depth Lab is a Tools application developed by Google Samples, but with the best Android emulator-LDPlayer, you can download and play ARCore Depth Lab on your computer.
Running ARCore Depth Lab on your computer allows you to browse clearly on a large screen, and controlling the application with a mouse and keyboard is much faster than using touchscreen, all while never having to worry about device battery issues.
With multi-instance and synchronization features, you can even run multiple applications and accounts on your PC.
And file sharing makes sharing images, videos, and files incredibly easy.
Download ARCore Depth Lab and run it on your PC. Enjoy the large screen and high-definition quality on your PC!
Download and install LDPlayer on your computer
Locate the Play Store in LDPlayer's system apps, launch it, and sign in to your Google account
Enter "undefined" into the search bar and search for it
Choose and install undefined from the search results
Once the download and installation are complete, return to the LDPlayer home screen
Click on the game icon on the LDPlayer home screen to start enjoying the exciting game
If you've already downloaded the APK file from another source, simply open LDPlayer and drag the APK file directly into the emulator.
If you've downloaded an XAPK file from another source, please refer to the tutorial for installation instructions.
If you've obtained both an APK file and OBB data from another source, please refer to the tutorial for installation instructions.
Using Object Placement sample on Pixel 4a 5g, when I try to place a virtual object behind a physical object in the environment, a white box covering virtual object is seen through the physical object. Is this the intended behavior? Also, many times, the virtual object appears in front of the physical objects, as it would be without depth API. Does the object placement work properly for anyone on Google Pixel versions supporting ARCore Depth API? Or does it only work properly with phones with ToF sensors?
This app has blown my mind. I work with depth based object tracking on the side using pictures and videos which takes hours to process on my beast of a rig, exactly what this app can do in real time. Incredible demo, blows my mind!
Raw lidar (rgbd) data saving is missing :( I like this app, but I'd like to have an option to save raw data frames like depth, confidence, rgb and IMU.