14.01.2021

Hololens Github

38

Microsoft HoloLens transforms how you interact with your data and the world around you. Unlock new possibilities for business and productivity. Contact us to discuss the possibilities that the new MR and VR platforms and technologies can open up for you. Gotcha - changing a HoloLens app’s name as it appears on the tile 1 minute read I am currently finishing up a nice little demo app for the HoloLens but ran into a weird problem. Originally the app was called ‘Gazer’, as I only wanted to test Gaze Input, but it turned out to become a lot more. So I wanted to call it “CubeBouncer”. Holoscanner has an ulterior motive, however. It is a game built around the idea of integrating analysis of the Hololens’s scans of the real world into the gameplay mechanics, with the intention of encouraging exploration of parts of the world that have yet to be scanned.

Sample highlights

HoloLensForCV-Unity is a sample project which incorporates the HoloLensForCV sample from Microsoft into a C# project in Unity using IL2CPP Windows Runtime Support.

Windows Runtime Support in Unity

Download full movie original sin. The project enables access to all HoloLens research mode streams through Unity C# code using the HoloLensForCV Windows Metadata file.

To enable Windows Runtime Support, the ENABLE_WINMD_SUPPORT #define directive is used to check that the project has runtime support enabled. Any call to a custom .winmd Windows API must be enclosed within this #define directive.

Windows Runtime Support makes it far easier to integrate other libraries into Universal Windows Projects than it was using dynamic linked libraries (DLLs). Microsoft has some detailed documentation on implementing Windows Runtime components in C++. Below is a simple example.

Runtime Support Example

C++ sample

In Visual Studio, create a new C++ WindowsRT component with:

Build the simple C++ sample and copy all output files to the Assets->Plugins->x86/x64 folder.

Unity C# sample

It is now easy to access the above C++ WinRT component in Unity C# through IL2CPP build support. Open C# Unity scripts in Visual Studio, and be sure to select the Assembly-CSharp.Player viewing option for code, otherwise ENABLE_WINMD_SUPPORT code in the #define directive with appear greyed out.

Build the Unity project with the runtime sample script attached to a game object and deploy to a UWP device. The WinRT component should be accessible through C# Unity environment, though not through the Unity Editor window.

HoloLensForCV-Unity sample

As mentioned, this sample enables access to the HoloLens research mode streams, as well as OpenCV to enable real-time processing of sensor frames streamed from the Microsoft HoloLens device in a C# Unity environment using IL2CPP Windows Runtime Support. Here is a link to the project on my personal Github repo.

The included sample includes a Unity project which maps the short-throw time of flight (ToF) depth stream to the PV camera stream.

To map the depth frames to PV frames we require the following 4 x 4 transformation matrices:

  • Depth camera: frame to origin transform (frameToOriginDepth)
  • Depth camera: camera to view transform (cameraToViewDepth)
  • PV camera: frame to origin transform (frameToOriginPV)
  • PV camera: camera view transform (cameraToViewPV)
  • PV camera: camera projection transform (cameraToProjectionPV)

The below image provides some more detail into how these camera transforms interact.

The mapping follows a 2D -> 3D -> 3D -> 2D set of transformations, resulting in 2D fused depth-PV frames.

Compute inverse transforms to relate depth frames to PV frame coordinate mapping:

Depth to PV 4 x 4 transformation matrix and chain of transformations. We first map the 2D depth frames from the view to camera space, then from the depth frame to its origin. From the origin, we map to the PV camera frame, PV camera frame to view space, then from the PV view space to PV projection space.

Compute the mapping of depth points and scale the 2D points by the final coordinate of the vector.

Now we scale the depth pv points to the size of the incoming camera frames.

And finally can visualize the output on a canvas in Unity!

The image at the beginning of this post also contains some simple computer vision processing to display contours in depth PV frames using OpenCV through the included Nuget package. Please see my Github repo for more details and sample code.

The YoloDetectionHoloLens sample enables object detection through the YOLO framework by streaming of sensor frame data from the HoloLens (client) to a companion PC (server) using Windows Sockets.

Desktop (server)HoloLens (client)

Specifically, a TCP (Transmission Control Protocol) socket is used to enable low-level network data transfers (in both directions) for this long-lived connection. To allow for a UWP device to connect to a TCP socket, we need to declare app capabilities for Private Networks (client & server) or Internet (client & server) in both the Windows Desktop application (server) as well as the UWP application (client).

In the Package.appxmanifest file of C++ Windows Desktop application and the C# Unity IL2CPP output build, add:

Simple client-server socket

More detail into implementation coming soon!

Hololens Github

Github Hololens Camera Stream

  • Desktop streamer header contains code which creates a header which indicates the size of the outgoing data.

  • Desktop streamer contains a sample to send the outgoing bounding box data as an interface vector of custom WinRT objects (bounding boxes).

  • Device receiver allows for the HoloLens to listen for and receive incoming WinRT objects (bounding boxes). These are then formatted and displayed to the user.

Running the sample

The following instructions provide the necessary steps to build the YoloDetectionHoloLens sample.

YoloDetectionHoloLensUnity project

This project houses the code to enable media frame source groups, send sensor frames, receive bounding boxes from the companion PC and display the returned bounding boxes.

Hololens Github Download

Hololens Github
  1. Open HoloLensForCV sample in VS2017 and install included OpenCV nuget package to HoloLensForCV project
  2. Build the HoloLensForCV project (x86, Debug or Release)
  3. Copy all output files from HoloLensForCV path (dlls and HoloLensForCV.winmd) to the Assets->Plugins->x86 folder of the YoloDetectionHoloLensUnity project
  4. Open YoloDetectionHoloLensUnity Unity project, enter the IP address of your desktop PC into the relevant field on the script holder game object
  1. Under Unity build settings, switch the platform to Universal Windows Platform and adjust relevant settings. Build project using IL2CPP
  2. Open VS solution from Unity build, build then deploy to device

ComputeOnDesktopYolo project

This project receives sensor frames from the connected HoloLens client, processes them using the Tiny YOLOV2 ONNX model. ONNX (Open Neural Network Exchange) provides an open source format for AI models and allows easy interoperability between different frameworks.

  1. Open the HoloLensForCV sample and build the ComputeOnDesktopYolo project (x64, Debug or Release)
  2. Deploy project to Local Machine. The sample should open a window which looks similar to the below figure

Combining the samples to process HoloLens PV camera frames

Github Hololens For Cv

  1. Ensure the HoloLens and PC are on the same network and the HoloLens is currently running the YoloDetectionHoloLensUnity sample
  2. When the deployed ComputeOnDesktopYolo desktop app opens, enter the IP of the HoloLens to connect and click connect
  3. Sensor frames from the HoloLens should begin to stream to the desktop and appear in the app window
  1. When prompted on the HoloLens, perform the double tap gesture to connect to the host socket and begin receiving information about detected objects in streamed frames. Bounding boxes and updated on screen text should appear within the field-of-view of the HoloLens as below.

Hololens Github Tutorial

Teamviewer version 10 descargar gratis descargar. Note that this frame was taken at a different instance than the above frame from the desktop streaming, leading to the discrepancy in bounding box sizes.