Virtual Shopping-3d cloth fitting solution with Unreal for Microsoft Kinect [ Dev-log]

ColdFace Interactive
6 min readJan 16, 2021

This post is a summary of the technical challenges & solutions for a multi-brand shopping store chains initial foray into virtual mirrors systems.

The hardware was a combination of Microsoft Kinect & Android box powered virtual mirror. To begin with we didn't own a Kinect but that wasn't a problem though the client was willing to give us the device which we declined.

Part 1- Clothes fitting (Draping)

Problem Statement

  • The cloth mesh is created as per a reference avatar i.e. they fit perfectly for a specific mesh, which is here on referred to as ‘Reference Avatar’ or RA.
Reference-Avatar-Character-Clothes-Mesh
  • As the clothes are modeled to fit the RA , so it wont fit a new avatar ( Refereed to as ‘Target Avatar’ or TA) which is different to the RA.
  • The aim is to fit the 3d clothes mesh to any target avatar(TA) that is loaded into the system.
Target-Avatar-Character-Clothes-Mesh
Blend between both Avatars

Note- The Reference Avatar was scanned in a multi-camera booth setup owned by the retail shopping chain which has commissioned this project. The 3d models of the clothes are scanned by another agency which works for the retail chain

Background:

  • Two main components: 3D scanned avatar & 3D clothes. Both are Skeletal Meshes (i.e. both meshes deform as per the position of bones/rig.) Both of these meshes are independent. They are just placed on top of each other.
  • The skeletons for both meshes (i.e. the bone positions) are controlled by the Kinect gesture mirroring feature.

Proposed Solution:

  • To analyze the correlation of the cloth mesh with the RA, and apply the found relation on the cloth mesh; as per the new TA.
  • The correlation refers to the, relative position (distance) of vertices of the cloth mesh; from the surface of the RA.
  • The solution is to find the distance(X) between the vertices of cloth mesh, and the surface of the RA; and then modify the position of the vertices of the cloth mesh, so that the distance between the cloth mesh vertices and the surface of the TA is same as that of (X).
Blend Shapes for target Avatar

Prerequisites:

  • For the #algorithm to work, the pose of the 3 meshes (i.e. RA, TA and the cloth mesh) needs to be same.

Challenges:

  • The prerequisite was not fulfilled, hence that needed to be solved first.

Note- The rigs in the given skeletal meshes were not aligned. None of them were in A or T pose , some were in A & the other in T. Another issue was that the root bones of each rig were not in the same position for all the skeletal meshes.

  • Lack of Unreal documentation, guides and usability issues.

Note- This was our first time developing with Unreal and a tough problem statement. This project started with another development team before it came to us & they had already designed the blueprints which included the use of some outdated plugins that only worked with Unreal Engine. We used their existing design flow & developed the algorithm to bridge the gap between the hardware & the UI. There too we had to develop fixes as there were major gaps in the design that needed to be managed.

In hindsight we could have easily done a better job in Unity, If it wasn’t for those plug-ins.

Approach & Progression:

  • First step was, to find a way (i.e. implementation) to move the vertices of a skeletal mesh at run-time. This needed 2–3 days as we had to get familiar with Unreal .
  • Next step was to find the distance of an arbitrary point from the surface of a skeletal mesh. This needed the use of Ray tracing (i.e. a mechanism with which we can shoot a ray in 3D space, and find the intersection point at which the ray intersects with a surface). This took another 2 days.
  • For the above to work we had to ensure that the prerequisite were in place. We had to find a way to control the bones through code (and not animation). This was the first challenge which we had to solve. We needed a couple of days to sort this out as it couldn’t be done with the skeletal meshes. There was no provision made to control bones using code. After 2 days we found another component called pose-able mesh, which supports skinned meshes; and allows bones to be controlled with code. It took us a day to modify the pose of the the RA, TA and clothes mesh so that all their poses were aligned to each other.
  • That is when we faced the second challenge . The poses were consistent between meshes but pose-able meshes do not support ray tracing mechanism. We spent 2 more days for that but it didn’t work out.
  • To ensure the algorithm worked, two things had to be in place (A) Bones should be controlled with code for matching poses (not supported by skeletal mesh) (B) Ray tracing mechanism should be used to find the intersection with the mesh surface (not supported by pose-able mesh).
  • After another round of research we found that, bones in a skeletal mesh can be controlled using animation blueprints (it is a type of visual scripting, specifically used for handling animations — does not require coding).
  • From there t took another 4 days to get the following outcomes :

1) To make the poses consistent using pose-able mesh (A)

2) To copy this pose to skeletal mesh through animation blueprint

3) To use ray tracing mechanism on skeletal mesh to analyze mesh(B).

A) Reference Avatar B)Target Avatar C) Blend Avatar

Now we could go ahead & implement the actual algorithm.The actual algorithm took us 3 days to implement . Another week to polish and fix bugs.

The Algorithm worked out fine, thou we know there is a lot of scope for improvement. This project had two parts, the above post was Part 1. Overall it took us a month to reach the desired results that also satisfied the clients requirements.

According to my tech partner Part 2 was more of a technical job and boring. So if time permits I might post part 2 at some point of time in the near future.

Coldface Interactive is a boutique XR (Virtual / Augmented Reality / Game / 3D) studio. We have delivered projects for Training, Entertainment, Experiential Marketing, Exhibitions & MVP’s. We also develop custom built apps & tools with business potential. Our first product, “Fur & Hair” tool, for character/creature modellers is selling on the Unity Store .You can read the case study here.

Have a look at our work & tell us how we can help with your projects.

--

--

ColdFace Interactive

We transform your ideas into augmented & virtual reality experiences, games & digital products using the latest cutting edge software & hardware technologies.