Texturing for AR: Enhancing The Online Shopping Experience With Shopify

Pierre Bosset on March 24 2018 | AR, Stories, Design

We interviewed Andrei Serghiuta, Senior 3D artist at Shopify. Andrei and his team have been experimenting with innovative ways to use augmented reality in online shopping. Read on to learn how this will change traditional e-commerce and how Substance Painter was involved in texturing the 3D models and making the products as realistic as their real-life counterparts.

Andrei Serghiuta: My personal journey in 3D started with a degree in Graphic Design. This has essentially steered my career as a 3D illustrator working in the advertising industry for over a decade, most of it as a freelancer. My collaboration with Shopify started two years ago, after seeing the Vive in action for the first time. The new VR tech had me hooked instantly.

Shopify is a top e-commerce platform that enables over 600,000 merchants to easily sell products online, on social media, and in person. We’re obsessed with the future of commerce and experimenting with technologies that we can scale to serve our merchants and their global consumers.

My current role at Shopify is that of a Senior 3D artist, part of our AR/VR team that is composed of two other artists and a few developers. Our vision is that technologies like AR and VR will fundamentally change how we shop in the future. We are actively exploring the best ways of enabling our merchants to take advantage of this technology as easily as possible.

We have come up with a number of immersive buying experiences and ideas, and we outline some of them on our team’s Blog. In order to test our ideas, we’d create small demos to help illustrate the concepts. In some cases, we would also work directly with select merchants to implement these ideas in their store, in order to get the ball rolling. One such example was adding the AR component to Magnolia Market’s mobile app. We modeled a couple dozen of their products in 3D and added the option inside their app to preview them in AR. To make this possible, we used the Apple ARKit feature that is now available on most iOS 11 devices. The user feedback has been tremendously positive for this feature.

During our initial testing with ARKit, we quickly found out how much fun it was to inspect a 3D model up close, from all angles. There is a distinct advantage in being able to see how a product might look inside your home, on your desk, or in your office. We also found that engaging with the product in AR or VR offers a much richer experience overall, and it generally provides a very similar interaction to the way we would inspect a product in real life. There is now a whole new world of possibilities that gets opened up once the merchant has a 3D model available for their products. This is also where our biggest challenge lies. Creating these 3D models, and creating them well, is our most difficult task in this process.

My discovery of Substance came about within this context. Although I was aware of it beforehand, I only started to use it once I had to deal with large sets of items in the VR environment of Thread Studio. This was the catalyst that forced me out of the Photoshop/Mudbox workflow for texturing. Because our particular challenge is to recreate the look of real items, having the ability to paint the materials directly in a PBR environment was critical. However, every 3D engine seems to have its own particular implementation of PBR. We have created demos using Unreal, Unity, PlayCanvas, SceneKit, Sketchfab, and Snapchat’s LensStudio, and they all come with their own special requirements. Being able to quickly re-export textures for the items to suit a particular engine was very important, and Substance Painter’s preset system was a lifesaver.

The VR environment of ThreadStudio.
Image Courtesy of Shopify

Most of the products we’d recreated up to that point were modeled through a manual process, while the rest of them were created through photogrammetry. Either way, Substance B2M came in very handy when we needed to generate normal and roughness maps from the photo reference. It provides an easy and integrated platform for manipulating textures from reference images, especially those coming from photogrammetry scans. To get the realistic look of a surface we needed to pay close attention not just to the mesh details, but also to the material details.

Example showing usage of Substance B2M to generate roughness and normal maps from photogrammetry texture, and the imported model inside Substance Painter.
Image Courtesy of Shopify
Image Courtesy of Shopify

Almost all of our usage in Substance comes from Substance Painter, and with good reason. It’s simply amazing how easily and efficiently Substance Painter is able to blend materials and add wear and tear specific to the particular product that we are recreating. It is a very intuitive and rapid way of painting objects. However, Substance Designer is invaluable when it comes to creating new materials with customizable repeating patterns that are difficult to paint.

For AR applications, our typical workflow for products starts with the 3D modeling process in Maya. The polygon count still has some importance, but we found that the largest limiting factor with 3D models on mobile, in particular, is actually the file size. As it happens, the mobile devices that support ARKit are able to push hundreds of thousands of triangles without issue, and can also handle quite a bit of texture memory as well.

A larger item, which required a heavier polygon count. At 72k triangles, it still works seamlessly on mobile with ARKit.
Image Courtesy of Shopify

The polygon mesh file will compress into significantly smaller sizes than the textures. This means that 2k normal maps for iOS will sometimes be significantly larger in file size than if the detail had been built directly into the mesh itself. For single product viewing in AR, having a 70k triangle count for a mesh is not a problem with ARKit. However, a 4-5MB normal map will often be a problem.

While the other attribute maps will compress nicely using formats available on iOS, we noticed that this type of compression produces very visible artifacts for the normal maps on smooth and shiny surfaces. Rougher detail, however, seems to compress without noticeable artifacts. Another advantage of having as much of the mesh detail as possible built into the model is that the user will have fewer visual issues when they inspect the model up close.

Normal Map JPEG compression artifacts. Left example shows the effect of compression, right example shows no compression.
Image Courtesy of Shopify

Once we have the model created, with the UVs nicely unwrapped, we’ll almost always bring it directly into Substance Painter for texturing. Some models will benefit from separate meshes, each with their own textures, especially when transparency is required. Here, once again, Substance Painter makes it very easy to work on the different materials, all within the same project. What makes Substance so valuable to us is having the ability to easily work with customizable material presets, which can be layered together in order to reproduce the look of surfaces in real life. Since the range of consumer products out there is so wide, we would regularly deal with different kinds of metals, woods, fabrics, and plastics.

Example of product with a mix of wood and metal.
Image Courtesy of Shopify

Most of the time we are able to start with the existing material presets, and tweak them from there. Some common features that we pay attention to are breaking up the specular with some sort of roughness detail, edge wear and tear, and ambient occlusion. Substance gives us plenty of control over all these aspects. Sometimes we also need to use custom normal maps for details that would take too long to model properly. We still go back to Photoshop or Maya for that.

Overall, we wouldn’t normally need to spend more than a day texturing a single product, and the results from that day of work would outdo what we could accomplish in a day using a more traditional workflow. With the texturing process complete, we would export the maps for use in XCode, with SceneKit. By default, SceneKit requires the maps to be exported individually, with the PBR shader set up as metallic/roughness. It also requires the meshes to be in Collada format. Once we learned the quirks of working with SceneKit, the process was actually pretty straightforward. Two more components that complete the scene are the contact shadow, which we generate inside Maya, and a proper environment map for lighting and reflections.

Export settings for SceneKit
Image Courtesy of Shopify
Example showing one of the products inside SceneKit.
Image Courtesy of Shopify

Overall, Substance has proven itself to be an integral tool in our workflow. It enabled us to bring the look and feel of the real products to their virtual counterpart. We received fantastic feedback on the reproduction of the Magnolia products in particular, and is in large part due to the role that Substance played in the production. Just read the reviews on the App Store. Having the ability to virtually place accurate 3D models of the products within their home space allows shoppers to make a better informed decision, and feel more comfortable with pressing that Buy button.

We anticipate steady growth in the demand for 3D models over the coming years. All of the AR and VR tech is dependent on them, whether they are built manually, scanned, or photographed. Although this demand has already sent many people on a quest for automation, I would venture to guess that some form of manual control is likely to remain for a while as the main method of ensuring quality.

If you are excited about creating quality 3D models for our merchants, please contact us!

On Facebook