Jump to content

New Game Changing Vulkan Extensions for Mobile: Descriptor Indexing

Recommended Posts


The Samsung Developers team works with many companies in the mobile and gaming ecosystems. We're excited to support our partner, Arm, as they bring timely and relevant content to developers looking to build games and high-performance experiences. This Vulkan Extensions series will help developers get the most out of the new and game-changing Vulkan extensions on Samsung mobile devices.

As I mentioned previously, Android is enabling a host of useful new Vulkan extensions for mobile. These new extensions are set to improve the state of graphics APIs for modern applications, enabling new use cases and changing how developers can design graphics renderers going forward. These extensions will be available across various Android smartphones, including the new Samsung Galaxy S21, which was recently launched on 14 January. Existing Samsung Galaxy S models, such as the Samsung Galaxy S20, also allow upgrades to Android R.

I have already discussed two of these extensions in previous blogs - Maintenance Extensions and Legacy Support Extensions. However, there are three further Vulkan extensions for Android that I believe are ‘game changers’. In the first of three blogs, I will explore these individual game changer extensions – what they do, why they can be useful and how to use them. The goal here is to not provide complete samples, but there should be enough to get you started. The first Vulkan extension is ‘Descriptor Indexing.’ Descriptor indexing can be available in handsets prior to Android R release. To check what Android devices are available with 'Descriptor Indexing' check here. You can also directly view the Khronos Group/ Vulkan samples that are relevant to this blog here.



In recent years, we have seen graphics APIs greatly evolve in their resource binding flexibility. All modern graphics APIs now have some answer to how we can access a large swathes of resources in a shader.


A common buzzword that is thrown around in modern rendering tech is “bindless”. The core philosophy is that resources like textures and buffers are accessed through simple indices or pointers, and not singular “resource bindings”. To pass down resources to our shaders, we do not really bind them like in the graphics APIs of old. Simply write a descriptor to some memory and a shader can come in and read it later. This means the API machinery to drive this is kept to a minimum.

This is a fundamental shift away from the older style where our rendering loop looked something like:

render_scene() {
    foreach(drawable) {

Now it looks more like:

render_scene() {
    large_descriptor_heap->write_global_descriptors(scene, lighting, shadowmaps);
    foreach(drawable) {
        offset = large_descriptor_heap->allocate_and_write_descriptors(drawable);  

Since we have free-form access to resources now, it is much simpler to take advantage of features like multi-draw or other GPU driven approaches. We no longer require the CPU to rebind descriptor sets between draw calls like we used to.

Going forward when we look at ray-tracing, this style of design is going to be mandatory since shooting a ray means we can hit anything, so all descriptors are potentially used. It is useful to start thinking about designing for this pattern going forward.

The other side of the coin with this feature is that it is easier to shoot yourself in the foot. It is easy to access the wrong resource, but as I will get to later, there are tools available to help you along the way.

VK_EXT_descriptor_indexing features

This extension is a large one and landed in Vulkan 1.2 as a core feature. To enable bindless algorithms, there are two major features exposed by this extension.

Non-uniform indexing of resources

How resources are accessed has evolved quite a lot over the years. Hardware capabilities used to be quite limited, with a tiny bank of descriptors being visible to shaders at any one time. In more modern hardware however, shaders can access descriptors freely from memory and the limits are somewhat theoretical.

Constant indexing

Arrays of resources have been with us for a long time, but mostly as syntactic sugar, where we can only index into arrays with a constant index. This is equivalent to not using arrays at all from a compiler point of view.

layout(set = 0, binding = 0) uniform sampler2D Textures[4];
const int CONSTANT_VALUE = 2;
color = texture(Textures[CONSTANT_VALUE], UV);

HLSL in D3D11 has this restriction as well, but it has been more relaxed about it, since it only requires that the index is constant after optimization passes are run.

Dynamic indexing

As an optional feature, dynamic indexing allows applications to perform dynamic indexing into arrays of resources. This allows for a very restricted form of bindless. Outside compute shaders however, using this feature correctly is quite awkward, due to the requirement of the resource index being dynamically uniform.

Dynamically uniform is a somewhat intricate subject, and the details are left to the accompanying sample in KhronosGroup/Vulkan-Samples.

Non-uniform indexing

Most hardware assumes that the resource index is dynamically uniform, as this has been the restriction in APIs for a long time. If you are not accessing resources with a dynamically uniform index, you must notify the compiler of your intent.

The rationale here is that hardware is optimized for dynamically uniform (or subgroup uniform) indices, so there is often an internal loop emitted by either compiler or hardware to handle every unique index that is used. This means performance tends to depend a bit on how divergent resource indices are.

#extension GL_EXT_nonuniform_qualifier : require
layout(set = 0, binding = 0) uniform texture2D Tex[];
layout(set = 1, binding = 0) uniform sampler Sampler;
color = texture(nonuniformEXT(sampler2D(Tex[index], Sampler)), UV);

In HLSL, there is a similar mechanism where you use NonUniformResourceIndex, for example.

Texture2D<float4> Textures[] : register(t0, space0);
SamplerState Samp : register(s0, space0);
float4 color = Textures[NonUniformResourceIndex(index)].Sample(Samp, UV);

All descriptor types can make use of this feature, not just textures, which is quite handy! The nonuniformEXT qualifier removes the requirement to use dynamically uniform indices. See the code sample for more detail.


A key component to make the bindless style work is that we do not have to … bind descriptor sets all the time. With the update-after-bind feature, we effectively block the driver from consuming descriptors at command recording time, which gives a lot of flexibility back to the application. The shader consumes descriptors as they are used and the application can freely update descriptors, even from multiple threads.

To enable, update-after-bind we modify the VkDescriptorSetLayout by adding new binding flags. The way to do this is somewhat verbose, but at least update-after-bind is something that is generally used for just one or two descriptor set layouts throughout most applications:

VkDescriptorSetLayoutCreateInfo info = { … };
const VkDescriptorBindingFlagsEXT flags =
VkDescriptorSetLayoutBindingFlagsCreateInfoEXT binding_flags = { … };
binding_flags.bindingCount = info.bindingCount;
binding_flags.pBindingFlags = &flags;
info.pNext = &binding_flags;

For each pBinding entry, we have a corresponding flags field where we can specify various flags. The descriptor_indexing extension has very fine-grained support, but UPDATE_AFTER_BIND_BIT and VARIABLE_DESCRIPTOR_COUNT_BIT are the most interesting ones to discuss.

VARIABLE_DESCRIPTOR_COUNT deserves special attention as it makes descriptor management far more flexible. Having to use a fixed array size can be somewhat awkward, since in a common usage pattern with a large descriptor heap, there is no natural upper limit to how many descriptors we want to use. We could settle for some arbitrarily high limit like 500k, but that means all descriptor sets we allocate have to be of that size and all pipelines have to be tied to that specific number. This is not necessarily what we want, and VARIABLE_DESCRIPTOR_COUNT allows us to allocate just the number of descriptors we need per descriptor set. This makes it far more practical to use multiple bindless descriptor sets.

When allocating a descriptor set, we pass down the actual number of descriptors to allocate:

VkDescriptorSetVariableDescriptorCountAllocateInfoEXT variable_info = { … };
variable_info.sType =
variable_info.descriptorSetCount = 1;
allocate_info.pNext = &variable_info;
variable_info.pDescriptorCounts = &NumDescriptorsStreaming;
VK_CHECK(vkAllocateDescriptorSets(get_device().get_handle(), &allocate_info, 

GPU-assisted validation and debugging

When we enter the world of descriptor indexing, there is a flipside where debugging and validation is much more difficult. The major benefit of the older binding models is that it is fairly easy for validation layers and debuggers to know what is going on. This is because the number of available resources to a shader is small and focused.

With UPDATE_AFTER_BIND in particular, we do not know anything at draw time, which makes this awkward.

It is possible to enable GPU assisted validation in the Khronos validation layers. This lets you catch issues like:

"UNASSIGNED-Descriptor uninitialized: Validation Error: [ UNASSIGNED-Descriptor uninitialized ] Object 0: handle = 0x55625acf5600, type = VK_OBJECT_TYPE_QUEUE; | MessageID = 0x893513c7 | Descriptor index 67 is uninitialized__.  Command buffer (0x55625b184d60). Draw Index 0x4. Pipeline (0x520000000052). Shader Module (0x510000000051). Shader Instruction Index = 59.  Stage = Fragment.  Fragment coord (x,y) = (944.5, 0.5).  Unable to find SPIR-V OpLine for source information.  Build shader with debug info to get source information."


"UNASSIGNED-Descriptor uninitialized: Validation Error: [ UNASSIGNED-Descriptor uninitialized ] Object 0: handle = 0x55625acf5600, type = VK_OBJECT_TYPE_QUEUE; | MessageID = 0x893513c7 | Descriptor index 131 is uninitialized__.  Command buffer (0x55625b1893c0). Draw Index 0x4. Pipeline (0x520000000052). Shader Module (0x510000000051). Shader Instruction Index = 59.  Stage = Fragment.  Fragment coord (x,y) = (944.5, 0.5).  Unable to find SPIR-V OpLine for source information.  Build shader with debug info to get source information."

RenderDoc supports debugging descriptor indexing through shader instrumentation, and this allows you to inspect which resources were accessed. When you have several thousand resources bound to a pipeline, this feature is critical to make any sense of the inputs.

If we are using the update-after-bind style, we can inspect the exact resources we used.

In a non-uniform indexing style, we can inspect all unique resources we used.


Descriptor indexing unlocks many design possibilities in your engine and is a real game changer for modern rendering techniques. Use with care, and make sure to take advantage of all debugging tools available to you. You need them.

This blog has explored the first Vulkan extension game changer, with two more parts in this game changer blog series still to come. The next part will focus on ‘Buffer Device Address’ and how developers can use this new feature to enhance their games.

Follow Up

Thanks to Hans-Kristian Arntzen and the team at Arm for bringing this great content to the Samsung Developers community. We hope you find this information about Vulkan extensions useful for developing your upcoming mobile games. The original version of this article can be viewed at Arm Community.

The Samsung Developers site has many resources for developers looking to build for and integrate with Samsung devices and services. Stay in touch with the latest news by creating a free account or by subscribing to our monthly newsletter. Visit the Marketing Resources page for information on promoting and distributing your apps and games. Finally, our developer forum is an excellent way to stay up-to-date on all things related to the Galaxy ecosystem.

View the full blog at its source

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Similar Topics

    • By STF News
      Samsung Electronics today announced a new partnership with CJ ENM, Asia’s leading entertainment and media group behind the Oscar-winning film ‘Parasite’, to build a virtual production studio to spearhead the production of future video content. Combining its cutting-edge Micro LED technology with CJ ENM’s globally recognized content production of television series and films, Samsung is taking the next step in a new initiative to innovate in the rapidly expanding virtual production market.
      Through this partnership, Samsung will supply its state-of-the-art display technology, The Wall, to CJ ENM’s virtual studio, a part of its television and film production studio complex scheduled to open in Paju, Korea later this year. The custom virtual production volume studio will be the first in the world to leverage The Wall’s boundless LED technology, unlocking new possibilities for video content production operations and virtual production solutions. The main display will be installed in an oval shape with a diameter of 20 meters and a height of seven meters or more, creating a seemingly endless backdrop to capture content.

      “We are excited to collaborate with CJ ENM to build a virtual production studio featuring Samsung’s most cutting-edge display technologies,” said Jong-hee Han, President of Visual Display Business at Samsung Electronics. “With this partnership, Samsung is launching a new virtual production industry initiative with a commitment to deliver innovative products and solutions that offer the optimal environment for next-generation content production.”
      This virtual production studio will use LED displays and connected cameras to create virtual settings in real-time. This solution will save time and reduce image compositing and on-location production costs while helping filmmakers to see the camera on the live-action set in any direction.
      The Wall’s modular technology allows creators to design environments to their specific requirements, enabling a variety of installation options such as ceiling installation and convex or concave design, depending on the internal studio design.

      The 2021 model of The Wall (Model Name: IWA) with Micro LED technology enhances visual expression with ultra-deep blacks and wide viewing angles, giving filmmakers and content creators the ultimate canvas to fulfill their visions. The modular screens are ideal for studios thanks to their precise color expression, HDR10+ and cinema LED picture quality technology and optimized frame rates for production houses. A new molding process is also applied to the modular surfaces of The Wall to minimize any moiré patterns from forming, a nuisance typically associated with filming standard LED screens.
      The Wall’s massive screen measures over 1,000 inches, producing vibrant colors and details supporting up to 16K high-resolution1 content. Dedicated frame rates for studio production, a new addition to this year’s model, allows producers to run content at frame rates such as 23.976, 29.97 and 59.94Hz, ensuring seamless videos sync with the most widely used camera framerates. Frame Rate Sync technology further reduces screen disruptions for true-to-life accuracy. With thoughtful dust and contamination-resistant LED protective films, plus a variety of easy-to-use solutions, such as remote management and color adjustment, The Wall is built for convenient management in any environment.
      CJ ENM Virtual Studio Concept Visual
      Both companies expect this collaboration to improve content production possibilities while satisfying a variety of customers by reimagining content production for today’s fast-paced entertainment environment.
      “The strategic partnership with Samsung will allow CJ ENM to push forward the creation of a new powerhouse of the next-generation content,” said Ho-sung Kang, CJ ENM CEO. “While CJ ENM is investing $4.4 billion over the next five years in entertainment content, we are taking the lead in building a global No. 1 production studio to become a world leading entertainment company.”
      About CJ ENM
      CJ ENM is Asia’s leading entertainment and lifestyle company headquartered in Seoul, Korea. Since 1995, the company has engaged in a wide array of businesses across the industry spectrum including media content, music, film, performing arts, and animation, providing its top notch original content to various media platforms. CJ ENM has created, produced and distributed globally acclaimed contents including Cannes-winning film Parasite, Tony Award-winning musical Kinky Boots, record-breaking Korean box office hits Roaring Currents, Extreme Job, Ode to My Father, along with sought-after television series such as Crash Landing On You, Mr. Sunshine, Guardian: The Lonely and Great God, Grandpas over Flowers, I Can See Your Voice and more. To offer the best K-Culture experiences worldwide, CJ ENM presents KCON/KCON:TACT, the world’s largest K-culture convention & festival celebrating Hallyu and Mnet Asian Music Awards (MAMA), Asia’s biggest music awards. With regional offices in Asia, Europe and the U.S., CJ ENM currently employs over 3,600 people.
      1 16K resolution is only available for horizontal layouts with a 15,360 x 2,160-pixel arrangement.
      View the full article
    • By STF News
      We’re constantly working to make your web experience better and more secure. This month we’re introducing our latest Samsung Internet beta, 15.0, with a engine upgrade as well as new features and privacy protections.
      To start off with, 15.0 will be based on Chromium M90. This means developers will have new APIs to work with as well as the 9.0 version of the v8 JavaScript engine.
      Adding the search widget to your home screen: long-press the Samsung Internet icon and select “widgets.” You will then be prompted to Add the search widget to your device’s homescreen.
      We’re working to make your search experience more intuitive and easier. 15.0 introduces a search widget to better integrate your web searching experience (with your chosen search provider) right to your home screen so your home screen searches better integrate with the rest of your web usage. More updates on the search experience is coming soon.
      We’re also including some enhancements to user privacy with 15.0 — notably making our browser more resistant to fingerprinting by introducing some “noise” in the results from APIs that are commonly used by finger-printers. Fingerprinting is one technique that tracking networks use to correlate information about people even when they use secret browsing mode or tracking blockers such as the previous version of our Smart Anti-Tracking did so building in fingerprinting resistance should help Samsung Internet users maintain their chosen privacy settings.
      Our new “Back-Forward cache” feature will allow you to navigate pages more quickly. For frequently accessed web pages, this will bring cache hits from ~20% to around ~70%, and should greatly speed up browsing time (and reduce data usage) as you flick between pages.
      Back-Forward cache (“bfcache”) is an in-memory cache that stores a complete snapshot of a page (including the JavaScript heap) as the user is navigating away. With the entire page in memory, the browser can easily restore it if the user returns. More info on how bfcache works can be found in this web.dev article.
      We’ve enhanced bfcache for Samsung Internet by adding some further heuristics to determine whether the page should be restored or not.
      To make back-forward cache work better for your sites, remember to set the follow the instructions on web.dev about optimising your page for bfcache.
      Privacy is also about the ability to delete your data when you choose to. We’re making it more straightforward to delete your browsing data by providing you more information about what you’re deleting.
      Another way we’re safeguarding your privacy is by making it more straightforward to use Secret Mode (our private browsing mode). With 15.0, once you select Secret Mode, Samsung Internet will launch in Secret Mode by default even if terminated.
      Our new Beta is available on both the Galaxy and Play stores for download today!
      If you find any bugs or issues, please let us know at: [email protected]
      View the full blog at its source
    • By STF News
      Samsung Electronics today launched the 2021 model of its boundary-pushing modular display The Wall (Model Name: IWA) globally. With the new AI processing technology, upgraded 120Hz frame rates, and versatile installation options, this year’s The Wall once again reimagines display technology to give businesses boundless flexibility to showcase content.

      “Samsung is dedicated to creating the most innovative displays for those at the forefront of video creation and brand experiences,” said Hyesung Ha, Senior Vice President of Visual Display Business at Samsung Electronics. “The 2021 The Wall is our most immersive and versatile display yet, giving businesses complete control to create their dream environments.”
      A new Micro AI Processor instantly analyzes and optimizes every frame of the video to deliver the best picture quality possible. By using up to 16 different neural network models, each trained in AI upscaling and deep learning technology, the Micro AI Processor can optimize picture quality up to 8K resolution, enhancing contrast and removing noise.
      The Wall’s Black Seal Technology blankets the screen with perfect uniformity, creating a seamless canvas for purer black levels with enhanced depth delivering unparalleled contrast and immaculate detail. With 1,600 nits peak brightness, The Wall provides clear images that stand out even in bright room environments. Ultra Chroma technology produces narrower wavelengths that create RGB colors twice as pure1 and more accurate than conventional LEDs. Each LED is now up to 40% smaller,2 increasing the pure black space between pixels for enhanced color uniformity and higher picture contrast.

      This year’s model is easier to install thanks to new wireless docking connections and a bezel-less design, resulting in a cabinet depth half as deep as before.3 With modular technology, The 2021 The Wall can be installed in a variety of positions, including concave, convex, ceiling, hanging, inclined and L-type. In addition, users only need to adjust once per cabinet with Factory Seam Adjustment, eliminating module-by-module adjustment and saving time.
      The 2021 model offers its most detailed picture yet with the industry’s first 8K resolution with a 120Hz refresh rate and Simple 8K playback. The Wall can be configured horizontally for up to 16K resolution with a 15,360 x 2,160-pixel arrangement.4 Its massive screen measures in at over 1,000 inches, creating a completely expansive canvas for showcasing content.

      The Wall comes built with Micro HDR and Micro Motion features, featuring 20-bit processing to deliver a consistent picture quality that is suitable across commercial environments for a smooth viewing experience regardless of location. The screen also includes four picture-by-picture screens (PBP), allowing for four different content sources, all of which can be displayed simultaneously in 4K resolution. The 4-PBP function can be used for business purposes where multi-screens are needed.
      All of The Wall’s features are delivered with safety top of mind. TÜV Rheinland awarded its Eye Comfort Certification for minimized blue light emission, while its EMC Class B certification minimizes electromagnetic waves for safe installation in homes.
      Samsung’s 2021 The Wall is available in select markets around the globe starting today. For more information, please visit: https://displaysolutions.samsung.com/the-wall.
      1 Figure is based on internal testing standards against Samsung’s conventional LED Signage.
      2 Compared to 2020 model.
      3 Compared to 2020 model.
      4 For a regular design, 16:9 ratio, 8K resolution (7,680 x 4,320) is supported.
      View the full article
    • By STF News
      Today, protecting the environment and fighting climate change require technology manufacturers to take a more thoughtful approach to innovation.
      Samsung Electronics strives to incorporate environmental sustainability into everything it does. The company’s products are thoughtfully designed to minimize their impact on the environment throughout their lifecycle – from the planning and manufacturing stages to their use and recycling.
      This includes using power-efficient semiconductor chips, sustainable packaging and energy-saving technology, and offering consumers the ability to upcycle packaging and old devices. Through tangible solutions like these, the company is empowering consumers to take part in an important mission: conserving resources and reducing their environmental footprint.
      These efforts are reflected in the company’s TV lineup, including its 75-inch Neo QLED TV (QN90A). Check out the infographic below to examine how Samsung took a ‘full circle’ approach to sustainability with one of its latest displays.

      View the full article
    • By Alex
      How do you get HBO Go on Samsung Tizen Smart TVs? HBO Go is supported on most Samsung Tizen Smart TVs.
      To find out if HBO GO is available on your Samsung TV, go to Samsung TV: Compatible Devices with HBO GO and look for your Samsung TV model.
      If HBO GO is not available on your Samsung TV, you can use a streaming player (such as Roku or Apple TV), a game console, or stream HBO GO to your TV using Chromecast. 
      Go to Smart Hub and search for HBO Go
  • Create New...