NVIDIA Vulkan SC 1.0.15 SDK README Prerequisites ============= 1) Operating System: 64-bit x86 Windows or Linux 2) The NVIDIA GPU Driver must be installed, with minimum version 560.00. 3) Vulkan SC development also requires use of the standard Vulkan SDK, which is available at https://vulkan.lunarg.com/sdk/home or through your distribution's package manager. This SDK was tested using version 1.3.283.0 of the Vulkan SDK, but other recent versions are likely to work. 4) NVIDIA GPU with a recent architecture. The Ampere and Ada (and future) architectures are fully supported. Experimental support is provided for the Turing architecture, but is not fully guaranteed. An approximate mapping from product name to architecture is as follows: NVIDIA GeForce RTX 20 Series, NVIDIA RTX 20 Series => Turing NVIDIA GeForce RTX 30 Series, NVIDIA RTX 30 Series => Ampere NVIDIA GeForce RTX 40 Series, NVIDIA RTX 40 Series => Ada PLEASE NOTE: There are exceptions to the above rule! For example, there may be some products sold under the GeForce RTX 30 Series label which are not Ampere cards. 5) For Windows users interested in using the Direct-to-Display feature (VK_KHR_display extension), a specific type of Windows license (such as Windows Enterprise, or Windows Pro for Workstations) may be required. For more information, please refer to the Microsoft documentation: https://learn.microsoft.com/en-us/windows-hardware/drivers/display/specialized-monitors This feature is also only supported by the NVIDIA Driver on Windows 11 and later. Installing the Vulkan SC SDK ============================ To install the Vulkan SC SDK, download the archive (vulkansc-sdk-1.0.15.tar.gz for Linux; vulkansc-sdk-1.0.15.zip for Windows) and extract it: Linux: $ tar xvzf vulkansc-sdk-1.0.15.tar.gz Windows PowerShell: PS> Expand-Archive -Path vulkansc-sdk-1.0.15.zip -DestinationPath . The above will extract the SDK to the vulkansc-sdk-1.0.15/ directory. For Linux users: if you intend to copy the Vulkan SC Headers to /usr/include, consider copying them to /usr/include/VulkanSC/vulkan/* (and then using the -I compiler flag with this directory in Vulkan SC builds). Doing so prevents the Vulkan and Vulkan SC headers from conflicting with one another. Included in the Vulkan SC SDK ============================= The sdk/ directory includes builds of the following Vulkan SC Ecosystem components: VulkanSC-Headers - Files: sdk/include/vulkan/* - These headers are the Vulkan SC equivalent of the standard Vulkan headers. Use these when compiling Vulkan SC applications. VulkanSC-Loader - Files: sdk/lib64/libvulkansc.so (linux), sdk/bin/vulkansc-1.dll (windows) - This library finds Vulkan SC drivers and layers on your system and enables Vulkan SC applications to query them through a consistent interface. Link against this library when building Vulkan SC applications. VulkanSC-Tools - Files: sdk/bin/vulkanscinfo (linux), sdk/bin/vulkanscinfo.exe (windows) - vulkanscinfo helps to verify that the Vulkan SC driver is set up correctly and prints information about the driver's capabilities. VulkanSC-ValidationLayers - Files: sdk/lib64/libVkSCLayer_khronos_validation.so (linux) sdk/bin/VkSCLayer_khronos_validation.dll (windows) - This layer performs extra runtime checks for valid usage of the Vulkan SC API, and may be helpful while debugging Vulkan SC applications. VulkanTools-ForSC - Files: sdk/lib64/libVkLayer_json_gen.so (linux), sdk/bin/VkLayer_json_gen.dll (windows) - This layer saves pipeline information to disk in a format that can be read by nvidia-pcc. For more information, see the "Regenerating the pipeline JSON" section below. More information about the sources of the Vulkan SC Ecosystem components, and the dependencies of the Ecosystem components, can be found in the config.txt file at the root level of this SDK. Also included in this SDK is the source code for two sample Vulkan SC applications, in the samples/ directory: - vksc_01tri - vksc_computeparticles Precompiled binaries for both Vulkan and Vulkan SC versions of the samples are included in the samples/bin/ directory. Compiling the pipeline cache ============================ Vulkan SC uses offline pipeline compilation, and therefore requires the additional step of compiling a pipeline cache binary before running the application. The Pipeline Cache Compiler (nvidia-pcc) has the following requirements: 1) nvidia-pcc/nvidia-pcc.exe must be installed on the PATH. 2) Pipeline JSON descriptions and shader SPIR-V must be available on the filesystem. For vksc_01tri and vksc_computeparticles, we have provided these in the samples/data/ directory. 3) The target architecture for PCC must be identified. Requirement 1 can be satisfied by installing the NVIDIA Driver Package (minimum version 560.00). Requirement 2 is satisfied by the pre-packaged files under the samples/data directory. Section "Regenerating the pipeline JSON" describes how to create these files. Requirement 3 is slightly more complicated. nvidia-pcc needs to be provided with the target GPU architecture in order to generate shader code. This can be done either by providing the chip ID (e.g., ga102, ad102) or by providing the 4-hex-digit PCI device ID. Identifying the target GPU using the PCI device ID is easier to accomplish programmatically. On Linux, this can be accomplished with the `lspci` utility: $ export GPU_DEVICE_ID=$(lspci -n -vmm -d 10de::03xx | grep "\" | cut -f 2) On Windows, the same can be done with `pnputil` in PowerShell: PS> $GPU_DEVICE_ID = pnputil /enum-devices /bus PCI /deviceids /class Display | select-string -Pattern 'DEV_[0-9A-F]+' -List | ForEach-Object { $_.Matches.Value } | Select -First 1 | select-string -Pattern "[0-9A-F]+$" | ForEach-Object { $_.Matches.Value } In either case, GPU_DEVICE_ID should be set to a four-digit hex code corresponding to the PCI device ID, such as "1e87" or "2204". Upper/lower case does not matter in this context. Now, nvidia-pcc can be invoked as follows: $ cd samples/ $ nvidia-pcc -out data/pipeline/vksc_01tri/pipeline_cache.bin -device $GPU_DEVICE_ID Running the precompiled samples =============================== Running either vksc_01tri or vksc_computeparticles has the following requirements: 1) The application must be run from the samples/ directory. 2) The Vulkan SC loader library (libvulkansc.so / vulkansc-1.dll) must be in a location in the linker search path. 3) The pipeline cache must be compiled as in the "Compiling the pipeline cache" section. On Linux, execute the following: $ SDK_ROOT_DIR=/path/to/sdk/root $ export LD_LIBRARY_PATH=$SDK_ROOT_DIR/sdk/lib64/ $ cd $SDK_ROOT_DIR/samples/ $ bin/vksc_01tri On Windows PowerShell, execute the following: PS> $SDK_ROOT_DIR = "C:\path\to\sdk\root" PS> $env:Path = "$SDK_ROOT_DIR\sdk\bin;" + $env:Path PS> cd $SDK_ROOT_DIR\samples\ PS> bin\vksc_01tri.exe The first three steps are the same for running vksc_computeparticles. To view output, there are two options: 1) Add the -o option to the command line for vksc_01tri/vksc_computeparticles. This produces a PPM image on disk (tri.ppm for vksc_01tri, particles.ppm for vksc_computeparticles). 2) Add the -d option to the command line for vksc_01tri/vksc_computeparticles to enable Direct-to-Display mode. Instructions for setting up your system for use with Direct-to-Display can be found in the "Using Direct-to-Display (VK_KHR_display)" sections. Regenerating the pipeline JSON ============================== As you develop Vulkan SC applications, the configuration of the VkPipelines that you use will evolve. Therefore, the JSON descriptions of those pipelines will need to be regenerated, and the new JSON will need to be passed into nvidia-pcc in the process described in "Compiling the pipeline cache". One approach to generate the pipeline JSON is to use a version of the application which uses standard Vulkan as its backend. We provide precompiled Vulkan versions of the samples in the samples/bin/ directory, named vk_01tri and vk_computeparticles, for convenience. These versions of the applications make use of the VkLayer_json_gen layer to capture details about the pipelines and shaders used by the application and dump them to disk in a format that can be used by nvidia-pcc. On Linux, execute the following: $ SDK_ROOT_DIR=/path/to/sdk/root $ export VK_LAYER_PATH=$SDK_ROOT_DIR/sdk/etc/vulkan/explicit_layer.d/ $ cd $SDK_ROOT_DIR/samples/ $ bin/vksc_01tri On Windows PowerShell, execute the following: PS> $SDK_ROOT_DIR = "C:\path\to\sdk\root" PS> $VK_LAYER_PATH=$SDK_ROOT_DIR\sdk\bin PS> cd $SDK_ROOT_DIR\samples\ PS> bin\vk_01tri.exe Building the samples ==================== To build the samples, the following are required: - cmake version >= 3.7 - A C++ compiler that supports the C++20 language standard. To build vksc_01tri on Linux: $ cd samples/vksc_01tri/ $ mkdir build $ cd build $ cmake .. $ cmake --build . To build vksc_01tri on Windows PowerShell: PS> cd samples\vksc_01tri\ PS> mkdir build PS> cd build PS> $env:VULKAN_SDK = "C:\VulkanSDK\1.3.283.0\" PS> cmake.exe .. PS> cmake.exe --build . vksc_computeparticles can be built in the same way from the samples/vksc_computeparticles directory. Now, the samples (both Vulkan and Vulkan SC versions) can be run using the locally built binaries at: (For Linux:) samples/vksc_01tri/build/{vksc_01tri,vk_01tri} samples/vksc_computeparticles/build/{vksc_computeparticles,vk_computeparticles} (For Windows:) samples/vksc_01tri/build/Debug/{vksc_01tri.exe,vk_01tri.exe} samples/vksc_computeparticles/build/Debug/{vksc_computeparticles.exe,vk_computeparticles.exe} Using Direct-to-Display (VK_KHR_display) ======================================== The NVIDIA Vulkan SC driver for x86_64 Linux and Windows supports the VK_KHR_display extension (also referred to as Direct-to-Display) as its sole presentation mechanism. VK_KHR_display circumvents the standard windowing system to allow the application to directly program the display hardware and present VkImages to a specified monitor. Direct-to-Display has some caveats and the setup is more complex to configure as compared to the standard Vulkan window system extensions. The next two sections describe how to use Direct-to-Display with Windows and Linux. On both Windows and Linux, once the system is set up for Direct-to-Display, the samples can be run in Direct-to-Display mode using the '-d' flag, e.g., $ bin/vksc_01tri -d $ bin/vksc_computeparticles -d Using Direct-to-Display (VK_KHR_display) on Linux ================================================= There is a known limitation with Linux Vulkan SC Direct-to-Display that prevents Direct-to-Display from acquiring a display at the same time that any other applications have other displays acquired. Therefore, before starting a Vulkan SC Direct-to-Display application, any other DRM users need to be terminated first (in particular, the X server). There are many different ways X servers can be launched but, on a distribution that uses systemd, the following should work: $ sudo systemctl stop lightdm.service You may need to exchange lightdm with the name of the display manager that your system uses. When finished, the corresponding 'start' command will restore the display manager. $ sudo systemctl start lightdm.service Using Direct-to-Display (VK_KHR_display) on Windows =================================================== Only certain editions of Windows support removing a display from the desktop, which is a required step in running Direct-to-Display applications on Windows. For more information, please see item 5 in the Prerequisites section. In order to run Direct-to-Display applications, the test system needs to have at least two displays connected, and at least one display needs to be removed from the desktop. The first time that a Direct-to-Display application is run, it may fail -- this appears to be related to the monitor powering on. Once the monitor has been powered on, subsequent Direct-to-Display applications should operate normally. Displays can be removed from the Windows desktop using the Advanced Display Settings UI: - Go to Start -> Settings -> System -> Display -> Advanced display settings - Select the monitor that should become the Direct-to-Display output and toggle the "Remove display from desktop" switch to 'on'. The monitor will go blank and the Windows desktop will not be extended to this display anymore. - Once finished, the display can be returned to the system by toggling the "Remove display from desktop" switch back to 'off'. PLEASE NOTE: There is a bug on certain versions of Windows which may prevent displays from being usable after being removed from the desktop. To fix this, follow the below text which mentions the "--force" option for configureDdisplay.exe's "return-to-display" command. Displays can also be managed using a command line utility, called configureDdisplay.exe, which can be downloaded at https://www.nvidia.com/en-us/drivers/ddisplay-utility/ . To use this utility, launch PowerShell as administrator and run the following commands: For command line help: PS> .\configureDdisplay.exe -h To gather information about displays attached to the system: PS> .\configureDdisplay.exe status Once you've identified which display you wish to use as the output for Direct-to-Display, keep track of the index of that display and use the following command to remove it from the desktop: PS> .\configureDdisplay.exe remove-from-desktop --do-not-manage-display-mode --index At this point, the display can be used by the Vulkan SC application. To return the display to the desktop, use the following command: PS> .\configureDdisplay.exe return-to-desktop --index There is a bug on certain versions of Windows which may prevent displays from being usable after being removed from the desktop. To fix this, run the following command: PS> .\configureDdisplay.exe return-to-desktop --force --index Support / Contact / Bug Reports =============================== For support, please create a thread with the "vulkansc" tag on the NVIDIA Vulkan developer forums at: https://forums.developer.nvidia.com/c/gaming-and-visualization-technologies/apis/vulkan/205 Please note that you will have to create an account on the developer forums before you can create a new topic.