vkDuck Tutorial ๐“…ฐโ‹†หš

(*แด—อˆหฌแด—อˆ) Getting Started

Prerequisites

Before building, make sure you have the following:

Building and Running

Clone the repository and build with Meson:

git clone https://github.com/fini03/vkDuck.git
cd vkDuck
meson setup build
meson compile -C build
./build/main

Project Structure

After building, the editor opens. Your working project follows this structure:

my-project/
  shaders/          # your .slang shader sources
  data/
    models/         # glTF/GLB models go here
  saved_states/     # scene configuration files
Note: All 3D models must be placed inside data/models/ at the project root before they can be loaded in the Asset Library.

โœŽแฐ Writing Shaders for vkDuck

vkDuck uses the Slang shading language. Each shader file must follow a specific structure so the editor can reflect on it, auto-generate node pins, and bind resources correctly.

The common.slang Module (Fixed Contract)

The common.slang file defines the shared CPU/GPU types for lights, camera, object transforms, and materials. The light and camera structs are fixed and must not be changed. All written shaders should import it.

module common;

export public struct Light {
    public float3 position;
    public float  radius;
    public float3 color;
    public float  intensity;
};

export public struct LightsBuffer {
    public int   numLights;
    public int   _pad0;
    public int   _pad1;
    public int   _pad2;
    public Light lights[128];
};

export public struct Camera {
    public float4x4 view;
    public float4x4 invView;
    public float4x4 proj;
    public float4x4 invProj;
};

export public struct ObjectUniforms {
    public float4x4 model;
    public float4x4 normal;
};

export public struct MaterialParams {
    public float4 baseColorFactor;
    public float4 emissiveFactor;
    public float  metallicFactor;
    public float  roughnessFactor;
    public float  _padding0;
    public float  _padding1;
};

Required Shader Structures

Every shader file must define three specific structures. The names don't matter โ€” the semantic annotations do, since the editor uses them for reflection.

Vertex Input (VSInput)

Supported semantics:

SemanticTypeDescription
POSITIONfloat3/float4Vertex object/world position
NORMALfloat3Vertex normal
TEXCOORD0float2Primary UV coordinates
TANGENTfloat4Tangent vector
struct VSInput {
    float3 position : POSITION;
    float3 normal   : NORMAL;
    float2 uv       : TEXCOORD0;
};

Vertex Output / Fragment Input (VSOutput)

Must include SV_Position for clip-space position. Any other fields (UV, world-space position, normals, etc.) are interpolated and passed to the fragment stage.

struct VSOutput {
    float4 position  : SV_Position;
    float2 uv        : UV;
    float3 positionW : POSITION;
    float3 normalW   : NORMAL;
};

Fragment Output (FSOut)

Must write to SV_Target0 for the main colour attachment. Additional render targets can be added as SV_Target1, SV_Target2, etc.

struct FSOut {
    float4 color : SV_Target0;
};

Resource Bindings

All resources must be explicitly tagged with [[vk::binding(binding, set)]]. The first argument is the binding number within the descriptor set; the second is the set number.

// Texture in set 0, binding 0
[[vk::binding(0, 0)]] Sampler2D albedoTexture;

// Camera UBO in set 1, binding 0
[[vk::binding(0, 1)]] ConstantBuffer<Camera> camera;

// Object transform UBO in set 1, binding 1
[[vk::binding(1, 1)]] ConstantBuffer<ObjectUniforms> objectUBO;

Shader Entry Points

Slang uses [shader("vertex")] and [shader("fragment")] attributes to mark entry points. The editor reflects these to determine what stages the shader file exposes.

[shader("vertex")]
VSOutput vertexMain(VSInput IN) { ... }

[shader("fragment")]
FSOut fragmentMain(VSOutput IN) { ... }

Shader Reflection and Node Pins โ‹†ห™โŸก

When you load or save a shader in the editor, vkDuck reflects the SPIR-V to discover inputs, outputs, and resource bindings. Each reflected resource automatically becomes a pin on the Pipeline node, letting you wire up Camera, Light, and Model nodes without any manual configuration.

โ–ณโ–ผ Building a Renderer in the Node Graph

The heart of vkDuck is the visual node graph. A renderer is assembled by creating nodes and connecting their output pins to input pins of other nodes.

Node Types Overview

NodePurpose
Model SourceLoads a glTF/GLB model file and exposes its mesh sub-nodes
Vertex DataProvides vertex/index buffer data from a model mesh to the pipeline
UBOProvides per-object uniform buffer data (model & normal matrices)
MaterialProvides texture and material parameter bindings for a mesh
CameraProvides view/projection matrices and camera mode controls
LightProvides a light source (position, colour, intensity) to the pipeline
PipelineThe core render pass node โ€” takes a shader, model data, and resources
PresentOutputs the final rendered image to the swapchain / preview window

Step-by-Step: Rendering a Model (โแด—อˆหฌแด—อˆ)

Step 1 โ€” Place your model files

Copy all .gltf or .glb files into data/models/ at your project root. The editor will not see them otherwise.

Step 2 โ€” Load models from the Asset Library

Create a Model Source node. Open the Asset Library tab in the editor. Your model files will appear there. Click a model to import it into the scene.

Step 3 โ€” Create per-mesh nodes

For each mesh inside the Model Source, create three companion nodes and connect them:

Note: A single model with multiple meshes (e.g. a character with separate body/armour/hair meshes) needs one set of Vertex Data + UBO + Material per mesh.

Step 4 โ€” Add a Camera node

Create a Camera node from the node menu. Choose a camera mode (FPS, Orbital, or Fixed) using the node's drop-down. Connect its output pin to the Camera input of the Pipeline node.

Step 5 โ€” Add Light nodes

Create a Light node. Set the position, color, radius and intensity for each. Connect them to the LightsBuffer input on the Pipeline node.

Step 6 โ€” Create and configure the Pipeline node

Create a Pipeline node. In the node's properties:

Step 7 โ€” Connect to Present

Create a Present node and connect the Pipeline node's output to it. The real-time preview window will immediately show the result หšห™๐“…ฐห™หš

โŒฏโŒฒ Complete Shader Example

A complete, minimal diffuse shader demonstrating all required conventions:

import common;

// โ”€โ”€ Resource Bindings โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
[[vk::binding(0, 0)]] Sampler2D       albedoTexture;
[[vk::binding(0, 1)]] ConstantBuffer<Camera>         camera;
[[vk::binding(0, 2)]] ConstantBuffer<ObjectUniforms> obj;
[[vk::binding(0, 3)]] ConstantBuffer<LightsBuffer>   lightsBuffer;
[[vk::binding(1, 0)]] ConstantBuffer<MaterialParams> material;

// โ”€โ”€ Vertex Input / Output โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
struct VSInput {
    float3 position : POSITION;
    float3 normal   : NORMAL;
    float2 uv       : TEXCOORD0;
};

struct VSOutput {
    float4 position  : SV_Position;
    float2 uv        : UV;
    float3 positionW : POSITION;
    float3 normalW   : NORMAL;
};

struct FSOut {
    float4 color : SV_Target0;
};

// โ”€โ”€ Vertex Shader โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
[shader("vertex")]
VSOutput vertexMain(VSInput IN) {
    VSOutput OUT;
    float4 worldPos = mul(obj.model, float4(IN.position, 1.0));
    OUT.positionW   = worldPos.xyz;
    OUT.normalW     = normalize(mul((float3x3)obj.normal, IN.normal));
    OUT.position    = mul(camera.proj, mul(camera.view, worldPos));
    OUT.uv          = IN.uv;
    return OUT;
}

// โ”€โ”€ Fragment Shader โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€
[shader("fragment")]
FSOut fragmentMain(VSOutput IN) {
    FSOut OUT;

    float4 baseColor = albedoTexture.Sample(IN.uv)
                     * material.baseColorFactor;

    // Simple Lambertian shading over all lights
    float3 diffuse = float3(0.0, 0.0, 0.0);
    for (int i = 0; i < lightsBuffer.numLights; ++i) {
        Light  l     = lightsBuffer.lights[i];
        float3 dir   = normalize(l.position - IN.positionW);
        float  NdotL = max(dot(IN.normalW, dir), 0.0);
        float  dist  = length(l.position - IN.positionW);
        float  atten = clamp(1.0 - dist / l.radius, 0.0, 1.0);
        diffuse += l.color * l.intensity * NdotL * atten;
    }

    OUT.color = float4(baseColor.rgb * diffuse, baseColor.a);
    return OUT;
}

โŒฏโŒฒ GitHub  ยท  โŒฏโŒฒ Vulkanised 2026 Talk  ยท  โ† quackie.at