Triangles on Web Ch3 Texture
In the last part, we created the life game with WebGPU. In this part, we will add texture to the game.
Texture is a 2D image that can be used in shaders. It can be used to colorize the triangles, or to store data. In this part, we will use texture to colorize the triangles.
Texture
Texture is simply a 2D image. It can be used in shaders to colorize the triangles. In WebGPU, texture is created by device.createTexture
method. The texture is created with a TextureDescriptor
object.
Creating Texture
Let's first look for a random image on the internet. I found a random image on this website. This is a grass texture by macrovector on Freepik.
We need to load that firstly, as a bitmap. It is simple, use fetch
to get the image, and use createImageBitmap
to create a bitmap from the image.
async function loadImageBitmap(url) {
const res = await fetch(url);
const blob = await res.blob();
return await createImageBitmap(blob);
}
Then load that bitmap into a texture,
const texture = device.createTexture({
label: "grass",
format: 'rgba8unorm',
size: [bitmap.width, bitmap.height],
usage: GPUTextureUsage.TEXTURE_BINDING |
GPUTextureUsage.COPY_DST |
GPUTextureUsage.RENDER_ATTACHMENT,
});
TEXTURE_BINDING
is the specified usage for the texture. COPY_DST
has already been introduced. RENDER_ATTACHMENT
means that the texture can be directly rendered to. Later we will use copyExternalImageToTexture
, so these usages are necessary.
Now, let's copy,
device.queue.copyExternalImageToTexture(
{ source: bitmap, flipY: true },
{ texture },
{ width: bitmap.width, height: bitmap.height },
);
copyExternalImageToTexture
copies the bitmap to the texture. flipY
is necessary because the origin of the image is at the top-left corner, but the origin of the texture is at the bottom-left corner. However, for a grass texture, you can't see much difference.
Using Texture
Now we have the texture, let's use it in the shader.
Texture is just a picture, and the way picture is presented is by fragment shader. The fragment shader choose a pixel in the texture, and use that pixel as the color of the triangle, so that the texture is applied to the triangle.
However, there might not be direct match between the displayed primitive and the texture. For example, the texture might be smaller than the primitive, or the texture might be larger than the primitive. In addition that, in the fragment, we only operate on the vertices.
The way to solve this is texture coordinations. Texture coordinations are the coordinations in the texture. The texture coordinations are in the range of [0, 1], and the texture coordinations are used to choose the pixel in the texture.
For example, if the texture coordinations are (0, 0), then the pixel at the top-left corner of the texture is chosen. If the texture coordinations are (1, 1), then the pixel at the bottom-right corner of the texture is chosen. As for the pixel between the pixels, the texture is interpolated.
This process is called texture sampling. The texture is sampled by the texture coordinations, and the color of the pixel is chosen.
Previously, we have loaded variable texture
with the picture. Now, we need to load the texture into the shader. We also need a sampler to sample the texture. They are all just special bindings.
We also add the texture coordinations to the vertex shader, and use textureSample
to let the fragment shader sample the texture.
const getShaderModule = async (device: GPUDevice): Promise<GPUShaderModule> => {
return device.createShaderModule({
label: "shader",
code: `
@group(0) @binding(0) var<storage> states: array<u32>;
@group(0) @binding(2) var grass_texture: texture_2d<f32>;
@group(0) @binding(3) var grass_sampler: sampler;
struct VertexOutput {
@builtin(position) pos: vec4f,
@location(0) @interpolate(flat) cell: u32,
@location(1) texCoord: vec2<f32>,
};
@vertex
fn vertexMain(@location(0) pos: vec2f, @builtin(vertex_index) vertexIndex: u32) -> VertexOutput {
var output: VertexOutput;
output.pos = vec4<f32>(pos, 0.0, 1.0);
output.cell = vertexIndex / 6;
output.texCoord = output.pos.xy * 0.5 + 0.5;
return output;
}
@fragment
fn fragmentMain(input: VertexOutput) -> @location(0) vec4<f32> {
if(states[input.cell] == 0u) {
discard;
}
return textureSample(grass_texture, grass_sampler, input.texCoord);
}
`
});
}
Please note that the NDC is in the range of [-1, 1], so we need to convert it to [0, 1] by multiplying 0.5 and adding 0.5.
The vec2<f32>
is automatically interpolated by the rasterized, with the mode @interpolate(perspective)
, which means that the texture coordinations are interpolated by the rasterize. Now, you can just regard it as a linear interpolation based on the coordinations.
Now we need to pass the values with the bindings. Its the same as before, but we need to pass the texture and the sampler.
const getBindGroupLayout = async (device: GPUDevice): Promise<GPUBindGroupLayout> => {
const bindGroupLayout = device.createBindGroupLayout({
entries: [{
binding: 0,
visibility: GPUShaderStage.FRAGMENT | GPUShaderStage.COMPUTE | GPUShaderStage.VERTEX,
buffer: {
type: "read-only-storage"
}
}, {
binding: 1,
visibility: GPUShaderStage.FRAGMENT | GPUShaderStage.COMPUTE,
buffer: {
type: "storage"
}
}, {
binding: 2,
visibility: GPUShaderStage.FRAGMENT,
texture: {}
}, {
binding: 3,
visibility: GPUShaderStage.FRAGMENT,
sampler: {}
}]
});
return bindGroupLayout;
}
We have already created the texture, but not the sampler. The sampler is created by device.createSampler
method.
const getSampler = (device: GPUDevice): GPUSampler => {
return device.createSampler({
magFilter: "linear",
minFilter: "linear",
});
}
We just created a simple sampler with linear interpolation. There are also other options, like nearest
for nearest neighbor interpolation.
Finally, we just change the bind group creation to include the texture and the sampler.
const bindGroup = device.createBindGroup({
label: "bind group",
layout: bindGroupLayout,
entries: [{
binding: 0,
resource: { buffer: statesStorageBuffer[step % 2] },
}, {
binding: 1,
resource: { buffer: statesStorageBuffer[(step + 1) % 2] },
}, {
binding: 2,
resource: texture.createView(),
}, {
binding: 3,
resource: sampler,
}]
});
Note that you need to pass a view of the texture, not the texture itself.
Now we can see that each cell shows part of the grass texture. The texture is applied to the triangles.
If you want each cell to show the full picture, you can change the texture coordinations to the following,
@vertex
fn vertexMain(@location(0) pos: vec2f, @builtin(vertex_index) vertexIndex: u32) -> VertexOutput {
var output: VertexOutput;
output.pos = vec4<f32>(pos, 0.0, 1.0);
output.cell = vertexIndex / 6;
if(vertexIndex % 6 == 0) {
output.texCoord = vec2<f32>(0.0, 0.0);
} else if(vertexIndex % 6 == 1 || vertexIndex % 6 == 4) {
output.texCoord = vec2<f32>(1.0, 0.0);
} else if(vertexIndex % 6 == 2 || vertexIndex % 6 == 3) {
output.texCoord = vec2<f32>(0.0, 1.0);
} else {
output.texCoord = vec2<f32>(1.0, 1.0);
}
return output;
}
Now, each cell shows the full picture, this is because the texture coordinations are set to the corners of the texture.