People new to 3D are frequently confused by the extra level of indirection introduced by texture coordinates. 3D veterans familiar with texture coordinates from other platforms are sometimes surprised that the default behavior in the Windows Presentation Foundation (formerly code-named Avalon) is a bit different than what they are accustomed to in other 3D APIs. Below is a quick overview of what texture coordinates are and how they are specified. The last section talks about the default behavior in the WPF and how the tradition behavior can be attained by tweaking Brush properties.
What are Texture Coordinates?
Texture coordinates specify which part of the Material shows up an which part of the MeshGeometry3D. Developers from a 2D background are usually surprised that this is necessary. Most 2D graphics APIs give the users drawing primitives like Ellipses, Rectangles, Paths, etc. with a predefined fill behavior that stretches the brush to the bounds of the given geometry. This is the default in WPF as well. Consider the following diagonal GradientBrush which is white in the upper left corner, red in the middle, and black in the lower right corner:
<LinearGradientBrush x:Key=“MyGrad“ StartPoint=“0,0“ EndPoint=“1,1“>
<GradientStop Color=“#FFFFFF“ Offset=“0“ />
<GradientStop Color=“#FF0000“ Offset=“0.5“ />
<GradientStop Color=“#000000“ Offset=“1“ />
By default when a brush is used to fill a 2D geometry the brush is applied relative to the bounding box of the shape. If applied to a Rectangle like the 100×100 square below the entire brush will be visible. If the geometry does not entirely fill it’s bounding box parts of the brush will not be visible. For example, the non-rectangular parallelogram below does not show the extreme white and black corners of the brush.
People working with 3D for the first time often expect that materials will be similarly stretched to fit the shape of their geometry. Indeed, in my last post my sample 3D primitives have default texture coordinates which mimic this behavior:
However, there is no single mapping which provides a reasonable default for all 3D geometry. Even with simple geometry like the 3 quadrics depicted above the method for generating the texture coordinates of the sphere differ from that of the cone and cylinder. In order to make the material wrap sensibly around each primitive I had to explicitly specify a 2D texture coordinate for each 3D position. For example, the cone primitive generates a mesh similar to the one depicted below. Each 3D position in the mesh has a corresponding 2D texture coordinate which maps that 3D position to a 2D coordinate in the material.
Specifying Texture Coordinates
You specify texture coordinates by adding 2D Points to the MeshGeometry3D.TextureCoordinates collection. For example, the following MeshGeometry3D maps the lower right half portion of the material to a triangle:
Positions=“-1,1,0 -1,-1,0 1,-1,0“
TextureCoordinates=“0,0 0,1 1,1“
The ith entry in the TextureCoordinates list corresponds to the ith entry in the Positions list, so texture coordinate 0,0 is associated with the vertex at position -1,1,0. The mapping between texture coordinates and positions is depicted below:
Texture Coordinates Rendered Triangle
Of course, this is not the only possibly mapping. If we change the 2nd texture coordinate from 0,1 to 0,0.5 we produce the following image:
Texture Coordinates Rendered Triangle
Notice that the area of the triangle specified by the texture coordinates on the left is stretched to fit the render triangle.
Texture coordinates are required when using a material with an ImageBrush, DrawingBrush, VisualBrush, LinearGradientBrush, or RadialGradientBrush (i.e., any Brush other than a SolidColorBrush in which case every position in the mesh maps to the same color.)
Texture Coordinates in the Windows Presentation Foundation
There are a couple of behavior differences that 3D veterans should be aware of when working with texture coordinates in the Windows Presentation Foundation. These differences arise from the fact that 3D Materials leverage 2D Brushes as the source of their textures. The wins from being able to take any arbitrary 2D brush and use it to paint a 3D mesh are enormous. However brushes default to behavior which, while traditional for 2D, is sometimes surprising for 3D developers. Fortunately brushes are highly configurable and it is easy to get the typical 3D behavior.
The first difference is that the +Y axis in brush space (and consequently texture coordinate space) points down by default. This is consistent with 2D but different from the traditional 3D coordinate system where the +Y axis points up. In order to get the +Y axis pointing up apply a Y scale of -1 and Y translation of +1:
ImageBrush Transform=“scale(1,-1) translate(0,1)“ ImageSource=“…“ />
The second difference is that the brushes default to relative units. At the beginning of this article I discussed how by default the brush maps to the relative bounding box of the 2D geometry it is being used to fill. The same thing happens by default with the bounding box of your texture coordinates. In the above samples I happened to use texture coordinates in the range of 0..1, but texture coordinates of “-20,-20 -20,10 10,10” would have also filled the triangle with the same part of the image. TileBrush’s Viewport property defaults to the rect (0,0)-(1,1) so to get the standard behavior where the bounds of the texture source maps to 0..1 you just need to specify you are using absolute units:
<ImageBrush ViewportUnits=“Absolute” Transform=“scale(1,-1) translate(0,1)“ ImageSource=“…“ />
These 2 settings should configure brush / texture space to be identical to what you are probably used to from other APIs. (Incidently, all of these Brush settings have the same affect in 2D as well.) Finally, if you need tiling you turn it on with TileMode=”Tile”.