Description
Part 3 of a three part series explaining the addition of textures to DirectX Primitive shapes.
Welcome to Ask The ZMan, a new column on the Coding4Fun game portal. The ZMan is here to solve your Managed DirectX programming problems. If you have a question for the ZMan send it to zman@thezbuffer.com.
Since Coding4Fun has only just launched, there aren't any specific questions to answer so I thought I would start with a question that occurs frequently on the Managed DirectX newsgroup:
"I've created my first Managed DirectX project and can successfully draw shapes created using the Mesh.{shape} primitives. However, when I try to apply a texture to them, nothing happens (b) the shapes go black or (c) the shapes disappear. How can I add a texture to the DirectX primitive shapes?"
Homework
Last time I covered how to manually assign points on a texture map to a box. If you spent any time thinking about the homework you will have realised that as the meshes get more complex, the texture map that goes with them and the set of texture coordinates gets too complex to do by hand very quickly. I left you with the following question:
You have seen what the texture map template for the box looks like. What do you think a texture map for a cylinder or a sphere would look like?
Download textures5.msi using the link at the top of this article. This solution allows you to display any of the Mesh.{shape}
primitives and to render them in wireframe so that you can see the underlying mesh. Imagine unfolding those shapes until all the triangles lie flat. Figure 1 shows the cylinder and a texture map that would work for this mesh.
Figure 1. The wireframe cylinder and one possible texture map that could be associated with it
Not only do you have to work out a suitable texture map, you have to work out all of the corresponding texture coordinates. The cylinder isn't too bad—the sides are represented by the rectangle in Figure 1 and the texture coordinates would be evenly distributed across it. The top and bottom are represented by the two circles and those coordinates can be calculated with some simple sine and cosine formula. You would still have to work out which vertex was which in the mesh structure, e.g., is vertex[49] one of the sides or the top?
Hopefully, you now agree with my assertion from the first article in the series that texturing is best left to the artists in most cases. At the very least you should not do it manually for anything other than the simplest meshes. There are, however, some automated ways of assigning textures.
Automatic Texture mapping
Imagine a globe of the earth. Now take an imaginary pocket knife and cut a line straight down the International Date line and peel the skin off the globe. What you have is a texture that represents the surface of the earth. The first problem is that it's not flat. Imagine that the skin is made of rubber so you can stretch it until it's flat. You have to do most of the stretching at the north and south poles to make it as wide as the equator but when you are done you will have a flat texture that looks something like Figure 2.
Figure 2. An example texture map for the earth.
The problem to solve is how to map this texture back onto the sphere. It's obviously possible, since it came from there in the first place. The solution is called spherical texture mapping.
Take a look at Figure 3 (click it for a large version). The red vector is what I will call the vertex ray. It represents a point on the surface of the sphere that needs texture coordinates and is a vector starting at the center of the object and passing through the point of interest. The point can be described by a longitude (the light blue curve) and latitude (the light green curve). The longitude and latitude equations are based on the angles between the vertex ray and the north vector and the equator vector, respectively (the 2 black vectors). After these are calculated they are scaled in the range 0.0 to 1.0 to give the corresponding point on the texture map.
Figure 3. The mapping between longitude and latitude and the uv texture space
The longitude and latitude functions are described in Paul Rademacher's Ray Tracing article and the code looks like the following:
C#
//Ref:http://www.cs.unc.edu/~rademach/xroadsRT/RTarticle.html and //Glassner, A. (ed) An Introduction to Ray Tracing. Academic Press New York, N.Y. 1989. phi = Math.Acos((double)Vector3.Dot(north, vertexRay)); verts[i].Tv = (float)(phi / Math.PI); if (phi == 0.0) //if north and vertex ray are coincident then we can pick an //arbitrary u since its the entire top/bottom line of the texture { u = 0.5f; } else { //Clamp the acos() param to 1.0/1.0 //(rounding errors are sometimes taking it slightly over). u = (float)(Math.Acos(Math.Max(Math.Min((double)Vector3.Dot(equator, vertexRay) / Math.Sin(phi), 1.0), 1.0)) / (2.0 * Math.PI)); if (Vector3.Dot(northEquatorCross, vertexRay) < 0.0) { verts[i].Tu = u; } else { verts[i].Tu = 1  u; } }
Visual Basic
'Ref:http://www.cs.unc.edu/~rademach/xroadsRT/RTarticle.html and 'Glassner, A. (ed) An Introduction to Ray Tracing. Academic Press New York, N.Y. 1989. phi = Math.Acos(CType(Vector3.Dot(north, vertexRay), Double)) verts(i).Tv = CType((phi / Math.PI), single) If phi = 0.0 Then u = 0.5f Else 'Clamp the acos() param to 1.0/1.0 '(rounding errors are sometimes taking it slightly over). u = (single)(Math.Acos(Math.Max(Math.Min((Double)Vector3.Dot(equator, vertexRay) / Math.Sin(phi), 1.0), 1.0)) / (2.0 * Math.PI)) If Vector3.Dot(northEquatorCross,vertexRay) < 0.0 Then verts(i).Tu = u Else verts(i).Tu = 1  u End If End If
Though this looks complex, it can be simplified in this case because the north vector is (0, 1, 0) and the equator vector is (1, 0, 0). Since (a, b, c) dot (d, e, f) is ad+be+cf, most of the dot products can be simplified to one component of the vertex, and the cross product is a constant (0, 0, 1). The final code is therefore:
C#
//Since we know we are using normalised axes we can simplify this somewhat! //Note these simplifcations only apply if the basis vectors are the unit axis //north=(0,0,1)=zaxis, equator=(0,1,0)=yaxis and north x equator=(1,0,0)=xaxis //since (0,0,1)dot(x,y,z)==z and (0,1,0)dot(x,y,z)==y //if north and vertex ray are coincident then we can pick an arbitray u, since //it's the entire top/bottom line of the texture phi = Math.Acos((double)vertexRay.Z); verts[i].Tv = (float )(phi / Math.PI); if (vertexRay.Z == 1.0f  vertexRay.Z == 1 .0f) { verts[i].Tu = 0.5f; } else { u = ( float)(Math.Acos(Math.Max(Math.Min(( double)vertexRay.Y / Math.Sin(phi), 1.0 ), 1. 0)) / (2 .0 * Math.PI)) ; //Since the cross product is just giving us (1,0,0) i.e. the xaxis //and the dot product was giving us a +ve or ve angle, we can just //compare the x value with 0 verts[i].Tu = (vertexRay.X > 0f) ? u : 1  u; }
Visual Basic
'Since we know we are using normalised axis we can simplify this somewhat! 'Note these simplifcations only apply if the basis vectors are the unit axis 'north=(0,0,1)=zaxis, equator=(0,1,0)=yaxis and north x equator=(1,0,0)=xaxis 'since (0,0,1)dot(x,y,z)==z and (0,1,0)dot(x,y,z)==y 'if north and vertex ray are coincident then we can pick an arbitray u since 'its the entire top/bottom line of the texture phi = Math.Acos(CDbl(vertexRay.Z)) verts(i).Tv = CSng(phi / Math.PI) If vertexRay.Z = 1.0F OrElse vertexRay.Z = 1.0F Then verts(i).Tu = 0.5F Else u = CSng((Math.Acos(Math.Max(Math.Min(CDbl(vertexRay.Y) / _ Math.Sin(phi), 1), 1)) / (2 * Math.PI))) 'Since the cross product is just giving us (1,0,0) i.e. the xaxis and the dot product ' was giving us a +ve or ve angle, we can just compare the x value with 0 verts(i).Tu = IIf((vertexRay.X > 0.0F), u, 1  u) End If
Download textures6.msi using the link at the top of this article for the complete application.
For an interesting aside into debugging this algorithm and why the Min and Max functions are necessary, see my blog entry on the subject.
So that has put the imaginary globe back together, as you can see from Figure 4, but how can the same theory be applied to arbitrary meshes?
Figure 4. The texture mapped sphere
It turns out that the spherical texture mapping can be applied to any shape. Once the center is found, a ray can be projected through each vertex, just as was done with the sphere, and the texture map sampled in the same way. (There are lots of ways of visualizing what is happening here. If my explanation doesn't do it for you, then try reading some of the references below, as everyone's mind works differently.)
I like to imagine a surrounding sphere around the mesh. When the ray is projected from the center through the mesh vertex, it can be extended out to touch the surrounding sphere giving you the texture coordinates for that point.
The sample code allows you to switch between all of the DirectX primitives and see how a spherical mesh is applied to each of them. In addition, I have added the ability to show the vertex rays and the surrounding texture mapped sphere to help get your head around how it all works.
Figure 5. A texture mapped cylinder using spherical mapping
Homework for this time: Explain the odd texturing you see in the middle of the Pacific Ocean. It's easiest to see on the sphere and cylinder. I will revisit this and fix it in a future article when I cover mesh structures.
This concludes the series on texture mapping, until I get some other questions on the subject.
Credits:
Thanks to
 Carlos Aguilar for the code colorizer
 Paul Rademacher for his Ray Tracing article with the Spherical texture mapping formula
 Robert Dunlop for his spherical texture mapping article
 Peter Donnelly for an old article about Texture Wrapping
 Fridger Schrempp for the earth texture from the Celestia MotherLode
Copyright © 2005 TheZBuffer.com
(Editor's note: These examples, like the two previous articles in the series, were built using the August 2005 DirectX 9.0 SDK update.)
The Discussion

Hi,
I put a fix in for the spherical mapping:
device.RenderState.Wrap0 = WrapCoordinates.Zero;
This will fix the seam. However, if you try a different texture like a grid, you'll notice that the poles are still not mapping correctly. Do you have a fix for this?

have the same problem as codemonkeyjas...
But I'm using opengl with fragment shader (perpixel) so I shouldn't have interpolation problems. any info?

Using the default sphere that gets generated by DirectX you can never map the poles perfectly for 2 reasons.
1. The poles actually map to the entire edge of the texture  you cant do that
2. The poles all share a single vertex and therefore have to share the same u,b coordinates which means everything is stretching to the same single point.
The only way to solve this is to generate your own spherical (or other) mesh and map the points yourself. This was beyond the scope of this article. This article covers other ways of mapping textures to spheres http://vterrain.org/Textures/spherical.html

Fixed it!
Now i'm having the same problem as codemonkeyjas on the poles.

I'm brand new to graphics programming and this article has been extremely helpful.
I'm trying to dynamically draw the texture being mapped on the sphere and the glitch (mentioned at the end of the article) is there when I call TextureLoader.FromFile() but when I call TextureLoader.FromStream() it only paints half the sphere.
I'm having trouble finding good tutorials online so any references/advice would be much appreciated!