Friday 13 February 2015

Coming from Shaders in XNA to Shaders in Unity3D


This is the first post in an intended series of tutorials covering shader in Unity3D. Now, I am no expert in this area, this is just my experience of working with shaders in Unity3D. I came to Unity3D shaders pre armed I guess, with knowledge gained from writing DX9 shaders in XNA. I have written a lengthy set of tutorials for that as well as creating my own deferred lighting engine in XNA with a post processing pipeline as well as hardware instanced mesh’s and particle systems, all with accompanying shaders, so as far as the GPU goes I do have some experience and I think a good understanding of HLSL and the graphics pipeline.
So what has got me wanting to write this post, especially as there are loads of tutorials out there on Unity3D shaders as well as the excellent Unity3D Documentation? Well, I remember starting out with shaders in XNA, and, well it’s tough to get the concepts and ideas under your belt at first and from what I have seen (I may not have looked very hard) most of them assume existing knowledge. Also, for a lot of developers all these sources are very 3D driven, and a lot of devs don’t seem to then see that  these shaders can also be used in their 2D projects.
In the past I have heard people say that 3D is much harder than 2D, but what I don’t think a lot of people realize is when you were working in SpriteBatch (XNA) and now Sprites (Unity3D) you are still working in 3D, it’s just that everything is a quad and always facing the camera.
So I am going to start from the very, very beginning. The first thing we will do will be to create a mesh ourselves in code so we can actually understand what gets passed to the shader, we may not even get to look at a shader in this post :S
A friend of mine, Jay from Drop Dead Interactive, sent me a great youtube link to a series of Unity Shader tutorials, you might want to check this out to. I liked it, worth a look if you have time.

The Vertex

Please forgive me if I am teaching you how to suck eggs (showing you something you already know), but if people don’t understand the very root of the pipeline, then it’s easy to get confused. 
 

So what is the Vertex (plural, Vertices)?

OK, the Vertex is a structure that holds information for a given point on a mesh. So at the very least this would be a Vector3 for the point position on the mesh, not in the world, but on the mesh.
For the GPU to draw anything on the screen it needs at least 3 vertices, and to draw a quad (a flat square, billboard or sprite), at least 4 vertices (you could use 6 to draw 3 triangles, with indexing the GPU sort of does this anyway). All meshs are made up of triangles, so you can imagine the more curved your mesh is, the more triangles are needed.
image
Each one of the triangles in the image above is built from 3 vertices.
The Vertex also has other information with it, it also has the direction this point is facing, this is also known as the “Normal”, it will also have texture coordinates (textcoord), that describe how a texture would be applied to it. For now, we will focus on these there elements, position, normal and texcoord. The texcoord are also referred to as UV, I am sure you have heard artists talk about UV mapping in modeling tools and this is the process of setting up the vertices so that the right bits of the textures are mapped to the right parts of the mesh. I am terrible at UV mapping, it’s a real skill and I just don’t have the patients…

Unity is Left Handed – XNA was Right Handed

So, why does this matter? Well, you can get your self in a bit of a pickle if you are not aware of it, and I get in a pickle all the time as I am so use to working right handed. But what does it mean for a graphics system to be left or right handed? Well, it’s about how the coordinate system works, I could describe it here, but found a great link explaining handedness here.

RuntimeQuadScript.cs

To help me show you the sort of data that  gets sent to the shader we are going to create a script that will generate a Quad for us at run time. Now, for those of you that don’t like working in 3D, see this quad as a sprite, not as 3D geometry, the sprites and GUI.Image elements you have been working with, even the 3D text are all quads, so think of this as a sprite, ill even show you how we can use our shaders on sprites and even text in our games.
What do we need then, we need a list of position data, Vector3, 4 in total, an index list used to draw the positions in the correct order (winding order) , 6 in total, a list of Vector2’s for texture coordinates, again 4 in total as well as a list of Vector3’s for the normals, again, 4 in total. We also need a Mesh to put it all in and we can apply a material and so in turn a shader to.
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
using System.Linq;

[RequireComponent(typeof(MeshFilter))]
[RequireComponent(typeof(MeshRenderer))]
[ExecuteInEditMode]
public class RuntimeQuadScript : MonoBehaviour 
{
    /// <summary>
    /// List to store all our vertex positions
    /// </summary>
    List<Vector3> positions = new List<Vector3>();

    /// <summary>
    /// List to store the order we want the positions rendered in.
    /// </summary>
    List<int> index = new List<int>();

    /// <summary>
    /// List to hold the text coorinates we want for the mesh.
    /// </summary>
    List<Vector2> texCoords = new List<Vector2>();

    /// <summary>
    /// List to hold the normals we wish to create.
    /// </summary>
    List<Vector3> normals = new List<Vector3>();

    /// <summary>
    /// The mesh that our variables will create.
    /// </summary>
    Mesh thisMesh;

Now we can use these elements create our quad. First thing we will do is initialize our mesh and load up the vertex positions we need.

 // Use this for initialization
    void Start()
    {
        // Initialize the mesh object
        thisMesh = new Mesh();

        // Set up position data.
        positions.Add(new Vector3(-.5f, .5f, 0));           // Top left corner
        positions.Add(new Vector3(-.5f, -.5f, 0));          // Bottom left corner
        positions.Add(new Vector3(.5f, -.5f, 0));           // Bottom right corner
        positions.Add(new Vector3(.5f, .5f, 0));            // Top right corner

We can then set the draw order of these positions with a list of indicies, so each value in this next list corresponds to a Vector3 in out position list, so 0 would be the first Vector3 and 3 would be the last.

        // Set up the draw index
        index.Add(0);           // Draw from top left corner                 
        index.Add(3);           // to bottom left
        index.Add(2);           // then to bottom right

        // Next triangle
        index.Add(2);           // Draw from bottom right
        index.Add(1);           // to top right
        index.Add(0);           // then to top left

We can then give these values to the mesh and set it in the filter like this, remember we are rendering the mesh to face our camera at 0,0,-10.

        // Now give our mesh this data
        thisMesh.vertices = positions.ToArray();
        thisMesh.triangles = index.ToArray();
        
        // Now put this in the mesh filter so the renderer apply a material to it.
        GetComponent<MeshFilter>().sharedMesh = thisMesh;

We can now create an empty game object, rename it RuntimeQuad and add our script to it like this

image

The quad

As we had the required attributes at the top of our class, the MeshFilter and MeshRenderer have been added automatically for us. You will also see that I have set the MeshRenderer to use the Default-Diffuse material, before you do that you will see your quad rendered as a magenta square like this

image

Once you have set the material it will look like this

image

You can see that it’s being shaded in an odd way with that shadow on the bottom, this is because we have not set up the normals, so the default shader does not know which direction the mesh is facing, we can set the normals in a couple of ways, first we will do it by hand

        // Create our own normals
        normals.Add(Vector3.back);
        normals.Add(Vector3.back);
        normals.Add(Vector3.back);
        normals.Add(Vector3.back);
        
        // Now give our mesh this data
        thisMesh.vertices = positions.ToArray();
        thisMesh.triangles = index.ToArray();
        thisMesh.normals = normals.ToArray();

So I am setting the normals to Vector3.back, so they will be 0,0,-1 , so pointing in the direction of the camera, the quad now renders like this

image

Unity3D’s Mesh has a lovely intrinsic method called RecalculateNormals, so, we can do away with the normals list, but I had it in here just to illustrate how they look and what they are referring to.

We can do that like this

        // Now give our mesh this data
        thisMesh.vertices = positions.ToArray();
        thisMesh.triangles = index.ToArray();
        //thisMesh.normals = normals.ToArray();

        // Thankfully, Unity provides a method to calculate the normals of the mesh
        thisMesh.RecalculateNormals();

        // Now put this in the mesh filter so the renderer apply a material to it.
        GetComponent<MeshFilter>().sharedMesh = thisMesh;

OK, so lets now set up a new material, call it IntrinsicDiffuse, as we are going to use the Diffuse shader provided by Unity like this

image

So, now we can set MeshRenderer to the new material, change the color, say to red and it will render red, thanks to the material.

image

But, what happens if we pass it a texture?

Lets give it my handsome face to render

image

As you can see, it’s not rendered it right, it’s because we have not set the texture coordinates, so before we can set them we sort of need to know how they work, how does the texture coordinate relate to the texture that gets passed in.

The texture coordinate system is a Vector2, the top left corner of the image is 0,0, the bottom right is 1,1, making the top right 1,0 and the bottom left 0,1 ergo, the centre of the texture would be .5f,.5f

BUT Unity pulls a few tricks, as you can read here depending on what framework it is using for the render, the above applies to Direct3D, so keep in mind that this can get flipped..

I think due to the  way the default shader is working, it’s using OpenGL as it’s flipping the texture ccords, so with the texture coordinates set the class looks like this

using UnityEngine;
using System.Collections;
using System.Collections.Generic;
using System.Linq;

[RequireComponent(typeof(MeshFilter))]
[RequireComponent(typeof(MeshRenderer))]
[ExecuteInEditMode]
public class RuntimeQuadScript : MonoBehaviour 
{
    /// <summary>
    /// List to store all our vertex positions
    /// </summary>
    List<Vector3> positions = new List<Vector3>();

    /// <summary>
    /// List to store the order we want the positions rendered in.
    /// </summary>
    List<int> index = new List<int>();

    /// <summary>
    /// List to hold the text coorinates we want for the mesh.
    /// </summary>
    List<Vector2> texCoords = new List<Vector2>();

    /// <summary>
    /// List to hold the normals we wish to create.
    /// </summary>
    List<Vector3> normals = new List<Vector3>();

    /// <summary>
    /// The mesh that our variables will create.
    /// </summary>
    Mesh thisMesh;

 // Use this for initialization
    void Start()
    {
        // Initialize the mesh object
        thisMesh = new Mesh();

        // Set up position data.
        positions.Add(new Vector3(-.5f, .5f, 0));           // Top left corner
        positions.Add(new Vector3(-.5f, -.5f, 0));          // Bottom left corner
        positions.Add(new Vector3(.5f, -.5f, 0));           // Bottom right corner
        positions.Add(new Vector3(.5f, .5f, 0));            // Top right corner

        // Set up the draw index
        index.Add(0);           // Draw from top left corner                 
        index.Add(3);           // to bottom left
        index.Add(2);           // then to bottom right

        // Next triangle
        index.Add(2);           // Draw from bottom right
        index.Add(1);           // to top right
        index.Add(0);           // then to top left

        // Set up text coords
        texCoords.Add(new Vector2(0, 1));
        texCoords.Add(new Vector2(0, 0));
        texCoords.Add(new Vector2(1, 0));
        texCoords.Add(new Vector2(1, 1));

        // Create our own normals
        normals.Add(Vector3.back);
        normals.Add(Vector3.back);
        normals.Add(Vector3.back);
        normals.Add(Vector3.back);
        
        // Now give our mesh this data
        thisMesh.vertices = positions.ToArray();
        thisMesh.triangles = index.ToArray();
        thisMesh.uv = texCoords.ToArray();
        //thisMesh.normals = normals.ToArray();

        // Thankfully, Unity provides a method to calculate the normals of the mesh
        thisMesh.RecalculateNormals();

        // Now put this in the mesh filter so the renderer apply a material to it.
        GetComponent<MeshFilter>().sharedMesh = thisMesh;
    }
 
 // Update is called once per frame
 void Update () 
    {
 
 }
}

And it now renders like this

image

I know, I know, you are wondering when we are going to start looking at actually writing a shader, and we will, but first, PLEASE spend some time to understand the values used to create the mesh we just created. Have a play around with the values, set one of the textcoords to .5f,.5f or one of the vertex positions to 1,2,0 or go back to manual normals and set one to Vector3.forward, so you have a clear idea of what all these elements are doing, because if you don’t get these fundamentals clear in your mind, shaders are going to cause you nothing but pain.

As ever if you spot anything in my post(s) that is incorrect or misleading, then please let me know, just post a comment bellow and Ill sort it out, same with any questions you might have, feel free to fire off a comment below.

In the next post we are going to look at creating our first shader, honest :P








5 comments:

  1. Great, I'd like to know how to make an interactive shaders like the water drops that were used in the DriveClub video game.

    ReplyDelete
    Replies
    1. If you want to request that I write you something specific, then join my Facebook page, "Charles Will Code It" here https://www.facebook.com/groups/CharlesWillCodeIt

      :)

      Delete
  2. Okay but in my opinion that would be good to be an article about some clue of how to make those shaders in Unity3D here,though.

    ReplyDelete