Blogger Templates

Thursday, October 05, 2017


Graphics (6 of ∞) Playing with images. No more empty boxes :)

Posted in ,
EAE 6320-001

In this write up, I will talk about my experience of wiring images as textures to the sprite and effect pair that we have been rendering so far. The purpose of this assignment is to leverage some of the things that I did previously like adding new project modules, updating reference and setting up the required dependencies. Also, I will be leveraging the AssetBuildLibrary to use the AssetBuildFunctions.lua to load all of the images that we want in our game and build the corresponding textures.

Starting with Lua
The first thing I did in this project was to find the images I want to use for my textures. In this game, I will be using authored images of my favorite game - Paragon and favorite pop singer - Taylor Swift. These images are of png, and jpg types. Then I had to modify my AssetBuildFunctions.lua to support loading this new images as built textures. In my application, I have chosen the extension .tex for the built textures. To get the build working, I had to add a texture builder project which will be used for building the texture files. This project has references to DirectXTex, AssetBuildLibrary and Platform. Meanwhile, the texture builder project will be set as dependency by BuildExampleGameAssets project as we would need the TextureBuilder.exe while building our images to textures. We did not have to add TextureBuilder project as a reference because we are not making any calls to methods of TextureBuilder rather we need the executable such that it is available when building BuildExampleGameAssets -> AssetBuildExe -> AssetBuildLibrary -> AssetBuildFunctions.lua

Mapping UV
UV mapping is the 3D modeling process of projecting a 2D image to a 3D model's surface for texture mapping. - Wiki

To get the sprites rendered with textures, I started with setting the UV coordinates. We would abstract the calculation of UV coordinates from the user by calculating them internally inside the Graphics project sprites. Here we have a problem where OpenGL and Direct3D use the same U values but for the V, OpenGl follows 0 to 1 for bottom to top and Direct3D follow 1 to 0 for bottom to top. To solve this, I have written an enum and platform independent namespace method that can be used to get the right coordinates based on lower left, lower right, upper right and upper left points. This way I don't have to specify individual UV coordinates for OpenGL and Direct3D platforms. When we set those UV coordinates to the vertexData, we consider the winding order we discussed in my previous writeup for OpenGL and Direct3D to get the image in the right mapping. The advantage of doing all the logic for getting the UV coordinate inside the graphics project is that the user don't have to provide all the complex data rather he just have to specify where the sprite is. (In our case, the abstracted information of height and width and the centre point)

Shader Mods
Then I had to modify the shaders to support UV mapping. I had to follow JP’s convention to use a semantic to map the texture coordinate with the shader. This involves wiring the UV data between the vertex shader and the fragment shader. Right now the shader files have code that are duplicated across openGl and Direct3D implementations using macros. For any change that I had to make, i should ensure that I modify it in both the places. As a TODO, I am planning to make this shader platform independent and will talk more about that in a future writeup.

Texture Handles and Submission
In my existing implementation of effect and sprite, the approach I took was to have a wrapper class and have these elements as pointers and submit them directly to the graphics code as part of a collection(vector) For the purpose of texture, we will be using handles which are intuitively managed by a manager. The advantage of using such approach is to minimize the amount of manual reference count control that a programmer would do over a pointer. We can directly use the manager and use a corresponding handle to get access to a specific asset. We would explicitly perform incrementReferenceCount during submission of the texture data. But we do not manually decrement the reference count like how we did with effect and sprites. Rather we would release the handle from the asset. This includes the end of renderFrame, Graphics Cleanup and ExampleGame Cleanup. Also, another nice thing about the handles is that - Whenever I make a request for an asset that is already loaded by the manager, the manager would give me the exact same asset. The reason we don’t prefer the same for effect and sprite right now is that we might have to add additional initialization parameters to the handle load method which is sort of like abusing the existing API design structure. Overall, handle is an inuitive way to get access to an asset through means of a manager and control its scope. In my case, I am using a vector of textures that will be submitted by the submit data function to the graphics code where I would perform the necessary increment, bind and release. We also don’t explicitly perform decrement on the handles as that would cause conflicts the way the manager is setup. Decrementing the reference count could lead to calling the destructor/cleanup on the texture method. This might leave a manager with reference to assets that are not valid anymore. Using the release method of the manager would actually perform the decrement by making the handle invalid. This way the manager can track valid handles for further operations across different systems.

Dynamic update of images
[Base Image without any keyboard events(Includes time based update)]

[Moving image on Space Key press]

[Image update on Shift key press]

Once I was able to get my images on screen, I want to get more fancier by allowing a moving image to be displayed. To get this working, I kept my SpriteWithEffect independent of the texture collection such that I can hook up any texture with any SpriteWithEffect pair at runtime. My favorite MOBA game is paragon and I want to give some feel of that game in this application. So I took a gif and converted that to individual frame slice of PNGs. Then I loaded them into my texture vector and in my UpdateSimulationBasedOnInput(), I used the Space Key press event to dynamically iterate through the sequence of images. You will literally see Gideon(Paragon Hero) performing his ultimate - Black Hole. Similarly, I wanted to show more paragon characters and my favorite singer Taylor Swift. I got this done in two ways; The user can iterate through the sequence of available images using the Shift key. On the other side, You will also notice that there is auto shuffling of images being done using the UpdateSimulationBasedOnTime method. At the moment, I got this working in hackish way by manually accessing a specific index group for the Moving image. I am planning to refactor that into a more meaningful sprite sheet which would directly hold all the required items and do all the hardwork of submission directly.

While getting this to work, I got stuck on one bug for many hours where an handle would hold an invalid asset or manager will continue to hold reference to an asset. Ameya was very approachable during the TA hours where I was able to experiment with some debugging steps to arrive at the solution. Thanks to Ameya and Ajay for giving up some ideas which helped me resolve it.

Play with the following x64 release game.
Controls: Space - Sequence of Moving Images Shift - Sequence of Random Images