This is a project that I started working on with my friend Henry Chronowski back in early 2022. We aimed to build a renderer from scratch using the Vulkan graphics API, and use it to grow our knowledge about graphics programming. We wanted to explore and gain experience using a modern graphics API that we had not yet been exposed to, as well as gain practical knowledge from architecting a real-time graphics application from the ground up. I still actively work on the project during my free time to brush up my skills or to try something new. There are a bunch of things I would like to add support for in the future which can be found here. Some of those new features include: a material system, physically based rendering, ray-traced lighting, and skeletal animation support.
I am currently working on getting my scene to render within an ImGui window, so that I can use the ImGuizmo library to render gizmos on selected objects in the scene. Initially this was supposed to be a very quick implementation before moving on to a material system, but I have since fallen into a massive rabbit hole trying to get this to work. The main issue that I have is that gizmos using the ImGuizmo library can only be rendered within an ImGui window, so I needed a way to capture my scene into a "texture"/framebuffer that I can send to ImGui to display. Welcome multi-pass rendering! This has been the main time sink for me so far. Render passes in Vulkan can get a bit complicated and it has taken some time to wrap my head around them. However, I was able to get multiple passes working! I currently have a render pass for the scene, and a render pass for all of the ImGui UI. The last thing that I think I need to do is take the result of the scene render pass and use it in the ImGui render pass. I haven't quite gotten that working yet, but when I do I'll be sure to update this page!
Over the past couple of months I have been working on getting textures working with Vukan (as most of the previous texture work was scrapped during the architecture rework) as well as setting up dockspaces and viewports using ImGui. After a bit of trial and error I was able to get textures working with little issue. The only problem is that every object in the scene uses the same texture, which is why the next major feature I want to implement is a basic material system. Having a material system will allow me to customize the scene on a more granular level instead of a global one. As for dockspaces and viewports, thankfully they were fairly easy to implement since it was mostly usuing already established code from ImGui.
Previously our renderer could only support a single point light, but it now has the ability to support any number of point lights. I also updated our lighting to use Blinn-Phong shading instead of regular Phong. Each light's position and intensity can be controlled through our ImGui user interface.
The past few months I have been reworking the core architecture of the renderer. Peviously, the way we strucutred our code made it difficult to implement new features and maintain the codebase. I mainly focused on abstracting most of the Vulkan specific code out of the main renderer itself, along with cleaning up some unecessary and unused code. This alone made it much easier to work within the engine.
Over the past month or so, I have been working on a lot of backend changes to the renderer, most of which came from trying to implement phong lighting correctly. We had been hung up for weeks on simply trying to get lighting data sent to the GPU correctly. After exploring many different avenues and reading countless sources online, I ran into a small aspect of Vulkan that I hadn't ever seen when working in other frameworks (though I was not building those frameworks from scratch so it may exist): memory alignment of buffers. Our main issue on the CPU side was that our uniform buffers were not packed properly, and thus even though the correct data was being sent to the GPU, it arrived in a different organization than was expected, causing major issues when modifying the lighting data. The way to solve this problem is to simply pack the data according to Vulkan's specifications. This can be achieved in a few different ways. The first way is to use alignas() to force each member of the buffer to be aligned by 16 bytes. While this is handy it doesn't work in every situation, as it is possible to pack data too tightly. Another option is to use vec4s instead of vec3s in the uniform buffer, and just ignore any unneeded values in the shader. Obviously this method works best for vec3s and vec4s because vec4s are already aligned by 16 bytes, but doesn't really make sense for a vec2 or scalar value. The final way is to just stick dummy padding variables in the buffer where needed. We used a combination of these methods to solve our problem, and since we were mostly working with vec3s, we just bumped them up to a vec4 and ignored the w component. In the future we will need to modify our light uniform buffer to hold a struct representing a light, so we can work on having multiple lights in the scene.
The other major feature I was working on was a user interface using the ImGui library. ImGui is an incredibly useful library for setting up good looking UI very quickly. The most difficult thing I ran into when using this library was figuring out how to make it play nice with the existing architecture that I had, but once that problem was solved, the rest was pretty smooth sailing. Currently we have one window that displays debug information as well as scene data. In the future I think I will bring the scene section out into it's own window, and try to set up a scene hierarchy similar to what you may find in Unity or Unreal. For now the scene data allows the user to control the single light and camera in the scene.
Over the past few weeks we've been focusing on adding three main features to our renderer: camera controls, basic lighting, and the option to toggle wireframe rendering. I was responsible for getting the camera controls and wireframe rendering mode working.
I originally started with an euler based system for the camera rotations, but this turned out to not be an ideal solution. One of the biggest drawbacks of an euler based system is that it is very easy to run into gimbal lock. Gimbal lock occurs when two axes of rotation align, effectively locking out an axis or rotation. This presents a major problem when working with a 3D camera, so I decieded to switch the camera over to using quaternions instead of euler angles. Once that was working, I added in a bit of smoothing to the camera so it wouldn't look jittery when moving around, and the result was a great success!
Getting wireframe rendering to work was actually much simpler that I thought it would be. It effectively boiled down to adding a second pipeline with the rasterization mode set to VK_POLYGON_MODE_LINE. This meant that I had to abstract the pipeline creation into its own class so that I could easily make multiple of them and adjust their parameters. In the future I would like to have this pipeline use a different shader from the default, so that the wireframe really stands out.
For this first portion of the project, I was mainly resposible for laying most of the groundwork for the renderer. I worked primarily on setting up the main graphics pipeline using the vulkan API, along with implemening support for vertex, index, and uniform buffers. In conbination with Henry's work, we were able to get a Utah Teapot loaded from a file and successfully rendering in 3D space with working texture mapping.
The biggest challenge for me these past few weeks has been trying to wrap my head around and make sure I understand all of the setup involved when using vulkan. The large majority of work that I have done was all setup and due to how explicit vulkan is, I found myself doubling back constantly to make sure what I was doing was correct. Another major challenge I had to face was debugging major bugs or crashes with limited to no error messages. While vulkan does offer some form of error catching, there were many times where the program would crash for no reason. There were many times where I would have to carfully step through the code and retrace my steps to figure out what was wrong.
Jan 2021 - Present
Visual Studio 2019/2022