Journey to Procedural Planets in Unreal Engine 5 - Part 1
This photo is from my final iteration of procedural terrains with PMC,
which these posts will catch up to eventually.
Why am I writing these entries?
I need a way to hold myself accountable in continued development towards these planets, and documenting my ongoing journey towards procedural planets should not only solidify the concepts of the content being presented here, but also serve as a way to showcase successes and failures of said journey. It's important for budding developers to understand that knowledge and success are not reserved for the perceived rockstars of the industry, but rather are concepts unique to each individual.
To explain that last statement, let me provide an example. There is a 17-18
year old developer who has achieved very detailed planets in Unreal, yet I
do not let this deter me or get me down. It would be quite easy to look at
his work and think "Oh, I have a lot of work experience under my belt, and
given my age & position, I should be able to do what he's done with
ease," and become downtrodden due to the fact that achieving my end goal is
difficult for me. Everything is relative, down to when we became developers,
the time/effort put in, etc, and I feel this is very important to keep in
mind for anyone coming into this industry.
I have not yet
achieved my goal of procedural planets in UE5, but even if I don't, it
doesn't mean I've failed. Each step forward in this unbelievably complex
process will unlock knowledge I currently don't have, and in that light, I
will have succeeded.
What is the end goal?
Ideally, I'd like to achieve something similar to the planets in Elite:
Dangerous or Space Engine. Granted, that is an extremely tall order, and
truthfully I'll be happy if I can get 25-50% of that quality. It's possible
this series of posts could take years, as I suspect it'll take that long to
get there, but that's alright by me. It's about the journey, not the
destination. Achieving something of this magnitude will have many
challenges, but my hope is that by encountering them, I will advance my
knowledge in programming, rendering and tools development.
Planetary surface rendered in Space Engine.
How it all started
I've always been fascinated by the wonders of space, but it wasn't until I
saw a gameplay video of the Elite: Dangerous alpha in 2014 that I
started to severely obsess over 3D rendered space. The problem was that I
had only just started my own foray into game development and had less than
zero knowledge about how anything worked.
Fast forward to 2022
after I had been working for Quixel for some years, and I was ready
(spoiler: I wasn't) to attempt the humble beginnings of tackling procedural
planets within Unreal Engine. As a technical artist/programmer, the logical
path forward is to break things down to their smallest steps and start
there, as I need to understand all parts of the whole, no matter how small
or insignificant they may seem. For this, I started with a
procedural mesh component to create a humble triangle.
Creating the Triangle
My understanding of procedural geometry was (and still is, to a lesser
degree) not the best, so even this seemed more complex than it really was.
To break it down for those who are unfamiliar, the bare minimum of data need
for producing a usable triangle with PMC is:
- Positions (FVector)
- Triangle Indices (int32)
- Normals (FVector)
- Tangents (FProcMeshTangent)
- UVs (FVector2D)
To put the data together in a usable manner is trivial - you create 3 entries into the position array with whatever positions you desire, and then create 3 entries into the triangle array. Winding order is important here, so for Unreal this must be done counter clockwise, e.g, (0,1,2). We want to be able to see the front face, not the back face, which is why this is necessary.
You can at this point plug the position and triangle arrays into Create Mesh Section and see your first triangle. However, you'll notice right away that it looks incorrect. This is because we have not supplied UVs to the mesh.
UVs are quite simple for a triangle or square/grid mesh. We want to essentially map the position of each vertex on the triangle to a corresponding point in texture space. UV coordinates represent positions in a 2D texture space, where U corresponds to the horizontal axis (X), and V corresponds to the vertical axis (Y). UVs typically range from 0 to 1, with (0, 0) being the bottom-left corner of the texture and (1, 1) being the top-right corner.
At this point, you might be asking "How does this help us and how do we move forward with this information?"
Well, we already have our positional data for our triangle, so we can simply take the XY values from the position entries and remap those values to fit in the 0-1 range of our UV entries.
Let's say we've defined an equilateral triangle with the following
positional data:
- (0,0,0)
- (0,100,0)
- (100,50,0)
We would then create UV information like this for the 3 entries:
- (0,0)
- (0,1)
- (1,0.5)
When setup like this, you will produce your first procedural triangle.
This is looking great so far, but there's a long way to go towards procedural terrain. In the next post, I'll cover how I went from a triangle to a grid mesh and implemented terrible Z offsetting with procedural noise.
The keen amongst you will notice we did not cover normals/tangents, which I've opted to save for a future entry.
Comments
Post a Comment