I came across this post on the Reddit sub r/Generative the other day and thought that u/ivanfleon had done something both relatively simple and also very cool. I had some ideas for generative glyphs and started by mimicking his sample there, thus was born the RectGlyph:
The interface came shortly after RectGlyph was done as I was trying to troubleshoot work on the PolarGlyph. It made it easier to see what sort of variations could be had, but also allowed debugging to be more visual (which really helps me a lot).
I’ve always been fascinated with languages, both real and imagined. As I was working toward my PolarGlyph idea, I stumbled upon a few happy accidents, such as the RunicGlyph.
And also the AngularGlyph:
And eventually worked out the kinks for the PolarGlyph:
I have a few others bring worked on, as well as some ideas regarding an editor so you can take your randomly generated glyphs and add line segments to or remove them from any of the glyphs in the set.
My pie-in-the-sky idea is to also be able to save them as a TrueType font so that they can be used in Unity (or anywhere), and possibly to save them as an SVG or vector sheet for use in various vector-based software.
Not being an artist can seem like one of the most daunting parts of going it alone (or mostly alone) as an indie dev. I’m a fair photographer, but that’s as far as my artistic capabilities have previously taken me. Most of my stick figures look, well… disfigured, and things like straight lines are as simple for me as writing in Martian. But I’ve been really working to increase my skill set here, and be able to create things in Labyrintheer that actually LOOK pretty decent.
In light of that, I started by picking up some great models from InfinityPBR. Andrew, the awesome dude behind iPBR (as I reference it for myself) includes some great PBR materials as SBSAR files (and recently SBS files) that really helped me delve into how materials work and what was meant by all of the components: normal, roughness, metallic, height (or specular and glossiness depending on your preferred workflow); metallic/roughness versus specular/glossiness; and a bit about what all of those components can do.
But this wasn’t enough customization for me. As I’ve mentioned before I started using Archimatix to design some architectural models (which is still something I’m getting a handle on, simply due to the sheer variability of AX and its nodes). As I worked through some (very simple) wall models and such, I realized that I also wanted more control over the materials themselves. iPBR offers an INCREDIBLE amount of customization, but I’m just a pain in my own arse that way. So the next step was… Substance Designer.
For those new to the art game, Substance Designer is an amazing software package that lets you node-author your own materials and export SBSAR files that can be used to procedurally create materials in Unity (and I believe Unreal Engine).
The beauty of the node-driven design is that, while artist-type folk seem to settle into it easily enough, we logic-driven code monkey types can also create some really stunning work since everything can be reduced to parameters and inputs and outputs. But before you can do any of this, you need to grasp some of the basics. I’m not high-speed enough yet to really offer a tutorial, but I can share the wealth of tutorial love that I’ve been using. So, let’s start with normal maps.
I ran across this tutorial literally this morning, which is what brought me to create this post and share this info. More blog post, less actual tutorial, the information about normal maps here is presented concisely, tersely, and with excellent clarity. Even someone who has never heard of a material in game parlance before can probably understand what’s being explained.
The gist of normal maps is to provide interaction between the materials and light in your scenes. This is not the same as metallic/roughness aspects, but more to “pretend” that there’s dirt, smudges, small deformities or other similar features on your object. When making a material, you often preview it on a perfectly flat surface. But you still want to see the details – details that offer a 3D appearance on a completely flat 2D plane. This is where normal maps come in.
Let’s look at the image below, for instance:
This is meant to be the head of a coin I’m working on as sort of a self-tutorial. The eye can easily see this as an embossed image, but due to the normal map, moving the light around changes how and where shadows happen. Here the light is off to the left, so left of the “ridges” (it’s still just a flat plane) looks brighter, and right of them produces shadows. If I were to move the light source to the other side, the opposite would be true. This is how normal maps help reinforce the 3D appearance of an object that doesn’t have detailed modeling done. This is a HUGE benefit to game performance – it’s much easier to draw a coin that is perfectly flat on both sides, and apply this material to make it appear 3D than it is to produce a 3D model with this level of detail. Easier both in actual creation of the object as well as on your GPU for rendering it.
This video shows the coin in Unity. The scuffs and scratches are both part of the base color of the item, but the deeper scratches are also mostly in the normal map, and allow light to fall into them or be occluded from them based on the light angle. Note that in the above video, the edge of the coin are NOT flat, those are actually angled on the model itself. That would not be a great use of attempting to use normal maps to provide a 3D effect (at least not in any way I would be able to do it).
That’s what I have for normal maps, for now. But I plan to continue this as a growing series of posts on PBR materials to help demystify them for those of us new to this whole thing.
So… how’s things? Yeah, that’s cool. I’m over here playing with Archimatix (AX) and it’s pretty much the best thing ever. I’ve been watching this great tool develop over the past several months (though the dev has been working on it for much, much longer than that), and I have to say that I am incredibly impressed. I’ve been fooling around with tutorials and random stuff for the past day to see how things can fit into Labyrintheer.
If you recall my earlier posts on the topic, I am not an artist. I am especially not a 3D modeler. Much of what I’ve been able to accomplish thus far has been thanks to the incredibly awesome 3D assets provided by InfinityPBR. I’ve seriously considered how I’d ever get those sweet, custom models for some of the things I’ve considered in Labyrintheer over the past few years, and AX is the answer (I hope).
I’ve been messing with some of the node types just to get a feel for it, and one of the things I wanted to be able to do was some oddly modern, organic shapes as structures and “art” statues in some areas – mostly towns and wilderness, but not so much dungeons and caves. This was my first attempt at using the FreeCurve node to knock out from a block and have a very open and extremely simple thing.
Ignore the robot – I don’t think that Labyrintheer is going to have robots. But Robot Kyle is the AX spokesbot and it here just for scale.
While this won’t make it into the game, I don’t think, I plan to use this concept to create monoliths that showcase the twelve elements in the game. This was actually an attempt at using something like a freehand drawn Fibonacci spiral as a cutout. It didn’t do quite what I expected it to, but that’s part of the true fun with AX… the things that don’t work the way you expect, but give you interesting new ideas.
At any rate, I’m sure I’ll be posting about AX now and then and maybe even showing off some assets that could end up in the game. But for now I cannot recommend Archimatix highly enough for any Unity developer or artist. It’s an utterly fantastic tool.
I’ve been working on the core combat mechanics for the past week or two, and have finally got things feeling decent – at least to start with. I wanted to do something modular so that over time it would not only be easy to tweak, but also easy to add additional features. So, here’s a sample of what I have so far:
The base element here is the DamagePackage, which contains information about the raw damage amount (DamageArray – from each of the 12 possible elements of damage), the cause of the damage (DamageVehicle – e.g., weapon, creature, trap, environment, effect), any damage effects which would include damage over time (DoT) type effects, and any special effects from the attack (chance to be set ablaze, frozen, sleep, silence, etc.).
There is a slight error with the diagram above, as I’m typing this all out – the CharacterStats are brought in on the Talker and Listener levels.
At any rate, the DamagePackage moves to the DamageTalker, which takes in the attackers stats for any modifications, then sends the package along to the DamageListener on the attacked actor. The Listener works in reverse, unpacking the information from the package, applying resistances and such from the attacked actor’s stats, then applies that modified damage package to the actor itself.
The beauty of this system is it’s flexibility. The downside is that different components reside on different levels of the actors and their equipment. For instance, on the player, the stats and the listener are on the root, but the talker and all of it’s constituent parts live on each weapon (or spell). For a non-humanoid creature, almost everything lives on the root level. Except listeners – the listener must always reside on the same layer that the collider lives on. Well, it doesn’t have to, but it makes things worlds easier. So… this is something I’m trying to figure out. Though honestly, I could probably package this up for the Unity Asset store once I get it all cleanly situated. I’m happy with where it’s at now, but it has a LOT of work still ahead of it to be a nice, simple-to-use package.
Of course, the system needs to be set dirty whenever an actor’s stats change, or when weapons/armor are swapped out that might impact stats.