GoedWare Game Jam 14 dev log and retrospective
In case the images won't load you can check the PDF version here
Table of Contents
- About
- Preparing for the jam
- Concepting, collaborative brainstorming and planning
- Creating sprites in Blender
- Creating the world
- Creating the weapons
- Running into performance issues
- Playtesting
- General issues with Unity
- Conclusion
This post is about what topic we dealt with during the GoedWare Game Jam, and my reflection on them. It’s a combination of a dev log and a retrospective. In case you haven’t played it yet, you can try it out in your browser here.
Preparing, brainstorming and planning
The jam takes place over 10 days. Additionally our team is remote. To ensure we deliver something workable I set up a few processes:
- Make the process of sharing ideas as easy as possible. For me the best way to do this is via a whiteboard so I set up a Miro board where we can write our notes, put our ideas and have our discussion.
- Have a roadmap so we know we are on track, most importantly, define how long we wanna spend on brainstorming, prototyping and polishing.
- Ensure our vision is aligned by planning few but focused meetings. From experience I know it can be tiring to brainstorm for long, especially when collaborating on this. To prevent brains being fried it helps when we agree on how much time we wanna spend talking in a meeting.
To start brainstorming we dedicated an hour to talk on Discord and fill out a mind map about the theme.
Note that we’re Dutch so our communication tends to be a mix of English and Dutch. Miro has a really nice mindmap tool that automatically positions your nodes.
That was all for day 1 as it was late (I still work and the others have school). For day 2 we would brainstorm again but now that we discussed the theme broadly we gave ourselves the task of thinking how we could turn these notes into a game.
I noticed that things didn’t really go well during the 2nd day. Delving deeper into our ideas and trying to see how they relate to resources just wasn’t sparking any creativity. So half an hour into our 2nd brainstorm I decided for the team to take a 1 hour break and use that time to think about a gameplay idea using the ideas we’ve been discussing. Then we would pitch these ideas against each other and take all the good ideas to combine them into our game.
This went quite well, the pitch that we ultimately went with was this one
With a gameplay idea set we went on to make a little roadmap to see if the work would fit.
I expected this roadmap to change pretty much after the first day, and it did, but the goal of it was to align on vision and create expectations for the upcoming days. With this, our planning was done and we were ready to start on the 3rd day.
Reflection
- Miro is a very useful tool for collaboration.
- Short and focused meetings were nice and productive
- The roadmap helped with identifying requirements, while it also could’ve been a list I believe it made a good overview
- Brainstorming is difficult, I feel our method of exploring the theme needs improvement, perhaps first have the team brainstorm themselves and then do pitches.
- The roadmap slowly but surely started to be used as a planning tool, using any task system like trello or even a Kanban board in Miro would’ve worked way better for this. So it's worth setting one up next time.
- 2 days for polish is too short. In our case it also included playtesting. But due to our game requiring so many systems to actually be playable we were taking a huge risk. The idea might end up being very boring. Next time more focus should be put on creative interpretation of the theme, maybe flip it on its head and instead of thinking the player is collecting resources, you are a resource about to be collected.
Creating sprites in Blender
A technique I wanted to try out for a while is to create 2D assets from 3D models. It’s a workflow Dead Cells makes use of and allows you to quickly create and adjust tons of animations. There were 2 things I wanted to achieve:
- Turning a Blender animations into a sprite sheet
- Generate a normal map for more interesting light interactions
To achieve this, you first create the model and animation in blender. To create the station I creates a hollow cylinder and used the discombobulate modifier on it. Afterwards I disable all viewport gizmos and then render the viewport twice:
- Once with rendermode material preview as we don’t want lighting
- Once with rendermode solid and the normal material
I exported 120 frames for the station to get a bit of smooth movement. This rendering generates 120 images so we still had to turn this into 1 texture/spritesheet.
To merge them I used Aseprite, you can “import as spritesheet” in which you select all images. And then “export as spritesheet”. Initially our texture was about 500Mb, but after enabling the settings to cull empty space and reduce the resolution we managed to shrink the texture all the way to 10Mb.
Reflection
- The workflow works well but you have to be careful with detailing, consistency in art style is important.
- Due to how we exported the textures they weren’t multiples of 2, so we couldn’t compress them.
Creating the world
The world is a very important system as it drives the whole game. It does a few things:
- Generate and modify the terrain that the player will mine
- Render that data to a texture
- A fog system to hide resources not scanned
- A lighting system for our god rays
Generate the terrain that the player will mine
To generate the world we’re using a combination of multiple noise maps and thresholds. I actually wanted to explore a voronoi solution to have some more interesting visuals but we had to cut our scope.
All ore sample from their own mixed noise texture, and we use threshold values and multipliers to control which resources overwrite which.
Terrain data itself is stored in a 2d array of enums. We have several data structures:
- ResourceType. An enum for the resource type
- ResourceData. Per instance data for each resource, such as if it’s revealed, its world and index pos.
- ResourceDefinition. A struct containing static info about a resource like its yield, preferred weapon, display name
We have another array for the fog mask, which are just stored as floats. Then we have several functions to interact with this array, like getResourcesAffectedByPayload, damage or getAdjacentResourcesOfType.
Render that data to a texture
The render system simply takes the array and makes a texture from it, then based on the ResourceType it generates a pixel of that color. We actually wanted to use tiled textures and then use this texture not directly, but to sample into the tiled texture and add some detailing to the terrain.
A fog system to hide resources not scanned
We have a custom shader for the planet that takes a fog texture. We simply use this to modify the color in the fragment shader though we were thinking of using it as a mask over a fog texture.
A lighting system for our god rays
This again is a different texture that writes to the same shader. Our approach was quite naive but we took the fog texture where red represents alpha, and then spawn and translate this according to the light direction. This way, you would get a texture that represents where light would land.
It was really bad on performance but with smoothing it did look very nice.
Next time I was thinking of implementing this as a post process shader. You already have an occlusion texture (the fog texture) and then for each fragment trace it to the sun. If it collides somewhere in the fog texture you know if it's shaded or not and it should be more performant.
Reflection
- Collaboration on this core system worked well by splitting it into all these subsystems instead of one big script
- I’m not sure if it’s best to have 4 different sprites, or to have them as textures and combine them all in a shader. I think the latter gives more flexibility, but the former has less dependencies and we can easily enable or disable layers to debug.
Creating the weapons
The weapons were downscoped a lot as well. Initially there would be an upgrade system for your weapons, as well as an energy systems and you would have to upgrade and you would have to queue projectiles and build stations to increase how many projectiles you could shoot in parallel. Not only was this obviously overscoped, but it also increases the complexity of the game a lot, meaning we have to teach the player all this, and the game already runs on a timer.
Needless to say it got scrapped. The weapons themselves just call into the World object, modifying the terrain by calling Damage with a payload. I don’t think there’s too much crazy to mention. All weapon and projectile stats are defined in the script variables, so when I implemented upgrades, I changed those to getters and used the upgrade multipliers. It’s not the most robust solution (and I think the current upgrades are actually bugged) but it worked quite well for this project.
Reflection
- Properties are quite useful for introducing state dependent values, like upgrades. You just have to make sure systems don’t cache these values
- Initially weapons were supposed to be the gameplay and strategy, but instead of upgrading weapons to be the gameplay, we made modifying the terrain as efficiently as possible the gameplay. Making just upgrading weapons the progression is backwards.
Running into performance issues
Performance ran fine in the editor, but the game ran terrible in a cooked build. I’ve never seen a build perform worse in a cooked build. I think it was due to the resolution but I started profiling it. Whenever you would scroll to the bottom of the planet, the game would drop to 15-30 fps. We also had an issue where the game would just freeze everytime you break resources.
For the fps issue the profiler showed that 40ms were spent in VSync on the renderer. I was shocked to see that and thought wow just disabling vsync will save the game. This didn’t do anything and I was sent on a false goose chase where I thought the build profile settings were not working as well as setting the vsync off in script.
After some further research, the profiler shows vsync when the GPU is busy, it doesn’t really have to be vsync that it is actually waiting on. What ended up being the bottleneck was the god ray shader and all the iterations we did to stamp the fog texture for light info. Disabling smoothing made the god rays quite ugly, but performance was back to the 100rds of fps.
For the freeze on resource destruction we initially thought it was our raycast function, but after profiling it was our world render script. Since our world systems are decoupled they communicate all mutations with events. For the render part we just rerender the whole texture. This issue was solved by passing an array of modified resources, so we can easily see which are affected and just update those.
Reflection
- Profiler and tooling in Unity in general is pretty good. Debugger works, profiler works. Never did I really get stuck as there are tons of tools available to approach your problems
- Just be sure to not take profiler results at face value, 95% in vsync is very unlikely, yet I still thought the tool must be right.
Playtesting
Playtesting happened on the last 2 days together with polishing. During playtesting we’ve added a ton of missing feedback, like the resources you’ve gained, the weapon you should use, the upgrade system (we decided this was the one thing truly missing). With 1 day left we had just enough time to add the systems.
Reflection
- Playtesting as always was very important, and it’s good we had a few days for it
- 2 days for playtesting and polish really is too little, but that is the issue with creating a concept that requires a ton of systems before you can even see if its fun
- In the future, I would definitely say 50% of the time needs to be spent on playtesting and polish, and because of this I will focus more on creative interpretation of the theme, rather than games with multiple different systems that need to be balanced, and only then consider gameplay progression.
Experiences with Unity
Nobody in the team had any former experience with Unity but one of our members wanted to learn about it, this seemed like a good opportunity to learn about it.
Reflection
Overall, I'm not the biggest fan. Being unable to read some engine api internals leaves me guessing and some of the new system like the new input system didn’t work well with the short timespan we had.
For example I tried to use 2 player inputs on different gameobjects because I assumed they forward input events to our script. Instead it would bind the 2nd input to another device, which makes sense for coop gaming but couldn't find a way around this.
Another issue was that pressing the UI would not eat input events. To solve this the official docs mention: “...The easiest way to resolve such ambiguities is to respond to in-game actions by polling from inside MonoBehaviour.Update…”. But polling kinda goes against making an event driven input system right? Now a lot of my confusion can also be attributed to the lack of wanting to learn the system with the timeline we had so I do give them the benefit of doubt.
On the positive side, it did contain all the tooling needed to create our project. It was also quite performant, we did run into performance issues but I tried to make the same project in GDScript and iterating over 200000 pixels already froze my application by 500ms. And lastly it did increase my motivation to use a custom engine or Godot for my next projects.
Conclusion
This was our first game jam. At the beginning time felt quite abundant but it quickly became apparent that the scope was too big. We've cut quite a few features but have still managed to make something that feels quite polished and fun to play.
Getting to experience working together towards a deliverable in a short timespan has been very insightful, the biggest concern was that our idea required quite a few systems to start testing, so we never really knew if it would be fun to play until it was done (and we didn't have anyone on the team with a focus on designing to ensure we're making something that's actually fun).
Once the systems were finished and we tried it out, it was quite rough, but by playing and coming up with ideas to improve the gameplay while ensuring they stay within scope we were able to deliver something that we are satisfied with.
For next time I want to focus on:
- Game design and ideation.
Exploring the theme more to come up with a creative interpretation that requires less foundation to start with.
- Since this was our first gamejam, I really put the focus on getting a tangible idea, and working towards it. With clear expectations it's easier to visualize the end product, ensuring the team is aligned. Now that I’ve had a taste of the entire process I’d like to take a bit more risk next time
- My skills in audio and making games visually pleasing. This Jam has also given me first hand experience in working with all systems that are required to bring a project from concepting to polishing. I’ve been mostly doing gameplay programming but have tried many new tools and topics. To name a few: Miro for collaboration, art with Blender and Aseprite, sfx, creating assets for Itch. I think there’s a lot to gain there both for personal development and the quality of my future games. Our music was downloaded from a royalty free channel and the project had way more custom shaders than expected. Our approaches were quite naive and caused performance issues. I’ve gained a newfound interest in these areas and will be developing them for.
Leave a comment
Log in with itch.io to leave a comment.