Week Three

Week three which after fixing some issues from the previous week, the second development sprint began, which included prototyping UI / Sound / Collectables and really fleshing out the test level ready for the play test sessions begining in week five.

I realised after I decided to give the player full control over the camera, that I would need to solve a problem where the player would not be visible behind certain objects - and I had a few options on how to fix this, either dive into the Cinemachine Camera component and make it so that the camera has collision detection and would change the length between the character and the camera - essentially zooming in.

After having played a lot of platformers on different consoles in my life, my pet peeve is a camera that is constantly clipping into objects and jumping between being the optimum distance away, to diving right behind the character.

I decided against implementing that at the moment, it would take a lot of work and trial and error to create custom collision behavior for the camera, so that it is only affected by certain objects, which is more than doable but would take a lot of time and would need to be perfected later on when I have more of a enviroment and not just a level showcasing the mechanics.

I decided to implement a feature within URP that would allow me to render the character even if they are behind a object - games such as Super Mario Sunshine. ( On The Right)

Essentially the Renderer switches between two different materials on the turtle, the original material, and one that would be rendered when it was behind a object.

Essentially the player is on its own Layer Mask meaning If I switched it to say a pickup Layer Mask - with all the pickups within it - all the Pickups would be visible above everything else.

The way the turtle was set up caused some issues, I added all the parts to the Player Layer Mask, and set it up as seen above. I duplicated the original turtle material and increased the darkness of it, hoping it would mimic what Super Mario has as shown above. As the turtle is split into different parts - it caused some issues.

(See Below)

If you look very closely above, you can see the two right legs, through the character on the left side of the legs, which unless the material is exactly black it extremely obvious and does not look good.

This method caused a few other issues as well, when characer is being alligned to the floor, it is from the centre of the character, meaning that sometimes the players head / front legs can sometimes clip slightly into the floor, which is not noticable from the players point of view, unless your really looking.

However with this effect set up it is very obvious - as as soon as the head clips through it changes to a darker material that is visible right through the characer. I was not really happy with this effect so began researching a different method of creating a similar effect.

I found two different options, sending a Ray Trace of the camera towards the character which returns the object that it hits, and then changing the transparency of that particular object or creating a shader that changes the transparency of only PART of the object, using a similar Ray Trace method to detect where the transparency should be.

I am in a few different discords in regards to the development of different platformers and one I am particularly active in is one for a platformer in development called Vista World. I was directed by the developer to a website that does specific tutorials on Shaders within Unity (available here) that contains a specific section called a Introduction to the Shader Graph which after reading was vital to my development.

After watching some tutorials and videos explaining the shader graph in detail I felt comfortable enough to dive into my own shader that I feel like would be a good solution. The way this works is with a custom material that is based on the original material of the object with the custom shader applied - this means that for every object I want to be transparent I need to create another material that is not very perfomance considerate if I want everything to use this shader, but performance is going to be a issue for later on within the project.

Within this custom material, I have specified the area that will need to be transparent around the character, and the Shader below is grouped into different sections describing the functionality.

Screen UV, which gets the offset and will allow me to be more precise with where the area that becomes transparent is in regards to the character.

Version One of the shader can be seen above, I was not happy with the size of the area that was transparent, so I decided to add another area that would allow me to keep the size consistent no matter the screen resolution as well as a section that mimicked the Elipse Node within the graph so I could specify the area a bit more rather than just a built in node.

With those implemented this is what was created below - the transparent space fit more suitably around the character.

I did not like the fact that the transistion between the two zones was so abrupt, and sharp - so I added two nodes that allowed me to alter the smoothness between the two zones as well as the transparency of the inner zone.

After making these float values public, and accessable sliders within the editor, I messed around with some values during runtime and found something that was suitable - and I was happy with the result.

The entire shader can be seen below split into those individual parts.

After that I decided to try and add some more lighting to the test scene to see how the shader was with some more lighting, and different materials. I decided to search the asset store for some good skyboxes and found a suitable asset that can be seen below. It includes a few different styles and shades of skybox that later on within development could be used for different levels.

After having completed the shader work seen above, I decided to add a asset that will be really important when it comes to having people playtest the game, as well as giving value to the development updates I am giving on my blog. That is screencast inputs - have you ever seen on tutorials, by say for example Blender Guru where to make the tutorial easier to follow, they add whatever buttons he clicks to a little text box at the bottom of the screen to make it even clearer if you are not that familar with the software, what to click and when. This is something that can be added within OBS ( the recording software I use ) - so would be useful when showcasing features with controller or keyboard but not so useful for playtesting. After some research I came across a free asset on the unity asset store that held the exact functionality I want for keyboard - and a realatively cheap addon that holds the functionality for controller.

This asset did not work right away, in the particular unity version I am in (2022.3) which supports both the legacy input system and the new input system in regards to showcasing what has been pressed on the UI. After some investigation and email conversations with the developers of the tool, we figured out that the default behaviour for this version of unity is to have the Active Input handling set to both versions - and the Script that held the code for this uses compiler directives to include / ignore code for either the old input or the new input. After we came to this conclusion they changed the code and pushed the changes via a custom package, and then once I tested it - that change was pushed to the asset store. This then added the functionality that can be seen below.

I then decided to jump right into implementing a really basic UI, as I have not created a menu in unity for quite a while now, so before I tried to design something complex I thought I should familiarise myself with the tools available to me.

I created a new scene that would hold all of the start menu. The Canvas component holding all the UI elements, and the Event System that handles what happens when each button is pressed.

After some research to make this work in tandem with the new input system I needed to add a new Module to the Event System object, with after making a few changes to the DefaultInputActions Input Asset within the UI section would now work with all types of controller.

After putting it to test I came across two different bugs, the first being a issue with the navigation around the buttons with a controller, when I clicked the Dpad or Used the Left Joystick, swapping between buttons would only work some of the time. I fixed this by changing the Navigation around the buttons to Explicit meaning that it now worked and I could also make the buttons cycle through, meaning if you clicked down while having credits selected, you would move to start rather than not at all.

The second issue I had was that when transitioning from testing with a mouse to testing with a controller, it would deselect the button and there was no way to reselect it while using the controller. After some research it turned out to be a simple check box behaviour that I had turned on!

A final issue I had was if I started the game and tried to instantly start with the controller, no button would be selected, so I created a script with a public button variable which I could set in the editor, that would automatically select a specified button when the script was ran.

After getting more comfortable within Unitys UI system, I began researching for some appropriate assets I could use within my game, and came across this Modern UI pack, which comes with built in prefabs and would save a LOT of time with me now not having to create all of the assets. After looking into the pack, it comes with lots of different types of buttons, sliders, checkboxes, icons - alot for the $40 they are charging, so it was a good investment that I can use for lots of different projects in the future.

After replicating the system I had created with the other assets to see if the assets were going to fit into my project I came across a issue with either the UI Pack or my script I am still unsure, I mentioned earlier that I needed to create a script that would auto select a desired button when the script was ran, meaning that when it was played with just a controller it would auto select the play button. The way I was going to do a really quick UI system was to have the settings be accessable through the start menu rather than within the game itself (for now) so I had planned to do this using onClick events, within the event system. For example - if Settings is pressed - I would set Start / Settings / Credits / Quit to inactive - and set the settings relative buttons to active - meaning I could handle it all in one scene.

When quickly prototyping this up I removed my script as the Modern UI pack had a Button Manager that would allow me to set the auto selected button on start - so I dragged it in and it worked.

However, when I began to do what I mentioned above using onClick events - I came across a issue, when I clicked on a button, it would do what I wanted in regards to settings certain buttons to either active or Inactive - but it would not auto select a button on the active ones, meaning that controllers were stuck - but mouse and keyboard worked fine.

To combat that issue I reintergrated my script but added two new functions and button variables that would help me manually set the selected button when certain buttons were pressed, but after testing the script, the Unity type Button variable in the inspector would not allow me to set it as the button in the scene. After digging through that Modern UI documentation I could not figure out why it would not allow me to set it - so after dedicating too much time to this issue - I split it into different scenes (as seen below) and the functionality worked for now.

After testing this, it worked both with controller and Mouse and Keyboard it worked as seen below.

After feeling comfortable using this asset pack, I began to design the user interface of the entire game in more detail - I did this on Good Notes app on my Ipad Mini that has proven to be a really useful asset in my Game Design. It not only allows me to work on the go, but at recent networking events has been useful when showing off CV / Portfolio for feedback.

After designing the UI on the Ipad Mini, I decided to try and correlate this across into Unity at least the fundamentals to prove a point - so I dove right in and managed to create this - fully functional menu, using the Assets that I got from the asset store.

Example of One of Menu Layout

Flow Chart Showing The User Interface Journey

After further development I decided to quicky add in the ScreenCast Input toggle into the menu to remove it from the UI when I do not require it to be there, as it is only intended for playtests and video showcases. A rather large video can be watched below to see the full menu in action.

After testing around the menu with both the controller and keyboard and mouse I identified two main issues I needed to solve. The first being the way I have set up the menu is all within seprate GameObjects, similar to how I set up it in the first example, and was toggling the active boolean for that GameObject, so when I added functionality for the pause button on controller (esc for pc) to pause on the first click and unpause when it was clicked again, I was just toggling the settings menu, so it would leave the other menus active if I was on that page of the menu. Footage can be seen below.

This was not a hard fix, I was using a custom function that was triggered when the pause button was clicked when the game was paused to unpause it - hence the name ResumeGame. I added some [SerializeField] to GameObjects so that in the editor I could drag the corresponding Objects on to there, so that whenever that function is ran - it sets the Boolean isActive on all of those GameObjects to false, just to be safe.

The second issue is something I faced earlier on in the week, the menu works flawlessly with keyboard, but with controller when switching the menus, the controller has no way of knowing what to select - so it does not have anything selected, and has no way of selecting anything. I began trying to replicate what I had done to solve the previous issue - by creating a script that would allow me to assign the selected button within the inspecter and then actually setting it as the selected button in the OnClick events. This did not work as the Unity Type button was not the type that the Modern UI Pack was classing as a button. After some research I decided on a different approach.

I began by creating variables that would allow me to set the Gameobjects (After spending 30 MORE minutes reading the Modern UI Documentation and realising the buttons were custom GameObjects ) With the idea being that when you entered the the Debug Menu for example - the first selected button is always the same.

The way I did this is once again, OnClick events - using OnEnterDebug for example again, when you click on the Debug Menu Button it does not only enable the Debug Menu and Disable the Settings Menu - but also runs the OnEnterDebug function that using the EventSystem within the UI, sets the selected GameObject. After some tweaking and messing around with the navigation - It worked fully with control and pc.

(Seen Below)

The functionality in regards to the VFX Toggle and the Resolution is not linked up yet but for now its a feature I can add later on as it is not really needed right now.

The final step for this week was adding audio, and where did I end up? The Asset Store - after not finding something that was standing out as suitable for a prototype I decided to look into OpenGameArt who have a rather large open source game audio library where I managed to find a pack, with very basic jump / land / pickup sound effects that can be placeholders for now.

Previous
Previous

Week Four and Five

Next
Next

Week Two