Week One
I used most of the time this week to really plan out the project, I decided to split the 12 week project (15 with easter break) into 6 development sprints followed by playtesting and feedback sessions. It allows me to keep on track, and keep to a tight deadline all while constantly iterating.
I am using Hack N Plan to organize those development sprints into smaller manageable tasks that help me estimate how long they will take to implement.
Alongside planning, the fundamentals of this were set up next - The GitHub Repo my version control of choice was set up in no time at all - allowing me to access my project files in Lab Sessions instead of having to bring a laptop in and allowing me to work on multiple different features without having to worry about making a mistake.
After speaking to my Supervisor I decided that it was best that I spend the first week of my project creating a proof of concept build that would do as it said on the tin, help me better explain the project as well as proving my overall vision.
Before the project had began, I had spent quite a large amount of time searching for assets I thought would be suitable for the vision I had. The Unity Asset Store had a amazing asset pack from a creator called Omabuarts Studio That said asset pack can be seen below.
The Asset Housed 45 Different fully textured, animated, blend shapes for Eye Animations all with 4 different LOD Levels built in for performance. After seeing the turtle and its animations I fell in love with the little guy, and he became the main character in my game. I do plan to use some of the other underwater assets, for friendly / enemy AI if development all goes to plan.
After importing these amazing assets into the project, and converting the materials into Unity Render Pipeline it came to the point where I needed to make a decision, do I spend the time now diving into a Hierarchal State Machine, that I at that point had not tackled before or just do what I am comfortable with. I decided to go against using the State Machine for now as for the prototype proof of concept, I only planned to have 4 States : Idle, Walking, Running and Jumping and If / Else based logic all held in one script that handled movement and animation would suffice. I would come back to it and refactor it when the time came.
I handled all of my movement inputs within Unitys New Input System, it allowed me to implement controller support, so after setting up my Action Mappings and generating a C sharp file alongside it, it was down to getting the character animations set up.
Using Unity’s Built In Animator Controller I set the Animations and there transition states and logic up to run of 4 Boolean Variables that are accessible in the Character Movement Script, and then can be used to trigger the right movement logic depending on the state. Test Footage can be seen below as I make sure all the transitions were functioning as intended.
After this, working on getting the character actually moving was the next task, so I set up callback functions, that are triggered when any of my assigned actions (that I set up earlier) have been pressed, are being pressed and when they are cancelled. As I intend to make it fully functional on controller as well as mouse and keyboard it using Move.performed was necessary.
Splitting the next couple of important functionality into different functions: OnJump / OnRun / OnMovementInput / HandleRotation / HandleGravity / HandleAnimation / HandleJump. This allowed for a more organized script, that will allow me to better find where bugs / issues are - as well as making altering this code later much easier.
HandleJump Functionality:
HandleAnimation Functionality :
After Implementing the base logic of moving / jumping I managed to go from this basic movement:
To adding logic to the HandleRotation Function, so that the character rotates correctly to where he is moving.
This wasn’t too difficult after researching about Quaternions through Youtube and Online Sources, I managed to set up a script that rotated the character to move towards his movement on two axis’s - X and Z.
This logic works for the moment, but when I come to adding swimming into the game the character will need to rotate on the other remaining Axis - but that shouldn’t be too much trouble - as I allready have a local to that function Vector3 set up, called positionToLookAt - where I am at manually setting the Y axis to zero.
Next up was the jump, which after adding simple logic to the HandleGravity Function worked well, the character jumped the same height no matter how long the button was held.
I decided after the basic jump was working, I decided to give players more control over the characters jump, and by that I mean Variable Jump Height - which is commonly found in both 2D and 3D games, Super Mario / Celeste are two well know examples. You press jump and hold it, you will jump really high and then slowly back down, whereas a short tap, will jump not so high. Releasing the jump button will cause the character to speed up towards the floor, so you as a player have a really high level of control of where you are within your jump.
HandleGravity Function
Within This Function I have included a local isFalling boolean alongside a fallMultiplier that I can alter later to help make the jump feel more snappy if needed. What this does is essentially if the player is falling and jump ISN’T pressed it applies a Fall Multiplier to the current gravity.
Alongside this I researched into Velocity Verlet Integration which is a way developers make the jump physics and characteristics consistent regardless of the framerate.
The way I handle this is, by storing the previous frames Y Velocity, then another instance of previous Y velocity * Gravity * Time.deltaTime adding both of these together and dividing by 0.5 to average them out.
Functionality can be seen below:
Bareboned Movement Prototype Complete - while I was in the Animation Controller - I used a different layer to use Blend Shapes to add the character eye animations to different states - it is almost identical to the Animation Controller I showed earlier with the same logic - but just individual eye animations.
Up until now the character has been moving in world space, not relative to the camera - which is unlike most platformers that have CameraRelativeMovement. I thought that would be a nice last step before I finished up the first Development Sprint of the project - since I had more time that I imagined.
After I created that script I just needed to update the calling movement section of the Update Function, this took a little bit longer than I wanted and is not quite perfect but I am happy for the moment.
Once this was completed I was happy enough with Week Ones Prototype Progress, I created a basic Canvas and using Unitys Text Mesh Pro added a little Demo Test message on the lower half of the screen. I also used Gridbox Prototype Materials to create just a little bit of a test scene.