top of page
helmetImage .png
uncoveredLogoTransparent.png

Gameplay and UI Programmer | Created in 4 months | Team of 8
Unity | C#

crossSlabImage .png
spindleWhorlImage .png
  • Made with a team of 8 people - 2 designers, 3 programmers and 3 artists

  • Pitched and developed prototype game for client 'Society of Antiquaries of Scotland'

  • Designed and developed mechanics such as player movement, artifact interaction and a flyable drone

  • Contributed heavily to Programming, UI, Animation and Design

  • Communicated with artists and designers to deliver a vision

  • Received an A+ for this module and very positive feedback from anonymous peer reviews

"The main focus of Uncovered is to provide a fun educational experience on the process of archaeology to the user. Focusing on exploration and excavation, the player will manage their budget to help them effectively excavate artifacts at a dig site, allowing them to then explore a recreation of the habitats of humanity past."

cowFemurImage .png
pot2Image .png
arrowhead2Image .png
crossSlabImage .png
Pot1Image .png

Artifact Interaction

I designed and implemented the "Artifact Interaction Mode" within the game - the visuals and function of this mechanic was heavily inspired by games such as Resident Evil 7 (2017) and Tomb Raider (2013).

 

The player can look towards an artifact and pick it up and rotate it to examine and uncover details about the object. There are "Points of Interest" on each object, meaning when the player rotates artifacts they can view different information on the artifact in the context of where it's placed.

The mechanic went through a few early design changes, mainly due to idea that I suggested and pushed to be included such as the Point of Interest system - I found it important that when the player finds an artifact, their hard work is paid off with a satisfying and in-depth look at the artifact which is both visually pleasing and educational.

As a team we decided that this mechanic would use the WASD controls to rotate the artifact rather than using the mouse - being able to drag the mouse to rotate an artifact would be great, but for scoping reasons we decided to prioritise other mechanics and gameplay elements.

I created the script for Artifact Interaction with multiple different values and behaviours - this ensured that the designers could tweak the appropriate values.

6.png
5.png
drone gif.gif

The Drone

In the first few weeks of development, I knew that we would need a custom cinematic camera for trailer footage and came up with the idea of a flyable drone which I pitched to the designers - the drone would be a simple side-tool the player could use to view the level from above. The designers weren't too convinced of this idea, so I created a prototype overnight. The Drone ended up being one of the main essential tools in the game, allowing the player to scan for artifacts beneath the ground.

The controls for the Drone are based off other games such as Watch Dogs 2 (2016) and Call of Duty: Warzone (2020) which both have the player control the drone to survey their surroundings. When the player presses the '3' key, a remote controller is brought onto the screen which displays a video feed of the Drone's camera, and the Drone is spawned above the player. I used post-processing effects such as film grain, lens distortion and colour grading to make the Drone camera appear like a video stream.

We initially discussed a "battery" system where the player had a limited amount of time to use the Drone within the level, but decided to instead use a signal range mechanic that prevents the player from piloting the Drone out of a certain range from their origin. This mechanic was inspired from Watch Dogs 2.

Player Movement

I solely designed and developed the player movement within the game - the perspective is First-Person and the player can walk around on an archaeology field. It was very important that this mechanic would work as best and smoothly as possible as the player will be actively walking about.

For many, it may be simpler to use the in-built unity First-Person Controller for the game, however I opted to create my own custom controller that tailor fit the needs of our game. With full understanding of how the script was created and how it works, I could easily make tweaks and iterations. The most important goal of this mechanic was making it feel smooth, controlled and freeing.

walking.gif

UI

I also contributed to UI design and development in the game. Designers created wireframes of UI which I then created in-engine using assets for our artists. After playtesting I would discuss them with the designers and we would make certain alterations to the UI. The images on the left are of the "Specialist" Page in the Journal (The menu within the game), which changed quite substantially from it's initial wireframe.

I contributed heavily to the design and development of 4 out of the 5 Journal menus, and designed the hotbar and interaction UI.

helmetImage .png
pot2Image .png
arrowhead2Image .png
spindleWhorlImage .png
bottom of page