Categories
Sound For Screen

Ghost In The Shell

I have been asked by a 2nd year student to record, sound design and score a short film they are currently working on. I intend to use an excerpt of this as my submission, however until they have finished filming, I wont be able to start doing and sound design or foley. As a result I have decided to set about doing so for the chase scene in the animation ‘Ghost in the Shell’ as practice for my final submission.

In order to start the process I have decided to record the atmospheres first. After having been introduced to the Sennheiser Ambeo Mic in our spatialisation lessons I thought it could be interesting to use this mic for this purpose in order to catch a 360 degree scope of sound, making the experience all the more immersive.

After renting both the Ambeo mic and the Zoom F4, I attempted to take some ambisonic recordings in my room but struggled to understand why the recorded files on the SD card showed up on my computer as wav files instead of ambisonic files.

As a result I decided to undertake some more research into how to successfully record ambisonically using these equipment. I learnt about the importance of linked gains in ambisonic recordings in order to get the accuracy of levels of each channel absolutely matched. Furthermore I realised I had not set the right recording format on the F4, thus changing it from stereo to the Ambisonics A format.

An interesting thing to note is that you can use a pink noise generator to get the pre fader levels matched in separate channels if without a a device that has the capacity to link the faders on each channel.

I now intend to record the atmospheres using the city as my resource.

Though normally atmospheres are recorded – out of an interest in ambisonic field recordings i decided to rent out the Sennheiser Ambeo VR Mic along with the portable multitrack recording device Zoom F4. Whilst I aim to experiment in a variety of ways using this equipment, I feel the urge to record atmospheres as a way of exploring the widened sound image given by ambisonic recordings… bringing an immersive element to my sound piece…

Categories
Sound For Screen

Booming – Location Sound Recording

Taking the knowledge gained from reading ‘Production Sound Mixing’ forward, I did further research into booming on a film set prompted by Jessica’s mention of it in a previous lecture.

All in all, in order to be good at booming, you must be one of the most perceptive people on the set, staying aware of microphone capabilities, where the frame starts and ends and possible shadows or reflections.

In order to get the most natural sound, one must hold the microphone as close as they can to the person speaking without being seen in the frame of shot.

Borrowing a boom pole, Zoom F4 and shotgun mic from the kit room, putting it to the test in my own time, I quickly learnt how tiresome it can be holding it up for extensive periods of time, especially if needed to record audio during a wide shot. As a result I quickly learnt the correct ways in which holding the boom pole would relieve some form of stress from my arms, utilising height and also my shoulders.

Categories
Personal/ Relevant Sound For Screen

Production Sound Mixing

After reading the book ‘Production Sound Mixing’ by John J. Murphy, I came away with many tips and consequent ideas on how to successfully record sound for moving image.

In reference to wind, what one might think is only a gentle breeze, that can’t possibly do anything to a recording, most likely will, rendering the recording unusable. Use wind shields, any type of shelter and ultimately try to avoid wind at all costs.
Another must is to Verbally ID every take and rename files something useful, thus saving loads of time later. At the beginning or end of each take, state the location, when it is and what you’re recording is key to avoiding confusion when going back over the recordings.
Also, it is good practice to always monitor with your headphones on while recording anything. You don’t hear what your microphone/recorder is hearing, so wearing headphones when recording anything is necessary in order to identify anything disruptive like wind or overhead noises, and also to see if your recording levels are set up OK. It will also help one identify interesting details in the sound environment that they might want to focus on more.
Additionally, always record in WAV (uncompressed audio) format, in stereo, in order to ensure the highest quality of sound. Whilst the file size may be bigger, it can always be reduced later, but the quality of a source recording if you’ve originally captured it as a low-quality mono MP3 cannot be increased.
Lastly, when watching recording levels it is better to have them set slightly too low than too high – one sudden increase in volume in the sound environment and the sound will start clipping causing digital distortion which is impossible to get rid of and will ruin the recording. If the sound is a bit too quiet, you can always boost it in editing later on.

Categories
Collaboration (VR)

Combining Individual Contributions & Mixing

After finishing all our respective parts in relation to the Phonebooth game we set a date to meet up and mix the project together. This was a decision we made as a group so that we could all have an equal say in how the final product sounded. Meeting at mine, we set about combining all the audio we had recorded/ made, but only after we discussed and agreed on what we thought were the aesthetic requirements of the game. Some of which included bringing the User Interface sound fx as well as the dialogue to the forefront of the mix in order to lead with the narrative, whilst music and foley remained in the background. Some last minute touch ups on the overall flow of sounds within the game were made. Bringing all our individual elements was very satisfying as it Brought game to a new level of immersivity. Without the other elements my outlook on the game was fairly unimpressed due to the minimal nature of sounds needed. However when combined with my group member’s elements we were all extremely pleased at how well our independent work complimented each others. Surprisingly so, as minimal processing was needed as each element of ours held its own space in the spatial image of the mix-down.

Categories
Collaboration (VR)

Atmosphere Design

When I eventually got round to designing the atmospheres, I intended to create immersivity using the sound of howling wind. I found this changed the entire nature of the black space in the game, turning it into a more threatening and confusing place to be, echoing the mental state of the character. Keeping in line with the narrative of the game, the dialogue goes on to explain how the main characters loved one died because he drove into a storm and I thought recreating a storm like condition, would not only increase the feeling of isolation, but almost act as a foreshadow of what we are about to told, or rather a memory that the character cannot seem to escape from.

Inspired by the hand turned wind machines used by Ben Burtt in the animated film Wall.E I was tempted to create wind from foley somehow, but given the time constraint I had, and the potential inconsistencies in sound this method might produce, I decided to take more of a sound design within Ableton approach. Using a plugin called Audio Wind, I was able to fashion the sound of the wind and tailor it exactly how I wanted, giving it a howling/ whistling feel in order to enhance the eeriness of the game. I felt like, not much else was needed in the parts of the game that involve the black space, as I wanted to avoid making the sonic environment too busy, as I was aware that Inaki’s music would also have to overlay my contribution.

Sticking to the original recordings of the phone booth I had taken before the first crit, I chopped and crossfaded sections of it that I deemed suitable for contextualising the non-diegetic sounds that surrounded the Phone-booth. In order to maintain continuity however, I found myself sound designing the sound of a car, automating the volume, filter and reverb of the designed sound in order to introduce distance, albeit manually, as well as to glue to the ambience together.

Categories
Collaboration (VR)

Re-recording Foley & FX Design

In order to implement some of the techniques I have learnt over the period of time I have spent researching I decided to have another look at the video file of the game play footage of the Phone-booth game. Taking into account the narrative aspect and the themes of depression and isolation I decided to record some more foley sounds to increase the presence of said themes. Recording myself breathing heavily, and applying reverb really did enhance the sense of space when the player is encapsulated in darkness at the beginning and end of the game. I did the same with other elements of sound that required it, but limited these effects to only the black space in which the player finds himself in before and after entering the phone-booth, exercising the idea of juxtaposition I had been introduced to earlier on in my research.

Due to lack of communication with the Games students we were ultimately unable to retrieve the unity compatible file for the game and so were limited to working in Ableton. As a result of this I re-recorded the certain foley, such as the footsteps, as the one shots we had taken were no longer relevant to the software we were using.

Adding to this, although my primary concerns were that of the atmospheres and the foley, I took it upon myself to create some whooshing sounds In order to introduce a sense of movement to the game, specifically so when the player enter the shaft of light that the phone booth is in. Using the sound of paper being flicked through the air close to my microphone, I applied some filtering and reverb and boosted frequencies, whilst also layering it with a sine wave and another wooshing sample I had made myself a while back to create the transitionary effect, which defines the movement in and out of the phone booth.

Categories
Global Sonic Cultures

How Sound Design Makes Games Feel Lonely

Whilst researching into the game What Remains of Edith Finch I felt an encompassing theme of loneliness at times, and wondered at how the sound design might have aided in creating such a feeling. While also considering the nature of our group’s game ‘Phonebooth’, in which the protagonist feels depressed and alone due to the nature of events that have unfolded in his life, I thought it would be useful to take a closer look at how to recreate what I felt whilst watching What Remains of Edith Finch.

After watching a couple videos and visiting forums and websites discussing loneliness in video games I made a lot of fascinating discoveries. When considering the way a train horn might sound in the distance at night and the way it cuts through the quiet darkness, one can begin to realise that it is not so much the actual silence that makes us feel alone, but the sounds that break it. Sounds that remind us of the space were in have a lot of power to illicit emotions of loneliness as this is how we experience it in reality. For example, wind and how it reacts with the things around us can be a tell tale sign of the size of the space were in. Furthermore, other sounds such as the ticking of a clock or the dripping of a tap both come into the forefront of our listening when in a quiet space. In a a public setting, such sounds would sink into the distance as we passively hear and subconsciously register them, as we are bombarded with many different kinds of noises. However when there are only a few sounds loud enough to hear they become a lot more memorable, igniting a sense of loneliness in whatever sense of the word is relevant to the given situation.
Moreover, sounds made by the character themselves can also illicit feeling of loneliness. One example would be footsteps. When paired with relative silence it can create a sense of isolation and I have discovered that sound designers will actually increase the volume of footsteps as well as other sounds, in certain settings, to an unrealistic level in order to make them stand out more in the mix. When the primary sound a player hears is the one they’re making, the message being communicated is that they’re alone.

Reverb can also help with increasing the feel of loneliness in a game, as the spatial element it brings helps imagine up the space one is in, perhaps making it seem bigger than it is. Essentially, it is not only the sounds that create loneliness, but how those sounds are heard.
Non diegetic sounds, too, can increase the feeling of loneliness as it can potentially remind us of what we are not experiencing, or missing out on. An example could be the sound of muffled speaking in a room next to us.
Lastly, juxtaposition can be a good tool also in achieving a similar effect as the contrast serves, again, as a reminder to what was, and what is. For examples, using the game Reverb can also help with increasing the feel of loneliness in a game, as the spatial element it brings helps imagine up the space one is in, perhaps making it seem bigger than it is. Essentially, it is not only the sounds that create loneliness, but how those sounds are heard.
Non diegetic sounds, too, can increase the feeling of loneliness as it can potentially remind us of what we are not experiencing, or missing out on. An example could be the sound of muffled speaking in a room next to us.
Lastly, juxtaposition can be a good tool also in achieving a similar effect as the contrast serves, again, as a reminder to what was, and what is. For example, using the game ‘Outer Wilds’ we can compare the calming, cruising music that plays whilst driving the spaceship, to the barren silence, enhanced by only the sounds of breathing and the characters jetpack, once you are separated from your ship.

After all this research, a game that I played recently, called ‘Shadow of Colossus’ comes to mind. Set in a vast landscape where there is seemingly no-one but the main character and his horse, all of the sounds play an important role in consistently maintaining the feeling of isolation throughout the game, whether that be the sound of a distant eagle overhead, or the reverb laden clatter of the horses hooves when exploring abandoned ancient ruins. Even the thunderous crashes made by the giants made of stone serve well to remind us of just how vast the world is and how subsequently alone the character is.

Categories
Collaboration (VR)

What Remains Of Edith Finch

‘What Remains of Edith Finch’ being the main reference given to us by the Games students for their Phone-booth VR game, I decided to delve further into the methodologies behind it.

It is essentially a narrative driven, first person game that uses storytelling, gameplay mechanics, music and a variety of visuals and movement to create a collection of short stories that are meant to shed light on the odd events that cause the members of the finch family their tragic demises. When beginning to watch a walkthrough video of the game I instantly noticed the uncluttered nature of the opening environment. Set in the woods, in which the main character is walking through, the sonic feel is uncluttered allowing us to enjoy the atmosphere of the woods whilst also registering every sound made through interaction, as well as the narrators overarching voice.

As the designated foley artist, I tried my best to pay special attention to the foley sounds throughout the game. The first thing I noticed however, which is also one of the first sounds we hear in the game, was the footsteps and how they seemed a little too robotic and without appropriate variation to recreate the feel of reality. Once the character reaches the house, once more I noticed that the sounds of the footsteps were inaccurate based on the amount of steps being taken, and did not sound appropriate the surfaces that were being tread on. Learning from this, I will make sure to pay special attention to variation and recreation when re-recording the foley for our group’s game.
On the other hand I noticed that in other parts of the game, footsteps had been omitted altogether, which I thought was done tastefully as it gave space for the music and narration and other sounds to take precedence. Being in a world that has been created for the purpose of narration means that the creator has the choice of what is heard, and in a way this omittal helped add to the suspension of fantasy one would find playing this game.

Considering the narrative nature of What Remains of Edith Finch, and it being the main reference, it made me see the Phone-booth VR in a different light; less about the completion of tasks, and more about the overall story or narrative. Using this newfound look on our work I wondered how else What Remains of Edith Finch could inform our game.
A couple things I noted was that the music at many points was intertwined with the soundscape, helping to underscore th narration as well as direct the emotional undercurrents. As well as this the music ever became an interactive experience of the game during parts of it; for example when winding the music box in the hallway to play the theme song. Lastly, descriptive sounds accompanied the narration at points, even being cued by it, bringing sounds in dependent on what the narrator was saying. This perhaps might be a handy tool to help concretise the spoken narrative that takes place in the phone-box in our game.

Categories
Collaboration (VR)

Mixing In VR

After our crits I decided it would be beneficial to do more research into mixing for virtual reality in order to bring an increased sense of sonic realism to the game.

While in two dimensional media such as movies, mixing involves adjusting volume levels, appropriate panning and reverbs, however for more interactive experiences like that of VR, these qualities of the mix rely on distance and direction. In softwares such as Fmod and Unity, as opposed to Ableton (which is what I am most used to using) we are able to automate and attenuate these qualities based on distance. For VR specifically, panning is replaced with HRTF, which stands for Head-Related-Transfer-Functions, which supposedly provides more accurate directional cues than panning.

To expand on this HRTF is seemingly a more advanced way of localising audio as it is ‘measured in a specialized facility under controlled conditions. During such a measurement, a sound is played through a loudspeaker from a certain direction and microphones placed at the entrance of your ear canal register how your ears ‘hear’ this sound’.

Moreover, while non-VR games can be played on any form of speaker, however this means somewhat compromising the quality of sound in order to ensure the final mix sounds good on all kinds of sound systems. VR headsets however are all equipped and played with headphones, allowing for a more consistent audio reproduction. However taking this into account, when mixing for VR, one has to be aware of user fatigue, requiring volume levels that can be listened to for sustained amounts of time as well as not cluttering the mix with too many sounds in the same frequency ranges.

Distance is also crucial to successful immersion within a game. For example when considering audio cues, if a sound is loud despite it being perceived as far away then it will most likely throw off the player and confuse them, breaking down the illusion of immersivity. The rule of thumb for physically accurate distance attenuation is apparently that ‘the doubling of distance is a halving of intensity’. Reverb is another important tool that, if used effectively, can create the illusion of distance also. Sounds that are far away we perceive to have more reverb.

All of the research into these techniques are sure to help me when coming together with the rest of my group for the final mix down and has only reinforced the importance of controlling and treating the relative qualities of each sound as a critical component in the mix.

Categories
Collaboration (VR)

Phonebooth – Designation Of Roles, Individual Contribution & First Crits

After creating an excel file and organising all the different components needed for our game with my group members, we discussed what roles should be designated to whom. Being given the task of remaining in charge of foley and atmospheres I decided to take my handheld recorder to a phone box on a busy road near mine to recreate the events in the game as accurately as possible.

Before this however, I made sure to meet up with the games students one more time, in order to play the game again, refreshing the mechanics and the general feel of the game in my memory.

After getting them to send me a video file of the game I overlaid it with the atmospheres and foley we had recorded prior to this. This experience provided to be quite rewarding in how the sonics came together. While there was some miscommunication on the musical aspect of things, I decided to make a simple composition using a Tibetan bowl, manipulated in Ableton to create an eeriness to the parts of the game that required it. When doing this however, it was with the intention of being temporary, eventually being replaced by a more collaborative effort for the eventual hand-in.

When evaluating our progress after showcasing our work to the rest of our class during the crits, it became apparent that we needed to focus more time on the immersive nature of the game, particularly so as it is virtual reality.