AUS230 Learning Journal - W4
- Clement Chan
- Mar 5, 2020
- 7 min read
Updated: Apr 5, 2020
Monday Session
On the Monday Joe, Phil, Roy, Ron and I completed recording of critical Foley sounds for "A Quiet Place II" trailer. Our session started from 11am to 5pm during which we tracked all of the critical Foley sounds that we deemed necessary for the final deliverable. I briefed everyone to bring bottles, cans and any house hold items that they wanted to record for the Foley session.
Figure 1 - Foley capture session with all the items used and making sound the picture
Key notes during the session were;
Rode NT1 for capturing items like cans/bottles to get the high frequencies content
Rode NTG4 for capturing Foley like footsteps or specific hinge creak from a trolley
Talkback mic using sm58 to be able to hear the people in the recording room
Items to be recorded were placed at the ready around the edge of the room for recording use during the session (see Figure 1)
Joe setup his laptop in the room with the scene playing back over specific sections that we outlined in an excel sheet. This allowed usd to move quickly through the session to record the necessary Foley quickly. I was in the Studio room monitoring the output with Ron while Joe, Roy and Phil were in the live room setting up.
After completing the session all people who attended the session got a copy of the captured Foley. It turned out to be an extremely productive day.
Tuesday Session
On Tuesday I confirmed with Lark (Tri 5 Film student) to be interviewed for our podcast project that we decided to create as our final deliverable. She will give an interesting perspective on the effect of Corona Virus or as we call it now COVID-19. Also confirmed Oliver (Tri 6 Animation student) that he is keen to be interviewed for the project as well. Other people included Gary from audio who is currently under lock down in China in Shanghai and a Indonesian friend of Gilang who goes to QUT.
Wednesday Class
During this class session we were put into groups of 3-4 and assigned to different studios to participate in the dialogue editing. During this session we had to extract 3 characters from a badly recorded room mic, lapel mic and boom mic recordings. We first split off the three characters into their separate tracks then we used clip gain in ProTools to balance the volume levels of each character and removed any extraneous noise from recording like pops,clicks and environment effects like table noise. I applied an EQ on all the tracks to cut out any low level rumble and dipped some high frequencies to balance the frequencies of each mic since they used several mic's in the recording I had to make all of them sound similar and diegetic to the scene.
Wednesday Night

Figure 2 - (Jeff Van Dyke, n.d.)
The highlight of this meetup was that it Jeff Van Dyke (Audio director for Aliens Isolation, Hand of Fate) and Zander Hulme (Audio Composer and sound integrator for Sling Kong) gave a presentation, "Cutting the Chord: Exploring Interactive Music". This was a short version of what they would be demonstrating at GDC San Francisco later next month.
Their talk was eye opening and was so worth spending the time to go and listen even though I was tired after an early Wednesday start. I got to listen to a GDC talk without having to go to San Francisco. ;)
Later after the talk, those of us that stayed behind met at the local pub with Jeff and Zander for networking over dinner. I got have a good conversation with Jeff and we talked about the game that I was working on for my final project and he seemed very interested in the rhythm game that my team was working on. He suggested that instead of letting the game engine handle the timing of the beats, I should use "callback" functions within in FMOD so that I can offload the timing to the FMOD sound engine. Zander then chipped in that if I let the game engine handle it we would get timing issues since inside Unity the sound engine and the game engine run different threads and will inherently introduce delays of milliseconds that will cause the rhythm of the game to go out of
sync. Zander and after talking to him I found out that he became funded by Screen Queensland and has a residency position while working for sound projects to get his business up and running.
After a few good laughs and some more stories of how their industry experience the night ended after Jeff and Zander left and subsequently we all did as well.
Friday Session
We got some feedback on our final project plans after our pitch regarding the delivery and the scope of the final deliverable. At my project Nick suggested that I follow an example like the BBC podcasts and a subsequent case study on one of them should help me understand how to create the environment for the recording.
For our studio session later in the day we mastered tracks by Jack and Alex. Guy demonstrated for the rest of the day how to master and add compression techniques to master a well mixed track. Making things louder was important but ensuring that our mix was mastered for the correct platforms and adhering to the LuFS levels for each of them was equally as important. The balance was also extremely important when mastering to maintain a balance. An analogy that Guy used was that mixing and mastering is like a "see-saw" in a way, when you EQ, compress the low end you have to keep a balance with the amount you do to the high end as well.
Week 4 Summary
Wow this week was really getting into the heavy side of the workload and really did not leave much time for me to digest the information this week. Even so, after taking the time to write this blog I reflected on the events and found out that I have learnt a good. On Monday I worked with several of our classmates and found it very effective that we knew how to setup a talk back mic early and we knew to do this from our earlier trimesters so that everyone in the studio was able to talk to each other quickly and get effective feedback for recording all of the separate Foley items.
From an earlier trimester working with Joe has always been productive and we both were able to get straight onto the job by setting up the items around the room so that we can record quickly and purposefully for the sounds. Before the session last week I prepared an excel sheet with the required sounds and shared this with Joe before the session. We both moved quickly through the identified scene and items that were to be recorded and I found that this allowed our day to be more productive. Of course we deviated from the list quite a bit due to our team members on the day debating over the critical parts in the recording but having the skeleton plan help move our day along quickly. By setting up the session before hand with the two different mics allowed us the option of recording specific sounds. For instance we tried to use the NT1 for making the whipping noise in the trip wire scene for AQPII trailer, but it picked up also a lot of Joe's hand and body movement noises which we did not want. Using the shot gun mic (NTG4) we were able to capture just the whipping noise from him whipping a plastic spatula near the end of the mic and this sounded perfect for the recording.
During the Wednesday session Nick gave a class on dialogue editing. It was a good refresher for me to work in ProTools and split the recording from a sample project. The sample project was from a student project where they acted out a scene from a movie and captured the sound from lav mics on two of the three actors, room mic and a shotgun mic. What made it more challenging was that several characters were on one recording and the different mic types captured different ambiance and levels. Following Nicks method we split the talents into their respective vocal line and cut in the clearest recordings with the least noise to get an overall sound. The next step was to make sure that all the separate characters had to sound like they were recorded in the same room by EQ'd each dialogue line so that the characters seemed diegetic to the scene. Lastly we had to find a track that had the right atmos from the room mic and fitted it through out the track to give the scene life. After doing this we were able to make the dialogue fit with the scenes tone.
Going to the game dev night was highly rewarding and I was able to meet one of industries veteran game audio engineer Jeff Van Dyke. I gained an insight from him how he and Zander were able to get funding from Screen Queensland to support the development of game audio in QLD as well as getting a super useful tip on using callback functions for triggering in game events for a rhythm game. Afterwards I was able to get Jeff's twitter contact and gain a contact who was interested in the final project I was working on.
During the Friday session I listened to Nick and Guys advice and found that BBC had great podcasts examples. After listening to several BBC podcast I understood what they meant about how the narrator or interviewer can take you to a different place through sound design. An example is the world environment podcast where they incorporated city sounds from the location that they were talking about, and used suitable music from the world location to immerse the listener when they talked. I definitely could use this knowledge, it is the same application of knowledge when I did the dialogue editing on Wednesday. I needed to use the correct atmos to make sure that each interview or scene in my final project is diegetic to fit the scene. If I interviewed the person in a cafe then there should be cafe ambiance or if there were at a bar then there should clinking glasses and some background chatter.
This was a highly valuable week and I have learnt a great deal regarding how correct sound design. The correct ambiance placement can immerse the audience into the narrative of a movie or transport the listener into the local of the recording.
References:
Jeff Van Dyke. [Image]. Retrieved from http://www.game-ost.com/persons/22/jeff_van_dyck/
Comments