Will Cammell
Videographer
Video Editor
3D Artist
My interest in 3D software such as Blender began in January when I saw an opportunity to film a music video for my good friend Henry Parworth, a fellow WMU student. Henry releases music under the name Headband Henry on Apple Music and Spotify and managed to amass a large following of 13,000 monthly listeners and 200,000 total streams in the year 2023. Henry works alongside Sam Peters, a producer living in the same neighborhood as him. Sam's in-home studio occupies his entire basement due to the sheer amount of equipment he owns. Later on, I had the opportunity to film a recording session between the two. After the session, I knew I had just stumbled into a basement with two real-life geniuses.
​
I didn't know Henry personally, but I had friends who did. He had just released a music video for one of his songs. Honestly, I wasn't too impressed with the video. I thought, given how good he sounded, he needed something that matched the effort he puts into his music. I decided to direct message him on Instagram to see if he would be interested in working together. Fortunately, Henry is incredibly receptive to working with new people and agreed to hear me out. In the messages that followed, we agreed to meet up at Sprau Tower to discuss what I could do for him.
​
When Henry and I finally met, we instantly clicked on a personal level. I hadn't done anything in Blender yet, but I wanted to use this as my practice run with the software since I offered to do it for free. One of the things I remember saying to him was, “With Blender, I can do whatever you could possibly imagine.” Looking back on this, it was incredibly cocky and naive to say. What I meant was, "I could potentially do anything you wanted," but that just didn't sound quite as compelling. Up until that point, I had only rendered out 6 still photos with my friend on his PC, which had a GTX 3090 graphics card. When I made that promise to Henry, I hadn't accounted for what it would be like to render out thousands of frames of an animation on an exponentially slower MacBook Air. This was the true beginning of my personal Blender journey.
Dreamcatcher Music Video
Several compositions were created using the method of importing Henry into Blender as an image plane converted to a .MOV via After Effects. Discussing them further would be repetitive. The next significant composition in my journey was the reshoot Henry and I undertook a month before the music video's release. We needed to reshoot the original intro I had created for him. After reviewing it, we found it to be too random, weird, and poorly color-graded, so I asked Henry what he would like to see in the video. He paused, looked off into the distance for a few seconds, and said, “A spaceship.” This was the first time I was fulfilling a request for someone else in Blender, so I felt compelled to work harder than usual to realize this vision for him. Now, the project was about more than just my own desires.
​
This shoot turned out to be one of the most developmental days for me throughout the entire process. I went into the shoot thinking I could pull off a moving shot in front of the greenscreen using tracker points. However, when I imported my footage into Blender, I realized I couldn't use what I had captured because the background was too uniform. I hadn't placed enough tracker points for Blender to identify the scene. Blender requires at least 8 high-contrast tracking points to be constantly on screen for a usable track. Fortunately, I had filmed a completely still version of what we needed, just in case the tracking didn't work.
Another area where I gained experience was in directing talent on an entirely green screen set. We had six shots for the spaceship intro to fit within the first 17 seconds, so I had to cue Henry for all his actions. I bought three green screens and four steady lights with money I earned from my freelancing business.
​
With my newly purchased steady lights, I positioned the lights on set to match the lighting inside the spaceship in Blender as best as possible. I placed three steady lights behind Henry, very low in the scene, so that they were out of frame yet lighting the green screen evenly. I positioned my fourth and final light two feet in front of Henry, facing directly at him. I did this because all the other times I had composited Henry into a Blender scene, there was always something off about him that wouldn't let my brain be tricked into believing he belonged there in the scene. Through pondering, I suspected that the lighting captured on set most likely did not match the relative position, wattage, and/or color of the virtual light sources in Blender. What I suspected turned out to be true. It turns out that simply matching just your key light in both Blender and on set gives you almost entirely the result you're looking for, assuming all lights are the same color. In Henry’s spaceship, there is a control panel positioned in front of him, emitting light blue light back onto him. I adjusted the Kelvin on my light accordingly and physically placed the light stand between his legs, relative to where the control panel seemed to be located in the virtual spaceship. The light gave me a beautiful blue reflection off his skin and a sheen to his forehead that looked natural when composited in front of a fake control panel in Blender. The rendering and animation took my MacBook Air 19 hours to finally export. This was the point when I realized I needed a more powerful computer but had yet to realize the time I could have saved had I just optimized my render settings properly.