Blog

Behind The Scenes: The Van Jones Video

It's been just under a week since I posted this video - thanks to everyone who liked and/or shared it on Facebook! In case you missed it, you can watch it again below:

A number of people have been curious about how I went about making this video, so I'll try to take you through the creative process as well as some of the more technical aspects that go into making a video like this.

First, the inspiration from this video came from the bassist Dywane "MonoNeon" Thomas, Jr., who is a genius at adding his bass sound to all sorts of unexpected sources.

To come up with my part, I first pulled the audio into one of my favorite iOS apps, AudioStretch. This is very similar to the Windows/Mac software called Transcribe!, in which you can view the audio waveform, slow down the speed, set A-B loop points, etc. It really makes transcribing a lot easier and faster, since you can easily isolate one section at a time to figure it out.

In the simplest form, I tried to figure out what pitches Van Jones's speaking voice was approximating, but his voice would often go through a range of pitches on a single note. This gives you the liberty to decide how you want to interpret it, giving you freedom make choices about pitches and their harmonic implications, register, etc. Often for faster or more emphatic passages, I would try to find a way to align the pitches with a pentatonic scale, or fragments of different pentatonic scales. This has the effect of feeling more idiomatic to the bass - some of the faster pentatonic runs feel like something that a gospel bassist such as Andrew Gouche might play, for example. I also used some double and triple stops to give emphasis to certain words or phrases. Although there are certain sections that convey a feeling of being a particular key, it can have an undesired effect to stay in a particular key for too long since our brains are used to associating common tonal motions with certain emotions. Sometimes the juxtaposition works, but for a lot of it, the atonal/chromatic approach is actually more supportive to the spoken text as it is less distracting.

Since I wanted to do the video in one take, the bulk of the work in this project was really nailing the synchronization of my bass with the spoken words. Internalizing something that is largely atonal and in free time is undoubtedly a more challenging task than your typical piece of music. While I was always conscious of the underlying harmony that the pitches had the effect of creating, the harmonic foundation changes so quickly that it only helped a small amount in helping to mentally organize the piece as a whole.

I recorded the piece using my Fodera Imperial Elite 5-string bass, tuned B-G (see below). It's an older Fodera bass and has Bartolini pickups as well as the first-generation Fodera/Pope preamp. For recording, I used the bass in passive mode since the onboard preamp contributes a little bit of undesired noise (I will add that the newest Fodera/Pope preamp is much quieter, and I wouldn't hesitate to record with it engaged). The Bartolini pickups are a custom OEM wind for Fodera, and they are quite transparent sounding as well as having a fairly low output. 

My 2000 Fodera Imperial Elite

I ran the bass through a Focusrite Scarlett interface connected to my MacBook pro running Logic Pro X. The tone you hear on the video is largely unprocessed, there is some compression and a subtle delay, but no EQ other than to roll off the bottom end to clean up the sound slightly. The bass itself has a strong midrange voice due to the mahogany body, which is great for a contemporary dark, bridge-pickup oriented solo tone. I usually wear my Shure SE215 in-ear monitors, which sound great and stay in my ears really well. I also have a pair of Audio-Technical M50x headphones which I use on occasion, since they are a industry-standard headphone and will often reveal trouble spots in your mixes, but for long practice sessions the in-ear monitors are more comfortable.

I shot the video on my iPhone 6S using an iOS app called FiLMiC Pro. It gives you more control over the video than the built-in video app, allowing you to manually change and lock parameters such as focus, exposure, etc. These manual controls were probably overkill for this project, but I find them useful nonetheless. Once the audio was mixed in Logic, I did the final video editing and synchronization of the audio tracks in iMovie.

One thing that was humorous to me was how many people seemed to really enjoy the stutter sound around 0:36-0:37. It's just a matter of getting dead/ghost notes on the bass by muting the strings with your left hand, but when it's perfectly aligned with Van Jones, it sounds like the bass is stuttering.

I hope this explanation shed some light on the creative and technical aspects that went into making this video. Thanks for reading!