28/03/2022
Managing Tradeoffs'
By Pablo Messier
Is Distortion a bad thing?
Do we really need to avoid Distortion as much as possible?
Does having no distortion at all makes things better?
Is Mastering the art of Distortion?
Introduction:
For a long time, the craft of mastering has been seen as a black magic art of some sort, isolated from the rest of the music production chain, especially for me, I was a someone that I constantly ask myself not only what is the craft of mastering but how was it possible, how do you get something that is the same music, but feels different? For many years I question myself on how mastering changes the production, not knowing that almost the same tools that are used in mixing, are use in the discipline of mastering, but there is one defining factor that made mastering a mystery to me, and this was the fact that the process of mastering was applied to the entire mix, and this was kind of a revelation to me when I first new it, but then I had to discover how do you do it? How do you know what tools to use? How do you know what to listen for? What to look for? This is the defining factor of what the discipline of mastering is, quality control and critical listening but more importantly, the art of managing tradeoffs.
The Battle Against Distortion
When people ask me what is mastering I usually tend to reply to them that the short of it is that mastering is distortion, and most of the people are end up like “wait what? Distortion?” and my reply is yes (but not terrible clipping), as soon as the mixing process ends, and the mastering stage starts we have to make changes that are now in the context of the full frequency spectrum, including the entire dynamic range of the music itself, so anything we do carries more meaning and a lot more weight because of the complex nature of the signal, in the mixing process you can easily get away with distorting one element pretty hard and masking it afterwards, not the same can be said in mastering, once a change is made in the entire mix the results are much more complex and evident, bringing something else to the table that wasn’t there to begin with.
“I have a way of thinking about mastering,
and this is going to sound a little odd when I first say it,
but I think that a Mastering Engineer has to be a Connoisseur of Distortion,
because ultimately what your doing is your taking a source file
and distorting it in some way in order to achieve some musical result”
Paul Blakemore
Limiters & Nonlinearities
Talking about mastering and how its achieve inevitably brings us to the world of nonlinear processing, specially limiting, because at the core of the work of modern mastering is limiting, now you might say that the process of limiting is to set a threshold, where the peaks of the music will never exceed a predetermined value, and technically you’re correct, that is the purpose of a limiter to prevent you from over modulating, but more importantly to help us define the level of program. To this point it’s all good, you might think well a limiter is a helpful tool because its preventing me from clipping, so what do I need to worry about? Yes it’s a helpful tool but here is a small detail here, limiters are not transparent, ill venture to say that no nonlinearity is transparent, this is because of the nature of the tool itself, a nonlinearity is any process from witch the change of the output is not proportional to the input, and here is where the tradeoffs comes in to play, limiters have a very fast release times, if you think about the nature of the circuit and analyze the time constants, they are made for tracking very fast moving short term intersample peaks (they are made to deal with transient energy), here is where the distortion comes from, when I start to use a limiter there are two questions that rise immediately when I’m doing the work, the first question is: how much limiting is necessary? (this is music dependent) the second question is: how much distortion are you willing to tolerate? It’s important to think about this when setting out the loudness of program, but more importantly how to do it in a way that is invisible, years ago Mastering Engineer Jonathan Wyner was interviewed by Pensado’s Place, and almost at the end of the interview Mixing Engineer Dave Pensado was talking about limiting and told Jonathan Wyner the following: “Limiting has become a bad thing, but with no limiting there is no Mastering”.
“What its design to do is that
The threshold stays well above the body of the music,
And all it does its stops overloads,
When you set the threshold right down in to the body of the music, and then
Every peak its being shave off, that’s what we do
And that’s wrong, we are actually destroying the process”
Dr Robert William Taylor
Mastering Engineer Jonathan Wyner in an interview with Dr Robert Taylor many years ago, where discussing the topic of Hyper Compression, which is the process of over compressing a signal, Robert Taylor stated that Hyper Compression involves any process that include, compression itself, multiband processing, limiting, clipping and to the extent that clipping may involve overdriving an analog to digital converter. Another aspect to consider when it comes to the digital domain is that many limiter plugins are emulating analog limiters, this creates another “issue” because in an analog equipment the color of the unit comes from a transformer or any electronic components, and many plugins that emulate their analog counterparts are also emulating the color that comes from that transformer as well, this reinforces what Jonathan Wyner said once: “Many people think that a little distortion is cool, well you don’t know how much distortion you’re getting, it depends on what you’re playing back on.”
Compression & Dynamics
Compression has been a unique tool that mastering engineers rely on, but the question is, what are we doing with it? Do we really understand compression? Those are the questions that I at least ask myself in the context of mastering the most, according to Mastering Engineer Gavin Lurssen, Doug Sax witch was a unique Mastering Engineer (one of the first in the business) stated that, “compression became very popular as soon as CDs when in to the car”, if you think about it, the noise from the engine and the noise of the car rolling down the road would have made the listening experience a bit challenging, so here comes the distortion to save the day!, compression does not leave the same imprint as a peak limiter but it adds something to the sound in return, the question is why use a compressor to make music more audible? Specially in a car where there is constant noise, simply put, having the loudness of program even out will make the music much easier to listen to, in the context of a car, you might be thinking this might be a good thing, but it’s not all positive, while compressors can help the music in many ways, there are things about compression that to a trained ear it cannot be unheard, and here lies why Mastering Engineers try to avoid compression as much as possible, I’ll venture to say that all compressors have the same side effects in common, loosing low end is the first thing that happens (result of distortion), some highs are accentuated, but the side effects of compression that most people don’t pay attention to is the fidelity of the signal and imaging, to some extent compression changes the relationships in a mix, this is very similar to what happens in a Loudness Normalization playback fashion.
“I think
that the first thing that you lose with
with many tools
of all kinds including compressors
is the depth”
Bob Katz
So here lies the question once more “What are we using compressors for?” the common definition is that compression is use to restrict dynamic range, but that doesn’t us what is actually doing. Mastering Engineer Bob Olhsson stated that he sees “Compression as helping the Limiter”, “I typically shout out at least three different limiters, on every job, listening for what they screw up”.
Excitation, Color & A conversation with Aliasing
Back in the day when Mastering Engineers where working with analog equipment, there would always be a color, and affect, some “distortion” that might be pleasing or interesting, rarely a treatment with this in the analog domain would be totally transparent, every time a signal runs through a new circuit or processor something changes, even if the processor in hand is not doing anything to the signal, this is especially relevant in the world of nonlinear processing, most engineers like to run signals through such devices not because they want to process it or change something, it’s because they enjoy the sonic effect, the imprint of that gear and the filtering, I’m the kind of person that believes that distortion in the correct context can bring something to life.
Here is a story from Mastering Engineer Jonathan Wyner, One day he was contacted by an artist that he decided not to share his or her name, the artist brought him a mix that was worked by a professional mixing engineer, the artist came to Jonathan very concerned about the mix because the mixing engineer and the rest of the craft rented a studio that this mixing engineer had never worked before, after one week of hard work the mix was finished but when the time came to test the mix the nightmare began, the mixing engineer did not bother to check his work before signing off, there was nothing the mixing engineer could have done because the time was over and there was no studio to correct the mix so the artist had no choice but to contact Jonathan and asking him for help, the mix was gone by this point, Jonathan had to try a lot of things to see if he could save the mix, and after days of experimentation he found and old compressor named “GATES” but instead of trying to compress he just run the record through the unit and as he stated “By adding the distortion brought life in to the recording” and this is how he could “save the mix”, this is what he call “Sonic Adventures in Mastering.”
Analog Distortion is interesting, there is a unique texture to it, all is good so far, but what about the Digital Domain? When it comes to distortion in the digital domain we have to consider two important things, distortion in the analog domain is free, distortion in the digital domain is not free, it has to be programed and coded down in all its potential complexity. But what are the implications of distortion in the digital domain? in the analog domain we judge equipment based on its sound, in the digital domain we judge plugins by how good they can do analog style distortion, as I stated before, distortion does not come alone in the digital domain, and there is a small price to pay when using coloration in a floating point universe, latency and CPU cost, in other words bandwidth and sampling. Simply put discrete time digital signals where made to operate via a sampling parameter, anytime the signal tries to go beyond the sampling parameter, the system is unable to recognize the activity beyond the sampling parameter and instead of making the signal disappear it interprets the activity as a side tone that does not follow the harmonic series of the fundamental frequency, this is why distortion is such a big deal in the digital domain, it takes a lot of work to implement distortion in a plugin properly, the reason being that In the analog domain distortion is a natural attribute to the devices themselves, it’s part of the sound because of the circuitry and all of its electronic components that are in the signal path, in most cases developers don’t like the idea of adding distortion or noise into the DSP, because battling the effects of Aliasing is tough task. So does that mean that digital coloration is useless? Not at all in the slightest bit, in some cases digital coloration is a better solution than going to analog domain, the reason being that to go from digital to analog involves conversion and truncation, so having a digital tool that can do the job is crucial, but this implies that the oversampling technique used has to be spotless, in mastering we have to select our tools carefully, specially color.
Cramping or Wrapping?
Do digital equalizers sound different from their analog counterparts? This is a question that most engineers have asked themselves for a long time, and with good reason, decades ago when digital equalizers came in to the picture, the sound was something that kept people off for a long time, a lot of engineers never imagine working in the digital domain because of this phenomenon, the problem was not the digital domain in the first place, the issue here where the tools themselves, the plugins where not accurate to begin with and if you sum the internal bandwidth of the plugin at the time well it’s understandable why they avoided digital.
Equalizers and filters where the first thing most engineers notice that something wasn’t quite right, recording and mixing engineer George Massenburg was the first person to introduce the parametric equalizer back in the day, and a circuit designer, I consider him to be an audio thinker, because he studied and analyzes the behavior in the tools we use today, and thanks to his long research and hard work we have really good clean equalizers, when Protools was starting out George notice something off with the filters used in their equalizers because something sounded wrong, this is where EQ Cramping is born, digital EQs at the time had an issue with the top end, the high end of the spectrum sounded harsh and brittle and most people didn’t know why and what was happening, it was George Massenburg who discover this issue and stated that a better filtering and sampling approach was needed to correct this behavior.
But what does all of this has to do with distortion? I think all of it has to do with distortion, cramping does not only affect the sound quality of the high bands, if you analyze the behavior of the plugin using a diagnosis tool or even simply the spectrum itself, you can see that the band starts to deform itself when reaching the Nyquist limit (sampling boundaries), listen carefully and you’ll hear distortion and unnatural ringing effect along with all weird artifacts, this is what makes the top end harsh. Ringing and distortion in an EQ is not necessarily a bad thing, because part of this it’s what gives the EQ the imprint and color that it makes it characteristic, now does this mean that an EQ that cramps is totally useless? Not necessarily a cramping EQ can still be useful but as I stated before, it’s all about knowing the behavior of the tools themselves and their limitations, but suffice to say if you’re working specifically on mastering then the quality of the tools really matter and to my way of thinking a cramping EQ in a mastering engineer’s toolbox is totally unacceptable.
“Nothing couldn’t have been more
Wrong, going in to the middle
Nighties with pristine,
Because it didn’t register”
George Massenburg