As mentioned in a previous column (Issue 197), the record manufacturing process actually begins with the composition, arrangement, performance and recording, since these will greatly influence the sound of the final product. With this out of the way, let us have a closer look at how records are actually made.
In the vast majority of all cases since the 1950’s, the recording arrives on stereophonic magnetic tape, or more recently, as digital files, to a disk mastering studio.
There, it will be transferred to disk, which means that master disks will be cut, one master disk per side, containing a spiral groove, modulated by the sound of the recording. In the days prior to digital recording and magnetic tape, the only way to transfer a recording to disk was from another disk. This was sometimes done to compile a long-playing (LP) record from multiple recordings of individual songs, or to allow some edits to be worked into the record. More commonly, though, prior to the widespread use of magnetic tape, recordings were done direct-to-disk, rather than being transferred to disk from another medium.
The disk mastering setup is, in effect, a sound recording system (a rather bulky one at that), consisting of a disk recording lathe, a cutter head, control units, and audio electronics.
The disk recording lathe, as its name implies, is a machine tool related to the lathe used for metalworking and screw cutting in particular. It is a very accurate mechanical assembly, which clamps and rotates the workpiece (a blank disk) at a steady speed while feeding a cutting tool across its surface to cut a spiral groove. What may appear as a crude and primitive process to the uninitiated, is the grandmother of modern micromachining, resulting in one of the most accurate products humans can make.
The feed rate, rotational speed, depth of cut and many other parameters are often adjusted through control units which even permit some limited degree of automation and repeatability. The cutting tool is held in the cutter head, which can impart vibrations to the tool in proportion to an electrical signal driving it. The audio electronics include, as a minimum, the pre-emphasis module (which implements the RIAA recording characteristic) and a cutting amplifier, which is essentially a power amplifier designed to drive a cutter head. The audio electronics often also extend to signal processing units, cutter head protection units, instrumentation to monitor signal level and the coil temperature of the cutter head transducer system, and possibly more. The complete system, despite its complexity, does not know or care what happens upstream of its inputs.
Any signal appearing at the inputs of a disk mastering system will be transferred to disk, whether this comes from a tape machine, a record player, a CD player, a DAC or even just a couple of microphones and microphone preamplifiers.
In the latter example, we would have a direct-to-disk recording, meaning that there was no other recording medium involved, and the sound is recorded directly to the master disk. In this case, the master disk is a first-generation recording, with only one round of recording side effects. All other cases of recordings transferred to disk from other media are second-generation re-recordings at best, and commonly many more re-recording generations are involved, along with several rounds of side effects.
Direct-to-disk recording may appear pure and simple, but was largely abandoned as soon as magnetic tape appeared, due to several great challenges, detailed below:
A master disk is called a “master” because its contents will be replicated onto multiple other disks during the manufacturing process. It is an aluminum substrate coated with a soft nitrocellulose lacquer. Only one side is used to cut grooves and due to the lacquer being soft and sensitive, they cannot be played back without damage! As such, a direct-to-disk recording cannot be played back until much later on in the record manufacturing process.
Any errors or defects affecting the recording will only be heard after substantial investment in manufacturing operations and long after the recording session is over. These costs will have to be covered well in advance of getting to hear the first clues of the outcome.
Even if all went perfectly well during the recording session, there is still the risk of the master disks being destroyed during the subsequent manufacturing steps, rendering the entire project a total loss.
Moreover, a master disk has to be recorded in one go, with no editing possibilities. The musicians have to be able to perform a whole side of a record in its entirety, with no mistakes or unintended sounds. If something goes wrong (such as legendary stories of the police interrupting an orchestral recording session towards the end of a perfect side due to complaints about the noise), the disk is thrown away and the entire side has to be performed and recorded all over again.
In case this does not sound nervewracking enough, I should point out that the disk recording lathe alone can easily weigh around 700 lbs. and the rest of the setup would be another 700 lbs. This was often carried on location, somewhere with decent acoustics where an ensemble and crew numbering well over 100 could fit, with suitable transportation, accommodation and subsistence arrangements, for everyone!
Last but not least, remember those disk mastering system automation capabilities I mentioned earlier? Forget about them! They rely on a preview signal arriving in advance of the program signal to be cut to disk, allowing enough time for heavy parts of the mechanical assembly to speed up and slow down and for the control systems to calculate all that.
Now here is the catch: since there can be no preview signal during a direct-to-disk recording, there can be no automation! Everything has to be done manually. This was exactly what the “Keeper of the Groove” (as credited in the album notes) was there for in some of the Sheffield Lab direct-to-disk recordings. This was a musically trained person sitting in with the disk mastering engineers and reading the score, giving advance cues of all the sudden loud passages, pauses, etc, for the engineers to manually ride the lathe controls during the cut! For some of these epic sessions, they carried up to three complete mastering systems on location, all cutting simultaneously, just to make sure that at least one of the master disks would survive and be deemed usable for commercial release.
Needless to say, the aforementioned challenges render direct-to-disk recording sessions rather expensive and risky undertakings. Still, they have yielded some of the best-sounding records to be found.
Not that transferring a recording from another medium to disk is without its difficulties. But, it allows the use of a preview signal and automation, as well as the ability to take test cuts of the exact same performance and log the settings for repeatable results when cutting the masters. No heavy lifting involved and no organizational or logistics nightmares to deal with.
Perhaps, one would probably like to think, the ability to take your time in a known, controlled environment, to figure out how it would be best to transfer the material to disk, would make up for this additional level of generation loss.
The modern reality of the music industry tends to be quite grim in this respect. Ideally, the mastering engineer would take a lot of time to fine-tune all the different settings for the most accurate transfer. Test cuts would be taken, which a producer would take home to evaluate with a clear head and return a few days later to oversee the process of cutting the actual masters. In an ideal world, the original recording would also be technically adequate and along with the composition, arrangement and performance would be compatible with the particularities of the destination medium it would be released on. In practice, however, this is rarely the case nowadays.
Recordings are often done in haste, on a small budget, without bothering with a proper arrangement of the compositions, often without even a producer. Inadequate monitoring further exaggerates the problem. Recordings make it to the mastering room and then may require radical processing and correction just to be able to be properly transferred to disk.
This takes time and experience, neither of which comes cheap. Such recordings are not usually backed by budgets sufficient to cover problem solving at the mastering stage.
This is how what I like to call “safe-mode mastering” came to be. This introduced preset signal processing units which chop away the low and high frequencies (which would require some time to figure out how to cut properly), butcher away the dynamics (if there were any there to begin with), and narrow down the stereo image and “shrink” the recording to fit comfortably on the disk medium using standard “safe settings” on the lathe as well. In fact, once you take out everything that makes a recording interesting and worth listening to, you can just cut it without even bothering to actually listen through the whole thing in advance.
The automations will ensure it will work, at least to the extent of that it spins and makes some noise. You may wonder at this point, why bother at all then, all this waste of potential and resources for records that sound worse than their digital sources? I personally feel that if you are not going to do it properly you might as well not do it at all and save us all the hassle for a bunch of records nobody will ever listen to more than once, if at all. There is a booming market for plastics recycling, especially targeting stacks of unsold records sitting in warehouses, which recycling facilities and record pressing plants buy by weight, demonstrating the extent of the problem. The more ethical operations at least remove the paper labels before granulating…
A great-sounding record begins with a great real life sound and a great recording of it, done with a good understanding of the recording medium and of the final distribution medium to which it will need to be transferred. The best-sounding records I have cut required no signal processing at all. The recording was great as it was. No “shrink to fit,” just an accurate transfer. But still, it took time and effort. This involved deciding on the best settings for the groove depth, the spacing between grooves and how this may change throughout the side, the cutting stylus heating system temperature, the amount of vacuum suction needed to clear the chips created during cutting with no audible effects, selecting the best cutting stylus out of a batch and the best blank lacquer master disks (carefully aged), taking several test cuts, auditioning on a calibrated reference reproducer and accurate full-range monitor loudspeakers in a room with excellent acoustics, inspecting the test cuts and then the entire length of the final master disks under the microscope, and doing this process all over again for each album.
While in “safe-mode mastering” one can boast about how many masters they cut in a day; for serious work, we talk about how many days it took to cut a master!
When seemingly perfect masters have been cut and inspected under the microscope, they are sent off to the the next step of the manufacturing process, without having any possibility of listening to the actual master disks and with a whole load of things which could still go wrong downstream. Should the master disk get damaged or should other faults occur or be discovered during the subsequent stages, we are back at the beginning again, cutting a new set of masters to be inspected and sent off!
In the next column, we will have a look at the subsequent stages of record manufacturing: plating, and pressing.
This article was first published in Issue 92.