1978 an important year...
3M Digital Audio Mastering System, 1-inch tape and
Sony PCM-1600 digital audio mastering system, 3/4" tape.
The start of recording classical music digitally.
Mathematical statistics have bases of Fourier sampling series as base to all digital history.
"10. 1980. Sony makes the first compact disc and takes the cut out perforations pattern from player piano paper music rolls from 100 years earlier and duplicates the pattern to the surface of the compact disc and the perforations from the player piano roll become pits on the disc for the laser to read."
I don't see how this would work. The pattern of perforations on the piano roll only captures performance information, not audio. You need 44,100 16 bit samples per channel for the CD. Paper piano roll data density not even close....
"12. 1980. Pioneer makes the worlds first commercial Laser movie disc and by 1986 produced laser movie disc players for the retail market."
Laserdiscs were an analog technology, both audio and video. As digital technology matured, ways were found to add digital soundtrack information to the laserdisc, but the video remained analog (composite actually).
Agree with Marakanetz about the importance of Fourier Analysis as the underpinnings of all audio. We should also add the Nyquist Sampling Theorem to the list.
Two items I neglected to mention regarding the development of SACD technology.
1. While working for Sony in California, digital engineer Andreas Koch designs and builds in
1997 the worlds first 8 channel DSD(SACD)recorder machine and also designed the worlds first SACD D/A converters that were used in the recorder machine.
2. In 1998, Sony digital engineer Dr. Yoshio Yamasaki invents the SACD disc which Sony introduced in 1999.
Bardeen, Brattain, and Shockley developed the first bipolar point-contact transistor in 1947 at Bell Labs.
The first silicon transistor was produced by Texas Instruments in 1954.
Without this, there would not have been an explosion of digital audio equipment.
Ghostrider45..thanks for the correction on the early laser disc. I was aware that the video was analog and your info indicates the audio as well, in which the audio became digital in '97 with the DVD. Regarding perforation patterns on the player piano music roll's, which I'll clarify further, had to do with the most logical approach Sony could take in the formation of the pits on the CD. The formed pit pattern on the CD taken from the piano roll's, was to arrange a pattern that would work best with a laser reading the pits. It had nothing to do with data whatsoever, just a space arrangement of pits that would be the easiest pattern for the laser to work with for best synchronization. Read about this in a Sony article back in the late nineties.
One point (though trivial) is the pits and lands in the CD or DVD do NOT represent ones and zeros.
The mathematical calculations which the pits and lands represent, and the complex formulas to make and interpret those variable pits and lands is very complex. And in a way, that CD spiral of basic data IS a combination of analog and digital in nature.
The fact of 'jitter' is one result of it's 'analog' like construction.
Since we talk about digital, i.e. 0s and 1s, then one should surely
acknowledge Gottfried Leibniz who invented the binary numeral system as it
is used today. (He is also responsible for Calculus as it is used today, i.e.
Newton notations were extremely cumbersome).
However, according to wikipedia (may the scholars forgive me), people
have been using binary numerical system to encode information for a very
long time. For example, in India by Pingala in the 2nd century BC, in China in
the clasic text "I Ching" in the 11th century. Other similar systems
were used also in Africa (e.g. ifa) and Europe (e.g. Francis Bacon). Also, let us
not forget the Morse code.
Another important figure that should be mentioned is the British
mathematician George Boole, the father of Boolean algebra which is at the
basis of all digital electronic circuitry.
The invention of the diode LASER in 1962 (by two groups in USA) should also
be acknowledged as a crucial important step.
I remember a digital preamp made (I think) by Infinity...the 80's?
An interesting thread. As Nvp pointed out, the origins of digital technology are old. Very.
is a digital technology, and it's been around since Mesopotamia. It was also used in ancient Greece, Rome, Persia, China, and India. The abacus is a good example of what makes a technology "digital," namely that it performs computations
with data represented in discrete values
Speaking of computation, I don't think anyone has mentioned Alan Turing
, who was instrumental in the development of computationalism in computer science. The Turing Machine
was a hypothetical device that could simulate the computational processes of virtually any digital technology.
The history of digital technology is awe inspiring, in an Arthur C. Clarke kind of way.
You mention digital audio tape came out in 1991. DAT had been out since 1987.
One more tangential note:
A lot of related technology came out of Bell Labs ca 1970, when they developed WDM (wavelength division multiplexing), and, later, DWDM (dense wavelength division multiplexing) to increase the potential capacity of fiber optic telecom cables. Today, many of the optical elements of various digital sound technologies have some roots in that research.
Correction..I mentioned the first digital audio magnetic tape recorder wass developed in 1967 by Technical Research Labs of NHK Broadcasting in Japan. I neglected to mention,
(my fault!) that same year Denon collaborated with Tech Research/NHT and developed the first digital tape. They took 2 inch video tape and converted to a quadruplex format for digital recording so it could record at a very high rate. Denon made this digital tape available in 1972 to be used with their 8-track digital recorder they put out that year. Soundstream came out with their own 1 inch digital tape in 1976 which they developed in 1975 for their digital audio recorder they designed.
1958-1959: Invention of the integrated circuit, by Jack Kilby of Texas Instruments, for which he later won the Nobel Prize in Physics; and a few months later, independently, by Robert Noyce of Fairchild Semiconductor, who co-founded Fairchild Semiconductor and (at a later date) Intel.
Kilby's IC was Germanium-based. The one developed by Noyce was the first Silicon-based IC, and incorporated other improvements that increased its practicality.
Note: To an electrical design engineer, "IC" = "integrated circuit," not "interconnect cable." :-)