Re: Did Commodore cheat with the quad density floppies?

From: Mia Magnusson <mia_at_plea.se>
Date: Sat, 5 Jan 2019 22:52:55 +0100
Message-ID: <20190105225255.000043f0@plea.se>
Den Fri, 04 Jan 2019 10:18:50 +0100 skrev André Fachat <afachat@gmx.de>:
> 
> 
> Am 3. Januar 2019 20:01:30 schrieb "Mike Stein" <mhs.stein@gmail.com>:
> 
> > I wonder if part of the answer to Andre's original question may be
> > the fact that Bits per inch is not necessarily the same as Flux
> > transitions per inch/mm...
> 
> Absolutely. 300 Oersted media had 5900 flux transitions per inch,
> which gives 2900 bpi using FM due to the many clock bits needed, or
> 5900 bpi using MFM. QD was the same media, only was defined for
> 96/100 tpi instead of 48 tpi.
> 
> Commodore GCR 170k used 250kHz write frequency,  thus the same 5900
> flux transitions per inch, i.e. 4us bit cells.
> Commodore GCR 500k used 375kHz writes, which increases ftpi by 50%
> and reduced bit cell size by 33%. Which seems to be out of spec with
> all Media specifications I found.

I have a theory:

With a given magnetic media, the maximum usable flux transitions per
distance depends on what signal/noise level you need.

As MFM uses "half-bits" (as compared to the frequency of the max flux
transitions rather than the clock frequency to the MFM
encoder/decoder), it would need a better signal to noise level than GCR
which doesen't use "half-bits".

Thus Commodore could get away with a higher flux transitions per
distance than what was used by almost all MFM implementations.

Having Commodore specific hardware hooked up directly to the read/write
heads would probably be a part of the way to make this work.

It also requires heads able to handle the flux transitions per distance
rate, which in the 1001/8050/8520 is probably solved by using heads
that were really intended for the HD MFM format.


I'm not sure if Commodore did actually did calculate what speed they
could use as compared to what MFM could use, if they introduced noise
to see where MFM started to fail and just tried what GCR frequency
would start to fail at the same noise level, or if they ordered a
bunch of every kind of diskette they could find and just tested them.

If someone wants to make some experiments using rather readily
available equipment, you could compare the frequency response and the
noise level using different speeds on A) a reel-to-reel tape recorder
from say mid 60's or older, and B) a reel-to-reel tape recorder from
say mid 70's or newer. On both recorders, the noise and frequency
response will be rather good at 19cm/s. On both the noise will increase
as you lower the speed, but the older recorder with it's less refined
heads (i.e. a wider gap) will give a worse frequency response than the
newer recorder at the same low speed. (Something worth remembering
though is at least the DIN standard commonly used for consumer reel to
reel recorders in Europe did change at least once during reel-to-reel
recorders haydays, risking making this comparison slightly misleading).

Another way to test this is to record a cassette tape with some audio
which includes the higher treble frequencies. Really old cassette
recorders/players/decks of decent quality (like the first Tandberg
TCD 300 cassette deck from 1972, Blaupunkt GOslar CR car stereo from
1974 and similar) will not reproduce the highest frequencies due to them
having a wider head gap, as they were made back in the days when the
highest frequencies weren't considered realistically possible on
cassettes. (Of course no one would expect a great treble response on
the simpler cassette recorders from the 60's, but that's not really the
point).

-- 
(\_/) Copy the bunny to your mails to help
(O.o) him achieve world domination.
(> <) Come join the dark side.
/_|_\ We have cookies.
Received on 2019-01-05 23:04:36

Archive generated by hypermail 2.2.0.