Improving tape loading (fwd)

From: Marko Mäkelä (
Date: 2001-06-04 12:08:29

I encountered this on the Oric mailing list.  (For those who don't know,
the Oric-1 and the Oric Atmos are 1 MHz 6502 computers that did not become
very popular.  Most users are in France or England.)

I'm writing to that list to suggest the use of PuCrunch for phase (2).
That would make it easier to adapt this scheme to the Commodore line of


---------- Forwarded message ----------
Date: Mon, 4 Jun 2001 11:41:45 +0200
From: Mickael Pointier <>
To: oric related account <>
Subject: Improving tape loading

Hi, I would like to share some ideas about how we could get some more
efficient tape loading on the oric.

I believe that most of you are aware of the fact that Fabrice Frances has
done an awesome job with all
his little tools to convert tapes to wav files, and specialy with the Tap2CD
that gives you a nearly ten
time speed up (something that used to take 6 minutes to load now requires
only 30 seconds).

The problem is that those files are reliable only when loaded directly from
a perfect digital source like
for instance a good PC soundcard. Having the same speed on a standard CD
player would be a lot
more interesting, but unfortunately due to the fact that the Audio CD format
does not include any
error control/correction, the result is largely non predictable. (From
experience it's nearly unusable
on gold CD, works a little bit better on blue and silvers, but stays very
bad). The standard wav files
generaly load correctly, but it takes ages to load...

My idea would be to have high speed loading, with high reliability, but it
requires some coding and a
good knowledge of inner tape working, bit encoding, compression,

The idea is to extend what fabrice has done with is fast loader:

1) Create a new tape loader that is able to handle a new bit encoding scheme
that reduce the
amount of data to load, thus given a faster loading. This new tape loader is
loaded first at normal
speed, and will load the remaining to the new improved format.

2) To this tape loader, we add some code to perform checksum validation and
depacking (could be
RLE or a variant of LZ(SS/77/W)...

3) Following are the main data splitted in blocks of 256 bytes. Each block
is 256 bytes long, and
includes 1 byte containing a checksum value (to check if the loaded data are
valid), and one byte
that will be XORed with the loaded data... why ? Well, if I remember
correctly, Fabrice wrote
somewhere that the "0" and "1" bits were encoded differently, one of those
taking more room than
the other... the idea is to count for each byte of the block if there are
more "1" or "0" on each of the
8 possible bits of a byte. If the "0" is encoding takes less time to load
than the "1" then we have
to maximise the number of loaded "0", right ? In this case we count, and if
there are more "1" than
"0", we simply put a "1" in the corresponding bit of the XOR mask... we do
that for the 8 possible
bits, and we perform a XOR masking of all the data before saving... after
the load we will be doing
the same thing, thus getting back the right values...

Does it seems logical ?

If we add data compression to each block, it means that the blocks will be a
lot smaller, and so
faster to load.

4) Save again all the blocks of 3)...
If we get a wrong checksum, we know the block is invalid. Assuming that the
maximum size of
a file to load is 48kb, we only need 192 bytes (48*1024/256) (or 192/8=24
bytes if we encode on
each bit) to flag if a particular block was good or wrong.
After the loading of all blocks, we just have to check if all blocks were
good, if it's not the case,
we continue to load until we find again one block that was wrong, and we try
again to load it and
write it at the right location.

Statisticaly, we should not have exactly two times the same block altered...
We can even do the
phase "4" an arbitrary number of time.

Another advantage of this block system, is that we can add a "load meter"

Considering that we have a maximum of 192 blocks to load, we can fill the
last scan lines of the
text screen with attributes value corresponding to a particular color (let's
say it's BLUE), and the
last one with a last attribute color (let's say it's WHITE).
Now, for each block we load, we change the corresponding attribute to GREEN
if the checksum
was OK and to RED if it's was not OK. It means that we can dynamically check
the quality of the
loading, this allows us to change the volume or the frequency until we
maximize the loading
quality... and we know exactly when the loading will be finished...

The idea behing packing data follow the same philosophy: If we pack blocks,
we reduce the
average lenght, so we limit the risk of alteration, and get a speed up of
the loading.

Any feed back about the idea ???

    Mickael Pointier / Dbug

This message was sent through the cbm-hackers mailing list.
To unsubscribe: echo unsubscribe | mail

Archive generated by hypermail 2.1.1.