On Thu, May 30, 2013 12:42 pm, email@example.com wrote: > I noticed that there is a function in nibtools' gcr.c called find_sync(), > which contains the following snippet > > /* sync flag goes up after the 10th bit */ > //if ( ((*gcr_pptr) & 0x03) == 0x03 && (*gcr_pptr) == 0xff) > /* but sometimes they are short a bit */ > if ( ((*gcr_pptr) & 0x01) == 0x01 && (*gcr_pptr) == 0xff) > /* or two */ > //if ( ((*gcr_pptr) == 0xff) ) > break; > > It is quite interesting to read the comments and see that there was some > evolution on the number of bits that constitute the SYNC mark ;-) > > The question is different though: the G64 doc says: > > "The most reliable way to read G64 track data is to read it as bits, not > bytes as there is no way to be sure that all the data is byte-aligned." > > yet the above code seems to look always for the first 8bits being > byte-aligned. Why so? Or is is just completely unrelated to G64 and works > with NIB files well? Yes, the 9-bit version was the best balance. Never any false-hits, and it still catches when we miss a bit now and then (V-MAX, for one) as sync is only measured by cycle-counting and and BIT/BVC is not perfect here. :) The 15x1 returns GCR data ALWAYS sync aligned, as long as a sync exists, so it never occurs that it's not aligned that way. There is a newer function to sync-align everything so it can process Kryoflux-captured G64's. It's not documented... I think it's switch -$. -Pete Rittwage Message was sent through the cbm-hackers mailing listReceived on 2013-05-30 17:01:24
Archive generated by hypermail 2.2.0.