I'm wondering if I'm using ddrescue in a manner that is optimal in regards to the error correcting codes on a bluray.
namely: do blurays error detect/correct codes on a 2048 byte block boundary or is it larger.
i.e. I use ddrescue -b 2048 -d to read the discs in a direct io manner, and even with difficult disks I can make progress on repeated reading over time (i.e. at the beginnong of the week, I had about 20MB of a current disc I'm trying to recover unable to be read, I've gotten it now down to under 4MB, here's hoping to complete the disc).
What I'm wondering is, is reading 2048 bytes at a time a bad way to do it. i.e. if a disc can only error detect / correct on a larger block size (so in effect, every read is really larger than 2048 bytes), am I wasting the rare but good reads that I accomplish by only trying to read 2048 bytes? It could also be the reverse, that the error detecting / correcting block size is smaller than 2048 bytes and what I'm doing is fine.
I'm also left wondering if it would be possible to do repeated raw reads of a disc (ala cdparanoia or old mode 2 CD reading) with the data and error detection/correction codes and try to accumulate enough data to do the error detection / correction in software?
Does anyone know about the on bluray disc format spec to be able to answer this Q?
Please post here for issues related to Blu-ray discs
1 post • Page 1 of 1