Experimenting with BDXL – Part 4: Scanning BDXL Discs feat. LG BH16NS55

In my earlier post about buying the Pioneer BDR-X13JBK drive from Japan, I said that I wouldn’t buy another LG BH16NS55. So, I compromised with myself and bought another LG BH16NS55. No, I haven’t gone crazy … there’s a method to my madness.

The Drive

Around AU$90 down and two days after ordering, a new LG BH16NS55 Blu-ray writer arrived for me. This is a bit of a curiosity, as the drive is sold as an “OEM” drive and true to form, it doesn’t have a retail box. However, it does have a SATA power cable, SATA data cable and software CD-ROM which rarely come with OEM drives.

Rummaging through the pile, there was also an installation guide and a packet of screws.

Well, I’ll be damned. This is a brand new drive – even newer than my Pioneer, carrying a date of 18th July 2024. As a result, it has the latest V1.05 firmware on it, which isn’t such a bad thing anymore (as we’ll see later). Despite this, LG have come out with newer versions of the drive – NS60 drives are on the market, but I suppose they probably just use the same chipset given the lack of development in this area. The power rating of 5V 1.5A and 12V 2A suggests the drive could consume more power than some external enclosures can provide.

The front bezel is nothing special, as expected of the “plainer” OEM models. The retail models have a more appealing textured front panel, but this hardly matters to me.

The back has the standard SATA power and data connectors.

As usual, mounting screw holes on the side. Nothing unusual here – the drive is “short” like most later optical drives.

Underneath, just two “divots” in the baseplate to contact heat-producing chips and act as a heatsink.

The provided CyberLink Media Suite for Blu-ray disc is dated 17th November 2023. Not really of any interest to me though.

Under the Covers

It only just got here, but I’ve decided to forego any warranty and look inside to see what you get for the money nowadays.

As it turns out, it’s a lot of empty space.

The controller PCB with the Mediatek MT1959 chipset is absolutely tiny compared to the controller boards of older optical drives. For security and perhaps for cost reduction, there is no external flash chip, so that is one less avenue for recovery if flashing goes wrong. All the ribbons bunch up into a single multi-part connector too and seem to be somewhat loose inside the drive. The PCB’s markings imply this to be an 8th May 2015 design, making the drive nine years in production!

There’s nothing to see on the back aside from a few capacitors.

This is the drive with the tray partially ejected, as required to take the tray’s faceplate and the drive’s faceplate off to release the inner unit from the outer shell. Look at that tray – it’s got some holes in the back too! Speed holes?

The optical pick-up unit … nice and clean with its two lenses.

Looking closer, I’m a little disappointed. Look at the plastic former – there seems to be a crack line running through quite a bit of it. While it probably won’t kill the drive unless it breaks apart, this doesn’t really speak highly of quality control. I’ve noticed OEM drives tend to be a little worse for write quality and longevity in the past, in part because OEM drives are often sold with shorter warranties, but this is a bit surprising to see visible defects.

The flex on the side connecting to the laser diodes or photodiodes.

The drive also uses the classic belt mechanism … with a plastic axle and plastic clip as most later drives do. Another cost-reduction measure, I suppose. At this point, I’d be happy if it works …

Testing

As a SATA-based internal drive, I was actually a bit low on ports, so I had to resort to pulling out an old HP Compaq 8000 Elite SFF desktop I had upgraded from an Intel E8400 to a Intel Q8400. It may be old, slow, have a failing RTC and consume lots of power, but for the purposes of “hosting” the drive, it is good enough to begin with.

Plugging it in, the drive is alive and detected. It seems it has a 4MiB buffer, minus a little bit for internal drive management tasks (e.g. handling encryption) presumably.

Reading all sorts of disks proceeded just fine – recordable CDs at 48x, DVDs at 16x, DVD-DL at 12x, BD at 12x and BD-DL at 8x. All smooth lines suggesting the laser and mechanics are all good.

But, of course, it’s an LG. So QPxTool doesn’t scan with it …

… and neither will Opti Drive Control or Nero CD-DVD Speed. It’s just not a scanning drive.

Making it a Scanner!

In my early days of being an engineer, I’ve always learned the rule of having one to use and one (or more) to break. This is why I bought another BH16NS55, even though I already had one. My original drive has a corroded SATA connector and a UHD-friendly version 1.02 firmware, but I wasn’t game to get back into the optical drive flashing game with just the one drive. Now that I have a new drive, I can proceed with some level of reassurance even if I screw-up, I still have my old BH16NS55 functioning.

It turns out that the LG platform drives have been “co-opted” by Vinpower Digital who make a robot version WH16NS58DUP drive which has the scan commands enabled. Apparently there is some confusion as there is a non-robot version that apparently doesn’t support scanning. Their new software (VPTools, PlexTools) appears just to be a reskinned version of QPxTool as well.

WARNINGThe following procedure may cause permanent and irreparable damage to your hardware. Proceed with caution and at your own risk. Author will not be held responsible for any successes, failures, damages however incurred.

So I embarked on this journey thanks to the resources available in the MakeMKV forums. Firstly, one needs to grab the all-in-one-firmware-pack from here and SDFtool Flasher.

As the drive already has V1.05 firmware on it that is both encrypted and has downgrade protection, we need to “side-grade” to the same firmware but of the MK type. You’ll find this in the firmware pack under MK\HL-DT-ST\BH16NS55\HL-DT-ST-BD-RE_BH16NS55-1.05-NM00400-212004211049.bin.

Because it is encrypted, the software can’t read the firmware out from the drive. If you try, you will only back up the calibration data, which is better than nothing. Anyway, having selected the correct file, you can click on “Start” to flash the drive. After a while, it should succeed. At this point, it is safest to shut down and restart the system to ensure the new firmware is running (although it is likely unnecessary).

Now, the drive is running V1.05 but is an unencrypted firmware which is LibreDrive enabled. If you’re wanting to rip UHDs, then this drive is almost good-to-go, with the exception of a pesky issue known as Jamless Play which can result in garbage data being returned in case of read errors (which I suspect is present in this firmware as it is a newer one but didn’t test). Apparently cross-flashing to a different firmware (perhaps ASUS BW-16D1HT) may be an avenue to eliminate this but I haven’t tested this myself.

But I digress. The purpose of this particular endeavour is to enable scanning and it just so happens that there is a Vinpower firmware in the pack –

Flashing the firmware found in MK\HL-DT-ST\WH16NS58\HL-DT-ST-BD-RE_WH16NS58-1.V5-NM00900-211802070957.bin will cross-flash the drive to a WH16NS58 and essentially unlock the scanning capability. Knowing this, some digging suggests to me that many of the NSxx drives are essentially so closely-related internally that cross-flashing is a possibility although whether this is beneficial really depends on the situation. Apparently the NS60 has an encryption chip for use with UHD 4k Blu-rays – this capability isn’t unlocked simply by cross-flashing the NS5x drives as they don’t physically have the chip.

There are a few downsides – the tray ejects somewhat violently under this firmware as it expects a drive with a longer tray intended for use with robotic auto-loaders. Also, because the drive model is different and the firmware is a little older (from what I can tell), the write performance of the drive can no longer be assured. It is essentially best thought of as a quality scanner or a ripping drive, with the need to be cautious regarding writing as it may not perform as well as a true NS58-based drive would.

Scanning the Discs

Burn #1 – HiDisc TL BD-R @ 6x Pioneer BDR-X13JBK

This disc was defective anyway, so I decided it wasn’t worth my time to scan as 2x scanning means three hours of work …

Burn #2 – Verbatim TL BD-R @ 6x Pioneer BDR-X13JBK

The Verbatim-branded disc wasn’t entirely smooth sailing even on the LG. It seems that error spike on the transition to L1 is real and does push the disc over acceptable limits. Aside from that, the errors remain within acceptable limits most of the time, albeit the errors are much higher than what I am used to from a quality BD-R SL burn. Perhaps BDXL is pushing the limits of optics that much that error rates are expected to be higher?

Burn #3 – Ritek TL BD-R @ 6x Pioneer BDR-X13JBK

The Ritek at 6x doesn’t have a big hump as such, but the error rates do seem to rise on a layer-by-layer basis. The transition from L0 to L1 does have a bit of a spike, as does the beginning of L0 and tail end of L2 which is on the edge of acceptability. This is still readable at full speed as seen earlier, but seems to be values corresponding to the limit of acceptability.

Burn #4 – Sony QL BD-R @ 6x Pioneer BDR-X13JBK

The Sony QL disc at 6x was a bit of a spiky mess and this one wasn’t my fault. The burner did have quite a few pauses, seemingly as part of WOPC, so perhaps that explains it. Interestingly, the (visual) average error levels are quite a bit lower on the Sony QL disc which makes it quite a bit better than the others (even if the spikiness does detract from this somewhat). The layer error rates are somewhat evenly matched, with L2 seemingly being a bit lower and layer switches seem rather good despite a bit of an increase in errors between L2 and L3.

Burn #5 – HiDisc TL BD-R @ 2x Pioneer BDR-X13JBK

When it comes to burning, one of the usual lessons is simply to “go slow” as that usually results in better burns. For the Verbatim media, L0 was excellent – just as good as an average BD-R SL burn. But L1 and L2 are starting to get marginal at the edges. This is perhaps just indicative of the difficulty of getting the media and burn process for multi-layered media right.

Burn #6 – Ritek TL BD-R @ 2x Pioneer BDR-X13JBK

The Ritek disc also showed improvement at 2x and to its credit, has some margin from the threshold. The burn would be considered averagely-good compared with the standards from BD-R SL days, although I’d still usually aim for better.

Burn #7 – HiDisc TL BD-R @ 2x LG BH16NS55

To try to answer the question of “is it the Pioneer’s fault”, the Verbatim media was burned at 2x in the LG too. Surprisingly, the LG also showed similar elevated errors with L1 and L2, but didn’t provide as good of a burn on L0. I prefer the Pioneer’s result.

Burn #8 – Sony QL BD-R @ 2x Pioneer BDR-X13JBK

I have to confess – this is a mea culpa situation. While transferring discs between drives, I had an accident which resulted in the disc dropping to the ground and getting covered in dust. While picking it up, I managed to get some finger oil on it. In my haste to clean it with a microfibre cloth, I left some very fine scratches on it … along with lots of micro-microfibre fragments which just wouldn’t let go of the disc. This is an unfortunate fact with Blu-ray – as soon as the disc gets any speck of dirt on it, it’s impossible to get a spike-free scan ever again!

But if we look “through” the spikes to the baseline – the burn is absolutely excellent. The results for L2 and L3 are especially excellent while L0 and L1 are pretty good even by BD-R SL standards, but with some variations suggesting there were some error peaks which reached into the ~100 BIS error rate region.

Burn #9 – Sony TL BD-RE @ 2x Pioneer BDR-X13JBK

Well, unfortunately, this burn couldn’t be scanned simply because it no longer exists – the BD-RE was re-written …

Burn #10 – Sony TL BD-RE @ 2x LG BH16NS55

The Samsung thought the BD-REs were absolutely hopeless and it seems the LG also seems to suggest the error rates are practically right at the threshold. It’s not a great burn graph, but it will still read.

Burn #11 (NEW) – Sony QL BD-R @ 2x LG BH16NS55

Seeing that the Sony discs were so good and despising the fact I had so many spikes in the previous scans, I burned one on the LG BH16NS55 to see what it was like at 2x. My final disc gave this result, showing very good values on L0, L2 and L3. The exception, L1, showed elevated errors at the layer transition point, but still with quite a bit of margin to the threshold. I would consider this an excellent BDXL burn – one to aspire to, but I do feel that the Pioneer drive still has a slight edge in terms of quality.

Burn #12 (NEW) – Sony TL BD-RE @ 2x LG BH16NS55 (again)

Curious to see what the BD-RE would look like after repeated rewrites, here’s another scan after a full format and overwrite. Interestingly, error rates went down with layer count in this instance, although L0 is very much “at threshold”.

Burn #13 (NEW) – Sony TL BD-RE @ 2x LG BH16NS55 (again, with defect management)

Formatting with defect management on, preferred size, you can see the disc capacity has reduced. It seems the error rate remains stable with overwrites – perhaps the first overwrite from the Pioneer to the LG contributed to difference in error characteristics in Burn #10.

Conclusion

I said I wouldn’t, but in the end, I did buy myself another LG BH16NS55 Blu-ray writer and I don’t regret it one bit. It was a brand new drive, although seemingly lacking in quality control, but with firmware V1.05 that had already been decrypted/MK’d/LibreDrive’d. Combining this along with tools from the MakeMKV community and their version of the Vinpower Digital WH16NS58 firmware, it was possible to enable scanning functionality on an otherwise non-scanning drive.

The results are rather interesting, pointing to some media having very layer-specific tendencies, suggesting something to do with variations in layer thickness, spacer layer thickness and disc geometry. On the whole, the slower burns (2x) fared better than the faster (6x) burns, although even at the 2x speed, there were regions of many discs which got close to or even exceeded the threshold of acceptability.

For the Verbatim BD-R TL media, it seems the greatest challenges occurred for L1 and L2 which showed elevated error rates compared to L0, especially at the transitions. For Ritek BD-R TL media, it seems that error rates increased with layer number.

The main exception to this seems to be the Sony BD-R QL media which had better results with L2 and L3 generally than L0 and L1, but the media overall was excellent with the best burn quality (seeing through the spikes, many of which were self-imposed). Considering it was the most expensive media per-disc, and considering it is the media type with the least options on the market, it seems quite expected that it would show such good results. But knowing this now, I feel very sorry that the Sony plant is closing down and that I didn’t take the opportunity to grab a few more discs while I was in Japan. They truly are a technology engineering marvel!

The Sony BD-RE TL media suggests that rewritable BDXLs really do “live at the edge” of what is possible, having error rates at the threshold level especially at L0. My disc seemed to show the reverse of the BD-R TL media, having error rates that reduce with increasing layer number. Regardless, a high error rate is not a desirable situation as it leaves little margin for error and may contribute to slower readouts. However, the tests were made on a disc with at least a few rewrites due to testing, so perhaps a “virgin” burn would produce better results.

How long would such the recordings on such discs last? It’s hard to say. While Blu-ray recordables are now solely using inorganic recording layers as LTH technology has practically been abandoned, my earliest Ritek (inorganic, 2x media) failures were very sudden and very catastrophic. While I haven’t encountered such failures later on in my Blu-ray burning career, my use of Blu-ray significantly declined after ~2015. Aside from initial material quality and burn quality, storage has a big impact on lifetime as well. Conventional wisdom suggests that the lower the initial error rate, the more margin error correction code has to “fix” any degradation that may occur, helping the disc last longer and giving the opportunity for the user to “catch” the degrading disc before data is completely lost. But this is on the assumption that the recordings degrade at all – if the recordings are very stable (or are kept in excellent storage conditions), even if they had high error initially, as long as they are originally readable, they could still remain readable into the future in spite of the handicap. It’s definitely not a simple thing to know for sure and you will still need a BDXL-capable drive to read it out in the future.

But at least, I can say that I’m happy to have an MT1959-based drive (BH16NS55) and an RS9400/ID15-based drive (BDR-X13JBK) that I can trust, alongside an MT1939/1956-based drive (SE-506CB) that I can’t.

About lui_gough

I'm a bit of a nut for electronics, computing, photography, radio, satellite and other technical hobbies. Click for more about me!
This entry was posted in Computing and tagged , , , , , , , . Bookmark the permalink.

8 Responses to Experimenting with BDXL – Part 4: Scanning BDXL Discs feat. LG BH16NS55

  1. Sphyblk22 says:

    Great series on BDXL, really enjoyed reading it! It also helped me a lot during my own little endeavor into BDXL drives and discs.
    As it turns out, Vinpower wasn’t the only vendor of MT1959 platform that enables quality scanning. Amethystum drives can also do that. Currently I’m working on getting a firmware dump of this drive as it has normal tray length and support writing to “vendor-locked” Amethystum discs. I’ll post updates of that on MakeMKV forum. (https://forum.makemkv.com/forum/viewtopic.php?f=16&t=36135)

    I’m also curious whether you would like to try Archival Discs (AD) one day. I would consider these the true “last format” of optical discs as they’re also based on BD technology, but reached 300GB and 500GB per disc. They were co-developed by Panasonic and Sony, and were used in Panasonic’s freeze-ray systems as well as Sony’s Optical Disc Archive (Gen 2/3).

    • lui_gough says:

      Thanks for the comment and letting me know that there’s another one out there! The encrypted secure nature of the newer drives are a pain … still remember the days when firmware was stored on external flash chips so a dump was never too hard even if the chipset wasn’t helping :). I’ll be interested to hear if you manage to get the firmware, but the Vinpower MK firmware is definitely much better than having nothing.

      I’ve never encountered an Amethystum-made disc in the wild … nor have I heard of this vendor-lock regarding layer switching at all. Definitely interesting to hear about – but probably like the BeAll 4.85GB DVD-R discs, will be an oddity that will likely be forgotten otherwise.

      While I intend to cover some more random optical-disc-related technology in the coming year, I’d have to say that prices for equipment and consumables mean that it just isn’t feasible to spend the money just to chase obsolete/outdated technology for the curiosity factor. For example, while I am aware of Professional Disc (PD) which is a “relative” of Blu-ray, I doubt I’ll ever have the chance to handle the equipment, even if it’s a cousin with a cartridge (and arguably closer to what Blu-ray was initially envisaged to be).

      As for Archival Discs – I’ve only seen mention of it in passing, but I’ve not heard of them being used in any serious way. After all, TDK did claim to reach 1TB on a disc (https://www.engadget.com/2010-10-11-tdk-develops-1tb-optical-disc-leaves-other-optical-storage-feel.html?_fsig=GN1ZC70_IM3.nTRqRVAvug–%7EA) in a 16L x 32GB x 2-sides arrangement. To me, this technology feels like “vapourware” as it’s so “hidden” and likely to be used only in very niche applications by companies with very deep pockets. You may be right to consider that to be the “last” format, but to me, that’s just a technicality. At that scale, idle-disks or tape libraries would still be a potentially more attractive alternative for those applications. For me, BDXL may be the last (consumer) format I can afford to try … :).

      – Gough

      • Vincent O. says:

        Archival Discs have basically failed to catch on as an alternative to LTO tapes. The technology seems really impressive, but used drives from Sony’s ODA lineup wind up on ebay costing several thousands of dollars, no matter which generation it’s from. It’s a shame since the rewritable 1.5 TiB cartridge would be the perfect solution for my backups.

  2. It actually makes sense that BD-RE errors go down with increasing layer numbers. Take a four layer disc. Layer 0 is actually the innermost layer, 1 and 2 are the middle layers, and layer 3 is the bottom surface layer. This means to write to layer 0, the laser has to penetrate all outer layers. The beam is unfocused when this happens, of course, but it still has some effect on the other BD-R layers and introduces some noise. Now, this is still the case with BD-RE, but when layer 1 and layer 2 get noise added to them during the layer 0 write, when the layer 1 write comes along the laser “erases” those effects. When you write a BD-RE, it doesn’t just do the pits on an otherwise “blank” disk, it resets the lands too. So any effect the laser had on layer layer 1 during the layer 0 write is wiped clean during the layer 1 write. So, with BD-RE you only the effect of increasingly accurate writes on subsequent layers since the laser is penetrating fewer outer layers to get there.

    • lui_gough says:

      Thanks – I suppose that might make sense in a way.

      But as far as I understand, the layer thicknesses were chosen to minimise the inter-layer interference. So Ll/L2 performance shouldn’t be affected significantly by writing L0 due to the thresholding effect of needing a certain level of laser power to “burn” the layer in an ordinary BD-R, as it is the same laser that reads and writes. If this wasn’t the case, then there would be quite a problem on TDK’s 10-layer prototype disc and the act of reading the disc itself would be potentially quite destructive (at low speeds, write power is ~5mW and read power is ~0.35mW). In the case of writes, passing through the other layers is done below-threshold and most likely just results in a slightly reduced laer power at the target layer which is compensated out by write strategy and OPC calibration. Otherwise, I’m surprised that Sony QL BD-R discs even work, as L3 would be hit with three “noise-inducing” passes before actually receiving its burn, while the error scans seem to suggest lower error rates on L2 and L3.

      For BD-RE, as it’s phase change, you are right to say its direct overwrite property means that the write lays down the full data signal in one pass, setting it to whatever you might want in an ideal world. But in practice, laser power, focus and tracking errors will still result in noise contribution from previous recordings, while some postulate that material annealing from being exercised improves performance over time. Whether this is the difference in error rates is hard to say – I would still expect the defocised write laer not to have enough laser power to affect the other layers enough to enact permanent changes during writes.

      The other issue is even if you are focused on L3, there are reflections that will land into L1 and L0 too due to the construction of the disc and semi-transparent nature of the layers, so it’s not like being focused on an outer layer spares inner layers from laser exposure entirely, albeit the reflected power may be less than the pass-through power.

      Readback inter-layer interference should be common to both types of disc but mitigating it effectively requires good layer spacer thickness control. Manufacturing deviations and tracking errors as a result would spoil both writing and reading.

      – Gough

  3. Will Ng says:

    Hi Dr Gough,

    I recently came across this post, and fortuitously enough, also purchased a similar LG drive that I managed to cross-flashed to the same firmware as featured. It is currently scanning away my previously-burned Verbatim BD-R TL well in Linux using speed47’s fork of qpxtool.

    However, I am wondering what you would consider as ‘acceptable’ for the LDC and BIS avg measurements for BDXL media. The only reference I could find was a Reddit post that mentions both being less than 15, but I gather those were for the SL&DL media, not for 3/4-layer XL.

    Cheers, Will

    • lui_gough says:

      That’s a good question, and based on the fact I’ve seen absolutely horrible scans still show up readable (including fairly thick spikes), my answer will always be “you want it as low as possible” as lower is going to be better.

      As for “acceptable”, it is an interesting concept. If the disc is readable, technically, it can be considered “acceptable” now. But since we don’t know what the degradation rate is like, we don’t know how much margin we have to failure. My informal experiments with scanning suggest that BDXL discs naturally have higher error rates as they push the optical margins to the extreme, but also, they appear to use the same error correction schemes as ordinary Blu-ray (as far as I’m aware) which suggest to me that the “acceptable” thresholds shouldn’t really change if we’re being just as stringent about maintaining equivalent margin.

      I would be generally more comfortable with discs where the bulk LDC is under 200 excluding individual spikes and the BIS is preferably under 10. However, I believe others have indicated, at least for ordinary Blu-ray, an average LDC of 13 or less and a maximum BIS of 9 or less (others 15 or less). The former is probably a bit too stringent – only my Sony QL could possibly achieve that, so perhaps you should aim for something more like an average LDC of 30-50 instead as a particularly good burn.

      Perhaps the reason BDXL disc error rates are a bit higher also has to do with manufacturing difficulties for multi-layered discs. If you see the images of light shining through the discs in my Amethystum BDXL disc review, I also show some images of Verbatim/rebadges and Ritek and it seems the internal layers have some patterning and potential holes which may serve to increase error rates. Single-layered discs have no such manufacturing complications to worry about.

      FYI – all comments are manually moderated in Sydney/Australia time zone, so don’t be surprised if it doesn’t appear immediately.

      – Gough

      • Will Ng says:

        Hi Gough,

        Ah, that figures… Thankfully, I held back on sending you an email.

        I guess I shall have to use my current measurements as a reference for the future, to see if there is a downward trend over the years. They are certainly perfectly readable now.

        My disc’s LDC avg is around the ballpark of your measurements as featured, though BIS max is above 100. But seeing that it reads fine, so there’s that.

        Cheers, Will

Error: Comment is Missing!