Copyright OSTA 2001
All rights reserved.


Author's Notes
Physical, Logical and File
System Standards

Recording Hardware
Recording Software
Recording Speed
Physical Compatibility
Disc Size and Capacity
Audio Recording
Digital Pictures on CD
Duplication, Replication
and Publishing
Disc Labeling
Disc Handling, Storage
and Disposal

Disc Longivity
> Disc Testing and

Disc Construction and

Appendix A - Further
Reading and Resources

Appendix B - Industry
and Product Contacts
About OSTA
About the Author

  CD-Recordable Glossary


  White Papers

  Archived Storage (COSA)

  Optical Websites


Is it necessary to verify a CD-R or CD-RW disc after recording?
Verifying discs after writing helps maintain an appropriate quality level. The amount of ongoing integrity checking and data verification that may be necessary is really a question of acceptable risks for any particular application. For example, letting recording software conduct data comparisons immediately after writing is usually sufficient in casual settings but critical data archiving and large-scale duplication may call for more comprehensive testing. This is due to the differences that often exist among recorders, drives and players. For example, recorders typically incorporate higher quality optical systems and lenses with slightly larger numerical apertures than do reading devices. Consequently, successfully verifying a written disc on a recorder does not guarantee broad playing compatibility, especially in cases where disc jitter is marginal.

How can the quality of a written CD-R or CD-RW disc be assessed?
Several methods can be used to assess the quality of a written disc. These include measuring its optical signals, examining the integrity of its physical and logical formats, performing interchange testing and conducting data verification. Each method is a piece of the quality testing puzzle. The extent to which a disc needs to be tested depends, of course, upon the imperatives of the application.

At a basic level it is possible to confirm that information has been correctly written to a disc by comparing it against the source material using the verification features found in many off the shelf writing software packages. When somewhat more detailed analysis is warranted, interchange testing can be performed to provide some practical indication of real-world compatibility. To accomplish this, audio CDs are played back in a number of consumer audio players to check for quality issues while data discs are checked in a variety of CD-ROM and DVD-ROM drives to make sure that recorded information is completely recoverable and at speeds established by the manufacturers. Specialized computer software controlling everyday CD-ROM drives can also be used to read a disc at a lower level of organization to verify that its physical and logical formats conform to industry specifications.

For situations which require appraising the more fundamental physical characteristics, a number of commercial analysis tools are available to examine the optical signal characteristics of a recorded disc and thus identify low-level errors. Typically, these devices are standalone or computer-attached and employ CD audio or CD-ROM drives specially modified to measure various disc parameters and provide descriptive reports. As is the case in testing generally, results can vary significantly among inspection systems so, to maintain continuity, discs should always be evaluated on the same pieces of equipment. Commercial CD testing companies offering quality verification services using such devices are also widely available.

An important question which has always existed for compact disc testing is the uncertainty of the relationship between the results derived from evaluating discs on low-level analyzers and real world disc performance in the installed population of reading and playing devices. Over the years a succession of groups and companies have labored to reconcile these two product classes through the use of various multi-point calibration discs and other vehicles. However, given the extremely rapid technological evolution of reading and playing devices it is impossible to conclusively establish any definitive link between measured and actual performance, especially for marginal discs.

When assessing disc quality keep in mind the huge number of variables involved. These include such things as the discs with their different types, batches and manufacturers, recording software and hardware in their many varieties and versions, diverse recording conditions encountered, different test equipment employed, operators of differing experience and even the handling of the discs themselves. Consequently, judgements should be made on a relative rather than absolute basis.