Ken Wood

IEEE 2013 Massive Storage Conference Revisited

Blog Post created by Ken Wood Employee on Jun 21, 2013

In May, I Chaired a revival session on Optical Storage Technologies at the 2013 IEEE Massive Storage Conference held at the Queen Mary Hotel in Long Beach, CA (complete with fire drill and temporary ship evacuation to give it that real Carnival feel ). The session was designed to re-introduce optical storage as a long-term data preservation technology with a proven history of media stability with backwards compatibility for over 30 years.


The session agenda, entitled "Media I (Optical Media and Libraries)" was listed as,


Media I (Optical Media and Libraries)

Chair: Ken Wood, Hitachi

Optical Media Technical Roadmap: The Revival of Optical Storage (Presentation)

Ken Wood, Hitachi (Bio)

Abstract: Optical storage has been seeing a resurgence in many industry verticals for it's unique preservation and environmental qualities. Recent developments have increased capacities and functionality while maintaining decades of backwards compatibility. This is due to the wide range of industries and markets that support this medium.

Optical Library System with Extended Error-Correction Coding for Long-Term Preservation (Presentation)

Akinobu Watanabe, Hitachi (Bio)

Abstract: Hitachi has developed an archive system with long-term preservation capability, storing data on optical disks. Extended parity mounting technology improves durability against scratches while maintaining compatibility with optical disk specifications.

Achieving 1000-year Data Persistence: "Engraved in Stone" (Presentation)

Doug Hansen, MDisc

Abstract: Proper choices in materials coupled with the flexibility of Optical Data Storage hardware enables the implementation of truly persistent digital data on a DVD or Blu-ray disc. Recently completed accelerated lifetime studies conducted in accordance with the ISO 10995 test standard demonstrate that a lifetime on the order of 1,000 years is achievable in a mass-market-priced product.


The subject of optical storage technologies hasn't been part of the conference's agenda for several years and when program chair @Matt O'Keefe asked me to put a session together on optical storage technologies, I was a bit hesitant, yet eager to try and put something interesting together. Besides myself, I was able to recruit Akinobu Watanabe, Senior Researcher from Hitachi’s Yokohama Research Lab (YRL) and Doug Hansen, Ph.D from Hitachi partner Millenniata (MDisc) to discuss other advance optical technologies in drives, media and library systems.


A long story short, this session was very successful. In opening the session, I asked the audience to hold all questions during the presentations to the end of the session at which point we held a three-person panel. This panel went on and on, into the break, and then some. The panel and the crowd around us eventually was asked to leave the stage so that the next session could begin (this is one of those problems you want at a conference).


While there were a couple of "phooey" and "poppycock" questions and statements from some of the tape guys, the overall consensus was very positive and the interest in learning more was overwhelming. I am still acting on action items and follow-on tasks, which will result in collaborations with some high profile customers using futuristic technologies.


With the continuing explosion in data growth, more and more customers are questioning and experiencing the old notion of media migrations when it comes to their archives and data that falls into the classification of long-term data preservation. This data is deemed by choice to be important enough to be preserved for the “life of the company” or the “life of the republic” and thus, is stored and used forever.


In a long-term data preservation strategy, data preservationists use a 3-2-2 approach. This means 3 copies of the data, 2 physically separate sites and 2 recording/media technologies. Today, the 2 recording technologies are disk and tape, but there are some concerns that magnetic recording should be considered as one technology since they both are subject to the same vulnerabilities, like degaussing. 


Also, the notion of media migrations for massive amounts of data can and does result in the “Painting the Golden Gate Bridge Problem” where starting to paint (insert migrate data) the bridge from one end means before you finish, you have to start over, therefore the task never finishes. Breaking this cycle by elongating the technology refresh cycle is one way customers are looking to tackling this problem.


If you have a chance, view the presentations from the conference. The above links in the session agenda should point you to the .pdf files of the materials we used. I'll also upload the material into the HDS Community soon for prosperity. If you have any question on this subject, please feel free to ping me.


I’m looking forward to reading your comments on this subject.