We’ve all heard stories about the antiquated and sluggish legacy system that blocks your passage into the slick, responsive technological Camelot. As we race to modernize these legacy applications, these systems that have been the virtual circulatory system of the organization, we often lose focus on the very lifeblood pumping through them – the legacy data.
That may sound obvious, or worse, patronizing. For the love of all things normalized, you didn’t just buy a sailboat and call yourself Larry Ellison. You know data migration is part of every large-scale legacy modernization project. But it’s the application that is shiny and new. Moving old data never added any value to your organization, right?
Unfortunately, not giving data the proper respect in large-scale modernization efforts adds unnecessary risk to the project.
Respect the Data
All too often, even though we know it needs to get done, not enough respect is given to delivering trustworthy data to your new shiny application. And, let’s face it, data migrations are messy, time consuming and boring.
Well boring may be going too far. Let’s replace boring with tedious. But let’s also add risky.
Unfortunately, not giving data the proper respect in large-scale modernization efforts adds unnecessary risk to the project. Don’t take my word for it. According to the 2007 Bloor Research report on Data Migration, 84% of all data migration projects were either running late, over budget or both. And as data migration goes, so goes the success of the larger project.
Bloor featured common traits of data migration success stories (psst…they mean best practices). Along with using proper tools and engaging the business side earlier, successful data migration is dependent on an unflinching focus on:
- Using a tried and tested methodology
- Separating data migration into its own project with its own budget
- Teaming internal institutional knowledge with external expertise
In other words, successful data migration can and should be treated as a separate project, not merely as a necessary evil of new system implementation.
It’s all in the execution
Some recent large-scale State of California IT projects have struggled with data migration, resulting in project overruns, delays and destructive production “glitches.” Even worse, failed data migration has forced important projects to be canceled, wasting forced projects to be canceled. None of us mean that when we say we want to leave a behind a legacy!
State IT staffs know the internal data better than anyone, but the execution of quality data analysis, cleansing, mapping, governance and migration requires specific experience, tools, processes and discipline. They are critical. However, these major projects could have used a solid set of methodologies, a separate data project control and the appropriate expertise to better deliver quality data migration. Teaming these subject matter experts with an autonomous, focused and experienced data migration team that engages the business side of the project will provide the lowest risk approach – and the best results.
Choose your team
Many make the mistake of thinking that the vendor who will implement the new system understands data migration as well as the application they are implementing.
Your data deserves the respect of a team that focuses on it, and it alone.
Generally, this is not the case – they were hired because of their experience with the new application, and data migration may not be part of that knowledge base. Although they will focus on the design, build and implementation of the shiny new application, it does not always have the focus or the expertise necessary to give the data the respect it needs for a smooth transition to the new system. Far too often, the plan for data migration is a mere detail in the larger modernization effort. Their passion is for the new system – your data deserves the respect of a team that focuses on it alone.
We hope that as departments plan to move those antiquated legacy systems to Camelot that they consider building separately funded projects specifically focused on delivering quality data into that new world.
This is the best way to make sure that the investment in the modernization effort delivers the expected benefit, to ensure that the only legacy that remains is the legacy of a highly successful project.