Understanding Software Migration. part 2

3. June 2008 13:35 by lleon in General  //  Tags:   //   Comments (0)

 As mentioned previously, the migration process is now an ally of every company while attempting to get their software systems revamped. It’s imperative to determine the rules to measure the process throughput, in order to compare all the options the market offers for this purpose, but, how it comes to be described the rules to compare a process where every single vendor employs proprietary technology that contrast from one to another?

After eye-witness the whole process, the ideas impressed in the user’s mind will decide the judgment made to some specified migration tool, and how it performs; but to make sure this judgment will be fair, here are some concepts, ideas and guidelines about how the migration process should be done, and the most important, how it should be measured.

 

<!--[if !supportLists]-->·        <!--[endif]-->Time:

Human efforts are precious; computer efforts are arbitrary, disposable and reusable. An automated process can be repeated as many times as necessary, as long as their design considerations allow the algorithms to accept all the possible input values. Migration processes can be done with straight one-on-one transformation rules resulting in poorly mapped items that will need small adjustments, but regardless of the size of those efforts, those must be human, so these single reckless rules may become hundreds of human hours to fix all this small issues; remember, we are dealing with large enterprise software products, meaning that a single peaceable imperfection can replicate million times. Another possible scenario will be complex rules that searches for patterns and complex structures to generate equivalent patterns on the other side, but as many AI tasks, it may take lots of computer efforts, because of the immense and boundless set of calculations needed to analyze the original rules and synthesize new constructions. For the sake of performance, the user must identify which resources are most valuable, the time spent by people fixing what the tool’s output provided; or computers time that will be employed by more complex migration tools to generate more human-like code.

 

<!--[if !supportLists]-->·        <!--[endif]-->Translation equivalence:

Legacy applications were built using the code standards and conventions for the moment, the patterns and strategies used in the past have evolved ones for good other to became obsolete. During an automated software migration process there must be a way to adapt arcade techniques to newer ones; a simple one-on-one translation will generate the same input pattern and the resulting source code will not take advantage of all the new features on the target platform. A brilliant migration tool should detect legacy patterns, analyze its usage and look for a new pattern in the target platform that behaves the same way. Because of the time calculations explained previously, a faster tool will only mean non-detailed and superficial transformations that will be a poor replica of the original code or in the best scenario a code wrapper will fix all the damage done. Functional equivalence is the key to a successful migration, because the whole concept of software migration is not only about getting the software running in the target platform, it’s about adaptation to a new set of capabilities and the actual usage of those capabilities.

 

With that on mind, a comparison between different tools can be clearer now. Leaving aside the competitiveness of the market, the readers should identify the facts from the prevaricated marketing slogans, and appraise the resources to be spent during a migration process. Saving a couple of days of computer time may become hundreds of human hours, which at the end will not cure the faulty core, will just make it run.