BLOG

Welcome to the new Gizmox Transposition blog. We will be using this space to share perspectives from the Gizmox team, comment on industry happenings, discuss how to get the most out of Gizmox products, and generally check in. User feedback is greatly appreciated and we will do our best to respond promptly. Hope you enjoy…

Legacy Modernization is a moving target

A decade or two back, the Legacy application was usually a mainframe application (written in COBOL, IDMS/ADSO or Natural ADABAS for example) and the target was a Client/Server application written in Java, C++ or C#, designed to run on open-systems.  These days, the legacy application may well be a Client/Server application written in one of the RAD systems popular in the nineties such as PowerBuilder or Visual Basic, but also in early versions of Java or .NET languages.
dart

A decade or two back, the Legacy application was usually a mainframe application (written in COBOL, IDMS/ADSO or Natural ADABAS for example) and the target was a Client/Server application written in Java, C++ or C#, designed to run on open-systems.  These days, the legacy application may well be a Client/Server application written in one of the RAD systems popular in the nineties such as PowerBuilder or Visual Basic, but also in early versions of Java or .NET languages.

New source & target platforms, new challenges

There are some challenges shared by both modernization paths, but there are also unique features, challenges and opportunities in either of them.  The robust code transformation solutions all share a common technological approach.  Vendors have developed solutions that parse source code into an intermediate representation. This representation is some form of an abstract syntax tree (AST).  ASTs represent the original program and logic, independent of its original source syntax. Vendors can then use language grammars, similar to those used by compilers, to generate new source code in the new desired language. 

Additional manual work has always been a challenge

A challenge for modernizing mainframe applications written in procedural languages such as COBOL, PL/1 or Natural to object oriented languages such as Java or C# is that additional steps are needed after the migration tool transforms the syntax.  These additional steps performed after the process, outside the tool, are required to re-engineer the migrated program to match the form of true OO programs.  Thus, restructuring a procedural program with shared working storage - a trademark of procedural programs - to OO classes and methods, requires additional effort, and may result in code with sub-optimal quality.

An end to end process, with machine based code understanding

For migration efforts where the source application is based on an open-systems based OO language and framework, it has become possible to enhance the AST based approach in several ways, resulting in a more complete solution with better code quality without additional steps.  For these source applications, advanced vendors are able to add semantic data and understanding to the AST, such that it becomes a much more holistic and complete representation of the application, it’s class structure, the data components and the UI.  With this holistic approach, as well as advances in virtual compilation, it is becoming possible to perform true continuous compilation of the application while it is being modified, and before the target is complete.  This has huge implications as it enables an IDE which is driven by complete code understanding and the continuous compilation process.  The IDE experience can include errors, warnings and messages produced by the virtual compilation engine, continuously guiding the modernization expert and also showing the gaps to completing the project.  This type of IDE is enabling in turn a modernization approach which is completely tool based, highly automated, and also highly open to expert architect input, modification and improvement.

An automatic path for VB6, PowerBuilder or .Net systems to the Web/Cloud

The bottom line is that it is now possible to modernize with confidence large and complex desktop Client/Server applications based on Visual Basic, PowerBuilder, Java and .NET and possibly other similar environments, to the web or cloud and with a mobile experience baked in.  The resulting tool-generated application will have functional equivalence with the legacy system, and will be expertly architected according to the best practices of the target platform and language.  The reduction in the need for additional steps (which were required for procedural program modernization and were usually manual) makes the process easily repeatable and with a significantly lower number of issues remaining in the target application.

Case in point: VB6, MS Access & .Net applications at AIG

Gizmox Transposition is working with AIG on migrating core business applications from various legacy platforms to ASP.Net MVC web applications.  The main challenge AIG was facing was of moving to the web while retaining their IP, i.e. the significant investment in business rules and logic.  By using advanced patented code understanding and unique continuous compilation technologies, Gizmox Transposition is able to re-architect this diverse legacy inventory to a well written, maintainable and readily available for re-development set of applications adhering to industry standard best practices on AIG’s platform of choice.

Gizmox Transposition’s modernization technology is a prime example of the newly available possibilities of a quality migration technology enabling enterprises to move their core applications from the client/server world to high quality web systems while retaining the decades long investment in their business differentiators.
Legacy Modernization is a moving target

Are you also concerned with the efforts and risks of moving desktop applications to the Cloud?

As someone who constantly interacts with decision makers in enterprises, I find myself answering the same common concerns.

Gizmox design
As someone who constantly interacts with decision makers in enterprises, I find myself answering the same common concerns.

The people making decisions on improving IT ROI or providing new user experience are mostly concerned with the efforts and risks of upgrading existing applications to the cloud.

I would like to share with you a new infographic-article I have written recently in an effort to answer those concerns. I think you will find it valuable. You may read or download it HERE.

Happy to provide more help and information.

 

Thank you

PJ O'ryan
 
Are you also concerned with the efforts and risks of moving desktop applications to the Cloud?

Was it Predestined?

bbv : about the WannaCry cyber-attack: "The software exploits a security flaw in Windows XP, and once it infects a computer, it encrypts the files and spreads to other computers..."
By PJ O'ryan

bbv : about the WannaCry cyber-attack: "The software exploits a security flaw in Windows XP, and once it infects a computer, it encrypts the files and spreads to other computers..."
By PJ O'ryan

wannacry

Breaking news and then oral and written opinions and finally lengthy articles have hit us since the recent May 17 WannaCry Cyber-attack.  Most of them described the attack and its affects in terms of a disaster. Europol Chief Rob Wainwright said more than 200,000 victims had been hit in more than 150 countries. It is the largest ransomware attack observed in history of such attacks and its effects are yet to be summarized …
The inevitable question is: was it avoidable?

If you ask Microsoft, the post on Brad Smith, Microsoft's president and chief legal officer’s blog, would most probably be its answer: “there is simply no way for customers to protect themselves against threats ( N.P: like WannaCry? )  unless they update their systems. Otherwise they’re literally fighting the problems of the present with tools from the past…"

Well, you are probably asking yourself, do organizations not know that? And if they do , how come organizations still use OSs like XP?


Maybe only a few organizations still use XP?

The following graph from Netmarketshare suggests otherwise. It says that XP still holds a share of almost 10% of all desktop OSs. So it is not a minor matter of a few companies behind with their IT workload, it is hundreds if not thousands of organizations out there that still run XP. 
   
ScreenHunter 301 May. 25 16.20


Or maybe organizations are rushing to update their systems but we are not seeing it? 

Well I found some up-to-date statistics at Netmarketshare. This time I looked for the pace at which organizations might be moving off their XP systems if at all. 

The "at all" seems to be the real story. No, organization do not really seem to be rushing, if "at all" to upgrade away from XP. 

ScreenHunter 302 May. 25 16.25


So why are organization not getting off unsupported software?

Be the reason what it may, as a vendor focusing on upgrading applications running on those OSs, we are talking to companies. They seem to be concerned with many aspects of the modernization issue, one of which being whether applications written with platforms such as VB6, VB.NET, Classic ASP or PowerBuilder will still run as they used to on the patched or updated OSs?  Could these concerns be the reason for the inaction? This article claims it may be. It talks about vendors reluctant to test and if necessary update their applications to run on the updates and patches.   

But is there a good solution?

We at GizmoxTS have been working for 5 years to build a good solution.

The ideal solution would have the ability to fully analyze and understand the source application, be automatic enough to be efficient compared to a manual rewrite.  It should however be customizable with the ability to deliver high quality code.  And finally, the solution must allow an early and accurate assessment of the effort required to migrate an application from a specific source platform to a specific target.

The required migration solution would be able to upgrade a monolithic or client/server application written with languages or platforms such as VB6 or PowerBuilder to modern platforms as the web or cloud with mobile accessibility, and do it quickly, effectively and with a high ROI.  And it should be able to handle enterprise grade, complex applications efficiently.

This is exactly the solution we have been building here at GizmoxTS.

The early versions of this solution have been tested with F500 applications, large and small ISVs, Governments and armed forces’ applications and have proven to be both powerful enough to handle millions of lines of code as well as to deliver customized high quality applications.






Was it Predestined?

GizmoxTS to Modernize the Computerized Personal Medical Record (CPR) at the IDF

According to Pesah Galon, the company’s CEO, “the importance of the project is in retaining knowledge and business rules accumulated over many years, while adapting the system to advanced technologies and to continued improvement in a standard development environment.” The project is valued at over 3 million USD.
 
c8ae17 f76533d073f44b2f8f6c0bf70380fd24 mv2

GizmoxTS is to modernize the Computerized Personal Medical Record (CPR) at the medical corps of the IDF (Israel Defense Army), in a project valued at more than 3 million USD.

The Israeli ministry of defense has selected GalilCS, a GizmoxTS sister company from Kibbutz Shamir to perform the project.

The project commenced a month ago, with a first phase that will last 18 months, during which the system will be modernized and a new architecture will be built for the system, going from a client/server configuration to Microsoft based web technology, with no functional changes.  After going live with the new version, during 2018, the project will continue for two more years, during which new features will be added and the system will be maintained by GizmoxTS and GalilCS.

The IDF medical corps developed the computerized system more than a decade ago, in 2003, centralizing and making managing soldier medical information management more efficient.  The system replaced the personal cardboard file with a computerized record, available to the different entities caring for the soldiers – unit doctor, clinic medical staff, specialists, nurses and others.

Creating a complete medical picture

The CPR was developed with the goal of making it possible for the different medical entities caring for the soldier to all work on a shared platform, with the medical information accumulated in any medical encounter being available to all other care givers creating a complete medical picture.  The system has thousands of users in the IDF and outside of it – doctors, nurses, military paramedics, and it interfaces with civilian systems.  But as years passed problems grew in the system, and its technology and architecture has become outdated. 

GizmoxTS has the technology (based on several software patents), allowing organizations to replace older client server enterprise scale applications to web, cloud and mobile environments.  The technology, developed and owned by GizmoxTS, enables a fast migration to new platforms, based on smart algorithms which fully ‘understand’ all the original system’s components, and maps and automatically replaces them with components and software patterns compatible with the new environment.

The technology ensures the highest code quality and a system architecture based on the target platform’s best practices for web, cloud and mobile applications – while upgrading security mechanisms for protection against cyber-attacks and also significantly improving command and control capabilities.

Pesah Galon, the CEO of both GizmoxTS and GalilCS said in an interview that “the importance of the project is in retaining knowledge and business rules accumulated over many years, while adapting the system to advanced technologies and to continued development in a standard development environment, according to industry accepted best practices.  The GizmoxTS methodology preserves the accumulated knowledge and rules, while bringing the system up to speed in the advanced networked world.”

Galon added that “the project’s success will provide the opportunity to perform many additional projects for the IDF and the Ministry of Defense, and will leverage both company’s growth, as well as adding to the high-tech related employment options of the diverse communities in northern Israel, where the GalilCS delivery center is located.”

Not possible to add features and improvements

Last November Israel’s State Comptroller Joseph Shapira said in a report that there were complaints about the use of the CPR system, specifically that it was slow, wrought with faulty functionality and that it is not possible to add features and improvements.  The State Comptroller added that the Medical Corps command should advance the computer systems relevant to medical examination in the IDF’s recruitment centers.

A decision was taken in 2014 to create a new Computerized Medical Record system, at a cost of over 10 million USD.  This was supposed to be performed by the Enterprise Services division of HP Israel who were going to implement the ISH-MED system, written by Siemens, which was developed in conjunction with SAP.  Ultimately, the project was canceled and a decision was taken to modernize the existing system.




This article is a translation from the Israeli online magazine PC World
.
GizmoxTS to Modernize the Computerized Personal Medical Record (CPR) at the IDF

So what is the difference? How does the AUTOMATED-REWRITE approach deliver higher code quality?!

Higher then what? What is low-quality code?
The vast experience we have with code migration and modernization has shown code quality to be the most challenging issue when discussing code migration automation. Most of the traditional auto–migrators we know off do not produce good enough code and in most cases require much manual work to fix and make the migrated code production-ready. This is the common perception of auto migrators which has been justly earned. 

Higher then what? What is low-quality code?
The vast experience we have with code migration and modernization has shown code quality to be the most challenging issue when discussing code migration automation. Most of the traditional auto–migrators we know off do not produce good enough code and in most cases require much manual work to fix and make the migrated code production-ready. This is the common perception of auto migrators which has been justly earned.
The fundamental shortfall of most of the auto-migrators is in the customizability and maintainability of the code they generate. Here are the main 3 challenges we experienced with such code:  

ScreenHunter 262 Mar. 12 12.59
  • Helper libraries are generic functions which emulate the legacy language or framework in the new environment, to make the old business code operate with little change (basically, translation only) after the migration.  These libraries would not be included in a new application written in the new environment and are possibly the most acute shortfall of auto-migrator-produced code.  They introduce proprietary code to the migrated solution and make maintaining and extending it an unnecessarily complex task.
  • Impossible or too hard to customize – auto-migrators usually operate as black box, command line driven processes.  All enterprise scale applications need to adhere to specific UI standards and coding styles, and additionally interface with middleware, data silos and directory services.  These requirements and others make customization imperative and if the auto-migrator-generated code cannot adhere to them it will not be fit for production without extensive additional work.
  • And last but not least, auto-migrator tools will usually only perform part of the migration automatically.  As described above this is partly a result of poor and limited customization capabilities.  Additionally, because security features cannot be added as part of the simplistic auto-migration, penetration testing on the migrated application will likely point to multiple weaknesses that will need to be addressed manually across the board.  These and other issues that need to be addressed manually, outside of the auto-migrator process, may prove to be some of the most complex tasks in the modernization project. 

So, how is high-quality code defined?

If the above is the definition and the reality of auto-migrator generated low-quality code, what do we mean when we expound higher code quality? 

The answer is simple but far reaching.  We mean the code quality that you would get if you were to rewrite the code from scratch with the best coders and according to highest coding standards.  No "foreign" code.  No helper library code.  No closed propriety DLLs.  Customized to the needs of the customer, including an upgraded user interface which is fully adjusted to the target form factor (or is responsive to changing form factors).  High-quality code is well structured code.  Easy to read and understand.  Short and efficient with understandable remarks.  Standard and native to the environment it targets.  No additional "generic-purpose" code, only the very code that the application uses.  No dead code.  Code that adheres to the strictest best-practices of the vendor who produced the framework and platform being used.  Architected according to common best-practices.  High performing code.  Code that was machine written accurately using one standard code-pattern process.

Sounds like utopia?  Here is the patented methodology that produces such high-quality code.
 
We read and understand the legacy code HOLISTICLY

ScreenHunter 256 Mar. 09 17.37
Source: GizmoxTS slide show

Based on unique algorithms for semantic code understanding, the legacy application is read and holistically represented internally.  Unlike with the traditional ‘line by line’ processors of yesterday, the holistic approach allows re-architecting to a new platform or framework, while preserving the semantics (business rules, data relationships and flow, UI element functionality).

One important benefit of holistically understanding the legacy application is that it removes the need to reverse engineer the legacy application’s business functionality, thus considerably reducing the cost, duration and especially the risk of the modernization effort.

The semantically explicit internal representation of the legacy application serves as the basis for the machine-driven process to automatically perform a large proportion of the re-architecting.  Through a process of virtual continuous compilation, the software guides the user, in most cases a software architect, through completing, refining and modifying the migrated application.

We are the only vendor in the world including a complete, flexible IDE (Integrated Development Environment) in the Automated-Rewrite solution 
ScreenHunter 257 Mar. 09 17.38
Source: GizmoxTS slide show 

The IDE is similar to the Visual Studio IDE, but is powered by the semantic understanding and continuous compilation engines.  Virtual continuous compilation allows the user to be guided in refactoring, mapping, or wider ranging re-architecting they still need to perform, through suggestions and comments.  Even more remarkably, it shows the results of those changes directly to the user, without the need to complete the migration, including error messages and warnings that further guide the user on the path to a well written and architected, optimized and modernized application.

An important task performed in the IDE is adding standard or custom security, identification and authorization to the modernized application, so that these extremely important parts of a modern application are not added as an afterthought to the product.

Here are some samples of the tools in the IDE (can we include screenshots and short caption description?)   
 

We are the only vendor that offers PRODUCTION READY tool-based deliveries, without any post-migration customer effort required  
ScreenHunter 258 Mar. 09 17.50Source: GizmoxTS slide show


The application re-assembly and QA are integral parts of the GizmoxTS tool-based process.  We start with the source code of the legacy application and we deliver the ready-to-deploy code of a modern, secure and tested application on your chosen platform.  The end-to-end process has a very low requirement for client personnel during most phases and does not include a post migration modification phase – we deliver to production.



Here is a description of the complete process
ScreenHunter 259 Mar. 09 17.53Source: GizmoxTS slide show 


Let us prove what we promise!

Yes, we know.  Being in the software business you have heard all manner of over-promising from vendors.  Most of them prove to be disappointing, so why "buy" our promises?

Do not "buy" them!  Ask us for a POC that proves our solution for your specific challenges. Judge the code-quality we will deliver, in all its aspects.  Evaluate other alternatives.  Ask your service provider to POC too.  Compare and decide.  Our track record shows that when we are able to present a POC and demonstrate our promises coming to life, the customer will choose the high quality choice – the Automated-Rewrite by GizmoxTS
 
So what is the difference?  How does the AUTOMATED-REWRITE approach deliver higher code quality?!

Brochure engineering? Who is doing brochure engineering…?!

It was there in my Outlook inbox with the promising headline 'application modernization' - roundup of news and tips. It is a space I make my living off these days, so I opened it. Looked nice.


shutterstock 507551464 1
What has triggered this write-up?

It was there in my Outlook inbox with the promising headline 'application modernization' - roundup of news and tips. It is a space I make my living off these days, so I opened it. Looked nice. Apparently, this was the fifth issue of a new publication I had missed before. 

I took a quick glance at around 20 headlines. Not bad I said to myself, not bad, seems to be focused on application modernization, and practical.  The publisher, a guy named David Johnson no doubt knows the space and has been doing a good job in collecting up-to-date stuff that could be useful and save some time in Googling for it.  

And then one of the news items caught my eyes. The headline was "Microservices tooling myths: The 3 lies tool vendors will tell you". Lies, for God’s sake … who is lying and why?!

I read the article. Allow me to be upfront. I am not going to write here about microservices. I basically agree with Techbeacon that microservises are an evolving concept, which is still lacking in fundamental ways. Value and ROI are still unclear, let alone good methodology and tooling to implement… obviously another buzz that still has to mature before it can be really evaluated. 

What I am going to write about is the notion that all vendors do brochure marketing all the time as a cover up on promises they can't fulfill, and potential customers should not listen to those promises, because by listening they encourage lying.

What…???

If that were the case how the hell would a vendor (startup for that matter) present an innovative breakthrough??? I strongly reject the statement and the sentiment that it entails, and that is why I decided to write!

Who did you call a liar?

In a nutshell the author seems to suggest that organizations should not spend time evaluating new vendor's solutions, because it might be a waste of time and encouragement for vendors to lie! In other words, there could not be anything new under this sun. Been there, seen that, and heard all attitude is the way to go! In most cases you will probably hear another set of lies anyway he says, so why listen?? The author calls it 'brochure marketing' and it seems to suggest that all vendors are good at selling on paper, but when it comes to reality they do not deliver.

Well, if that is the case how will organizations get exposure to real innovation? 

Leaving difficult-to-track-down problems…

To quote some of what the author suggests with direct relevance to what I am going to write here: "The reality of every one of these “solutions” is that they will solve easy things well and moderate problems poorly, while failing to even grapple with the most challenging situations. And guess what: The preponderance of legacy applications falls into that latter category. Even worse, many times the solution says it has done some conversion but instead has done it imperfectly, leaving difficult-to-track-down problems lurking in the updated application."

Do not underestimate the professionals in an organization!!

Well I think the author underestimates the capabilities of an organization to define and evaluate what the organization needs and whether a vendor solution can address that need or not. From my experience, most of the organizations with which we speak have done their homework and when shopping for solutions, they know exactly what they are looking for and whether a given vendor solution answers that need or not.

If anything they might miss the proof that the vendor solution really delivers on the promises… or not. But that is as easy as asking for a proof of concept, isn’t it? 

Do not ever close your door on innovation, or you will wake up to a closed door that is too late to open…

From an innovative startup’s point of view I strongly reject the implied notion that innovation is brochure marketing and that an organization should not listen to us. I also reject the accusation that when it comes to legacy applications we leave difficult—to-track problems for the customer to struggle with. The opposite is the truth. The difficult-to-track problems are the challenges we had chosen to solve first! This is what innovation is all about, isn’t it?

And to my point! Open your door for innovation, but be wise enough to ask your vendor to do a POC that covers all the hard-to-track challenges. If your vendor is doing brochure-marketing they will not be able to deliver on such a POC!
Brochure engineering?  Who is doing brochure engineering…?!

If, as Joel on software says "…the single worst strategic mistake that any software company can make" is rewriting code from scratch

How come companies still make such mistakes?!
It is considered to be one of the shrewdest observations on the dilemma of rewriting your code from scratch versus other options for updating your application.
How come companies still make such mistakes?!


confused
15 years ago, Joel answered the questions that we ask ourselves today


It is considered to be one of the shrewdest observations on the dilemma of rewriting your code from scratch versus other options for updating your application. The dilemma is still as valid as it was 15 years ago.

I have been reading and quoting Joel Spolsky’s blog post on code rewrite so many times in recent years that I can quote what Joel says almost word by word.   

Recently, after gaining more experience with customers, it has struck me again! Joel was on the mark with his observations including many insights into the decision processes in software companies.I must say Joel was certainly able to answer the questions long before we knew what to ask.

So how come software companies still make the single worst strategic mistake they can make?  Joel provides the following answers.  

How bad can a wrong decision be?

Joel points out cases that speak for themselves. Cases in which companies face the dilemma and knowingly or not, take the decision to rewrite code despite bad experience that shows such a decision might be outright suicide.

The question of why does this happen is the million-dollar (or maybe multi-million dollar) question. Some of the cases Joel points out in his blog post are:

Netscape for example. Netscape had decided to rewrite the code of its browser. By the time they realized what this really meant, it didn’t matter anymore. Netscape had lost the battle and actually ceased to exist as a viable competing company. It just faded away out of the market.

Another is Borland’s decision to rewrite Argo and later another solution. The result was the same and Borland’s fate was very similar to that of Netscape. 
Yet another example is no other than Microsoft that just a few inches from the chasm made a U turn and decided to abort its decision to rewrite Windows Word (which still runs visual basic code…)

A fact: "it's harder to read code than write it"

I will not go into the details of Joel’s blog post about the reasons so I kindly urge you to read more about it yourselves.  I will just mention the high-level reasons and ask the following questions: are these reasons still valid today?  Aren't there better new data points enabling companies to take a much more educated decision on this dilemma and to avoid the risks that a wrong decision entails?

I believe the answer is positive. We now have new options which should be evaluated before taking such a critical decision.  

Joel says the main reason is: "it's harder to read code than write it" I believe he is right and I should honestly say; this is not new for any of us. It is true in other domains as well. It's easier and in most cases much faster and more efficient to fix your house, redecorate it and bring it up-to-date with the new options than rebuild it from scratch. At the end of the day, the basic functionality of a house doesn’t change even if now smart-houses make our lives easier. We still need good shelter and protection from the weather…

Is this still the case in recent years?     

Yes it is. It is still harder to read code than rewrite it, but I believe companies like the company I work for are making very good progress in recent years, to change it. Will elaborate shortly. 

I will just add to re-validates Joel’s basic arguments about how difficult it is to read code, that from my recent years' experience with customers I can add to Joel sample quite few cases of my own. Sad cases in which companies decided to rewrite code, got stuck on the first stage of trying to read and understand the old code and its logic or what it is supposed to do. But after years of trying and with several millions of Dollars out of pocket started desperately looking for a saver… some after too many years and too many Dollars, and some after less. They all ran into the wall of not being able to read and understand the code, let alone rewriting it. But is this a big surprise? I am not sure. It is a well-known fact that it is close to impossible with all the archeological layers of code and given the fact that in most cases the developers who wrote the code are not with the company anymore not to mention the level of documentation they usually left beyond.
I will not mention names under NDA. I will just say that in most cases I am talking large companies of the F500 level that run structured processes of software writing and maintenance. It didn’t help much in reaching the point of code-update-need in a better position. Non-availability of the original developers and lack of proper documentation is maybe common to all those companies.  

A healthy programmer will always say: if I would get a second chance to rewrite this code, I would write it much better. Is that really the case?  

It looks like the claim of being able to rewrite better code the second time around, is a software programmer's natural inclination. It has become a mantra! I have heard it so many times, I almost believe it myself…it will be the automatic answer to almost any code issue that is raised during an application’s life cycle.   

Is it really the case?

Joel suggests it is not! In short, he says that the claim is generic and has been proven wrong time and again!

Is there a really good substitute to the huge investments of years and years of software maintenance and debugging? Obviously there isn’t.  

So is this the sad lesson that a company should consider before embarking on a rewrite project? Joel seems to suggest it is! A company should take into consideration that even if the miracle happens and it does overcome the roadblock of reading and understanding its legacy code, it shouldn’t expect shorter development time than that of the code it is now throwing away!  

What kind of investments are we talking about? Are10-15 years of maintenance by good number of competent programmers an exaggeration? Obviously not for many companies and applications. And that is on top of the original investment in the application’s initial development.

Standish group has been putting numbers behind Joel’s observations

This is what this software rewrite research & management group has been doing for years. Putting together statistics to measure the rewrite risks, successes and failures. See HERE, most of the Standish group research papers are free. They continuously present research showing 70-80% rewrite project failures. Projects either run well over time and budget or are terminated due to un-crossable roadblocks.

What are the new data points that companies should consider, then?

A strong enough pain usually calls for investment in finding solutions to it. Primarily because it is a challenge and talented programmers love the fame and even more the fortune that might come with solving such problems.

Either way there is a lot of brain power being invested in finding solutions to those challenges as we speak. Especially so when we talk about technology shifts that create disruption in the market, such as cloud and mobile and the digital transformation.

One of those solutions is the solution that GizmoxTS is introducing to the market. It is a solution that automates the well-established processes of manual code rewrite and encapsulate those best practices into software application. This solution offers holistic reading and understanding of legacy code and logic, which as we know is critical for any code re-architecting solution. It does this completely automatically in a shape of a standalone easy-to-operate free downloadable wizard.

And once the legacy code is read and understood, the solution offers an integrated toolkit with all you need to re-architect an application. It includes code refactoring tools, mapping, breaking an application into layers, code preview and application re-assembly tools.

The proof is in the pudding - the concept has been delivering early value with large and complex enterprise-level applications. HERE are some more details.

I believe that given the challenges of manual rewrites along with the growing need to complete the digital transformation and leverage new cloud and mobile platforms, tools such as Gizmox's and others should be carefully evaluated before companies decide on manual rewrite projects.

If, as Joel on software says "…the single worst strategic mistake that any software company can make" is rewriting code from scratch

University of Surrey in a Survey: Escaping Legacy removes a major roadblock to digital future

The survey is downloadable HERE

ScreenHunter 187 Nov. 29 14.04
The survey is downloadable HERE


Recently I read an interesting survey about Application Modernization conducted by Andy Nelson and Roger Camrass, senior board advisor and visiting professor at the University of Surrey. I find it highly interesting because, based on my own experience, it represents what we hear from enterprises in our real world engagements.

I would like to share my impression of this reading along with a short story of one of Gizmox’s real world F500 company engagements, and why I think this survey provides an accurate "picture" of real world motivations.

The title of the survey says legacy can be a roadblock to larger digital processes. I agree with the survey. I think the following story could be a good indicator of the way a typical application real world modernization initiative evolves within the organization.

So, we have here this F500 company that has been going through a modernization process with the goal of making its main system and some peripheral supporting systems accessible to multiple distributed users via web.

The accessibility and availability of the system, along with a modern Web / Mobile-friendly user interface, were the main requirements in terms of user experience. The source systems were multiple VB6 / Access systems that were written 15-20 years ago. The company decided to use MS Dynamics as its main target platform.

After more than 3 years and millions of dollars’ worth of efforts to extract and integrate the peripheral systems' functionalities into Dynamics with MS help, the company reached the understanding that: A – the business logic of these systems cannot be extracted manually; B – it would be too lengthy and risky to analyze the systems and write new specifications and in any case an unbearable waste of resources and assets; C – even if the business logic was extracted successfully, Dynamics doesn’t necessarily support such functionality or it would be too painful to customize it to support such functionality.

The company became desperate. These peripheral systems were blocking a major multi-million dollar move, which the business peers has been requesting for years, in order to keep up with customers’ requests and the competition in the market.

The company started looking for solutions to remove this roadblock and continue with the implementation of Dynamics.

The company and Microsoft started looking for solutions. One of them was Gizmox. Soon the short list of solutions and vendors had become a one-vendor list – Gizmox. No other vendor on the market could read and understand the original source systems' functionalities in such an efficient and wide-ranging way, analyze it and suggest a path to automatically rewrite it and integrate with Dynamics.

Gizmox demonstrated its unique capabilities in a fast POC that was done and delivered in 2 weeks' time. The POC had encompassed the same main challenges that the company had pointed out as the challenges it had faced and with which it had struggled for so many years.

It took Gizmox a few more months to automatically rewrite these systems from VB6 / Access to ASP.NET and integrate with Dynamics, and in fact to remove what had seemed to be an invincible roadblock.  


For more details please download our Showcases 
ScreenHunter_187_Nov._29_14.04.jpg

CIO Insight: Choose your potential application modernization partner by the following 9 point test!

Recently I had stumbled upon a slideshow by CIO Insight. The slideshow recommends a set of 9 parameters for choosing your modernization partner. 
Recently I had stumbled upon a slideshow by CIO Insight. The slideshow recommends a set of 9 parameters for choosing your modernization partner. 

 

check list

It is sort of a test which you should ask your potential partner to take. The results of this test can then be used to choose between candidates.  I am impressed with the magazines’ insights on the real-world challenges, and I think the 9 question test is excellent and should reflect your partners’ ability to help you with your application modernization needs. I think it is exactly what you should grade your partner upon.

I was asking myself while going through the slideshow how would my company be graded with this test. Well, I put it forward for your judgment.  Here are my answers to the 9 question test.

I’d be happy if you let me know what you think about my test results.   

The first question CIO Insight asks is: "do you offer up-front risk assessment?" 

My answer:  Yes we do, we think it is the basics. The customer should make sure that he gets a FIXED proposal based on the assessment results. The better the assessment is, the better the pre-engagement estimation is.  We offer an assessment in the form of a free downloadable, self-service wizard. It performs a comprehensive assessment, as far as we know the most comprehensive assessment of your application, and provides up-front indications on all the challenges that we might face in re-architecting an app to your chosen target. We rely on this assessment which is how we provide fixed cost and time commitments.

ScreenHunter 151 Nov. 03 16.18

The second question is: "Can you work with different languages?"

My answer:  Yes, we sure support different languages. With the diversified needs we see in the market, we think this is one of the essentials as well.  A good application modernization provider should offer customers a choice of target platforms.  We decide on the languages' support priorities according to the demand we see in the market. For example, we had recently added to the supported Microsoft and Java stack, support for PowerBuilder as a source and Angular 2 as a target. Both were added as a result of the strong demand we see in the market.
ScreenHunter 152 Nov. 03 16.37

The third question is: "Can you refactor the code as part of the transformation process?"

My answer: We certainly can!  At GizmoxTS we believe that you cannot really deliver good target-optimized code and architecture without original code refactoring.  Subsequently, this is a basic component in our approach, facilitating extensive code refactoring. It is a customizable tool-based or automated code refactoring capability, just like a skilled developer would do manually, only faster and more accurate.  We also allow refactoring customization according to our customers' specific demand and standards.

Here is a VIDEO that demonstrates this basic component in our approach.

ScreenHunter 153 Nov. 03 16.40

The fourth question is: "will you offer a prototype solution?"   

My answer: a definite Yes! Of course, we think it is not only important for building confidence in the proposed solution, but also to prove the code quality. We urge organizations to request a prototype in order to assess and compare code quality. After all we are talking about starting a new life cycle with the target application and the code and architecture should be performance-optimized as well as standard and native to the target platform, to facilitate easy and standard maintenance and extension by standard tools and skill sets.

ScreenHunter 154 Nov. 03 16.43
The fifth question is: "Can enhancements be included as part of the automatic transformation process?"

My answer: Actually my advice to you, eliminate partners who don’t offer code enhancements as part of the process. My experience is that in most cases going from a platform like monolithic client server architecture to the web will require a level of enhancements without which the target application will not be a true web application. This extends from breaking the monolithic application into layers, to upgrading the user experience into a native web user experience.  Customers do not want "remains" of the old code, for example "dead code" that makes it hard to maintain the new application, let alone helper libraries and emulation that again make maintenance a nightmare. Responsive UI is another good example. We offer this enhancement as a baked in feature in our tool-based approach. To summarize, we offer a complete tool-based code enhancement that includes removal of dead code, breaking the application into layers, exposing services and APIs, integrating frameworks such as entity frameworks and data security enhancements, standardizing code patterns etc. you name it, our tool-based approach can deliver it.

ScreenHunter 155 Nov. 03 16.47

The sixth question is:" What is your strategy for modernizing the application and the database?"

My answer: Our strategy is to take an end to end responsibility. We found that this is what customers want.  We start with our assessment that looks at  not only the application itself but also the environment it runs on. We end where we start by running and testing the new application on its target environment. It means that if a data access or other middleware layer is required to be upgraded as well, we will include it in our application modernization proposal and execution.

ScreenHunter 156 Nov. 03 16.48

The seventh question is: "Can the partner work with your current team to execute updates?"

My answer:  Yes, we encourage that. We believe it is the best way to build trust between the companies and a long-lasting relationship. Our technology shines with the larger more complex systems which are nearly impossible to upgrade manually.  In more than one case, we got involved after an organization had started modernization by working on the simpler and smaller applications and we were called in for the larger more complex applications. In other scenarios the organization has been trying to modernize systems that took too much time and effort than anticipated, and we were called in to help. And we also have had successes with a shared efforts model, where the customers do the parts they have manpower and skillsets for and we do all the rest.   

ScreenHunter 157 Nov. 03 16.56

The eighth question is: "Can the product UI be customized?"

My answer: The straightforward answer is that most of the platform to platform transformations will require not only multi-lingual support but complete UI optimization for the new platform and therefore it is an important part of our offering. Good examples would be adaptive UI for different form factors, desktop right click to touch replacement or breaking a desktop full screen form factor UI into a small smartphone screen flow.  All are included in our offering.

ScreenHunter 158 Nov. 03 16.59

The last but not least question is: "Do you have credibility?"

My answer: I believe this question should be answered by my customers rather than by me.

ScreenHunter 159 Nov. 03 17.01

HERE is a short success stories paper for you to download. And I invite you to call Gil Mor in US (6756757856755) or myself  (972-50-5215436) or email Gil (gilmor@gizmoxts.com) or me (Navot.peled@gizmoxts.com) for contact details and testimonials.



Cheers

Navot Peled   

check_list.jpg

Automated Rewrite

Finally, an Application Modernization Solution That Produces the Code Quality and Maintainability You Want

 

Finally, an Application Modernization Solution That Produces the Code Quality and Maintainability You Want

robo

Gizmox Transposition provides a patented enterprise-level solution for modernizing enterprise-level applications to the latest platforms and operating systems, such as web, cloud, and mobile. Applications which were rewritten by our automated, fast, reliable, and low-risk solution are currently being usedby Fortune 500 companies including financial organizations, health care institutions, insurance companies, and ISVs.

The Need for Modernization


As technology advances, web, cloud, and mobile compatibility is becoming more important to remaining relevant in a competitive market.

Companies that do not modernize face:

  • Rising maintenance challenges and excessive costs

 

  • Losing business opportunities

 

  • A decline in customer loyalty due to deteriorating user experience

 

  • Security vulnerabilities and malfunctions due to lack of vendor attention and support

Modernization is crucial to keep up with the latest technological benefits and to remaining one step ahead of the competition.

Modernization Challenges



While the need for modernization is clear, the solutions are not as straightforward, and companies face many challenges in a process that can be time consuming, costly, and risky. Current methods include:

  •   Manual Rewrite: Expensive and very risky every time it is used. Companies need to reverse engineer existing applications to be able to identify undocumented business logic, which is a big challenge.



  • Automatic Tools: While this is a much cheaper and faster solution than manually rewriting the code, it produces low-quality code that is hard to maintain and extend.

 

  • Off-the-Shelf Software: In most cases, this method can't really replace the homegrown specific business logic, and customization, if at all possible, is very expensive. 

 

  • Virtualization: Virtualization refers to creating a virtual environment that is compatible with new platforms, and run the application there, but it doesn’t solve the legacy obsolete code and architect challenge.   

 

Preserving the application's proven business logic. All the rest is upgradeable



Gizmox's automated rewrite solution supports the migration of legacy applications to modern platforms while preserving the application's proven business logic. All the rest is upgradeable: new target optimized components, code, user interface, and application architecture. Our customers benefit from the enhanced performance, scalability, security, and mobility of using modern platforms while limiting the risk associated with migrating a mission-critical application. Our patented approach has proven itself time and time again as the best solution to bringing business applications up to date with the latest cloud, web, and mobile platforms, systems, and technologies.
robo

Should you modernize your Client / Server applications?

Gartner says you should!  Within no more than 5-10 years! In its annual Market Clock report 2016, Gartner puts a strong  REPLACMENT recommendation on this aging architecture.
 
Gartner says you should!  Within no more than 5-10 years! In its annual Market Clock report 2016, Gartner puts a strong  REPLACMENT recommendation on this aging architecture.

ScreenHunter 128 Sep. 15 17.50


Why ? here are some of the reasons:

  •   New Integration needs: The classic  two-tear client / server model might still be good for an isolated workgroup, but most applications need to be composite-type, integrated with external resources beyond the scope of the LAN.

  • New architecture: Applications should be re-architected to three-tier with a sepration of user-facing logic from the backend data-facing logic of the application.
 
  •   Support of new processes: Applications should connect to operations creating an automated delivery pipeline. This will include increased automation as well as tools and practices to enhance collaboration. IT teams should be maturing beyond project-level agile practices into end-to-end enterprise agile delivery.

  •   New business needs: New modes of customer interaction delivered by devices and sensors will require new skills and context-aware frameworks.

  • New software models: Leverage cloud application development  tools and application stores to unleash application innovation from software vendors, consultants and internal developers. Adopt citizen developer strategies to enable business users to create some of their own solutions in partnership with IT.

 

For Gartner full report.

ScreenHunter_128_Sep._15_17.50.jpg

Microsoft introduces new partners site: PMA

We are proud to announce that we are now partners in the new Microsot partners site: PMA - Platform Modernization Alliance.

We are proud to announce that we are now partners in the new Microsot partners site: PMA - Platform Modernization Alliance.


ScreenHunter 104 Aug. 31 17.21

 
The Platform Modernization Alliance (PMA) is a group of companies that are working together to help customers migrate and modernize their non-Microsoft business critical and mission critical workloads to the Microsoft Application Platform.
The companies that make up the Alliance share the goal of reducing the cost of business critical and mission critical applications in the enterprise.  They share a common interest in making Platform Modernization and Application and Database Migrations easier and more efficient for customers.

The Platform Modernization Alliance is a web-community designed to support five objectives:
  • To provide a community where members can collaborate in order to make Application and Database migrations to the Microsoft Application Platform easier and more effective.
  • To help businesses learn about and choose the Microsoft Application Platform as the foundation of their business critical and mission critical applications.
  • To provide co-marketing, sales and support to partners and customers as they modernize existing non-Microsoft business critical and mission critical applications.
  • To provide a central point for publishing Platform Modernization related information, and developing discussions around Platform Modernization and Application and Database Migration topics.
The PMA Website

pma.png

Congrats, moving off client / server to the Web, Mobile and Cloud?

But What About Application and Network Security?!

But What About Application and Network Security?!

danger
Application and network security have become prominent concerns for IT organizations, large and small.  The threat of data loss, of business and customer data falling into the wrong hands, and of disruption to business processes caused by malicious attacks are all center stage in IT priorities.

Mind the gaps! 
First.sketchFigure 1: There is a lot consider where security is concerned...


In the context of application modernization, security concerns can often pose a risk factor delaying or complicating the decision to move desktop based client server applications to modern platforms, especially to the web or cloud.  Securing what are now considered to be traditional applications was much easier before applications started to be delivered over public networks using general purpose browsers.  Yet in a recent publication, Gartner estimates that “Through 2020, 99% of vulnerabilities exploited will continue to be ones known by security and IT professionals for at least one year”.  In other words, almost all of harmful attacks have been and will continue to be preventable if up-to-date best practices are employed.

Risk or opportunity?

So why is security perceived as a risk factor instead of an opportunity when considering the modernization of a legacy application?  After all, getting rid of older operating systems and often un-supported platforms as VB6, PowerBuilder and WinForms should have clear security related benefits.  The answer is that when organizations consider modernization, they usually think in terms of re-development, or of a manual re-write.  The manual re-write process is a lengthy, risky and resource intensive undertaking.  Security standards and best practices are usually added at the end of the re-write process, employing static analysis tools which point to vulnerabilities and require very precise across the board modifications to the new application, as well a whole separate testing cycle as a result.

What if there was …

What if there was a process to modernize an application, faithfully capture all the business rules for which the organization has invested major development efforts over decades, and produce a modern application that has up-to-the minute security best practices baked in?  This is really only possible if the re-write was done by a machine that will re-architect and re-write the application for the new platform.      

Well, there is!

Gizmox Transposition is the leader in machine based modernization of legacy desktop based applications.  The company’s innovative & patent protected tech produces migrated applications that have best-practices-based security procedures and features inherently built in to the code base, as opposed to ‘bolted on’ security wrappers that are much more vulnerable to attacks.

Gizmox Transposition’s migration technology is based on complete code understanding of the source application and then a re-architecting to a new solution on a target platform - desktop, web and cloud, including mobile experiences with a unified codebase.  The process is based on reading in the source application, automatically building an intermediate representation of the source code in Transposition Studio, creating refactoring and mapping rules for all source objects (both automatically and by the architect driving the tool), and then automatically applying those rules to create a re-architected application with all business rules intact and a UI that is appropriately equivalent to the legacy UI, depending on the target platform.

 And, in more detail

Several features of the migrated application on the new platform are specifically in answer to security concerns inherent to the web or cloud environments.  These include:

 

  • SQL injection resistance

All legacy SQL statements are migrated to parameterized ADO.NET or Entity Framework commands.  Any string manipulation that may have existed in the legacy implementation is automatically rewritten as command operations, removing the threat malicious input attacks.
2nd.sketchFigure 2: The dreaded SQL injection exploit

  • All services are authenticated at all times. 

This is achieved by automatically creating the required code and attribute decoration so developer error can never be responsible for vulnerable code to be created.  Identity authentication can be generated automatically to connect with standard (e.g. LDAP) or proprietary identity systems, using custom mapping rules during the migration process.

 

  • Similarly, Identity Spoofing is thwarted

By including authentication on all services, and creating corresponding secure code on the client, during the automatic code creation of the re-architected application.

 

  • Cross Site Scripting prevention

Requires escaping all user inputs, as well as validating all inputs.  Because applications may use different browsers and client frameworks, automatic static analysis will not always find these vulnerabilities, and developers are hard pressed to close all the cracks.  The automatic code generation of Gizmox Transposition is the perfect solution here too.

3rd.sketchFigure 3: A cross site scripting attack 

  • Double Input Validation

Because any data that is entered or influenced by application users should be untrusted, the migrated application will validate everything twice, both on the client (mostly for convenience and the elimination of unnecessary roundtrips) and on the server, for security as well for data integrity.  It is important to note that not only data directly entered by users is validated but also information in HTTP headers, cookies, GET and PUT parameters including hidden fields, which can all be the source of vulnerabilities.
4th.sketchFigure 4: If data is not validated...

  • Data protection

 Gizmox Transposition has two complementary approaches for data protection.  For ASP.NET MVC applications, the VisualTree layer limits to a minimum both data being transmitted between browser and server and code running and visible on the client.  Encryption can be added automatically to protect especially sensitive information.

  • Standard Logging

 Application logging should not be an afterthought or limited to debugging and troubleshooting.  Gizmox Transposition automatically includes logging for all or specific activities, for monitoring, intrusion detection, compliance verification and auditing purposes.  As the migrated application is the result of automatic code generation, rules can be written at varying granularities to automatically add logging at many levels.

 

  • Error handling & exceptions

Several attack methods rely heavily on causing exceptions, stack overflows etc.  The challenge with applications migrated from legacy frameworks such as VB6 or PowerBuilder is that exception handling in those framework is non-standard and in many cases not adherent to best practices.  Translating business logic which includes such error handling to modern languages such as C# or Java may cause one or both of the following – the business logic will not be reproduced faithfully in the migrated application, and some errors will not be handled by the migrated code, causing potential exceptions to be introduced to the migrated code.  Gizmox Transposition automatically translates legacy error handling patterns to precise and secure patterns in the target language, thus eliminating this potential vulnerability from being introduced. 

An additional challenge with error handling on migrated legacy applications is that often, critical parts of the application have simply not been correctly protected from possible exceptions.  Gizmox Transposition allows the automatic addition of standard error handling and logging, in vulnerable areas of the application such as I/O.


Or in short, the new-gen practice: secure while modernizing!   


In summary, probably the most unique security related feature of a Gizmox Transposition migrated application is that all the above is not added after the fact, or ‘bolted on’ to the application as a result of a vulnerability audit done on the application after it is migrated or written from scratch.  All the previous points are an integral part of the code generation process with the added benefit of the process being both automatic and configurable so specific practices, standard or proprietary will be implemented across the board, with the confidence that only a machine based process can provide.
danger.jpg

How to answer the question: Where do Software patents come from?

(and in particular, could they support a claim for a sole supplier status in bids?)

(and in particular, could they support a claim for a sole supplier status in bids?)

mom and child


First there was an idea.  Even before that there was a need.  Some of us started to see a growing need for modernizing enterprise scale desktop applications based on the RAD tools of the 1990s such as VB6 and PowerBuilder.  These applications may be performing core business tasks in Insurance, government finance or manufacturing, but compared to current web based, mobile ready applications they were starting to show their age.

We can go even further back, when legacy application modernization had a very clear meaning – migrating mainframe applications, usually to open systems and with modern databases.  But legacy applications are a moving target and now we were faced with re-architecting monolithic or Client/Server desktop applications to a completely different architecture – that of a modern web application.



And then they start reading…



(Semantic code reading & DOM creation is unique…)
kids

Guy Peled, CTO and founder here at Gizmox Transposition has already described in detail the unique approach to application modernization  he has chosen which can be summed up as Semantic Code Understanding. 

Semantic Code Understanding uses compiler technology to let our tool build an internal representation of the source application which in turn allows the tool to re-create a new well architected version of the application on an entirely different platform while retaining the original functionality.  See here and here for a more detailed discussion of the actual process.

So Gizmox built a migration tool  based on Semantic Code Understanding and started to perform migrations with it.  This was already quite unique, and the quality of the re-architected applications was quite appealing.  But we saw that unsurprisingly, when delivering a project, the tool users still had ways to improve the migrated applications.  So the next step was adding an IDE which will not only use Semantic Code Understanding to re-architect a legacy application, it will harness the power of Semantic Code Understanding to interactively guide, enhance and safe guard the work of the architect performing the job.

And this is how Gizmox Transposition Studio came to be – a suite of architect tools that harness the power of advanced algorithms to digitally re-write and re-architect a legacy desktop application to a modern web or cloud based applications with full support for mobile experiences.  All of this while keeping the decades long investment in business rules and logic which are the valuable IP our clients need to keep using for application modernization.

Then graduation… how exciting!

(the claims that support sole supplier status for application modernization)


ScreenHunter 124 Sep. 13 17.25


And, the unique claims


This combination has proven to be so unique we just received a USA patent – quite an achievement given the difficulty to patent software processes.  The unique tool features result in the following advantages of the Gizmox Transposition process:

  • Semantic code understanding that allows re writing to the target platform and framework with faithful functional equivalence, and a best practices based architecture.
  • Code quality and maintainability resulting from the retention of the original code structure and business objects (unlike the completely new code which is the result of a re-write).
  • The opportunity to apply custom architectural changes such as adding layers or services, with the confidence and power of the tool based process (block operations).
  • The possibility of revising and improving security in an automatic, repeatable and reliable way – essential especially if migrating to the web.
  • Minimal code freeze requirement allowing continued development until several weeks before the start of QA – a result of the mapping rules mechanism.
  • Automatic translation of dynamically constructed SQL to parameterized LINQ – uniquely made possible through semantic understanding.
  • Last in the list but very significant - minimal requirement on the client’s resources, mostly because of the elimination of the re-specification phase and the reduction in re-training because of functional equivalence.

 

mom_and_child.jpg

Well, surprise, surprise, is Modernization ROI higher than that of off-the-shelf package?

By Navot Peled

"My management has asked me to recommend one of the 3 alternatives; rewrite, off the shelf, or modernization but I have no clue where to start…" is a sentence we modernization experts hear oftentimes when we first engage with an organization. Well, we usually respond, you are not alone with this dilemma and it is never an easy and clear-cut recommendation. It involves many parameters some of which should be decided upon at management strategic level.

By Navot Peled

"My management has asked me to recommend one of the 3 alternatives; rewrite, off the shelf, or modernization but I have no clue where to start…" is a sentence we modernization experts hear oftentimes when we first engage with an organization. Well, we usually respond, you are not alone with this dilemma and it is never an easy and clear-cut recommendation. It involves many parameters some of which should be decided upon at management strategic level.

ROI

Being around for quite sometimes now, I am always impressed when I stumble upon the ability to simplify and model such a decision. I think the Standish Group guys have managed to do it quite nicely. The article might be few years old but the model and the way to calculate the ROI is good and valid now as it was when it was written and I reckon it will continue to be so for many years to come.
The complete article can be found HERE. For those of you who want a quick glance before investing some more time in reading it, here is my short and fast review and some more comments from my experience: 

  • The First thing I would do is try to model the article into a set of questions and answers and the first thing I would ask is; how much more do we pay by using old infrastructure and application frameworks vs.  a "commodity" option? A Commodity infrastructure in my humble understanding is an open standard or a widely used alternative.

Standish breaks it down into 2 major categories; hardware and software, but one may add cost parameters to fit one's specific use cases.  And instead of a mainframe application this could well be an older Client / Server application such as a VB6 based system running on older OSs and servers.  PowerBuilder or Classic ASP as well as a long list of other legacy Client / Server migrations to modern Cloud or Web and Mobile based platforms can also be considered. The basic assumption here is that the new platforms are more efficient  than old frameworks and infrastructures and therefore there should be a good and fast  ROI  If you chose modernization.   

ScreenHunter 72 Aug. 09 11.54 

 

Source: Standish Group - modernization, clearing a pathway to success

 

  • The second question to answer in our new model might be related to the cost of each of the methods. It is a tricky question because we need a common denominator to compare apples to apples. The industry has commonly focused on LOC (Line Of Code) cost as a measurement. In this specific case the way to pre-estimate the cost was by asking for bids from 3 vendors for each method, all together 9 vendors calculating the average and then comparing between the methods. So here is what the article finds, and my calculation per line of code, shown in a matrix format:


Item Average bid cost ($) Average bid Time (years)

Average cost per
line of code bid ($)

Rewrite 10M 3 5
Off-the-shelf 5M 2 2.5
Modernization

3.5M

1.5  1.75 


             Scope: 2 Million lines of code

 

  • Now it is time to refer to actual benchmarks (statistics) in order to calculate the ROI. The question here would be what will the actual cost and time of such a project be with each of the compared methods?  One may note that modernization has significant and clear advantages over both alternatives being the lowest in cost and shortest in time overrun.

Here is the matrix:

table2 

table3



  • And finally based on the parameters above, the inevitable question. So based on the given parameters (and more in the article), which of the 3 methods demonstrates the highest ROI? 

The conclusion is quite clear: Modernization delivers the highest ROI, 5 times higher than for a manual rewrite and almost twice higher than an off-the-shelf package! 



table4 

For more details download the free Standish Group Article HERE

ROI.jpg
Page 1 of 4