!997, I am back in Southern Ontario, doing project work for my new insurance company employer, mostly forgettable stuff like how to restructure life insurance policies that had been sold on the promise of "Vanishing Premiums". I did escape the whole Y2K thing, so I can be thankful for that.
One interesting thing I worked on was ETL tools.I had done reading on Data Warehousing, especially Bill Inmon's seminal book on the topic. A lot of it was/is dependent on getting data from transaction systems into the DW in summarized, time-dependent structures. This was “Extract, Transform and Load”(ing) of data, ETL.
A different need for ETL came up at my new employer. Its growth had been fueled by acquiring smaller competitors. When you buy another insurance company, you get all its in-force (premium generating) polices, and all the computer systems that had been used at the company being bought. All of these systems were usually pretty unique, so my employer would end up having to run several systems for the same kind of insurance product, often variations of the same dominant insurance package system of the time. This was expensive and wasteful of resources, so the inherent desire was to migrate all the policies on all the acquired systems on to the latest/greatest system being used for new business.
That ain’t easy. Insurance is a non-physical product, so actuaries can dream up almost anything that they can sell and make money, and systems have to be customized to support very unique products. So the first hard thing to do was to figure out what the products/policies on the old systems really were, and if the new system could support them. That might mean the new system needs to be changed. After that, you have to get the data out of the old system, and get it into a format the new system could read and use to add policies. That could mean a lot of complicated and time-consuming programming, to create code that would only be used once, when the conversion from old to new system is done. My feeling was there had to be a better way, and a quick check of the marketplace showed that commercial ETL products were now available. I wanted something where the interface matched source data to the target database, accepted transformation rules, handled different types of files and databases, and generated the code to do the ETL process.
I found many such tools, and recommended we investigate them to acquire one to save time on ETL programming, as we had so many conversions to do and probably more coming in the future. I was in a line area, and most tech evals were done centrally, so I contacted the central area and they said they had no time, so go ahead yourself. So I contacted people in other line areas who agreed this might help them too, and got a part-time team together, generated requirements and went out and evaluated tools.
The best one was a product from an Austin company called ETI. It emerged from a research think-tank that major vendors and universities supported. It did exactly what I describe above.
I did up a full cost-benefit analysis, showed it was a good buy, and it had to go up to my VPs boss for approval because of all the other areas involved.
So we get the tool in, I and another analyst get trained, and we are lined up to do the next project, and then the push-back begins. The lead programmer decides she doesn’t want to get trained because she would have to travel, and she doesn’t like to travel, and the PM says OK. It becomes apparent that despite a good cost-benefit and management approval, programmers and some analysts don’t think much of the ETL tool; generating code meant no programming, and who wants that? They could not see that it meant no boring, repetitive programming, freeing up resources for new, interesting stuff.
So the whole thing stalled, which can happen when trying to implement change that frightens some people. It was too bad, the tool really worked as advertised. It was a sign...
No comments:
Post a Comment