Ok, now I am trained up to be a Business Analyst, and an opportunity to be one was about to present itself. But first, a little about changing environments.
When I joined and for the first few years I was there, Crown was PL/1-IMS mainframe shop with all systems developed in-house. I don't know how this came about, as so many companies used COBOL and VSAM files. The fact that Crown Life did not use COBOL was one of the reasons I accepted their job offer in 1979. The only old war story I heard from some company veterans was when the first computer was installed at the company in the mid-60's. It was a big deal, all about "turning the big switch" to move the company on to using a computer.
Development of new systems moved in cycles, of course, depending on what needed built/replaced, allocation of budgets, etc. After the development of the new Stocks/Bonds Admin system was done, the focus moved to the Group Insurance business. Crown Life did a lot of business in the United States, like most Canadian insurance companies of any real size, and it was still possible to make money in group health business as well as life, before health costs sky-rocketed and HMOs took over.
So, the company had a number of group sales offices around the United States doing business, and they needed a new system. Something about the cost of using billable mainframe cycles probably played a role in the decisions to create a system based on a mini-computer located in each sales office, which employees would use all day, and each mini would then feed the days transactions to a master system on the mainframe in Toronto for consolidated processing; sounded reasonable, I guess. Adding minis to our mainframe environment was new, and probably should have been treated as a risk, but I was 26 years old and not on that project, so can't say I worried about it much myself.
The team for this project was located next to the department I was in. I recall that I did not know many of these people directly, but they seemed like a good enough bunch. Reporting about project status to people outside the project always presented things in a good light, but that kind of reporting usually does, no matter what is actually happening.
Because something bad was happening on that project; I found out about it like most of the rest of the company when I came to work one day and the area with that project team was empty, and stayed empty. The mini-mainframe mix had not worked out (more on that later), so the project was cancelled and the whole team up to director level was let go, fired. I don't believe the 'down-sizing' euphemism had been invented yet, but this was the first one I ever saw like this, quick and brutal. I think this was the cause of me realizing what a lot of other people were realizing as the 80's progressed: companies could just not afford to be completely loyal to their employees, so it was time to start managing your career yourself, start looking out for #1. Obviously this is when employee loyalty to their employers started to disappear too, and management magazines over the next 10 years or so had to gall to print articles asking "why?".
Anyway, the Group Business division still needed a system, so senior management brought in a new team that went right out and bought a Group Insurance COTS package. It was totally mainframe, so whoever wanted to avoid mainframe costs was ignored (or had been one of those who was fired), and it was COBOL-CICS-VSAM.
This was the first package I had seen purchased at Crown Life, so the developing of all systems in-house could not be assumed anymore; and the pure PL/1-IMS environment was no more. Business realities would now drive what systems and technologies would be used.
PS: About the reason why the mini-mainframe mix failed... about 5 years later, I was on a training course near Washington DC, and the direct flight for myself and a co-worker also attending back to Toronto was canceled. So, we ended up on a travel nightmare, taking a flight that stopped 3 times and we had to change planes once. A couple of other travelers were doing the same thing, so we got to talking (especially in the bar between planes) and turned out they were AT&T techies. When I said we were from Crown Life, they rolled their eyes and said that we may not want to hang out for the rest of the trip.
Why? It turns out that for the minis in those Group offices to communicate each night with the mainframe, they would need special or dedicated lines (any network techies reading this will probably know why). In those days, those lines were not easy to get, and Crown Life ended up on a waiting list measured in years. When that situation became apparent to all, it wasn't too long after that I came into work and found that project team gone.
PPS: Crown Life apparently sued AT&T over this. They had a contract, of course, and they claimed the delays breached the contract or something like that. As a Crown Life employee, I never heard about this lawsuit, but the AT&T guys we were traveling with sure had; I don't know how the suit ended up, but Crown Life was not a favorite of AT&T for a long time; but us foot soldiers who had met at random decided we would have a few more beers and not let company issues bother; we just wanted to get home that night.
So, when I started on the next big project, all assumptions had changed.
Tuesday, September 29, 2009
Monday, September 28, 2009
Memories of IT - 1984 - Structured Techniques
So, this many posts in and I am still less than 5 years into my career... but an important shift in the story is imminent.
As I came off pure programming work like the new devevelopment project I have been describing, the Analyst part of my Programmer/Analyst job title started to dominate. Some of it was the work I moved on to, but also a parallel path of training that the company paid to have me attend. Everyone has a complaint or two about anywhere they have worked, but my first employer did understand that training was vital to the development and overall happiness of their employees.
Some of the training was general skills, like public speaking and on giving presentations. On the IT side of things, structured techniques were all the rage, broken into the parts of Structured Analysis and Structured Design. As I was still primarily a programmer, I recall I attended a Structured Design course first; it was given by an external vendor. The actual diagrams and techniques have faded from memory, but I do recall it is where I first learned about"high cohesion" and "low coupling". The course also showed how it used the results of Structured Analysis as input to Structured Design.
So, I continued along, doing enhancement work on existing systems. Understanding what the business wanted out of the enhancement, and then determining the impact it would have a on a system was the majority of the work; the actual programming work needed could often be minimal. So, it seemed like maybe I was turning into an Analyst who programs a little. I think the company did have the Business Analyst job title already, so I veered my career path towards that title.
That meant I got to attend the next offering of Structured Analysis training. The course used one of the era's gurus as a basis for the course: DeMarco or Constantine or Yourdon or whomever. I should have kept my course material, it would be a classic now!
It was on this training I was first introduced to the Data Flow Diagram, orDFD. The idea of diagramming what your system should do really appealed to me, even if the diagrams were hand-drawn and hard to change. Pencils and erasers is what I remember of the course and subsequent work back at the office. That would have caused a lot of people to use it less or stop using it, but not me. My future was being defined right there in that course.
Next time - the next big project
As I came off pure programming work like the new devevelopment project I have been describing, the Analyst part of my Programmer/Analyst job title started to dominate. Some of it was the work I moved on to, but also a parallel path of training that the company paid to have me attend. Everyone has a complaint or two about anywhere they have worked, but my first employer did understand that training was vital to the development and overall happiness of their employees.
Some of the training was general skills, like public speaking and on giving presentations. On the IT side of things, structured techniques were all the rage, broken into the parts of Structured Analysis and Structured Design. As I was still primarily a programmer, I recall I attended a Structured Design course first; it was given by an external vendor. The actual diagrams and techniques have faded from memory, but I do recall it is where I first learned about"high cohesion" and "low coupling". The course also showed how it used the results of Structured Analysis as input to Structured Design.
So, I continued along, doing enhancement work on existing systems. Understanding what the business wanted out of the enhancement, and then determining the impact it would have a on a system was the majority of the work; the actual programming work needed could often be minimal. So, it seemed like maybe I was turning into an Analyst who programs a little. I think the company did have the Business Analyst job title already, so I veered my career path towards that title.
That meant I got to attend the next offering of Structured Analysis training. The course used one of the era's gurus as a basis for the course: DeMarco or Constantine or Yourdon or whomever. I should have kept my course material, it would be a classic now!
It was on this training I was first introduced to the Data Flow Diagram, orDFD. The idea of diagramming what your system should do really appealed to me, even if the diagrams were hand-drawn and hard to change. Pencils and erasers is what I remember of the course and subsequent work back at the office. That would have caused a lot of people to use it less or stop using it, but not me. My future was being defined right there in that course.
Next time - the next big project
Wednesday, September 16, 2009
Memories of IT - unexpected consequences of system development
(Actually, I need to wrap up a few things and start a few others before I get back to PCs...)
So, I came off of the big PL/1-IMS development project, as it wound down, 1983 or so. The system from my viewpoint was brilliant, supporting all the stock and bond investment business of the company. I worked a lot on look-up screens, starting with a definition of a security and breaking it down to all the lowest levels of investments the company had in that security. A friend of mine was quite proud of a program he wrote to allocate bond income across the company's complete portfolio after one hit of the enter key. However, it still faced two challenges:
1. the easy-to-program architecture of the online system did require more of the system to be loaded in memory at a time than an architecture based on a single screen structure. As a result, the online system always had lowest priority among all the online IMS systems, and so response to the users was always slow.
2. the actual securities traders had been doing business over the phone and writing down their trades on scrips/paper, then handing these to clerks to code up transactions to feed the existing batch systems. A major proposed benefit of going to an on-line system is that the traders would now enter their trades directly into the system, giving real-time updates and removing the clerical effort. So, our BA/PM shows a test version of the system to management and the traders, to show just how this was all going to work for them. Apparently the traders weren't aware of this benefit, and they reacted very negatively. Typing anything was for clerks and secretaries, so using a system that required typing was beneath them. So, when the system went live, traders continued to write their trades down and handed them to the clerks, whose jobs continued but as the new users of the online system.
The system did go on to have a useful life of about 10 years. I did not work on it again, so I don't know if the traders ever warmed to it. I do recall some outside consultants doing reviews of our existing systems later in the decade, and they said something stupid about this one; given its volumes compared to say, our core individual insurance systems, they wanted to know why we had not developed it on a mini-computer. I think the architect,s head probably almost exploded. I suppose not owning a mini-computer did not mean anything, nor the fact that that the company's core IT skills at the time of development were all on mainframes. By the time of this review, however, other changes had occurred; packages were being purchased, which meant COBOL and CICS were invading our previously pristine PL/1-IMS world. The architect left the company not too much later. Breck Carter, wherever you are, if you see this, drop me a line.
So, I came off of the big PL/1-IMS development project, as it wound down, 1983 or so. The system from my viewpoint was brilliant, supporting all the stock and bond investment business of the company. I worked a lot on look-up screens, starting with a definition of a security and breaking it down to all the lowest levels of investments the company had in that security. A friend of mine was quite proud of a program he wrote to allocate bond income across the company's complete portfolio after one hit of the enter key. However, it still faced two challenges:
1. the easy-to-program architecture of the online system did require more of the system to be loaded in memory at a time than an architecture based on a single screen structure. As a result, the online system always had lowest priority among all the online IMS systems, and so response to the users was always slow.
2. the actual securities traders had been doing business over the phone and writing down their trades on scrips/paper, then handing these to clerks to code up transactions to feed the existing batch systems. A major proposed benefit of going to an on-line system is that the traders would now enter their trades directly into the system, giving real-time updates and removing the clerical effort. So, our BA/PM shows a test version of the system to management and the traders, to show just how this was all going to work for them. Apparently the traders weren't aware of this benefit, and they reacted very negatively. Typing anything was for clerks and secretaries, so using a system that required typing was beneath them. So, when the system went live, traders continued to write their trades down and handed them to the clerks, whose jobs continued but as the new users of the online system.
The system did go on to have a useful life of about 10 years. I did not work on it again, so I don't know if the traders ever warmed to it. I do recall some outside consultants doing reviews of our existing systems later in the decade, and they said something stupid about this one; given its volumes compared to say, our core individual insurance systems, they wanted to know why we had not developed it on a mini-computer. I think the architect,s head probably almost exploded. I suppose not owning a mini-computer did not mean anything, nor the fact that that the company's core IT skills at the time of development were all on mainframes. By the time of this review, however, other changes had occurred; packages were being purchased, which meant COBOL and CICS were invading our previously pristine PL/1-IMS world. The architect left the company not too much later. Breck Carter, wherever you are, if you see this, drop me a line.
Monday, September 14, 2009
Memories of IT - PC enters the home, then the office
In the early 80's, I was one of the younger people in my department, even with regular new hires coming on. It would actually annoy some people...
However, we would also have co-op students, who went to university for a term, then work at a company like ours for a length of term, then go back to school. The good ones would be invited back for more work terms and often came on permanent after graduating. It was at a work-party hosted by one of these students that I saw my first personal computer. I think it had to be an early Apple, but can't be sure now. Anyway, the main thing it was running and people were trying out was one of those Alien Invasion games: bad aliens dropped from the top of the screen and you moved your weapon left and right on the bottom of the screen shooting upwards to kill the aliens before any got to the ground. Well, I thought this looks like fun, and took my turn to play, but as my previous post said, eye-hand coordination is not my strong point, and you had to use certain keys to move and shoot which I wasn't familiar with, so I lasted about 30 seconds before I lost. The surrounding young folks hooted and basically told me I sucked, so I left the game room, returned to the rest of the party and got a beer. If that was personal computing, I said to myself, then they can keep/stuff it. When the young folks weren't playing the game, they were going on about programming the thing, in Basic I guess, which I thought was mickey mouse; I mean, I programmed an IBM mainframe for a living, you could stuff your toy programming too.
So, all in all, the arrival of the PC was not something I was promoting any time soon; but one day, an IBM PC was delivered and set-up in our department. character-based screen and two floppy drives (A: and B:), and a daisy-wheel printer attached. I have to think that the amount of mainframe cycles and laser printing we were using for documents was seen to be costing too much money, so what about this PC thing for doing that?
First off, many people thought the printer was horrible, still using fold-attached paper fed through a wheel with a print quality barely above that of crayons, so people stuck to their PDS members and mainframe laser printing. The thing that got one person using the PC was Visicalc, the first PC spreadsheet. It was the BA/Manager of the big development project we were all on, and she had to do either a business case or status report with a lot of numbers to calculate and add up; well, she thought Visicalc was even better than sliced bread, and I can see why. There was nothing like it in all our mainframe programs and utilities; she might have been the first person to eventually get her own dedicated PC.
Meanwhile, the rest of us are trying out the PC as encouraged by our managers, so you sit down with at least three 5.5 inch diskettes. The first is DOS; insert it in drive A and turn the PC on. The machine would boot, and I don't recall it taking too long (long boots were still in the future). Once you get the A: prompt, take out the DOS disk and insert your Multimate disc, it being the first PC word processor of any note. So, enter MM or something at the A: prompt and it loads. It was green screen, no graphics, and so you created a document, a blank space to type in. When you wanted to save your work, insert a blank floppy disk in drive B: and save it there…as long as you had remembered to format it using DOS before hand. When you are done, take the third disk with you with your work on it, and leave the other disks for the next user.
Given all this, the PC did not really get a lot of users, but my use of a PC was about increase dramatically.
However, we would also have co-op students, who went to university for a term, then work at a company like ours for a length of term, then go back to school. The good ones would be invited back for more work terms and often came on permanent after graduating. It was at a work-party hosted by one of these students that I saw my first personal computer. I think it had to be an early Apple, but can't be sure now. Anyway, the main thing it was running and people were trying out was one of those Alien Invasion games: bad aliens dropped from the top of the screen and you moved your weapon left and right on the bottom of the screen shooting upwards to kill the aliens before any got to the ground. Well, I thought this looks like fun, and took my turn to play, but as my previous post said, eye-hand coordination is not my strong point, and you had to use certain keys to move and shoot which I wasn't familiar with, so I lasted about 30 seconds before I lost. The surrounding young folks hooted and basically told me I sucked, so I left the game room, returned to the rest of the party and got a beer. If that was personal computing, I said to myself, then they can keep/stuff it. When the young folks weren't playing the game, they were going on about programming the thing, in Basic I guess, which I thought was mickey mouse; I mean, I programmed an IBM mainframe for a living, you could stuff your toy programming too.
So, all in all, the arrival of the PC was not something I was promoting any time soon; but one day, an IBM PC was delivered and set-up in our department. character-based screen and two floppy drives (A: and B:), and a daisy-wheel printer attached. I have to think that the amount of mainframe cycles and laser printing we were using for documents was seen to be costing too much money, so what about this PC thing for doing that?
First off, many people thought the printer was horrible, still using fold-attached paper fed through a wheel with a print quality barely above that of crayons, so people stuck to their PDS members and mainframe laser printing. The thing that got one person using the PC was Visicalc, the first PC spreadsheet. It was the BA/Manager of the big development project we were all on, and she had to do either a business case or status report with a lot of numbers to calculate and add up; well, she thought Visicalc was even better than sliced bread, and I can see why. There was nothing like it in all our mainframe programs and utilities; she might have been the first person to eventually get her own dedicated PC.
Meanwhile, the rest of us are trying out the PC as encouraged by our managers, so you sit down with at least three 5.5 inch diskettes. The first is DOS; insert it in drive A and turn the PC on. The machine would boot, and I don't recall it taking too long (long boots were still in the future). Once you get the A: prompt, take out the DOS disk and insert your Multimate disc, it being the first PC word processor of any note. So, enter MM or something at the A: prompt and it loads. It was green screen, no graphics, and so you created a document, a blank space to type in. When you wanted to save your work, insert a blank floppy disk in drive B: and save it there…as long as you had remembered to format it using DOS before hand. When you are done, take the third disk with you with your work on it, and leave the other disks for the next user.
Given all this, the PC did not really get a lot of users, but my use of a PC was about increase dramatically.
Friday, September 11, 2009
Memories of IT - could computing be fun?
The time periods covered in the previous posts overlap in some cases, so the PC did not just appear in a poof of smoke where I was working, totally unexpected. Lets pause and talk about computing for fun.
I am guessing the first CRT most people of a certain age used was on a video arcade game. Before that, there was pinball and other physical games like baseball (machine pitched a ball and you hit it with a bat in the same way as pinball flippers.) I played these games in many places as a youngster. A bowling alley on a Lake Huron beach comes to mind, and a few trips to the UK where I saw games where you dropped a coin on top of a whole lot of other coins in the hope it would be the one that tips that whole pile of coins into a slot that delivered them to you; my first exposure to gambling, I suppose; And there was skee-ball, trying to win coupons to cash in for cheap toys/trinkets. ( I mentioned earlier about playing Adventure on a teletype at university, but was really only available to students of the time.)
Then the first CRT-based game(s) appear. Was it Pong? Wikipedia would tell us, I guess. What I remember from the 80's was the arcade in the mall next to the office where I first worked, and it had about 20 machines lined up. Here is where I first learned that my eye-hand coordination skills were average or worse. All these games were based around scoring a level of points to get free games (taken from pinball I suppose) or move to the next level. I could get a free game or two, but would crash and burn not long after that point. Some of my-coworkers could play a game for hours on one quarter. They also set up tournaments for co-workers, which I did not enter.
The main game I recall of this period was the first Star Wars game. It was a first-person flyer-shooter, you were Luke in your fighter attacking the Deathstar; you started by battling Imperial fighters, then to the surface to get past tower cannons, and if you survived to that point, you flew into the trench to shoot the bomb down the hole while Darth Vader tried to shoot you down.
If you succeeded, you would start all over again at an increased difficulty level. I recall I managed to get through the first level and into the next, but never farther. Then a guy I worked with would step up and go thru 3 or 4 or 5 levels, and might just walk away before losing, because lunch hour was over. I felt SO inadequate as a game-playing male.
(This reminds me that at this point, really eccentric characters were still something you could be working with --- at the company, they were programming whizzes or knew stuff nobody else knew; so long hair or dread-locks was OK, plus guys in weird suits with fedoras, and more, and they were all great game players; but they all moved on at some point, maybe to develop games.)
Any way, the graphics on the game were neon lines on a black background., simple but effective for a game set in space. The soundtrack was the familiar movie score, with clips of things like Obi-Wan saying "Luke, use the force!". It was fun, I played it a lot, but not once did I walk away because my lunch was over...
I am guessing the first CRT most people of a certain age used was on a video arcade game. Before that, there was pinball and other physical games like baseball (machine pitched a ball and you hit it with a bat in the same way as pinball flippers.) I played these games in many places as a youngster. A bowling alley on a Lake Huron beach comes to mind, and a few trips to the UK where I saw games where you dropped a coin on top of a whole lot of other coins in the hope it would be the one that tips that whole pile of coins into a slot that delivered them to you; my first exposure to gambling, I suppose; And there was skee-ball, trying to win coupons to cash in for cheap toys/trinkets. ( I mentioned earlier about playing Adventure on a teletype at university, but was really only available to students of the time.)
Then the first CRT-based game(s) appear. Was it Pong? Wikipedia would tell us, I guess. What I remember from the 80's was the arcade in the mall next to the office where I first worked, and it had about 20 machines lined up. Here is where I first learned that my eye-hand coordination skills were average or worse. All these games were based around scoring a level of points to get free games (taken from pinball I suppose) or move to the next level. I could get a free game or two, but would crash and burn not long after that point. Some of my-coworkers could play a game for hours on one quarter. They also set up tournaments for co-workers, which I did not enter.
The main game I recall of this period was the first Star Wars game. It was a first-person flyer-shooter, you were Luke in your fighter attacking the Deathstar; you started by battling Imperial fighters, then to the surface to get past tower cannons, and if you survived to that point, you flew into the trench to shoot the bomb down the hole while Darth Vader tried to shoot you down.
If you succeeded, you would start all over again at an increased difficulty level. I recall I managed to get through the first level and into the next, but never farther. Then a guy I worked with would step up and go thru 3 or 4 or 5 levels, and might just walk away before losing, because lunch hour was over. I felt SO inadequate as a game-playing male.
(This reminds me that at this point, really eccentric characters were still something you could be working with --- at the company, they were programming whizzes or knew stuff nobody else knew; so long hair or dread-locks was OK, plus guys in weird suits with fedoras, and more, and they were all great game players; but they all moved on at some point, maybe to develop games.)
Any way, the graphics on the game were neon lines on a black background., simple but effective for a game set in space. The soundtrack was the familiar movie score, with clips of things like Obi-Wan saying "Luke, use the force!". It was fun, I played it a lot, but not once did I walk away because my lunch was over...
Thursday, September 10, 2009
Memories of IT - early 80's - my own terminal, plus email and laser printers
So, I and several other programmers are assigned full-time to the development project. To be productive, management figured out we needed our own dedicated terminals, so we got them, and so did everyone else in the department over time.
A major long-term impact of this was that now we could move from semi-open floor space, where terminals could be shared, to full-on cubicle farms, and that's what happened. Its funny now, because lots of offices are trying to get back to open space, and being trapped in a cubicle is treated as torture, but we were all thrilled when we each got our own 3 and half walls. I think this has carried over into my choices in housing; open-concept and high-ceilings means wasted space. I want walls, doors, and each floor of my house to cover all the available space.
What else was going on about this time? We got internal, mainframe-based email. Up till then, we were still using triplicate memo forms to hand-write messages and send them by inter-office snail mail; can't say I missed that very much, but email was charged for internally for its use of external mainframe cycles, so some departments refused to use it because of that cost, and it usually turned out to my users; but the email worked well, just still text on a green screen and only within the company.
Around this time the company bought its first laser printer for the mainframe, producing excellent quality printing on standard letter and legal paper. What you would do is insert printer commands in your code to produce reports that looked good. We also started typing memos and documents in TSO PDS members, and you would add printer codes that specified font, bold/italics, spacing and more. I think this when my typing started to get faster because of constant use, although I have still never learned how to type like a typist would. (They didn't teach typing to boys in high-school back in the 70's, that was for girls, who also learned shorthand, so they could go right out there and be a secretary! I have to think shorthand really has to be a lost skill by now.)
But even with mainframe email and laser printers to dazzle us, lurking out there was that next great paradigm-shift: the IBM PC.
A major long-term impact of this was that now we could move from semi-open floor space, where terminals could be shared, to full-on cubicle farms, and that's what happened. Its funny now, because lots of offices are trying to get back to open space, and being trapped in a cubicle is treated as torture, but we were all thrilled when we each got our own 3 and half walls. I think this has carried over into my choices in housing; open-concept and high-ceilings means wasted space. I want walls, doors, and each floor of my house to cover all the available space.
What else was going on about this time? We got internal, mainframe-based email. Up till then, we were still using triplicate memo forms to hand-write messages and send them by inter-office snail mail; can't say I missed that very much, but email was charged for internally for its use of external mainframe cycles, so some departments refused to use it because of that cost, and it usually turned out to my users; but the email worked well, just still text on a green screen and only within the company.
Around this time the company bought its first laser printer for the mainframe, producing excellent quality printing on standard letter and legal paper. What you would do is insert printer commands in your code to produce reports that looked good. We also started typing memos and documents in TSO PDS members, and you would add printer codes that specified font, bold/italics, spacing and more. I think this when my typing started to get faster because of constant use, although I have still never learned how to type like a typist would. (They didn't teach typing to boys in high-school back in the 70's, that was for girls, who also learned shorthand, so they could go right out there and be a secretary! I have to think shorthand really has to be a lost skill by now.)
But even with mainframe email and laser printers to dazzle us, lurking out there was that next great paradigm-shift: the IBM PC.
Tuesday, September 08, 2009
Memories of IT - A New In-House Development Project
So, a couple of years into being a Programmer-Analyst... it's still the early 80's, back when jobs had numbers after them, so you would start as a PA 1, progress to PA 2, and so on. Progression also meant raises along with annual increases. I would say this was the period of my career, unique to the time, where my salary grew the fastest. No one had invented bonuses based on company success, you made what you made without concern for company profits, while realizing that low or no profit was not good for job security.
So, a couple of years in, the Corporate Systems department gets a new development project! This is full in-house development, PL/1, IMS DB/DC. The company had been using a system similar to the mortgage system I have described, but for securities: stocks, bonds, money markets, etc. I was not in on the process that led to the decision, still being a junior programmer busy on current projects, but looking back now it must have been a reaction to the realization that every-other-night batch processing was not going to be good enough for stocks and bonds trading. So, this was going to be an online system, where the trades on the markets would be captured right after they occurred. It was not a trading system itself, nor was it connected to one, so the traders were going to record what they did in this new securities administration and control system (more on how that worked out, later).
As I mentioned earlier, the core Individual Insurance Systems were already online using this technology, but it would be the first such system in a Corporate area. Our senior programmer/designer/architect took on the challenge of building a trial online system first, to show it could be done in Corporate Systems, It was an easy-to-develop structure he came up with.
Most online systems have one program for each screen used in the system that handles both out put to the screen, and receiving input from the user. I saw this structure as awkward, as it could be tough to discern which part of a program was being used at any one time. Our architect devised a master control program that would, for example, discern what screen had been just used by the user and pass its input to a program built to process input from that screen. It would do all the editing and, if there were input errors, it would pass that information to a program built to send data and messages to that screen, which would do just that. The user has the response and does what they need to do, hit enter and the whole process starts again. Once the input program is OK with the data its has received, it will then do one of many possible things defined for it; for example, the user may have indicated they want to navigate to another screen, so the input program would pass control (and any needed data) to the output program for that other screen, it does what it does and sends data out to the user, etc. etc.
Just want to remind the reader that this is still a mainframe, green screen system, so next time I will recall how we actually built it, and the good and bad things that happened.
So, a couple of years in, the Corporate Systems department gets a new development project! This is full in-house development, PL/1, IMS DB/DC. The company had been using a system similar to the mortgage system I have described, but for securities: stocks, bonds, money markets, etc. I was not in on the process that led to the decision, still being a junior programmer busy on current projects, but looking back now it must have been a reaction to the realization that every-other-night batch processing was not going to be good enough for stocks and bonds trading. So, this was going to be an online system, where the trades on the markets would be captured right after they occurred. It was not a trading system itself, nor was it connected to one, so the traders were going to record what they did in this new securities administration and control system (more on how that worked out, later).
As I mentioned earlier, the core Individual Insurance Systems were already online using this technology, but it would be the first such system in a Corporate area. Our senior programmer/designer/architect took on the challenge of building a trial online system first, to show it could be done in Corporate Systems, It was an easy-to-develop structure he came up with.
Most online systems have one program for each screen used in the system that handles both out put to the screen, and receiving input from the user. I saw this structure as awkward, as it could be tough to discern which part of a program was being used at any one time. Our architect devised a master control program that would, for example, discern what screen had been just used by the user and pass its input to a program built to process input from that screen. It would do all the editing and, if there were input errors, it would pass that information to a program built to send data and messages to that screen, which would do just that. The user has the response and does what they need to do, hit enter and the whole process starts again. Once the input program is OK with the data its has received, it will then do one of many possible things defined for it; for example, the user may have indicated they want to navigate to another screen, so the input program would pass control (and any needed data) to the output program for that other screen, it does what it does and sends data out to the user, etc. etc.
Just want to remind the reader that this is still a mainframe, green screen system, so next time I will recall how we actually built it, and the good and bad things that happened.
Thursday, September 03, 2009
Memories of IT - early 80's - Computing Goes Online
So, I am in a department of 25 people, all working with coding sheets, until one day, a couple of terminals appear. They are put in the middle of the department, so each person has to schedule time to use one. This is when I first used TSO seriously, using both the command line and the menu-based SPF function, I learned what Partitioned Data Sets were, and how to create files online and allocate space. The main reason to use them was that you could extract a program from the Librarian tool, edit it in TSO and then put it back in Librarian; look ma, no punch cards!. This led to the first "religious war" I saw; if you used Librarian in batch mode, it would track all the changes made to the code, providing a history of changes over time. However, if you used TSO, that updated code would be placed back in Librarian like it was a new program, no history recorded. Well, people were aghast that the history was going to be lost, how could you debug a program with knowing that history, etc. etc.
The fact that I don't remember if I cared one way or the other is the first indication that I have almost never "chosen sides" in debates like this. I might express an opinion, or actually be the analyst defining the pros and cons, but once a decision was made, I was never the person who would be whining 6 months later "we should have done it the other way..." You gotta go with what's been decided, otherwise it's time to move on.
In this case, TSO updating won out and Librarian was phased out over a couple of years. The corollary technique to change history was commenting your program code, inserting text at regular intervals to describe what the program was doing; so, it was decided that significant changes be noted as comments too.
What this led to was a need for more terminals. Over time, we got enough to have one for every two people. It sat on a swivel table between desks, so you could move it to use it when you had something to enter; the other person would do other work until their turn with the terminal came up again. This would still be in the early 80’s, just before a big new project that would change how we worked some more...
Next Time: A new development project
The fact that I don't remember if I cared one way or the other is the first indication that I have almost never "chosen sides" in debates like this. I might express an opinion, or actually be the analyst defining the pros and cons, but once a decision was made, I was never the person who would be whining 6 months later "we should have done it the other way..." You gotta go with what's been decided, otherwise it's time to move on.
In this case, TSO updating won out and Librarian was phased out over a couple of years. The corollary technique to change history was commenting your program code, inserting text at regular intervals to describe what the program was doing; so, it was decided that significant changes be noted as comments too.
What this led to was a need for more terminals. Over time, we got enough to have one for every two people. It sat on a swivel table between desks, so you could move it to use it when you had something to enter; the other person would do other work until their turn with the terminal came up again. This would still be in the early 80’s, just before a big new project that would change how we worked some more...
Next Time: A new development project
Wednesday, September 02, 2009
Memories of IT - early 1980s - Starting to work...
So what was my actual first work? Maintenance, making changes here and there in the Mortgage system. Sometimes there would be a need for a new report, so that might be written from scratch, but often it was take an existing report/program, copy it and change it. Less often, I got to create something totally new; I recall an accounting reconciliation function I developed, matching detail mortgage transactions to GL entries; it was big, intricate, and I was quite proud of it. All of this was done under the direction of an experienced programmer, a nice woman whose name I really wish I could remember, because she got me off to a good start, but left the company not too much later.
Looking back, the main maintenance work is where I first started doing more analysis than coding. Figuring out what the system was doing and how to change it to do what new thing was being asked for, that took more time and effort that the actual coding changes.
I was also lucky to start out in area that literally supported more systems than there were people in the department, so I got to work on many applications over time, from Real Estate Management to Shareholder Reporting to IT Chargeback. I contrast this with the company's main individual life insurance system, which was huge and had a whole department just for its care and feeding.
Other things to note about this period, first half of the 80s:
Looking back, the main maintenance work is where I first started doing more analysis than coding. Figuring out what the system was doing and how to change it to do what new thing was being asked for, that took more time and effort that the actual coding changes.
I was also lucky to start out in area that literally supported more systems than there were people in the department, so I got to work on many applications over time, from Real Estate Management to Shareholder Reporting to IT Chargeback. I contrast this with the company's main individual life insurance system, which was huge and had a whole department just for its care and feeding.
Other things to note about this period, first half of the 80s:
- All the systems in use had been developed in-house
- Batch systems were on their way out; the life system I mentioned above was on-line, using IMS DB and DC. This led to the first down/out-sizing I saw in my career; the internal keypunch group was phased out. The current staff were offered positions with an outside company that continued to do the same work for Crown while it was still needed, but with the obvious expectation that it would be needed less and less over time. At some point, the cards themselves were phased out, with the data being entered in files that mimicked the cards, and those files being used as input to batch jobs.
- The technology/tools used by programmers was also changing; more on that next time.
Tuesday, September 01, 2009
Memories of IT - 1980 - My First System
So, about 6 weeks of PL/1 training, coding some pretty big programs (what they actually were/did, I don't remember) and I am ready for some work. The system I start on is for Mortgage administration; the company loaned money to both individuals and businesses, although the individual business was wound down by the mid-80's. This also meant staff mortgages as a benefit, which I and my wife eventually took advantage of. (oh yeah, got married in 1979 too; she was not in and never has been in IT, a good thing in the long run I think).
So, the Mortgage System, MTG for short. It was a batch system, in PL/1 but not with flat files or VSAM. Someone earlier in the decade had written a little DBMS system that this and other systems used. It was hierarchical, so each mortgage was a root record and had children records of various types, like for the payments collected; this was the Master File. Transactions for the system were written on custom input sheets that went to the keypunch group. All the transactions were gathered up and run against the Master File overnight. The input transactions would be sorted by Mortgage Number/ID, and the main batch program would process Master and transactions in Number order. It would skip past Mortgages that had no transactions, and then apply the transactions against Master records that did. It ran every other night, which gave the business time to look at output one day and then code new transactions the next day.
As I look back it now, it was a pretty good system. Since it was all batch, looking at info for a Mortgage meant printouts. To save paper, the system used microfiche. A printout for each mortgage was put on fiche to begin with, and then each time a mortgage was updated, all the updated ones would get a new-printout on a new fiche. The fiche was numbered, and there was one fiche that had an index of which number fiche to find a mortgage printout on. That index was recreated after each batch run, and would point to the new fiche as needed. Once a quarter, when a lot of new fiche had been added, the system would print a new set of fiche, and the process would start again. There were also monthly jobs for reports, and an annual run for more reports and some housekeeping, like purging paid-off mortgages from the master.
I suppose everyone has a soft spot for their first system, like their first car (mine was a 68 Ford XL) or other firsts. The system would not last forever, but that is another post...
Next time: Starting to Work
So, the Mortgage System, MTG for short. It was a batch system, in PL/1 but not with flat files or VSAM. Someone earlier in the decade had written a little DBMS system that this and other systems used. It was hierarchical, so each mortgage was a root record and had children records of various types, like for the payments collected; this was the Master File. Transactions for the system were written on custom input sheets that went to the keypunch group. All the transactions were gathered up and run against the Master File overnight. The input transactions would be sorted by Mortgage Number/ID, and the main batch program would process Master and transactions in Number order. It would skip past Mortgages that had no transactions, and then apply the transactions against Master records that did. It ran every other night, which gave the business time to look at output one day and then code new transactions the next day.
As I look back it now, it was a pretty good system. Since it was all batch, looking at info for a Mortgage meant printouts. To save paper, the system used microfiche. A printout for each mortgage was put on fiche to begin with, and then each time a mortgage was updated, all the updated ones would get a new-printout on a new fiche. The fiche was numbered, and there was one fiche that had an index of which number fiche to find a mortgage printout on. That index was recreated after each batch run, and would point to the new fiche as needed. Once a quarter, when a lot of new fiche had been added, the system would print a new set of fiche, and the process would start again. There were also monthly jobs for reports, and an annual run for more reports and some housekeeping, like purging paid-off mortgages from the master.
I suppose everyone has a soft spot for their first system, like their first car (mine was a 68 Ford XL) or other firsts. The system would not last forever, but that is another post...
Next time: Starting to Work
Subscribe to:
Posts (Atom)
About Me
- David Wright
- Ontario, Canada
- I have been an IT Business Analyst for 25 years, so I must have learned something. Also been on a lot of projects, which I have distilled into the book "Cascade": follow the link to the right to see more.