> >
Preface to the second edition
A Professional’s Guide to Systems Analysis was written in the mid-eighties – almost a decade ago. In the intervening time Information Systems has undergone several revolutions and has continued to evolve from an industry dominated by centralized corporate data processing support organizations operating large mainframe computing environments, to one dominated by developers and end-users working on ever more powerful desktop personal computers and workstations. At the same time the computer industry itself has evolved from one dominated by several large mainframe manufacturers to one dominated by several large personal computer software developers.
In many respects the world has changed over the past decade and in some respects it has not. Today’s office is populated with personal computers of all types, some powerful, others not. The people who use these machines are more computer literate and they have a wider variety of software to work with than we could have dreamed of a decade ago. The capabilities of today’s machines and software have not only given users new options and opportunities but they have also given them more freedom.
The storage and memory capacity, graphics and multi-media capabilities open up vast new possibilities for standard business applications. The availability of easy to use, powerful software to build your own application. further increases the user’s options. However, as much as these machines and software have evolved, the evolution appears to have only started.
There has been an enormous impact on the systems analyst as well. The tools and techniques available to perform the required analysis tasks have increased dramatically, as have the number of areas and the scope of those areas to be included in the analysis. Until a decade ago, systems were developed for the big mainframes using mainframe languages and software. If personal workstations were included in the analysis, it was because they contained documentation, or some spreadsheets. Rarely were applications resident on those machines. For the most part they ran character-based applications and software, and many were directly connected to a mainframe and were used as substitutes for dumb terminals.
These machines were small, by today’s standards, in terms of memory, speed and disk capacity. By today’s standards the applications seem primitive, inflexible and non-user-friendly. The availability of, and demand for Graphic User Interface (GUI) software was still low. Each machine supported a single user, those lucky enough to have one, or were connected to a mainframe by low speed, low capacity, direct dial-up or leased line connections. Again by today’s standards connections between personal computers were scarce, and primitive. Today’s machines are biquitous and have speed, storage capacity and memory that rival their mainframe cousins, and are connected through intricate webs of networks interconnected by high-speed, high capacity communications lines.
Almost all firms are building personal computer based applications where processing, data and reporting occur almost exclusively within a network of interconnected personal computers. These networks of personal computers (clients) and common (server) workstations are used both to support dedicated applications and to be shared among multiple applications.
Applications which were not thought to be practical when the first edition of this book was written are routinely developed today. The proliferation of commercially available business application is staggering. Software and hardware for personal computers, for both home and office use are sold everywhere, in small specialty stores and supermarket-like discount stores. The prices of software and machines have put this technology within the reach of almost everyone and every business.
Personal computers and computer related topics are regular topics in the curricula of schools from the elementary grades through graduate school. In an increasing number of schools computer training is a mandatory item. Commercial and cable TV stations regularly broadcast shows that discuss, explain, or are based on personal computing or network related topics. Even our home appliances, cars and toys are computer based, or have some computer related aspects.
When this book was first written I indulged in the conceit that the material would be more or less timeless, and would not be subject to the changing winds of technology. I was not astute enough to foresee the impact of the “little machines” that I used to write the book. At that time I had a spreadsheet package, a database package, a word processor and a presentation graphics package installed, but did not see the need for either a modem or for any communications software. After all there were few places to communicate with, and most of them were mainframes.
My machine today is still small (in size) but with far more capability, and it is still not close to “state-of-the-art.” I have upgraded twice since the mid-eighties, RAM (Random Access Memory) has increased from 640 KB to 8 MB which is still insufficient for some of what I want to do. My hard drive capacity has increased from 20 MB to 600 MB (about half full) and the processor has increased from 8 MHz to 33 MHz (sufficient for current needs but insufficient for longer term plans.)
My machine now has a modem (not the fastest available, but adequate,) software to allow me send and receive faxes, and of course, a full complement of Internet software which allows me to connect to the Information Superhighway, “cruise” or , “surf the net”, go sightseeing, visit, and talk to friends, relatives and colleagues via e-mail and a variety of other software packages which were undreamed of or in their infancy a decade ago.
I impart this information because none of this hardware or software was difficult to obtain, nor was any of it expensive. I was able to buy some of it at my local computer supplies store, some of it at my local discount computer supplies store, and some of it was obtained free from the archives on the Internet. This same hardware and software availability has changed the analysis tasks substantially.
These changes have occurred in the last decade and the next decade will bring orders of magnitude more change in software capability, flexibility and availability, hardware features and capability, software development techniques, and greater interconnection between users, business and government.. These changes will revolutionize the way businesses operate, where and how their employees perform their functions, and what functions their employees perform. Technology will play an ever increasing role in business life, because it will be uneconomic not to use it. Employees and management alike will become more computer literate, more at home with these machines and software, and will have higher expectations for performance and function. These machines will, and in some cases already have, become as much a part of the office as the phone, and in some cases they may have even replaced the phone.
These machines have changed the way companies do business, have added new products and services, and introduced new considerations into the analysis mix
However strong the impact of the PC has been, other forces have also changed the corporate environment, and thus changed the way companies have and will do business. The current trend is toward making companies leaner, more efficient and more competitive. This is an ongoing process and has employed processes variously Called Business Process Reengineering, Corporate Restructuring, Total Quality Management, and Continuous Improvement to reexamine and rework the way the company and its processes function, and often what processes and functions should be performed. One side effect of this process is the elimination of positions and a reduction in staffing requirements that has been called Right-Sizing, Down-Sizing, and Restructuring. Companies employee natural attrition, voluntary and involuntary lay-off or even lucrative early retirement offers to cut staff size.
One major impact of all this has been to remove from the environment large numbers of employees with long years of service, extensive institutional experience and vast pools of embedded knowledge. procedures, information sources, and in many cases, critical corporate information itself has left with these departing employees. This lost knowledge includes business policy and rules rationale, customer preference information, special processing methods, and undocumented information on the firm’s automated applications, or sometimes just the location of critical application components, such as documentation, code, procedures, etc. This loss of knowledge will have far reaching effects as companies attempt to upgrade, modify or redesign their existing “legacy” systems.
The concept of “legacy” systems, originally referred to older generation systems, inherited from prior developers, built using older, and perhaps outdated technology. These systems were usually poorly documented, fragile, and high on the list of systems to be replaced. Today the term “legacy system” can refer to any systems from another platforms, or even to any older system. The designation “legacy” implies that these systems are less useful, older, or desirable, and are something to be salvaged, not rebuilt or enhanced.
The rise of personal computing capability, the avalanche of user-built applications, the availability of commercial-off-the-shelf (COTS) software in use in the average business office, has complicated the analysis process by adding so many more sources of information, and so many more forms that information can take. The proliferation of user-built applications, and personal user-built and maintained files makes the entire user population targets of inquiry and analysis. No longer can the analyst interview a representative sample of user personnel to ascertain current procedures and requirements.
The first edition was the result of ideas from many sources, and its success was due in no small part to the efforts of the production staff at McGraw-Hill who took my rough material and polished it into a fine finished product. My thanks to the many people who read the published book and have written to suggest topics for additional material. Almost no one suggested removing any old material. The book has received good notices and wide readership - for all of this I thank you. When I was approached to update the book for a second edition I realized that what was missing was the material reflecting the impact of the PC revolution. I have attempted to add material on those topics I felt were appropriate to the general scope of the book.
In this second edition I have modified some of the original language to reflect the changing emphasis of today’s technology. The new text plays down some of the mainframe aspects and highlights some of the PC aspects. I have attempted to change some of the language to reflect today’s usage, or at least to reflect where the names and terminology have changed over time. This last effort may be an exercise in futility since the language and terminology of IS seems to change on a continuous basis.
The role of the Systems Analyst has not changed, nor do I expect it to do so in the future. the tools and techniques have and will change. New and improved tools will become available. This book is intended to help the analyst understand the current business environment, understand the changes in that environment, and to effectively use both the old and new tools and techniques.
MEM
1995
A Professional's Guide to Systems Analysis, Second Edition
Written by Martin E. Modell
Copyright © 2007 Martin E. Modell
All rights reserved. Printed in the United States of America. Except as permitted under United States Copyright Act of 1976, no part of this publication may be reproduced or distributed in any form or by any means, or stored in a data base or retrieval system, without the prior written permission of the author.