Information Technology Explored as a Corporate Asset
It is a significant fact that we are in the focal point of a deep-seated change in both technology and its application. Any institutions in our day expect to get more value from their investments in technology. In the "Post dearth era of calculation" the user-friendliness of dispensation power is not a check where cost of platform technology has become a minor factor in selecting among alternatives to build the business solution and as such the constraining factors are the managerial impact of reengineering the business process and the costs and time required for system development. Additionally, the need to re-educate personnel to the compulsory level of expertise can be an extremely expensive scheme. Open systems enable organizations to buy off-the-shelf solutions to business problems. Open systems standards set apart the design in which data is swapped, remote systems are accessed, and services are attracted. The receipt of open systems standards supports the creation o f system architectures that can be built from technology components. These standards enable us, as follows:
To build reusable class libraries to use in object-oriented design and development environments.
To build functional products those interact with the same data which are bedded on object oriented as well as bedded on full integrity.
To modify a correspondence at an individual desktop workstation to include data, addressing and graphics input from a word processor, a personal spreadsheet, a workgroup database, and an existing project host relevance to be propelled by electronic mail to someplace in the world.
It is worth mentioning that opposing to the claims of groups variety from the Open Software base to the user consortium Open User Recommended Solutions, open systems are not exclusively systems that conform to OSF UNIX specifications. The client/server model makes the enterprise available at the desk. It provides access to data that th e previous architectures did not. Standards have been defined for client-server figuring. If these principles are understood and used, a society can rationally expect to buy solutions today that can grow with their business needs without the constant need to revise the solutions. Architectures based on open systems standards can be implemented throughout the world, as global systems become the norm for large organizations. While a supportable common platform on a global scale is far from standardized, it certainly is becoming much easier to accomplish. From the desktop, enterprise-wide applications are indistinguishable from workgroup and personal applications. Powerful enabling technologies with built-in conformance to open systems standards are evolving rapidly. Examples include object-oriented development, relational and object-oriented databases, multimedia, imaging, expert systems, geographic information systems, voice recognition and voice response, and text management . These technologies provide the opportunity to integrate their generic potentialwith the particular necessities of a businessto create a cost-effective and personalized business solution. The client/server model provides the ideal platform with which to integrate these enabling technologies. Well-defined interface standards enable integration of products from several vendors to provide the right application solution. Enterprise systems are those that create and provide a shared information resource for the entire corporation. They do not imply centralized development and control, but they do treat information and technology as corporate resources. Enterprise network management requires all devices and applications in the enterprise computing environment to be visible and managed. This remains a major challenge as organizations move to distributed processing. Standards are defined and are being implemented within the client/server model. Client/server applications give great er viability to worker empowerment in a distributed organization than do today's host-centered environments.
Prospects are accessible to society and populace who are equipped and capable to compete in the global market and there is no denying the fact that a competitive global economy will ensure obsolescence and obscurity to those who cannot or are unwilling to compete. All organizations must look for ways to demonstrate value. We are in conclusion bearing in mind that an enthusiasm has to rethink upon existing managerial structures and commerce in putting into practice. Organizations are aggressively downsizing even as they try to aggressively expand their revenue base. There is more willingness to continue improvement practices and programs to eliminate redundancy and increase effectiveness. Organizations are becoming market-driven while remaining true to their business vision. To be competitive in a global economy, organizations in developed economies must employ technology to gain the efficiencies necessary to offset their higher labor costs. Reengineering the business process to provide information and decision-making support at points of customer contact reduces the need for layers of decision-making management, improves responsiveness, and enhances customer service. Empowerment means that knowledge and responsibility are available to the employee at the point of customer contact. Empowerment will ensure that product and service problems and opportunities are identified and finalized. Client/server computing is the most effective source for the tools that empower employees with authority and responsibility. The following are some key drivers in organizational philosophy, policies, and practices. Competitiveness is forcing organizations to find new ways to manage their business, despite fewer personnel, more outsourcing, a market-driven orientation, and rapid product obsolescence. Technology can be the enabler of organizational nim bleness. To survive and prosper in a world where trade barriers are being eliminated, organizations must look for partnerships and processes that are not restrained by artificial borders. Quality, cost, product differentiation, and service are the new marketing priorities. Our information systems must support these priorities.
Contesting demands that information systems organizations justify their costs and it is evident that business are in the way to question the return on their existing investments and as such Centralized IS an operation in particular are under the microscope. Manufactured goods obsolescence has never been so vital a factor. Purchasers have more options and are more demanding. Technology must enable organizations to anticipate demand and meet it. Quality and flexibility require decisions to be made by individuals who are in touch with the customer. Many organizations are eliminating layers of middle management. Technology must provide the necessary information and support to this new structure. If a business is run from its distributed locations, the technology supporting these units must be as reliable as the existing central systems. Technology for remote management of the distributed technology is essential in order to use scarce expertise appropriately and to reduce costs. Each individual must have access to all information he or she has a "need and right" to access, without regard to where it is collected, determined, or located. We can use technology today to provide this "single-system image" of information at the desk, whatever the technology used to create it. Standardization has introduced many new suppliers and has dramatically reduced costs. Competition is driving innovation. Organizations must use architectures that take advantage of cost-effective offerings as they appear. Desktop workstations now provide the power and mainframe capacity that mainframes did only a few years ago. The challenge is to effec tively use this power and capacity to create solutions to real business problems. Downsizing and empowerment require that the workgroup have access to information and work collectively. Decisions are being made in the workplace, not in the head office. Standards and new technologies enable workstation users to access information and systems without regard to location. Remote network management enables experts to provide support and central, system-like reliability to distributed systems. However, distributed systems are not transparent. Data access across a network often has unpredictable result sets; therefore, performance on existing networks is often inadequate, requiring a retooling of the existing network infrastructure to support the new data access environment.
Standards enable many new vendors to enter the market. With a common platform target, every product has the entire marketplace as a potential customer. With the high rate of introduction of products, it is certain that organizations will have to deal with multiple vendors. Only through a commitment to standards-based technology will the heterogeneous multiple vendor environment effectively service the buyer. Workstation power, workgroup empowerment, preservation of existing investments, remote network management, and market-driven business are the forces creating the need for client/server computing. The technology is here; what is missing is the expertise to effectively apply it. Organizational pressures to demonstrate value apply as much to the information systems functions as to any other element or operating unit of the business. This is a special challenge because most IS organizations have not previously experienced strong financial constraints, nor have they been measured for success using the same business justification "yardstick" as other value-creating units within the business enterprise. IS has not been under the microscope to prove that the role it plays truly adds value to the overall organization. In today's world, organizations that cannot be seen to add value are either eliminated or outsourced. It has been found out on a survey that about 1000 companies, on average, spend 90 percent of IS dollars maintaining existing systems. Major business benefits, however, are available only from "new" systems. Dramatic reductions in the cost of technology help cost justify many systems. Organizations that adapt faster than their competitors demonstrate value and become the leaders in their marketplace. Products and services command a premium price when these organizations are "early to market." As they become commodities, they attract only commodity prices. This is true of both commercial organizations wishing to be competitive in the market with their products and of service organizations wishing to demonstrate value within their department or government sector. "It only took God seven days to create the world because he didn't have an existing environment to deal with."3 Billions of dollars have been invested in corporate computing infrastructure and training. This investment must be fully used. Successful client/server solutions integrate with the existing applications and provide a gradual migration to the new platforms and business models.
To meet the goals of the 1990s, organizations are downsizing and eliminating middle-management positions. They want to transfer responsibility to empower the person closest to the customer to make decisions. Historically, computer systems have imposed the burden of data collection and maintenance on the front-line work force but have husbanded information in the head office to support decision making by middle management. Information must be made available to the data creators and maintainers by providing the connectivity and distributed management of enterprise databases and applications. The technology of client/server computing will support the movement of information processing to the direct creators and users of information. OLTP applications traditionally have been used in insurance, financial, government, and sales-related organizations. These applications are characterized by their need for highly reliable platforms that guarantee that transactions will be handled correctly, no data will be lost, and response times will be extremely low, and only authorized users will have access to an application. The IS industry understands OLTP in the traditional mainframe-centered platforms but not in the distributed client/server platforms. Organizations do (and will continue) to rely on technology to drive business. Much of the IS industry does not yet understand how to build mission-critical applications on client/server platforms. As organizations move to employee empowerment and workgroup computing, the desktop becomes the critical technology element running the business. Client/server applications and platforms must provide main frame levels of reliability. Executive information systems provide a single-screen view of "how well we are doing" by comparing the mass of details contained in their current and historical enterprise databases with information obtained from outside sources about the economy and competition. As organizations enter into corporation with their customers and suppliers, the need to integrate with external systems becomes essential in order to capture the necessary information for an effective EIS. Organizations want to use the EIS data to make strategic decisions. The DSS should provide "what if" analyses to project the results of these decisions. Managers define expectations, and the local processing capability generates decision alerts when reality does not conform. This is the DSS of the client/server model. Information is now recognized as a corporate resource. To be truly effective, organizations must collect data at the source and distribute it, according to the requiremen ts of "need and right to access," throughout the organization. Workgroups will select the platforms that best meet their needs, and these platforms must integrate to support the enterprise solution. Systems built around open systems standards are essential for cost-effective integration. Los Angeles County issued a request for information stating simply that its goal was "to implement and operate a modern telecommunications network that creates a seamless utility for all County telecommunications applications from desktop to desktop. The United States government has initiated a projectthe National Information Interchange that has the simple objective of "making the intellectual property of the United States available to all with a need and right to access.
"Computers will become a truly useful part of our society only when they are linked by an infrastructure like the highway system and the electric power grid, creating a new kind of free market for information servic es. The feature that makes the highway and electric power grids truly useful is their pervasiveness. Every home and office has ready access to these services; thus, they are usedwithout thoughtin the normal course of living and working. This pervasive accessibility has emerged largely because of the adoption of standards for interconnection. If there were no standards for driving, imagine the confusion and danger. What if every wall plug were a different shape, or the power available on every plug were random? If using a service requires too much thought and attention, that service cannot become a default part of our living and working environment. "Imagine the United States without its highways. Our millions of cars, buses, and trucks driven in our own backyards and neighborhood parking lots, with occasional forays by the daring few along uncharted, unpredictable, and treacherous dirt roads, full of unspeakable terrors."7 The parking lot analogy illustrated in Figure 1.1 re presents the current information-processing environment in most organizations. It is easy and transparent to locate and use information on a local area network (LAN), but information located on another LAN is almost inaccessible. End-user access to enterprise data often is unavailable except for predefined information requests. Although computersfrom mainframes to PCsare numerous, powerful, flexible, and widely used, they are still used in relative isolation. When they communicate, they usually do so ineffectively, through arcane and arbitrary procedures. Information comes with many faces. As shown in Figure 1.2, it can take the form of text, drawings, music, speech, photographs, stock prices, invoices, software, live video, and many other entities. Yet once information is computerized, it becomes a deceptively uniform sequence of ones and zeros. The underlying infrastructure must be flexible in the way it transports these ones and zeros. To be truly effective besides routin g these binaries to their destinations the infrastructure must be able to carry binaries with varying degrees of speed, accuracy, and security to accommodate different computer capabilities and needs. Because computers are manufactured and sold by vendors with differing views on the most effective technology, they do not share common implementation concepts. Transporting ones and zeros around, however flexibly, isn't enough. Computers based on different technologies cannot comprehend each other's ones and zeros any more than people comprehend foreign languages. We therefore need to endow our IS organizations with a set of widely understood common information interchange conventions. Moreover, these conventions must be based on concepts that make life easier for humans, rather than for computer servants. Finally, the truly useful infrastructure must be equipped with "common servers"computers that provide a few basic information services of wide interest, such as computerized white and yellow pages.
Technological innovation proceeds at a pace that challenges the human mind to understand how to take advantage of its capabilities. Electronic information manageme nt, technological innovation in the personal computer, high-speed electronic communication, and digital encoding of information provide new opportunities for enhanced services at lower cost. Personal computers can provide services directly to people who have minimal computer experience. They provide low-cost, high-performance computing engines at the site that the individual lives, works, or accesses the serviceregardless of where the information is physically stored. Standards for user interface, data access, and intercrosses communications have been defined for the personal computer and are being adopted by a majority of the vendor community. There is no reason to accept solutions that do not conform to the accepted standards. Most large organizations today use a heterogeneous collection of hardware, software, and connectivity technologies. There is considerable momentum toward increased use of technology from multiple vendors. This trend leads to an increasingly heterogen eous environment for users and developers of computer systems. Users are interested in the business functionality, not the technology. Developers rarely are interested in more than a subset of the technology. The concept of the single-system image says that you can build systems that provide transparency of the technology platform to the user andat the largest extent possibleto the developer. Developers will need sufficient knowledge of the syntax used to solve the business problem, but will need little or no knowledge of the underlying technology infrastructure. Hardware platforms, operating systems, database engines, and communications protocols are necessary technological components of any computer solution, but they should provide servicesnot create obstacles to getting the job done. Services should be masked; that is, they should be provided in a natural manner without requiring the user to make unnatural gyrations to invoke them. Only by masking these services and by u sing standard interfaces can we hope to develop systems quickly and economically. At the same time, masking (known as encapsulation in object-oriented programming) and standard interfaces preserve the ability to change the underlying technology without affecting the application. There is value in restricting imagination when you build system architectures. Systems development is not an art; it is an engineering discipline that can be learned and used. Systems can be built on the foundations established by previous projects.
Within the single-system image environment, a business system user is totally unaware of where data is stored, how the client and server processors work, and what networking is involved in gaining connectivity. Every application that the user accesses provides a common "look and feel." Help is provided in the same way by every application. Errors are presented and resolved in the same way by every application. Access is provided through a standard security procedure for every application. Each user has access to all services for which he or she has a need and a right to access.
The security layer is invisible to the authorized and impenetrable to the unauthorized.
Navigation from function to function and application to application is provided in the same way in every system. New applications can be added with minimal training, because the standard functions work in the same way, and only the new business functions need be learned. It is not necessary to go to "boot camp for basic training" prior to using each new application. Basic training is a one-time effort because the basics do not change.
The complexity of a heterogeneous computing platform will result in many interfaces at both the logical and physical level. Organizations evolve from one platform to another as the industry changes, as new technologies evolve that are more cost effective, and as acquisitions and mergers introduce other in stalled platforms. All these advances must be accommodated. There is complexity and risk when attempting to interoperate among technologies from many vendors. It is necessary to engage in "proof of concept" testing to distinguish the marketing version of products and architectures from the delivered version. Many organizations use a test lab concept called technology competency centers to do this "proof of concept." The TCC concept provides a local, small-scale model of all the technologies involved in a potential single-system, interoperable image. Installing a proposed solution using a TCC is a low-cost means of ensuring that the solution is viable. These labs enable rapid installation of the proposed solution into a proven environment. They eliminate the need to set up from scratch all the components that are necessary to support the unique part of a new application. OrganizationsMerrill Lynch, Health Canada, SHL System house, BSG Corporation, Microsoft, and many othersus e such labs to do sanity checks on new technologies. The rapid changes in technology capability dictate that such a resource be available to validate new products. The single-system image is best implemented through the client/server model.. Our experience confirms that client/server computing can provide the enterprise to the desktop. Because the desktop computer is the user's view into the enterprise, there is no better way to guarantee a single image than to start at the desktop. Unfortunately, it often seems as if the number of definitions of client/server computing depends on how many organizations you survey, whether they're hardware and software vendors, integrators, or IS groups. Each has a vested interest in a definition that makes its particular product or service an indispensable component. Throughout this book, the following definitions will be used consistently:
Client: A client is a single-user workstation that provides presentation services and the app ropriate computing, connectivity, and database services and interfaces relevant to the business need.
Server: A server is one or more multi-user processors with shared memory providing computing, connectivity, and database services and interfaces relevant to the business need.
Client/server computing is an environment that satisfies the business need by appropriately allocating the application processing between the client and the server processors. The client requests services from the server; the server processes the request and returns the result to the client. The communications mechanism is a message passing interposes communication (IPC) that enables (but does not require) distributed placement of the client and server processes. Client/server is a software model of computing, not a hardware definition. This definition makes client/server a rather generic model and fits what is known in the industry as "cooperative processing" or "peer-to-peer." Because the client/server environment is typically heterogeneous, the hardware platform and operating system of the client and server are not usually the same. In such cases, the communications mechanism may be further extended through a well-defined set of standard application program interfaces (APIs) and remote procedure calls. The modern diagram representing the client/server model was probably first popularized by Sybase. Figure 1.4 illustrates the single-system image vision. A client-user relies on the desktop workstation for all computing needs. Whether the application runs totally on the desktop or uses services provided by one or more serversbe they powerful PCs or mainframesis irrelevant. Effective client/server computing will be fundamentally platform-independent. The user of an application wants the business functionality it provides; the computing platform provides access to this business functionality. There is no benefit, yet considerable risk, in exposing this platfo rm to its user. Changes in platform and underlying technology should be transparent to the user. Training costs, business processing delays and errors, staff frustration, and staff turnover result from the confusion generated by changes in environments where the user is sensitive to the technology platform.
It is easily demonstrated that systems built with transparency to the technology, for all users, offer the highest probability of solid ongoing return for the technology investment. It is equally demonstrable that if developers become aware of the target platform, development will be bound to that platform. Developers will use special features, tricks, and syntax found only in the specific development platform. Tools, which isolate developers from the specifics of any single platform, assist developers in writing transparent, portable applications. These tools must be available for each of the three essential components in any application: data access, proce ssing, and interfaces. Data access includes the graphical user interface (GUI) and stored data access. Processing includes the business logic. Interfaces link services with other applications. This simple model, reflected in Figure 1.5, should be kept in mind when following the evolution to client/server computing. The use of technology layers provides this application development isolation. These layers isolate the characteristics of the technology at each level from the layer above and below. This layering is fundamental to the development of applications in the client/server model. The rapid rate of change in these technologies and the lack of experience with the "best" solutions implies that we must isolate specific technologies from each other. This book will continue to emphasize and expand on the concept of a systems development environment (SDE) as a way to achieve this isolation. Developer tools are by far the most visible. Most developers need to know only the synt ax of these tools to express the business problem in a format acceptable to the technology platform. With the increasing involvement of minicomputer professionals, as technology users and application assemblers, technology isolation is even more important. Very fewperhaps noneof an organization's application development staff needs to be aware of the hardware, system software, specific database engines, specific communications products, or specific presentation services products. These are invoked through the APIs message passing, and generated by tools or by a few technical specialists. As you will see in Chapter 6, the development of an application architecture supported by a technical architecture and systems development environment is the key to achieving this platform independence and ultimately to developing successful client/server applications.
As organizations increase the use of personal productivity tools, workstations become widely installed. The need to p rotect desktop real estate requires that host terminal capabilities be provided by the single workstation. It soon becomes evident that the power of the workstation is not being tapped and application processing migrates to the desktop. Once most users are connected from their workstation desktop to the applications and data at the host mainframe or minicomputer, there is significant cost benefit in offloading processing to these powerful workstations. The first applications tend to be data capture and edit. These simplifybut still usethe transaction expected by an already existing host application. If the workstation is to become truly integrated with the application, reengineering of the business process will be necessary. Accounting functions and many customer service applications are easily offloaded in this manner. Thus, workgroup and departmental processing is done at the LAN level, with host involvement for enterprise-wide data and enforcement of interdepartmental bus iness rules. In this "dumb" terminal (IBM uses the euphemism nonprogrammable to describe its 327x devices) emulation environment, all application logic resides in the minicomputer, mainframe, or workstation. Clearly a $5000 or less desktop workstation is capable of much more than the character display provided by a $500 terminal. In the client/server model, the low-cost processing power of the workstation will replace host processing, and the application logic will be divided appropriately among the platforms. As previously noted, this distribution of function and data is transparent to the user and application developer.
The mainframe-centric model uses the presentation capabilities of the workstation to front-end existing applications. The character mode interface is remapped by products such as Easel and Mozart. The same data is displayed or entered through the use of pull-down lists, scrollable fields, check boxes, and buttons; the user interface is easy to use, a nd information is presented more clearly. In this mainframe-centric model, mainframe applications continue to run unmodified, because the existing terminal data stream is processed by the workstation-based communications API. This protects the investment in existing applications while improving performance and reducing costs. Character mode applications, usually driven from a block mode screen, attempt to display as much data as possible in order to reduce the number of transmissions required to complete a function. Dumb terminals impose limitations on the user interface including fixed length fields, fixed length lists, crowded screens, single or limited character fonts, limited or no graphics icons, and limited windowing for multiple application display. In addition, the fixed layout of the screen makes it difficult to support the display of conditionally derived information. In contrast, the workstation GUI provides facilities to build the screen dynamically. This enables screens to be built with a variable format based conditionally on the data values of specific fields. Variable length fields can be scrollable, and lists of fields can have a scrollable number of rows. This enables a much larger virtual screen to be used with no additional data communicated between the client workstation and server. Windowing can be used to pull up additional information such as help text, valid value lists, and error messages without losing the original screen contents. The more robust GUI facilities of the workstation enable the user to navigate easily around the screen. Additional information can be encapsulated by varying the display's colors, fonts, graphics icons, scrollable lists, pull-down lists, and option boxes. Option lists can be provided to enable users to quickly select input values. Help can be provided, based on the context and the cursor location, using the same pull-down list facilities. Although it is a limited use of client/server comput ing capability, a GUI front end to an existing application is frequently the first client/server-like application implemented by organizations familiar with the host mainframe and dumb-terminal approach. The GUI preserves the existing investment while providing the benefits of ease of use associated with a GUI. It is possible to provide dramatic and functionally rich changes to the user interface without host application change.
The next logical step is the provision of some edit and processing logic executing at the desktop workstation. This additional logic can be added without requiring changes in the host application and may reduce the host transaction rate by sending up only valid transactions. With minimal changes to the host application, network traffic can be reduced and performance can be improved by using the workstation's processing power to encode the data stream into a compressed form. A more interactive user interface can be provided with built-in, conte xt-sensitive help, and extensive prompting and user interfaces that are sensitive to the users' level of expertise. These options can be added through the use of workstation processing power. These capabilities enable users to operate an existing system with less intensive training and may even provide the opportunity for public access to the applications. Electronic data interchange (EDI) is an example of this front-end processing. EDI enables organizations to communicate electronically with their suppliers or customers. Frequently, these systems provide the workstation front end to deal with the EDI link but continue to work with the existing back-end host system applications. Messages are reformatted and responses are handled by the EDI client, but application processing is done by the existing application server. Productivity may be enhanced significantly by capturing information at the source and making it available to all authorized users. Typically, if users employ a multipart form for data capture, the form data is entered into multiple systems. Capturing this information once to a server in a client/server application, and reusing the data for several client applications can reduce errors, lower data entry costs, and speed up the availability of this information.
There is no delay while the forms are passed around the organization. This is usually a better technique than forms imaging technology in which the forms are created and distributed internally in an organization. The use of workflow-management technology and techniques, in conjunction with imaging technology, is an effective way of handling this process when forms are filled out by a person who is physically remote from the organization. Intelligent Character Recognition (ICR) technology can be an extremely effective way to automate the capture of data from a form, without the need to key. Current experience with this technique shows accuracy rates greater than 99.5 per cent for typed forms and greater than 98.5 percent for handwritten forms.
Rightsizing and rationalizing are strategies used with the client/server model to take advantage of the lower cost of workstation technology. Rightsizing and upsizing may involve the addition of more diverse or more powerful computing resources to an enterprise computing environment. The benefits of rightsizing are reduction in cost and/or increased functionality, performance, and flexibility in the applications of the enterprise. Significant cost savings usually are obtained from a resulting reduction in employee, hardware, software, and maintenance expenses. Additional savings typically accrue from the improved effectiveness of the user community using client/server technology. Eliminating middle layers of management implies empowerment to the first level of management with the decision-making authority for the whole job. Information provided at the desktop by networked PCs and workstat ions integrated with existing host (such as mainframe and minicomputer) applications is necessary to facilitate this empowerment. These desktop-host integrated systems house the information required to make decisions quickly. To be effective, the desktop workstation must provide access to this information as part of the normal business practice. Architects and developers must work closely with business decision makers to ensure that new applications and systems are designed to be integrated with effective business processes. Much of the cause of poor return on technology investment is attributable to a lack of understanding by the designers of the day-to-day business impact of their solutions. Downsizing information systems is more than an attempt to use cheaper workstation technologies to replace existing mainframes and minicomputers in use. Although some benefit is obtained by this approach, greater benefit is obtained by reengineering the business processes to really use the capabilities of the desktop environment. Systems solutions are effective only when they are seen by the actual user to add value to the business process. Client/server technology implemented on low-cost standard hardware will drive downsizing. Client/server computing makes the desktop the users' enterprise. As we move from the machine-centered era of computing into the workgroup era, the desktop workstation is empowering the business user to regain ownership of his or her information resource. Client/server computing combines the best of the old with the newthe reliable multi-user access to shared data and resources with the intuitive, powerful desktop workstation.
In view of the above it is evident that object-oriented development concepts are embodied in the use of an SDE created for an organization from an architecturally selected set of tools. The SDE provides more effective development and maintenance than companies have experienced with traditional host-base d approaches. Client/server computing is open computing. Mix and match is the rule. Development tools and development environments must be created with both openness and standards in mind. Mainframe applications rarely can be downsizedwithout modificationsto a workstation environment. Modifications can be minor, wherein tools are used to port existing mainframe source codeor major, wherein the applications are rewritten using completely new tools. In porting, native COBOL compilers, functional file systems, and emulators for DB2, IMS DB/DC, and CICS are available for workstations. In rewriting, there is a broad array of tools ranging from PowerBuilder, Visual Basic, and Access, to larger scale tools such as Forte and Dynasty. Micro Focus has added an Object Oriented (OO) option to its workbench to facilitate the creation of reusable components. The OO option supports integration with applications developed under Smalltalk/V PM. IBM's CICS for OS/2, OS400, RS6000, and HP/UX p roducts enable developers to directly port applications using standard CICS call interfaces from the mainframe to the workstation. These applications can then run under OS/2, AIX, OS400, HP/UX, or MVS/VSE without modification. This promises to enable developers to create applications for execution in the CICS MVS environment and later to port them to these other environments without modification. Conversely, applications can be designed and built for such environments and subsequently ported to MVS (if this is a logical move). Organizations envisioning such a migration should ensure that their SDE incorporates standards that are consistent for all of these platforms.
These harvests, pooled with the economical processing power available on the workstation, make the workstation Local Area Network an ideal expansion and maintenance environment for existing host processors. When an organization views mainframe or minicomputer resources as real dollars, developers can usua lly justify offloading the development in only three to six months. Explorers can be effective only when a proper systems development environment is put in place and provided with a suite of tools offering the host capabilities plus enhanced connectivity. Workstation operating systems are still more primitive than the existing host server MVS, VMS, or UNIX operating systems. Therefore, appropriate standards and procedures must be put in place to coordinate shared development. The workstation environment will change. Only projects built with common standards and procedures will be resilient enough to remain viable in the new environment.
The major reserves come up to from new projects that can create apposite values at the initiate and do all development using the workstation LAN environment. It is possible to retrofit standards to an existing environment and establish a workstation with a LAN-based maintenance environment. The benefits are less because retrofitting th e standards creates some costs. However, these costs are justified when the application is scheduled to undergo significant maintenance or if the application is very critical and there is a desire to reduce the error rate created by changes. The discipline associated with the movement toward client/server-based development, and the transfer of code between the host and client/server will almost certainly result in better testing and fewer errors. The testing facilities and usability of the workstation will make the developer and tester more effective and therefore more accurate. Business processes use database, communications, and application services. In an ideal world, we pick the best servers available to provide these services, thereby enabling our organizations to enjoy the maximum benefit that current technology provides. Real-world developers make compromises around the existing technology, existing application products, training investments, product support, and a my riad other factors. Key to the success of full client/server applications is selecting an appropriate application and technical architecture for the organization. Once the technical architecture is defined, the tools are known.
The ultimate pace is to accomplish an SDE to categorize the principles desirable to use the tools in actual fact. This SDE is the collection of hardware, software, standards, standard procedures, interfaces, and training built up to support the organization's particular needs. Many construction projects fail because their developers assume that a person with a toolbox full of carpenter's tools is a capable builder.
In view of the above, it is evident that in order to be a successful planner, a person needs be trained to build according to standards. The creation of standards to define interfaces to the sewerage, water, electrical utilities, road, school, and community systems is essential for successful, cost-effective building. We do no t expect a carpenter to design such interfaces individually for every building. Rather, pragmatism discourages imagination in this regard. By reusing the models previously built to accomplish integration, we all benefit from cost and risk reduction. Suffice it to say that the preamble of a whole new generation of Object oriented Technology based on tools for client/server development demands that proper standards can be put in place to support shared development, reusable code, interfaces to existing systems, security, error handling, and an organizational standard "gaze and think." As with any new technology, there will be changes. Developers can build application systems closely tied to today's technology or use an SDE and promote applications that can progress along with the expertise podium.