Structured Programming Research Paper Starter

Structured Programming

(Research Starters)

This article explains the concept of structured programming and examines some of the key aspects of the structured programming process. The impact of the information systems development life cycle on structured programming is also examined. The use of Function Point Analysis as a tool to measure the complexity of a computer program is explained along with the basic process of applying function point analysis. The issue of improving quality in the software development is examined and various methods that can be employed to improve software quality are investigated.

Keywords Function Point Analysis; Information systems development life cycle (ISDLC); Management Information Systems; Requirements analysis; Software Development Productive; Structured Development; Structured Programming

Business Information Systems: Structured Programming


There are two major aspects of structured software development. The first is the overall method of managing software over its life cycle. The second is the style and method in which computer programs are actually written. Software consists of abstract sets of rules that govern the creation, transfer, and transformation of data. Initially existing solely as an idea, software is iteratively refined becoming completely visible only at its completion. This invisibility is compounded for large software projects, for which logical complexity cannot be maintained in one person's mind, and for which development must be partitioned into a number of tasks that are assigned to different programmers. As task descriptions are only models of the intended abstraction, and as the individual performing a task interprets these descriptions through a unique worldview, most software errors occur at the interfaces of modules written by different programmers (Zelkowitz, 1978).

How software is developed and managed over its lifecycle varies considerably from organization to organization. In large organizations that use numerous and varied applications software packages, a disciplined and structured approach to software management is generally followed. The Information System Development Life Cycle (ISDLC) is an established concept in the MIS arena. The traditional approach to the ISDLC is that a development project has to undergo a series of phases where the completion of each is a prerequisite to the commencement of the next and where each phase consists of a related group of steps. The general scheme for the ISDLC is similar almost everywhere. It typically contains four major phases consisting of several steps each:

  • The Definition Phase: Consisting of preliminary analysis, feasibility study, information analysis, and system design.
  • The Construction Phase: Consisting of programming, development of procedures, unit testing, quality control, and documentation.
  • The Implementation Phase: Consisting of user training, conversion of old systems to new systems, thorough field testing, and then a move to full operations.
  • The Maintenance Phase: After the system is in full operation, updates are made to assure continued operations as new equipment or upgrades to operating systems occur. Enhancements to the system can also be made to meet changing user requirements.

The traditional approach to software management advocates a rigid ISDLC in order to assure control over the development process. In practice, however, development processes are not that rigid. They vary with respect to the complexity of the system under development, the importance attached to that system, and the user's environment. (Ahituv, Hadass & Neumann, 1984). The various steps of the ISDLC are usually perfromed on all projects, but not necssarily in traditional order. For example, the testing, quality control, and documentation steps may not occur until everybody involved is satisifed with data models or with prototypes of systems.

Structured Design

Structured design is a set of general program design considerations and techniques for making coding, debugging, and modification easier, faster, and less expensive by reducing complexity. The extent to which structured programming methods are followed varies from organization to organization. In general, the more complex a system is the more likely it is that structured design and programming methods will be applied to the development process.

Simplicity is the primary method for evaluating alternative designs to reduce debugging and modification time. Simplicity can be enhanced by dividing the applications software into separate pieces in such a way that pieces can be considered, implemented, fixed, and changed with minimal consideration or effect on the other pieces of the software. Observability (the ability to easily perceive how and why actions occur) is another useful consideration that can help in designing programs that can be changed easily. Consideration of the effect of reasonable changes is also valuable for evaluating alternative designs.


Structured programming and simplicity guidelines call for developing software in modules. The term module is used to refer to a set of one or more contiguous program statements having a name by which other parts of the system can invoke it and preferably having its own distinct set of variable names. Examples of modules are PL/I procedures, FORTRAN mainlines and subprograms, and, in general, subroutines of all types. Considerations are always with relation to the program statements as coded, since it is the programmer's ability to understand and change the source program that is under consideration.

While conceptually it is useful to discuss dividing whole programs into smaller pieces, the techniques for designing new original independent modules are simple. On the other hand, it may be difficult to divide an existing program into separate pieces without increasing the complexity because of the amount of overlapped code and other interrelationships that usually exist.

Modules, Connections

The fewer and simpler the connections between modules, the easier it is to understand each module without reference to other modules. Minimizing connections between modules also minimizes the paths along which changes and errors can propagate into other parts of the system. This helps to eliminate disastrous "ripple" effects, where changes in one part cause errors in another, necessitating additional changes elsewhere that often give rise to new errors. The widely used technique of using common data areas (or global variables or modules without their own distinct set of variable names) can often result in an enormous number of connections between the modules of a program.

The complexity of a system is affected not only by the number of connections but by the degree to which each connection couples (associates) two modules, making them interdependent rather than independent. Coupling is the measure of the strength of association established by a connection from one module to another. Strong coupling complicates a system since a module is harder to understand, change, or correct by itself if it is highly interrelated with other modules. Structure can be improved and complexity reduced by designing systems with the weakest possible coupling between modules.

The degree of coupling established by a particular connection is a function of several factors, and thus it is difficult to establish a simple index of coupling. Coupling depends (1) on how complicated the connection is, (2) on whether the connection refers to the module itself or something inside it, and (3) on what is being sent or received.

Coupling increases with complexity or obscurity of the interface. Coupling is lower when the connection is with the normal module interface than when the connection is with an internal component. Coupling is lower with data connections than with control connections, which are in turn lower than hybrid connections (modification of one module's code by another module).

Every element in the common environment, whether used by particular modules or not, constitutes a separate path along which errors and changes can propagate. Each element in the common environment adds to the complexity of the total system. Changes to, and new uses of, the common area can potentially impact all modules in unpredictable ways. Data references may become unplanned, uncontrolled, and even unknown.


(The entire section is 3734 words.)