an industry term for a large computer...
by Viking Waters
by Lance H.
Reprinted here with his permission
Having read John Campbell's "What is a Mainframe", and
having been asked this question myself many times I would like to propose a
more illuminating definition. First, however, some very brief biographical
info. I first became interested in computing machines as a teenager. In those
days the 2nd generation was rapidly drawing to a close and System/360 was about
to change the computing landscape. My first programming experience was in high
school, where my class had access to a very fast IBM 7094-II (and before you
ask, no, my high school did not have its own 7094; we were allowed limited use
of one of MIT's systems). In college I majored in math, primarily because
computer science as a major was still about 4 years in the future.
Nevertheless, my first love has always been computing machines, and I have
invested a lifetime of study and labor in this industry. I have worked with all
platforms except vector processing based supercomputers. My favorite has always
been, and remains to this day, the mainframe.
One might suppose that it would be easy to define a
mainframe, but such is not the case. Some definitions are so broad that they
include all computing platforms. Others seek to concentrate on some particular
aspect of mainframe computing (such as the operating systems which run on a
mainframe) and declare that a mainframe is that which runs or supports this
computing aspect. This latter definition suffers from two problems: 1) it is
completely unenlightening; and 2) it is misleading. For example, the FLEX/ES
simulator allows one to run OS/390, VM, and VSE/ESA on a fast Intel processor.
Yet most people who have worked with both classes of machine would intuitively
consider the Intel PC to be the opposite of a mainframe.
Moreover, in the debate between client/server oriented
computing, and mainframe based solutions, the inability to clearly define the
latter has cost more than one data center its mainframe. The "new paradigm"
proclaimed that a clustering of small, limited architecture machines,
interconnected by elaborate topologies, was the wave of the future. Lost to a
nontechnical senior management was the fact that in implementing this new
computational model they were at the same time eliminating the most powerful,
comprehensive, and sophisticated class of computing platforms ever brought to
So what is a mainframe? In order to answer this question I
sat down one weekend and reviewed the history of mainframe computing,
concentrating on those elements that are unique to the mainframe world. The
result of this effort was the following definition, which has the dual
advantages of being both concise and precise. It also invites elaboration and
serves as the starting point for an in depth discussion of the issues it
"A mainframe is a continually evolving general purpose
computing platform incorporating in it architectural definition the essential
functionality required by its target applications."
Some additional comments about this definition are in order.
One of the most fundamental features of the mainframe world is the rapid and
apparently endless evolution of the product line. From 16 general and 4
floating point registers of System/360, to the control register additions in
the early 370s, to the access registers of the latter 370s, to the full
complement of floating point registers of System/390 and the full 64 bit
implementation offered by the z800/900 models; from 6 selector channels to 16
block multiplexing channels to 256 high speed optical channels; from 142
instructions to over 500 instructions; from real addressing to virtual
addressing to virtual machines; from the simple 8 bit memory of the 360/30
through generations of development to the multiported, multilevel caching,
multiprocessor supporting memory of the z900, the entire hardware domain of the
mainframe world has been characterized by an unmatched, and indeed
During much of the first 20 years of the modern mainframe
era (which began on April 7, 1964) individual models of the mainframe line were
targeted by competitive systems heavily optimized to provide a superior
price/performance product within a well defined niche market. As the mainframe
evolved through product refresh cycles and new product announcements, the niche
advantage offered by these special purpose competitors was marginalized, and
their ability to compete in a market that demanded an ever greater general
purpose capability was simply overwhelmed.
The most critical defining element of the mainframe paradigm
is that the solutions it provides are implemented primarily in hardware,
including microcode, an approach (contrary to what many users of other
platforms might imagine) that is truly unique to the mainframe world. From the
early RPQs of the 360 era, to the numerous "assists" of the primary 370 era, to
the full blown architectural enhancements of the late 370 and 390 periods the
mainframe has been a hardware test bed of unmatched scope and versatility. By
way of comparison, you may recall that a few years ago Intel added a half dozen
instructions to its line of Pentium processors to facilitate graphics
processing. Their announcement took a certain pride in noting that this was the
first change to the PC's instruction set in the previous 13 years!
One of the most striking elements of mainframe computing,
when viewed over time, is the extent to which the architecture changes to
accommodate user requirements. One of the early selling points of System/360
was its stand-alone emulation of 2nd generation systems. By the time System/370
came along, stand-alone emulation was replaced by integrated emulation, a
critical user requirement. Hundreds of RPQs have been made available over the
years to satisfy one user requirement or another. Some of these solutions were
limited time offerings; others became a permanent part of the architecture. One
of my favorites from the former group was the High Accuracy Arithmetic Facility
(HAAF) available on the IBM 4361. This mainframe, marketed as a supermini, was
targeted at university math and physics departments. With installation of the
HAAF one could do floating point arithmetic without carrying a characteristic
in the floating point number. Moreover, all errors introduced by fraction
(mantissa) shifting were eliminated. This facility permitted floating point
arithmetic to be analyzed for accuracy under a wide range of computational
conditions, a stunning capability for the math and physics users.
In summary, the essential characteristics of a mainframe
are: rapid and continuing evolution, general purpose orientation, hardware
implemented solutions, and the criticality of user input to all of these
by Lance H.
Reprinted here with his permission