Department of Computer and Information Science
New Jersey Institute of Technology
Could you tell us a little about your educational
and professional background and how you came to be interested in computers
as a career field?
When I entered Physics graduate school at Brandies in 1958, they
decided (quite arbitrarily) to give me the IBM fellowship. Along with
that they gave me a FORTRAN manual and told me to go visit MIT, sit in
some seminars, and learn to program so I could do what ever any of the
faculty wanted programmed. I became the computer "gru" of the department.
When my thesis advisor went on a sabbatical for a year I took a year off
to work full time for IBM and then while doing my thesis I worked part
time for IBM. When I got my degree in theoretical astrophysics I went
to a non-profit as a physicist but they had so many problems dealing with
computers and the related area of command and control that I started doing
more and more work on computer problems. Then five years later I went
into a rather unique government situation as a senior operations research
analyst and had a chance to work with the very latest technology in time
sharing systems. That is where I had a chance to build the first Groupware
system that had ever been created (1970), although we called it Computer
Conferencing in those days. My primary interest had not been computers
in those days but in the problems of humans dealing as groups with complex
problems that required input from a wide range of different talents. This
was based upon work I did during that period on the Delphi Method as a
group communication method to tackle complex problems. Today people would
call this Group Decision Support Systems in computer jargon. The computer
to me was a device for being able to structure group communications and
I was very interested in your online hand-outs,
especially the "Folklore
of Interactive Systems" group. You seem to be saying that users and
designers of interactive systems have very different perspectives of the
system. Is that true and, if so, what are the implications for those of
us in the design business?
For a long time computers were very expensive and therefore it was common
to sacrifice the human user to reduce hardware and software costs. A lot
of people trained in this philosophy don't even realize they have been
trained to do this and are just not conscious of real user considerations.
Now that computers are so common we have a slightly different problem.
Most of the software is being built for the masses. If you want to play
music very easily you buy a CD player. But an expert musician would rather
pick up a violin. Too much of the software out there is designed for the
CD music players and not for real experts in even areas like writing.
The word processors may have thousands of functions but not much to help
the writers of books or large documents.
I was also really intrigued by your statement that
"Value Will Overcome Poor Interface." That seems to run counter to a lot
of current thinking. Do you still stand by that statement and, if so,
how do you define value?
In the early days of word processors on personal computers we had a group
of magazine writers on the EIES system at NJIT. They were reviewing word
processors for the first edition of the Whole Earth Software catalog.
Basically they all sort of said "my word processor is great" and were
exhibiting the basic phenomena that the best word processor is the one
you learn first because of the effort to learn a new one with a different
interface. (Why doesn't everyone use the more optimum keyboard design?)
At one point someone said they had tried a new word processor and it did
not have any of the real formatting power of what they were using (WordStar
I think) but that they were going to change to use that one. Over the
next two months almost every one of the 25 writers changed to that one
word processor that far fewer functions than any of the others. That was
"think-tank" the first system to allow writers to do outlining and fill
in the outlines. Most of these professional writers learned to write with
outlining and it gave them a much higher level of cognitive support. They
were willing to give up the fancier editors for the one that this ability
that was much more valuable to their way of writing than the fancy formatting
features. The real secret of good design is understanding the problem
solving processes employed by your users to solve their problems. I can
pin point many examples in systems to day where we do that poorly. As
long as our users are typified by novices we are not delivering systems
that are valuable to the experts. Philosophically and as an academic it
is bothersome that we are not really designing the sort of the systems
that would allow novices to migrate to becoming experts. Look at all the
professional papers on the fact that users have all sorts of hidden errors
in spreadsheets leading to who knows how many wrong decisions by organizations
and groups. The spreadsheet may be easy to use but it is does not allow
a user to carefully and systematically error check the results. It is
not that the users are some how mentally retarded, but that we have not
designed systems they can learn more sophisticated debugging procedures
You said "the real secret of good design is understanding
the problem solving processes employed by your users to solve their problems.
" Can you suggest some ways that designers can do this (or describe an
innovative way that you have seen it done "in real life")?
There are a number of standard methods in the evaluation literature. In
cognitive sciences is the method of protocol analysis. In anthropology
there is the method of participant observation which means you become
a user for a while and learn to do the things the user does - live with
the tribe. But even to use these methods you have to become trusted by
the users. Other related methods are using the Delphi method and/or Focus
Groups to gather information on requirements from a wider sample of the
user community and avoid relying on the squeaky wheels that are the first
to come forward. The problem is providing the user community meaningful
mechanism of involvement without putting control of the design in their
hands. Most users are not able to understand the difference between what
they believe they want now and what they will really want after using
a system for a period of time. The designer is usually in a better position,
if he knows the field, to project future needs than the user themselves.
Sometimes it is clear, even in their current problem solving processes
that the computer can provide improvements they don't realize are possible.
In the real world it is difficult to get the resource to do the requirements
development and design the right way. In this situation the designer is
faced with using a lot of his intuition and awareness of what is going
on with the efforts at the forefront of the particular or related application
One of your major accomplishments is to develop
the Policy Delphi technique. Could you give us a brief overview of Delphi,
in general, and Policy Delphi, specifically? Are these techniques that
researchers in instructional technology could make use of?
The original concept of the Delphi method as first evolved was to aid
a large heterogeneous group to reach consensus on various complex problems
like projecting when computers would obtain a major new processing speed
through the introduction of a new technology (e.g. technological forecasting).
It is a written communication structure where people are sent an initial
round that summaries the obvious things and asks them to fill in what
is not so obvious. After this first round of exploring and elaborating
on the topic they will vote and try to reach some consensus. On the third
round they are shown the disagreements and asked to give justifications
for disagreements or to change their views. They are all anonymous so
no one looses face by changing their mind. The policy Delphi recognized
that on many policy issues people could not agree because they represented
certain unique interests, values, and/or organizational commitments. The
Policy Delphi was designed to allow people to express the strongest possible
disagreements (e.g. a Hegelian Inquiry process) so that the sponsor or
final decision maker(s) could see what the best information was for any
of the alternative resolutions to the policy issue. It was the first Delphi
designed to promote disagreement and encourage it rather than fostering
a consensus. Some proprietary Delphis in organizations have been used
as a mechanism to gather requirements form users as a group communication
process. They have a chance to see what everyone wants and express group
oriented priority voting for the benefit of the designers.
is a good introductory article on Delphi on my homepage.
Your online paper
Futures of Distance Learning: The Force and the Darkside", provides
some sobering thoughts on the changing nature of tenure for higher education
faculty. Will tenure as we know it today still be available in 10 years?
Is technology being used by administrators to erode tenure?
I am very concerned that state institutions are going to gradually be
forced to give up tenure and that the quality of education will suffer
considerably. For some reason the general public has lost sight of the
college education mission of state governments. Part of this is the money
that the states have wasted on political decisions to put community colleges
in every county and the huge resulting maintenance bill for the operation
of too many second rate institutions. As a result some of the first rate
state universities are going to suffer or are already suffering. There
will always be the elite Universities with tenure and a quality education
for those that can afford them. In the past decade the actual income distribution
in this country has gotten worse (more wealth to fewer people) so that
the net result is going to be less quality education for those that cannot
afford it. The opening of web based learning delivery will also result
in numerous diploma mills that will make people think they are getting
an education for low cost. Our most recent evaluation studies show that
attempts to deliver an automated course where the student acts as an individual
leaner with pre defined materials and assignments and is just graded on
his or her assignment are clearly inferior to face to face classes. But
that the use of collaborative learning approaches with groups of students
is just as good. All the administrators have these dreams of a 1000 students
with a team of graders doing the cheap individual approach. Many students
coming out of working class families and less educated family backgrounds
have no perception of the differences between accredited programs and
non accredited programs. I hope for the emergence of organizations like
a consumer union for degree programs so that the consumer gets some impartial
idea of what he or she is getting. The whole problem could be improved
considerably if accreditation agencies ever realized they need to accredit
the individual teachers and not just a program.
Are there 2-3 books, maybe not directly related
to computers, that you found especially interesting personally or especially
helpful in your career?
C. W. Churchman's book on the Design of Inquiry Systems
made the connection between philosophy and the design of Information-
Communication Systems for me and has had a great deal of influence on
how I view the application of computers. Ludwig von Bertalanffy's book
on General Systems Theory, 1968 was a book that helped firm up
for me an interdisciplinary view of the world that has yet to be realized
on most college and university campuses. Torgerson's Theory and Methods
of Scaling which gave me a real appreciation for the problems of dealing
with human judgments and showed meaningful theoretical foundations for
such a subject. All the above books are very related to someone who is
interested in designing group communication systems that operate through
a computer network.
Could you describe a development project that you
have been involved in that was especially interesting (or especially horrible)
and what made it so?
I think we have all been in some horrible ones and it is particularly
horrible when the nature of the organization is to promise more than can
ever be delivered with in the scope of the available effort. My most satisfying
development and design effort was the original EIES (Electronic Information
Exchange System) effort in 1974-1978 under NSF sponsorship. There were
no constraints and sort of complete freedom to innovate. Too many R&D
sponsors today seem to have gone to the extreme of wanting to feel that
the "R&D effort" will be assured of 100% success so that they can be insulated
from possible criticism. There is some sort of paradox about assuming
that the R&D has to be shown to be successful before it is attempted.
More and more efforts want consortiums including industry commitment which
is another way of stating: "Look we already know we are going to have
a product!" True scientific R&D is an art of doing multiple risk investment
with an expectation of like 1 out of 10 investments paying off a hundred
fold. As a result I do worry about the state of innovative R&D efforts
coming out of traditional sources of R&D funding.
What are you professional plans for the next few
It is time to get a group oriented Decision Support tool kit
available on the WEB and I am encouraging some students to turn their
efforts in that direction. Things like Java and Visual Basic 5 are going
to make it rather easy to prototype much more sophisticated facilities
than have been possible in the past. We are entering another shift in
the industry where the economics are beginning to return to making software
rather than always buying it. I still want to see the web based automation
of the Delphi processes which incorporates the true graphic oriented communication
structures that were traditionally used via paper and pencil and many
of the more sophisticated data analysis tools that are possible for scaling
human judgments and to improve the ability of humans to truly understand