Neurocomputing: some possible implications for human-machine interfaces.



© Stephen Plowright
Computing and Communications Division, University of Western Sydney, Nepean

ABSTRACT

Recent research into neural computation has lead to the development of a range of commercially available data analysis tools. Because the simulation of an array of many parallel elements is very processor-intensive for conventional computers, the full potential of neural processing awaits a hardware implementation. When hardware becomes commercially viable, adaptive interfaces will revolutionise human-machine systems, forcing human biologists to re-define such concepts as; human, machine, intelligence, and language. This paper is a discussion of some of the possible implications.

INTRODUCTION

Research since the mid 1980s into computation performed by arrays of neuronlike elements has revealed some of the emergent properties of such 'neural nets'.

The power of neural net processing lies in its ability to learn to map given data sets to required output patterns, also, to distinguish between and categorise input patterns without explicit instructions. The result is that untrained nets can be constructed which can be computationally structured by their information environment. These models are said to be adaptive (Jubak, 1992).

The recent direction of adaptive machine interfaces has focused on the incorporation of intelligent agents into the interface software. These agents act as autonomous helpers which remember and assist with routine command sequences. Some of these agents utilise neural net processing to adapt to user actions (Slagle and Wieckowski, 1994).

Another area of advance in interface design has been the development of virtual reality technology. Most of this technology has been aimed at allowing the immersion of the user in a simulated environment. There has been some recent research into abstract virtual environments as an aid for scientific discovery (ADASS, 1993).

Current research in artificial intelligence, cognitive science, neural networks, and neurophysiology point the way toward the development of virtually transparent human-machine interfaces. Also, methods of data presentation suggest the possibility of new and augmented sense modalities.

RESEARCH AND APPLICATIONS

Neural Nets

A neural net is a simulation of an array of interconnected neuronlike components. Each neuron performs a summation formula on the values of its inputs which determines its output value. Each neuron will send its output to many other neurons (see figure 1) (Hecht-Nielsen, 1990; Hertz et al, 1990).

An adaptive network changes the weighting on each of its inputs according to its input history. In this way computational structures and heuristics emerge without explicit learning algorithms, and feature maps are constructed by the information environment (Hertz et al, 1990; Allinson & Johnson, 1989). Information is processed across a large number of elements simultaneously. This form of computation is often termed parallel distributive processing.



Figure 1. Neurons with weight vectors Wi = (Wi1, Wi2, .... Win ).


Like their biological counterparts, neural nets arrive at unique solutions to categorisation problems and, once trained, are particularly suited to pattern recognition tasks. They also achieve remarkable feats of data compression and extraction.

Although computer simulations of neural systems have provided useful research, and even working commercial data analysis tools, there are some problems to be overcome. Experimental neural chips have been produced, but the limitation at present is the problem of the interconnection of layers (Jubak, 1992). Full one-way interconnection between two layers of a mere 10 neurons each requires 100 wires. A modest 3 later chip of 100 elements per layer would involve one million wires to interconnect in the forward direction only (assuming no connections feeding backwards, or from layer 1 direct to layer 3, as would be found in more realistic systems). Indeed the human brain is composed mostly of connecting fibres, the white matter, while the neurons, or grey matter, occupy a fraction of the volume. Such 3 dimensional circuitry contrasts with modern CPU chip designs which arrange more than two million elements in a two dimensional circuit connected only at the edges.

Biological Systems

Modern computing power has made it possible to analyse and simulate some of the complex processes observed in biology. Biomimesis has provided a rich source of solutions to practical design problems. Millions of years of trial and error has produced refinements in efficiency not understood until supercomputer analysis revealed their complexities. Much of the current neural net research is inspired by known features of biological systems.

The general organisation of biological sensory systems is fairly well understood. Sense organs consist of a structure in which a large number of transducer cells produce signals when exposed to an appropriate stimulus. The signals undergo pre-processing at the earliest stages and the data path is typically reduced by orders of magnitude. For instance, the human visual system has 100 million detector cells feeding into one million optic nerve fibres (Churchland and Sejnowski, 1992). This data compression illustrates the ability of neural nets to compress and pre-process a large amount of data in a small number of steps. Most of the processing and recognition tasks of the brain need to take place in approximately 1 second. This limits the number of steps in a neuron chain to around 100 (Rumelhart, 1989). This 100 step constraint is only possible with massively parallel processing.

Further along the processing path, the extracted features are themselves data for similar feature detection and categorisation. At the highest levels, sensory input from all sources is integrated. In this way, data from a large array of cells is processed within an organ to produce data representing features of the original pattern, which are then input for further neural net processing. At each stage a more global integration of sensory data occurs which eventually encompasses the entire information environment.

The implications of this type of processing are, firstly, that a very large amount of data can be reduced rapidly to a manageable form. Secondly, that pattern processing is possible across data from all sources, thus at the highest levels of processing, all of the sensory inputs together can be viewed as a single sensory array. This may help explain the phenomenon of synaesthesia, when sensory modalities become mixed and sounds acquire colour, or tastes are seen as shapes.

The high level integration of sensory data is well illustrated by the importance of visual and auditory data matching in speech comprehension. As Movellan (1995) points out, there is a need for current psychological models to address the known importance of audio-visual synchrony in speech perception.

Skilled Performance

An insight into the adaptive power of biological neural systems can be gained by considering skilled performance. Even long after the period of maximum neural plasticity, humans can learn to perform complex and seemingly transparent translation tasks. A good example is the motorcyclist who must perform many complex interactions per second with the machine. With practice, the rider is no longer aware of the actual muscle movements required and can attend to the larger picture. The machine becomes an extension of the body, the interface becomes transparent.

By utilising a wide range of sensory input, humans are able to learn to perform a number of concurrent tasks with an effective combined bandwidth far exceeding that of current human-computer interfaces.

Metaphorical Mapping

The information processing in neural nets is more accurately described as mapping rather than coding or translation. The nets learn to map features of input patterns to particular output patterns. In this way patterns can be categorised according to the learned associations. New input patterns are associated with the results of the closest previous input patterns. Useful predictive value can arise in trained nets exposed to new situations.

Neural net research has offered some insights into the way information is processed and stored in the human brain. The simple computer analogy of input, output, and storage is no longer appropriate as a model for the brain. Lakoff (1987) argues that language is processed by spatial metaphorical mapping, that abstract concepts are understood in terms spatial metaphors. This could have profound implications for the evolution of language, as discussed in a previous paper (Plowright, 1994).

The large data structures known in neural net research as schemas can be thought of as metaphorical maps. Data patterns can be reduced to features which are then shunted to the appropriate metaphorical maps. In this way, features of the input pattern activate metaphors previously associated with similar patterns. Thus new information can be understood in terms of existing metaphors. This has obvious survival value as it gives most new situations some predictability.

Memory storage and retrieval in neural systems is also quite different from that used in computers. While a computer stores binary code in a matrix of memory addresses and can duplicate the stored data exactly, neural systems store information in a more diffuse, less accurate, but more efficient manner. Information is stored in terms of the associations and relationships between features of the new information and the existing schemas. Rather than retrieving a whole memory sequence as a computer would, a neural system reconstructs an approximation of the required memory from old schemas and the salient features of the specific memory sequence required. A great deal of efficiency is gained from the fact that only the relevant or unusual features of an event need be remembered, the rest of the scene can be filled in from knowledge contained in existing schemas. The less fortunate side of this efficiency is that often the assumed or filled in parts of memory are believed to be accurate.

The Human-Machine Interface

The Graphical User Interface, first popularised by Apple Computer's Macintosh and now emulated on 80% of all personal computers, was a revolution in interface design. The GUI design uses the desktop metaphor. Elements of the interface behave in a manner analogous to objects in an office environment. This makes use of very basic existing knowledge, drastically reducing the amount of learning required to operate the machine. The interface creates a virtual environment in which the user can interact with virtual objects, it translates user actions into machine commands and displays information in a form which is quickly and easily understood.

While the GUI is customisable to best suit the needs of the user, current interface development centres around active agents within the interface which adapt intelligently to the user's needs (Slagle and Wieckowski, 1994).

Although interface software has come a long way, mainstream interface hardware has hardly changed in decades, with the exception of the mouse. Virtual reality devices have been produced for 3 dimensional games and simulations, but these are still bulky and expensive. VR devices are being used to investigate new graphical, tactile, and audio methods of data presentation (ADASS, 1993).

DISCUSSION

Advances in neural net research offer many opportunities for practical applications in human-machine interface design. Insights into human sensory processing and perception will allow an improved integration of sensory media in the presentation of information. Virtual reality devices will allow a greater range of metaphors by presenting a more familiar and complete information-scape. This will in effect increase the bandwidth of the interface substantially.

Another opportunity for neurocomputing will be in fast hardware implementations which allow the machine side of the interface to adapt to the human interaction. This will allow the interface to rapidly tune in to features of the user's input from a broad sensor array. Neural processing will enable the interface to learn to interpret data collected from the user's sounds, eye movements, even direct from nerve impulses. Interface attributes such as audio volume and tone, background colour and brightness, and response speed could be adjusted to best fit the state of the user as reflected by the patterns detected by an array of interface sensors. The interface would learn to make such adjustments so that they increase the efficiency of interaction as measured by user productivity, stress, or other criteria.

The implication of improvements in sensor and presentation technology, along with adaptive neural processing is that, in the near future, machines will be integrated far more seamlessly and transparently into human lives. Information environments will become as familiar and real as physical ones. Information-scapes will use familiar metaphors to allow the user to quickly learn to interpret the messages coming from the machine. Realtime feedback will enable the user to become unaware of the hardware and focus on the task, much like the motorcyclist who feels the machine as an extension of the body.

Adaptive neural net processing of signals to and from prosthetic devices should greatly improve feedback resolution and motor control. Bionic devices will be able to learn to map data appropriately in both directions providing a vast improvement in functionality. Such advances will pave the way for affordable augmentation devices for the non-disabled public. Sensory, motor, and information system enhancements are then likely to become as affordable and ubiquitous as the wristwatch.

The social and psychological implications will need to be considered with some urgency, given the rate at which new technology is adopted by the public. For example, the effects of the adoption of alternative personae on internet users has raised some concerns (Wozencroft, 1995). There is also the question of the way people identify with their accessories. When accessories and clothing start to incorporate adaptive enhancement and information technology, users will no longer be aware of a separation between themselves and their technology. Optical fibre can provide the bandwidth necessary for real-time multi-modal sensory communication at a distance which goes well beyond video conferencing.

Within the next two decades we face the probability that cyberspace will be as real to the user as physical space, and bionic enhancements will become fashionable, and later indispensable, items as transparent to use as contact lenses. Items currently separate can be combined or connected remotely. The functions of the mobile phone, organisers, desktop computers, and public information systems can be accessed with a single communication device. The technology is under development now. The implications for human biology will be far reaching.

Human biologists will play a vital role both in the development of hardware interfaces and software metaphors, and in the assessment of the physiological, psychological, and sociological effects of new technology. Even today, it is apparent that human-machine systems can no longer be understood in a purely reductionist paradigm. The interaction with the machine brings about unique behavioural adaptations in the human, and the new generation of machines will adapt uniquely to the human. Study of the interaction at the interface will be vital in the understanding of the effects on the human component. Just as important will be the study of human-machine systems as whole entities. These entities exist within communication networks which are rapidly evolving. Human biologists will be hard pressed to keep pace with the cyborg psychology and cyberspace sociology which has already started its evolution.

REFERENCES

ADASS Conference (1993). Towards an Astrophysical Cyberspace: The Evolution of User Interfaces. http://guinan.gsfc.nasa.gov/ACS.html

Allinson N and Johnson M (1989). Realisation of self organising neural maps in {0,1}n space. In: Taylor J and Mannion C (eds). New Developments in Neural Computing. New York, Adam Hilger, pp 79-85.

Churchland P and Sejnowski T (1992). The Computational Brain. Massachusetts Institute of Technology, pp 142-149.

Hecht-Neilsen R (1990). Neurocomputing. New York: Addison-Wesley, pp 63-70.

Hertz J, Krogh A, and Palmer R (1990). Introduction to the Theory of Neural Computation. New York, Addison-Wesley, pp 217-237.

Jubak J (1992). In the Image of the Brain. Boston, Little, Brown & Co, pp 170-175.

Lakoff G (1987). Women, Fire, and Dangerous Things: What categories reveal about the mind. Chicago, University of Chicago Press, pp 283-303.

Morvellan J (1995). In search of the Statistical Brain. http://cogsci.ucsd.edu:80/~movellan/crl.essay.html

Plowright S (1994). Neural net processing: A key factor in the evolution of language. ASHB News, Vol 6 No 2, pp 7-12.

Rumelhart D (1989). Brain-style computation: Mental processes emerge from interactions among neuron-like elements. In Brink J and Haden C (eds). The Computer and the Brain: Perspectives on human and artificial intelligence. North-Holland, Elsevier Science Publishers, pp 111-121.

Slagle J and Wieckowski Z (1994). Ideas for Intelligent User Interface Design. ftp://ftp.cs.edu/dept/users/wieckows/Ideas_for_Intelligent_Us

Wozencroft J (1995). Losing yourself on the Net. New Scientist , Vol 147 No 1994, p 42.



First published by the Australasian Society for Human Biology.

Other Papers

Page Authored by MacKaos Consulting.