The Next (R)evolution in Computing

Modern computing has come a long way from the days of Babbage's Difference and Analytical Engines, which are thought by many to the true ancestors to the modern computer, even though neither machine was actually built as a full-scale working model. The first production computers were mechanical devices, in many ways similar to Babbage's Analytical Engine. These mechanical computers evolved into electro-mechanical devices, a hybrid of electrical relays and mechanical parts, which in turn became electronic analog devices, then electronic digital devices, which finally gave way to the semiconductor-based digital computers we have today.

Even though the guts of the computer have changed over the years, the main function of the computer hasn't. Babbage created his Difference Engine to calculate logarithmic tables and Eniac was created to calculate artillery trajectories. Modern computers can do all sorts of things besides calculating tables of logarithms-but at their core, they are still overgrown calculators.

The principles of computing, which were beginning to be developed as early as the 1930's and became practical on a large scale in the 1950's are still being used today. The new frontiers of computing technology, quantum-based CPU's and holographic memory may increase the speed and capacity of computers, but will not change the fundamental nature of computers; they will still be large (or small) calculators.

The problem with computers is that, for all of the gigabytes of memory and fancy HD displays, they are hardware-centric, engineering devices, designed to solve engineering problems. Sure, the attempt has been made to make computers more user friendly by layering operating systems and programs on top of the hardware, but the user is still at the mercy of the computer engineer and software designer to develop the tools that are needed to use the computer. And even though it is much easier to use a computer now than it has been in the past, the available software is still hardware-centric, making the user conform to the rules of the machine, rather than making the machine conform to the rules of the user.

Making computers faster, or increasing their capacity won't change the fundamental paradigm of machine centered computing. The next evolutionary step in computing, the next revolution in computers will be the shift from hardware-centric computing to human-centric computing. Instead of using a computer to solve a problem, we will interact with a computer to solve the problem together. What is needed to make this paradigm shift possible? Computers will need to understand human languages.

This can be shown by using an example familiar to all of us programmers: writing a program. To write a program, you start with a programming language of your choice. Once you write the source code, it is fed into a compiler or interpreter to convert the source code into machine code that is then executed by the computer. Notice the process: we write a human-readable document that must be converted to a machine-readable document before the computer can understand what we want accomplished.

Source code is a strictly human construct. A computer has no idea what Dim a as As Integer means from a language perspective. In order to make the computer understand what we want done, the code must be translated to machine language. The machine forces this step-wise process because we must conform to the rules of the machine. To make matters worse, we as programmers must conform to the rules of the language designer, even though the rules of the language may not be what we really need in the first place.

It makes more sense, from a human perspective, that to solve a problem, we simply tell the computer the problem, and together, build a solution. If the computer understood our language, we could explain to it what needed to be done. If what we are asking is ambiguous or not understood, language provides the mechanisms necessary to clarify concepts and ideas, and arrive at a common understanding. Human-centric computing means making the machine conform to human rules; that is, changing computing from calculation-based to language recognition.

Consider the following:

1. Understanding language implies interaction. If a computer understood human language, it would have the facility to be an interactive partner in problem solving, because language is the basis for human interaction.

2. Understanding language implies learning. The only way to develop language skills is to learn the language. In order for a computer to understand language is has to learn the language, and must have the capacity to learn the language of the user.

3. Understanding language implies intelligence. Language is, because it is a human construct, filled with subtleties. If a computer were able to understand language, it would also have the necessary intelligence to understand these subtleties.

4. Understanding language frees the computer and user from the constraints of the engineer and software designer. If a computer can understand that I need to build a database of names, it will also have the capability to build and manage that list of names.

5. Understanding language implies a universal machine. If language is the medium of computer interaction, then the concept of a specific operating system, even specific programs, is no longer necessary. Language enabled computers would be universal, able to interact, not only with humans, but with other computers as well, in a humanistic fashion.

6. Understanding language implies evolutionary change. Languages change over time, and the evolutionary progress of man can be measured by the changes in human language. If a computer understood language, it too would evolve right along with humans.

While all of this may sound nearly impossible, the shift from calculation computing to language computing does not mean a new set of computing principles. Rather it is the reformulation of those principles, the change of perspective that needs to come about in order for the paradigm shift to occur. Calculation oriented computing has undeniably changed the world. Language oriented computing, the next (r)evolutionary step in computing, may have an even bigger impact.

Unless otherwise stated, the content of this page is licensed under Creative Commons Attribution-Share Alike 2.5 License.