Of microcomputers, the abacus and geniuses
About four decades ago, sometime in 1970, the first microprocessors, also called a “computer on a chip,” were born. Today, there are more microprocessors than there are people on earth.
In the early 1980s, CNN used an Onyx computer system with a network of 30 terminals, with one on each anchor desk to bring late-breaking stories to the newscaster. The first accounts of the Mount St. Helens volcanic eruption were read to the viewing audience right off the computer terminal’s screen — probably at the same moment (less the speed of microwaves) they were being typed in at the other end.
During these times, in the early 1980s, Xerox used to claim in its advertising that “today’s office has changed very little.” And so it did seem. Most executives in the most advanced and sophisticated business environments then still wrote their letters in longhand on yellow legal-size pads; executives used typewriters with keyboards designed to slow their typing speed. The first typewriters jammed if they went too fast, and tons upon tons of paperwork died the death of the unread in mile after mile of file cabinets.
Thus, one of the hottest uses for computers then became those two great words: “office automation.” Word processing, teleconferencing, computerized telephone systems, intelligent photocopiers, networks and executive computers — all these became part of the push for the paperless office. A well-known office automation consultant from Pennsylvania, USA, announced in 1982 that “The paperless office will arrive at about the same time as the paperless toilet.” And boy, oh, boy, that’s going to be tough!
Actually, computers used to be simply counting machines. What is interesting and amazing are all the ways we have learned to put these ultra-fast counting machines to work for us. If we want to be accurate, we will all have to agree that the first computer is the abacus, whose origins date back to 500 BC in Egypt. The abacus as we know it today came into use in China about 200 AD. It was called the “saun-pan” or computer tray. In Japan it is called a “soroban.”
A Chinese-Filipino friend of mine gave me a great-looking abacus as a gift sometime in the ‘80s when I first worked for government, wishing me good luck with it. Perhaps the good luck indeed worked. For sometime in 1987 when we first started to de-monopolize and liberalize the telecommunications environment of the country — a political process that had to be done without fear or favor, it could have brought me the good luck we needed. This industry has become the most dynamic and profitable sector of the country today. In the process of granting authorizations for the new technologies, however, the authorities and the powers that be must be very careful in observing due process, and respect the collegial nature of the powerful agency that regulates the sector.
The collegiality of said agency was not just a dream of mine. This has already been mandated by the Supreme Court of our country. It is definitely not a one-man-signing affair.
Leonardo da Vinci (1452-1519), the consummate Renaissance man, dabbled in science but was never known to be interested in calculating machines. Yet in 1967, two volumes of his notebooks were discovered in Madrid’s National Library of Spain, detailing a digit-counting machine. A model interpreted from his drawings is at IBM.
Blaise Pascal (1623-62) was a child prodigy. Before he turned 13, he had proven the 32nd proposition of Euclid, and discovered an error in René Descartes’ geometry. At 16, Pascal began preparing to write a study of the entire field of mathematics, but his father required his time to hand-total long columns of numbers. Pascal began designing a calculating machine, which he finally perfected when he was 30.
The “Pascaline,” a beautiful, handcrafted brass box about 14x5x3 inches, the first accurate mechanical calculator, was born. The Pascaline was not a commercial success in Pascal’s lifetime; it could do the work of six accountants and people feared it would create unemployment, not unlike modern sentiments about computers, robots and now, of course, the Internet.
Pascal was dismayed and disgusted by society’s reactions to his machine, and completely renounced his interest and work in science and mathematics, devoting the rest of his life to God.
He is best known for his collection of spiritual essays, Les Penseés, even though the basic design of the Pascaline lived on in mechanical calculators for over 300 years. As a counting machine, the Pascaline was not superseded until the invention of the electronic calculating machine. “The arithmetical machine produces effects which approach nearer to thought than all the actions of animals,” wrote Pascal in the Penseés, “but it does nothing which would enable us to attribute will to it, as to the animals.” Pascal, to my mind, a genius by any measure, died of a massive brain hemorrhage at such a young age of 39. In 1968, a programming language was named after him.
Indeed, Blaise Pascal was a genius. Someone said he was just talented — not a genius. A great many, including myself, strongly disagree. I have heard it said that genius is the highest type of reason while talent is merely the highest type of understanding; talent is what you possess, but genius is what possesses you.
Winston Churchill said it best when he said that “True genius resides in the capacity for evaluation of uncertain, hazardous and conflicting information.” And as John Keats said, “Works of genius are the first things in the world.”
You don’t have to be a genius to use an abacus adeptly and expertly. Learning to use an abacus is easy … it is still in daily use in many parts of the East. In 1946, a contest was staged between Army Private Thomas Wood, the most skilled electronic desk calculator operator in the United States then, and Kiyoshi Matzuzaki of the then Japanese Ministry of Postal Administration.
They were given 50 problems in addition, subtraction, multiplication and division. Matzuzaki, wielding his soroban (abacus), won in everything except multiplication. Lee Kai Chen, a Chinese professor, beat computers with his abacus in Seattle, New York and Taipei in 1959. Some abacus experts become so adept at calculating they eventually set the abacus aside and do it all in their heads. This, by the way, is talent, not genius.
There was another counting machine devised by a Spaniard named Magnus around 1000 AD. It was a machine made of brass resembling a human head with figures instead of teeth. The thing looked so diabolical — must have been inspired by Satan — that the Spanish priests destroyed it with clubs.
Let’s fast-forward to the machine of today where everything possible is in one thin, slick, handsome device. We are in the 21st century.
Today’s computer brain lives in this slick little device. The computer brain went through living in a city of transistors, resistors, capacitors, and connectors, where city limits used to stretch only a quarter inch and the distance of streets, or wires, was only 2.5 microns, 5,000 times smaller than the thickness of a strand of human hair.
And the actors today are a new breed — young, bright, energetic, hopefully infused with a vision, not running around holding their umbilical cords not knowing what to do and where to go. I see them somehow in my kids and the kids of my friends and the kids of the world, that generation after us, infused with a vision — microcomputers stacked miles high, offices awash in binary code, kids welded to computers.
The old breed is rich. Some have been the movers and shakers; others are moved and shaken. Some have gone from rags to riches; others, especially in today’s current world, from riches to rags.
Some are gurus; some are frauds. Some were trained in the discipline, while others fell into it from other occupations. A few, it sometimes seems, have come from outer space.
Most are a tantalizing combination who find themselves in the intriguing world of the Internet today.
* * *
Thanks for your e-mails sent to jtl@pldtdsl.net.