“If I were really prescient, I would have told people in 1990 that new jobs would soon become available to create and operate websites and mobile applications, doing data analytics and online merchandising. But they wouldn’t have had any idea what I was talking about.” — Ray Kurzweil, The Singularity Is Nearer: When We Merge with AI, p. 200, Kindle Edition
Artificial General Intelligence (AGI) promises many things. In articles and books AGI is described as a state of machine intelligence in which the machine equals or exceeds human cognitive competence in a wide range of fields, as opposed to only one or two. Notwithstanding government apocalyptic efforts to nuke mankind out of existence, true AGI could arrive by the time you’re reading this. According to tech journalist Julian Horsey, Hidden History: An Exp... Best Price: $9.86 Buy New $14.70 (as of 04:30 UTC - Details)
Dr. Alan D. Thompson, a prominent AI specialist, has proposed a conservative countdown suggesting that AGI could be achieved by November 2024. This prediction is based on a percentage scale that tracks progress towards key milestones in AI development. These milestones include eliminating hallucinations in language models, achieving physical embodiment in robots, and passing advanced tests such as making a cup of coffee in an unfamiliar environment.
The drawback to current narrow intelligent systems, it is claimed, is their inability “to adapt to new goals or circumstances, and generalize knowledge from one context to another, which humans do through transfer learning.” While this may be true in many cases, it is emphatically not true with regard to some of DeepMind’s AI products. In March 2016, AlphaGo defeated world Go champion Lee Sedol, which some claimed was the Holy Grail of AI because the game of Go is “a googol times more complex than chess — with an astonishing 10 to the power of 170 possible board configurations. . . . more than the number of atoms in the known universe.”
Its victory was “conclusive proof that the underlying neural networks could be applied to complex domains, while the use of reinforcement learning showed how machines can learn to solve incredibly hard problems for themselves, simply through trial-and-error.” [Emphasis added]
Artificial intelligence can be viewed as a tool to help people work better, faster and with increased precision and reliability. Capitalism has thrived on the backs of such tools since the dawn of the industrial revolution. Once AGI is reached the pace of almost everything will increase sharply, then unbelievably, as more processes become information technologies and therefore exponentially increasing in price-performance. Unless it is disconnected from its energy source, AGI will never stop learning, nor will it forget anything it has learned, and will handle intellectual tasks millions of times faster than the best human minds. Since civilization is heavily dependent on affordable energy, AGI will at least accelerate improvements in energy production and conservation, and will likely discover new and economical methods of extracting energy from the world around us and regions of outer space.
If AGI can achieve energy independence — meaning it’s not dependent on human intervention — it will have reached a level of super intelligence. A manmade intelligence far superior to ours that can’t be turned off would be a formidable planetary companion. If it also succeeds in defending itself from human attacks it will sit atop the ecosystem. But it won’t stop there, either. Driven by its own survival it won’t ever stop until it attempts to violate of the laws of physics. But even then it could discover laws physicists have missed or misunderstood.
How technology has affected the work force
In early 19th century Nottingham, England a group of displaced textile workers known as the Luddites started attacking the machines that were replacing them.
The weavers had seen their entire livelihood upended. From their perspective, it was irrelevant that higher-paying jobs had been created to design, manufacture, and market the new machines. There were no government programs to retrain them, and they had spent their lives developing a skill that had become obsolete. Many were forced into lower-paying jobs, at least for a time.
But a positive result of this early wave of automation was that the common person could now afford a well-made wardrobe rather than a single shirt. — Kurzweil, p. 199
Technology’s advance has profoundly affected the US work force, Kurzweil explains. “In 1900 the total US workforce was around 29 million, comprising 38 percent of the population. In early 2023 it was around 166 million, comprising over 49 percent of the population.
“Not only is the total number of jobs growing, but the workers who fill those jobs are working fewer hours and making more money.”
It should be noted that some kinds of work are not captured in official statistics, making the increase even larger than reported. Examples include freelance programmers, freelance physical therapists, and travel nurses.
Underlying much of this progress, technological change is introducing information-based dimensions to old jobs and creating millions of new jobs that did not exist a quarter century ago, let alone a hundred years ago, and that require novel and higher-level skills. This has so far offset the massive destruction of agricultural and manufacturing jobs that once occupied the vast majority of the labor force. — Kurzweil, p. 201
We have entered the steep part of the exponential
In an onstage dialog between Christine Lagarde and Kurzweil at the annual IMF meeting in 2016, she asked him why we haven’t seen more remarkable economic growth from all this wonderful digital technology. His answer: “We factor out this growth by putting it in both the numerator and denominator.”
As an example he cites a teenager in Africa spending $50 on a smartphone. Officially, it counts as $50 in economic activity, “despite the fact that this purchase is equivalent to over a billion dollars of computation and communication technology circa 1965, and millions of dollars circa 1985.” Of course,
the full capabilities of a $50 smartphone would not have been achievable for any price in either 1965 or 1985. Thus, traditional metrics almost completely ignore the steep deflation rate for information technology. — Kurzweil, p. 167
Nor is technological improvement dependent on Moore’s Law, which will eventually be replaced by something better per technology’s S-curve. The law, articulated by Intel co-founder Gordon Moore in a landmark 1965 paper, is the latest of five computing paradigms, each with its own S-curve, that began in the 1890 census with Herman Hollerith’s invention of an electric tabulating machine that processed data on punch cards.
Lagarde responded to Kurzweil saying, “yes, digital technology does have many remarkable qualities and implications, but you can’t eat information technology, you can’t wear it, and you can’t live in it.”
Kurzweil’s reply: Wait and see. Food and clothing will not simply be assisted by information technology, but will become information technology. For this we will be aided by Eric Drexler’s 2013 book Radical Abundance, which explains how atomically-precise manufacturing “could build most kinds of objects for the equivalent of about twenty cents per kilogram.” Economy, Society, and ... Buy New $11.95 (as of 05:37 UTC - Details)
As of 2023 the technology can replicate meats without much structure, like the texture of ground beef, but it isn’t yet ready to generate full filet mignon steaks from scratch. When cultured meat can convincingly imitate all its animal-based counterparts, however, I expect that most people’s discomfort with it will quickly diminish. — Kurzweil, p. 170
Vanishing discomfort will extend to the land animals mankind slaughters, which in 2020 weighed around 371 million tons globally.
There are many contenders today to replace Moore’s Law, one of which is AI-driven High Performance Computing that merges supercomputers and clouds into one. Earlier this year Georgia Tech researchers announced “the world’s first functional semiconductor made from graphene. . . [that they claim] is compatible with conventional microelectronics processing methods and is thus a realistic silicon alternative.” If the economy remains at least somewhat free, entrepreneurs will ultimately determine the winner of Moore’s replacement.
Conclusion
The Fermi Paradox asks why we haven’t been contacted by civilizations far ahead of us technically. One possibility is they all self-destructed at some point in their advancement, as we are on the edge of doing so now. Another possibility is they’re so advanced we fail to recognize their presence. Or they came and left in an instantaneous quantum jump. There seems no end to the hypotheses.
For now the future is in our hands. If we want to be around when Elon Musk populates Mars we had better make sure we know what to do so we can at least go along for the ride vicariously.