We need to innovate just to run in place. We need a constant stream of innovations in energy, materials, automation, and other areas just to maintain a constant standard of living. Some substantial fraction of all innovation goes toward just maintaining our current standard of living and quality of life. My guess is the fraction of all innovation goibng toward maintaining our position is actually rising. Whether that is true is a very important question.
This need to innovate just to maintain our current living standard seems to get little attention. So I'd like to explain some of the reasons why the need exists and why I suspect a rising fraction of our total innovation must be going toward just maintaining previous gains.
Let us start with population increases. As population rises so do the damaging side effects of human activities (often called external costs by economists). For example, 100 million people can emit 3 times the pollution per person as 300 million people with the same total pollution. Therefore when population increases the amount of pollution allowed per person has to decline in order to maintain the same level of air and water quality. The cost of emissions reduction is not linear. Cutting tailpipe pollution 50% is much less than half as expensive as cutting it 100%. We need more innovations that lower the costs of emissions reduction or else population increases will translate into either higher pollution or lower living standards or both.
Population increases also mean more demand for water, oil, copper, zinc, manganese, and other minerals. Even as demand increases the marginal costs of minerals go up due to rising demand from larger industrialized populations, declining ore quality, and higher energy costs. Therefore we need innovations in ore extraction and energy production to compensate for both higher demand and lower quality supply. Plus, we need innovations that enable us to use substitutes. How many of our innovations are going toward developing needed substitutes?
Look at the cost of fish. Overfishing has raised the costs of finding fish. Whereas fish used to be easy to find without going very far they are now harder to find and require ships to travel over greater distances and with more expense and fuel in order to catch them. This trend has already developed so far and fish prices have gotten so high that aquaculture to raise fish has been developed as an alternative. Researchers work to lower the pollution effects of aquaculture and to reduce infections and other problems in aquaculture farms. Fish production now requires scientists who study how to manage fisheries in the wild and how to produce feed and suitable conditions for aquaculture farms.
Look at energy production. Innovation for oil extraction has not progressed fast enough to prevent a large rise in the cost of oil extraction as the easier to get oil reserves have been depleted. Offshore drilling now requires drilling rigs that cost hundreds of millions of dollars along with support ships and helicopters to ferry out workers. Large numbers of scientists and engineers toil away at considerable expense at trying to make photovoltaic and other alternative energy sources cheaper. Still other scientists and engineers work toward lowering the cost electric vehicle batteries in order to enable a migration away from increasingly expensive oil as a source of fuel for vehicles.
Here is a partial list of depleting resources, external costs, and other changes that require innovation to compensate:
What I'd like to know: How to measure what fraction of innovation goes toward breaking even, basically running in place? If we could measure that we could also measure whether the amount of innovation devoted toward civilization maintenance is rising, falling, or staying the same.
Update: It is difficult to predict future rates of innovation. One reason why: Key discoveries can enable many derivative innovations. So, for example, it would be an understatement to say that the transistor enabled quite a few other innovations. Ditto the laser which has revolutionized communications.
Whether we can generate innovations faster than we create conditions (e.g. depleted mines or depleted aquifers) that require innovations is hard to know. In theory we have huge potential for advances in a number of fields including computing, nanomaterials, and fusion energy. But it is hard to forecast, for example, when fusion energy will become commercially practical.
I think the overall rate of technological progress seems faster than it really is because the computer and communications revolution has done so much to increase the flow of stimuli to people. They experience videos and web sites and buzzing sounds indicating that new text messages have arrived and it all seems very fast paced. But we need advances in areas that are more basic such as in materials and energy production in order to stay out of the Malthusian Trap. So far those advances haven't come easily. We still don't have nanobot manufacturing devices or fusion energy for example.
Perhaps manufacturing nanobots will make nuclear power, photovoltaics, and long range lithium car batteries a reality in 20 or 30 years hence. We might really be approaching some huge enabling advances that speed up the rate of innovation. But right now the rate of innovation doesn't seem to be keeping up with the rising demands for resources.
That was Kurzweil's real secret, and back in 1965 nobody guessed it. Maybe not even him, not yet. But now, 46 years later, Kurzweil believes that we're approaching a moment when computers will become intelligent, and not just intelligent but more intelligent than humans. When that happens, humanity — our bodies, our minds, our civilization — will be completely and irreversibly transformed. He believes that this moment is not only inevitable but imminent. According to his calculations, the end of human civilization as we know it is about 35 years away.
The underlying assumption here is that artificial intelligence will speed up the rate of technological innovation by orders of magnitude. But is that really true? Look at the trend today: Computer power is still following the pattern of doublings, albeit with more difficulty. The shift toward use of more CPU cores to achieve this does not work for all problems.
A contrarian view is that in spite of the repeated doublings of computer power the rate of technological innovation slowed up starting in the early 1970s, the low-hanging technological fruit has all been picked, and we are being held back by depleting natural resources. For signs of the latter look at a big shift in the long run trend in natural source prices. The Julian Simon-Paul Ehrlich bet on natural resource prices stopped coming out out in Simon's favor starting in 1994.
The rosier Singularity view seems based on a few assumptions:
While I expect the computer industry will create artificial intelligences I'm less sure we will survive once they become far smarter and more powerful than us. Yes, I expect they'll be more productive.
What I'm least sure about: How much computing power do we need to turn currently extremely hard problems into easily solved problems. Consider that Moore's Law of computer power doubling every couple of years or so has been running for decades. Yet, for example, it hasn't enabled plasma physicists to come up with workable fusion reactor designs. Nor has it enabled the solar photovoltaics industry to suddenly come up with ways to drop the cost of photovoltaics by orders of magnitude. Nor has it prevented the big run-up in commodities prices as Asia industrialized and ore concentrations declined.
Some substantial (and I suspect growing) portion of all innovation goes to basically trying to keep running in place. How to grow enough food as world population grows? How to manage water more carefully as aquifers get drained? How to use oil more efficiently as the remaining oil is in harder to reach places?
Maybe things get worse for years until the Singularity suddenly ushers in a Golden Age. But even if true we have to live thru the intervening years before we reach the Singularity.
Writing at The Atlantic at Megan McArdle takes a look (with home kitchen video for demonstration) at just how much time modern appliances save us in the kitchen.
When my grandmother was growing up in the 1920s, the average woman spent about 30 hours a week preparing food and cleaning up. By the 1950s, when she was raising her family, that number had fallen to about 20 hours a week. Now, according to the U.S. Department of Agriculture, women average just 5.5 hours—and those who are employed, like me, spend less than 4.4 hours a week. And that’s not because men are picking up the slack; they log a paltry 15 minutes a day doing kitchen work. One market-research firm, the NPD Group, says that even in the 1980s, 72 percent of meals eaten at home involved an entrée cooked from scratch; now just 59 percent of them do, and the average number of food items used per meal has decreased from 4.4 to 3.5. That’s when we’re home at all: by 1995, we consumed more than a quarter of all meals and snacks outside the home, up from 16 percent two decades earlier.
Her accompanying video shows some of the kitchen innovations that we might think have always been around. Some are simple and yet had their origins only in the 20th century. 19th century cooking was no fun. 18 century? Time travel would be no fun.
Go back to the year 1900 or earlier and the labor needed for food preparation was even greater. Food storage was a much bigger problem with early home refrigerators only first making it to the market in the 1910s with a much larger roll-out in the 1920s. Before that ice boxes were used in areas where ice could be stored into the summer from the winter or traded along coasts.
The further back we look the more of the food processing steps were done at home and the manual the labor was for doing those steps. In England when did the majority of harvested grain begin to be processed by specialized laborers called millers? When did butchers become the venues thru which most meat flowed?
Technological advances made the womens' liberation movement possible. Men were going to do lots of manual labor outside of the house and women were going to do lots of manual labor inside the house until machines freed women from the kitchen and washroom. The movement of women into commercial workplaces was enabled not just by freeing them from kitchen labor but also by machines in factories and other job sites that reduced the need muscles to do most commercial work.
So what about the future? What technological advances are going to cause changes in human labor on the same scale as industrial food processing, refrigerators, home cooking appliances, frozen dinners, and pizza delivery? So far modern communications technologies (cell phones, the internet) have not caused changes as fundamental as those which occurred in the 20th century using primarily mechanical technologies. For example, the living standards gains from personal computers have been relatively small. In my view this supports the argument put further by Tyler Cowen that the rate of fundamental innovation has slowed. Or his argument see his Kindle book The Great Stagnation: How America Ate All The Low-Hanging Fruit of Modern History,Got Sick, and Will (Eventually) Feel Better. As for eventually feeling better: Only once full body rejuvenation becomes possible.
Here are some excerpts from IBM's predictions for the next 5 years. What do you think of these predictions?
You'll beam up your friends in 3-D
In the next five years, 3-D interfaces – like those in the movies – will let you interact with 3-D holograms of your friends in real time. Movies and TVs are already moving to 3-D, and as 3-D and holographic cameras get more sophisticated and miniaturized to fit into cell phones, you will be able to interact with photos, browse the Web and chat with your friends in entirely new ways.
Scientists are working to improve video chat to become holography chat - or "3-D telepresence." The technique uses light beams scattered from objects and reconstructs them a picture of that object, a similar technique to the one human eyes use to visualize our surroundings.
3-D telepresence will do more for business than for personal communication. The trend for socializing is toward more chatting by typing than by talking. The ratio of typed to spoken cell phone conversations keeps going up. Think about it: Do you spend more time in chat rooms, instant messaging, email, and Facebook? Or do you spend more time on the phone?
Better batteries using air.
Batteries will breathe air to power our devices
Ever wish you could make your lap top battery last all day without needing a charge? Or what about a cell phone that powers up by being carried in your pocket?
Battery improvements will certainly keep coming. But will they have their biggest impact on hand-held devices? Or on cars? My guess: cars. Oil is too expensive and the remaining oil is deep offshore or in other places hard to reach. In the United States 94% of transportation energy comes from oil. Even that number understates the dependency since corn ethanol (shown as renewable energy in that graph) requires so much oil to produce it.
Some feel good pap about how we can all save the planet with personal technology. Is this practical?
You won’t need to be a scientist to save the planet
While you may not be a physicist, you are a walking sensor. In five years, sensors in your phone, your car, your wallet and even your tweets will collect data that will give scientists a real-time picture of your environment. You'll be able to contribute this data to fight global warming, save endangered species or track invasive plants or animals that threaten ecosystems around the world. In the next five years, a whole class of "citizen scientists" will emerge, using simple sensors that already exist to create massive data sets for research.
One idea: Imagine putting camera collars on big cats. Suppose the cameras could be powered from the movement of the cats (I'm reaching). Far more people would get off on virtually riding along with the cats on hunts than would get off on shooting the cats. The cameras might help deter poachers (and then again, maybe not). But habitat destruction wouldn't be stopped by cameras aimed at stopping poachers. Habitat destruction due to growing human populations and industrialization is the root problem. I see continued deterioration.
Computers will tell you how to commute to avoid traffic. Ho hum.
Your commute will be personalized
Imagine your commute with no jam-packed highways, no crowded subways, no construction delays and not having to worry about late for work. In the next five years, advanced analytics technologies will provide personalized recommendations that get commuters where they need to go in the fastest time. Adaptive traffic systems will intuitively learn traveler patterns and behavior to provide more dynamic travel safety and route information to travelers than is available today.
I doubt this will help much. What will help: 3-D holograms that enable you to work from home and yet still do really high quality meetings. What will help longer term: Cars that drive themselves. That'll enable closer packing of cars on the road, higher speed travel, lower accident rates, and time to read and type at a computer while the car computer does the driving.
IBM expects more of the waste heat from computer data centers will get used for useful purposes like heating buildings in winter.
Computers will help energize your city
Innovations in computers and data centers are enabling the excessive heat and energy that they give off to do things like heat buildings in the winter and power air conditioning in the summer. Can you imagine if the energy poured into the world's data centers could in turn be recycled for a city's use.
I question the potential for this idea to do much. Data centers are often located where electric power is cheaper. Really expensive cities (e.g. Manhattan) have rental costs that tend to push data centers out to the suburbs or beyond. So knowledge workers in expensive skyscrapers interact with cloud computers in other states and countries.
What should have made it to IBM's list for the next 5 years? I can think of a few things off the top of my head:
So what else do you see in the next 5 years? How will faster computing power and cheaper internet bandwidth change our lives? Will biotechnological advances have much of an impact in the next 5 years? Biotech's big impacts seem longer term. Certainly we'll see amazing biotech advances in the 2020s. But will we see any major disease cures in the next 10 years?
Update: Also check out IBM's 2007 predictions for the following 5 years. Note they had cell phones acting like credit/debit cards. That's already the case in Japan and now the Nexus S phone from Google has circuitry to help do that. So this is happening, albeit more slowly. Also, they predicted more active control of cars by car computers. That's happening very gradually with adaptive cruise control and other electronic assists making some driving decisions.
Update II: An article about 3-D and augmented reality reminds me of some areas of continued big advances in the next 5 years: