July 28, 2012
Moore’s Law– Intel co-founder Gordon Moore’s assertion that the number of transistors on integrated circuits doubles approximately every two years*– is one of the best-known axioms of our time, a rule of thumb that helps explain the explosion of technological capability over the last several decades, at the same time that it reassures us of advances-to-come. It’s attained this status the old-fashioned way: by being largely right– which is to say reasonably accurately predictive.
But as IEEE Spectrum reports, recent research at the Santa Fe Institute suggests that the broader concept on which Moore’s law was founded, the Experience Curve, is actually a better predictor of technological progress than Moore’s refinement.
Bruce Henderson and BCG tend to get credit for the idea of the Experience Curve (or the Learning Curve)– the notion that the costs of technological items drop with their cumulative production. BCG certainly did make hay with the concept back in the late 60s. But the concept dates back to the 19th Century and the work of German psychologist Hermann Ebbinghaus. Then in 1936, Theodore P. Wright observed the phenomenon in aircraft manufacture (“Factors Affecting the Cost of Airplanes,” Journal of Aeronautical Sciences and ”Learning Curve”, Journal of the Aeronautical Sciences), and coined “Wright’s Law” describing the effect.
Moore’s Law seems to be a special case of Wright’s Law; and in fact, Wright’s Law seems to describe technological evolution a bit better than Moore’s—not just in electronics, but in dozens of industries.
A new Santa Fe Institute (SFI) working paper (Statistical Basis for Predicting Technological Progress, by Bela Nagy, J. Doyne Farmer, Quan M. Bui, and Jessika E. Trancik) compares the performance of six technology-forecasting models with constant-dollar historical cost data for 62 different technologies—what the authors call the largest database of such information ever compiled. The dataset includes stats on hardware like transistors and DRAMs, of course, but extends to products in energy, chemicals, and a catch-all “other” category (beer, electric ranges) during the periods when they were undergoing technological evolution. The datasets cover spans of from 10 to 39 years; the earliest dates to 1930, the most recent to 2009.
It turns out that high technology has more in common with low-tech than we thought. The same rules seem to describe price evolution in all 62 areas.
Read the whole story at “Wright’s Law Edges Out Moore’s Law in Predicting Technology Development.”
* “Two years” became, in common understanding, “18 months” when Moore’s colleague David House revised the estimate to account for faster chips contributing to the the acceleration of further development.
December 5, 2010
The GFAJ-1 strain of Halomonadaceae bacteria is able to use arsenic in its internal structure, an element considered poisonous to all previously known life-forms (source)
On the heels of the last post, “Dykes, Leaks, Fingers…,” comes news of more change that’s on the way whether we’re ready for it or not– NASA’s announcement that scientists have discovered organisms that can swap phosphorus, a basic building block of life as we’ve known it on earth, for arsenic… and flourish.
It could be that arsenic-based life (or life based on some other surprising element) will turn up somewhere else in the universe; as Randall Munroe quips in the xkcd panels above, that’s what NASA-watchers were waiting to hear. But in any case, the announcement is a clear signal that insofar as the arena of biology is concerned, the ropes are down. The “givens” aren’t necessarily given; the limits… well, there may not be limits, at least none anywhere near where we thought they were. This amounts, as lead researcher Felisa Wolfe-Simon observed, to “cracking open the door and finding that what we think are fixed constants of life are not.” Indeed, as Caleb Scharf, a Columbia University astrobiologist, told The New York Times, “It’s like if you or I morphed into fully functioning cyborgs after being thrown into a room of electronic scrap with nothing to eat.”
When the silicon revolution exploded the barriers to faster computation, then communication, large academic/research organizations and mammoth companies took part in the exploration of the new terrain that was opened. But famously– and critically importantly– so did individuals and small groups of hobbyists, hackers. and ultimately, entrepreneurs. Precisely the same pattern is emerging in the exploration of the expanding frontiers of biology.
Huge incumbent institutions like NASA are at work– and so is an already large, and growing, community of “biohackers.” As Nature reports:
…Would-be ‘biohackers’ around the world are setting up labs in their garages, closets and kitchens — from professional scientists keeping a side project at home to individuals who have never used a pipette before. They buy used lab equipment online, convert webcams into US$10 microscopes and incubate tubes of genetically engineered Escherichia coli in their armpits. (It’s cheaper than shelling out $100 or more on a 37 °C incubator.) Some share protocols and ideas in open forums. Others prefer to keep their labs under wraps, concerned that authorities will take one look at the gear in their garages and label them as bioterrorists.
For now, most members of the do-it-yourself, or DIY, biology community are hobbyists, rigging up cheap equipment and tackling projects that — although not exactly pushing the boundaries of molecular biology — are creative proof of the hacker principle. Meredith Patterson, a computer programmer based in San Francisco, California, whom some call the ‘doyenne of DIYbio’, made glow-in-the-dark yogurt by engineering the bacteria within to produce a fluorescent protein. Others hope to learn more about themselves: a group called DIYgenomics has banded together to analyse their genomes, and even conduct and participate in small clinical trials. For those who aspire to change the world, improving biofuel development is a popular draw. And several groups are focused on making standard instruments — such as PCR machines, which amplify segments of DNA — cheaper and easier to use outside the confines of a laboratory, ultimately promising to make DIYbio more accessible…
Meredith Patterson, developing genetically-altered yogurt bacteria that will glow green to signal the presence of melamine (source)
Biohacking has a long, if not altogether respectable, pedigree: plastic surgery, performance-enhancing drugs… but then, the earliest tech hackers were often considered outliers– if not indeed, outlaws. The respectability that they gained over the years was a function of the establishment of the new fields– and new markets– they helped build. And as Freeman Dyson observes, that’s sure to happen in the biosphere as well:
… I see a bright future for the biotechnology industry when it follows the path of the computer industry, the path that von Neumann failed to foresee [for computers], becoming small and domesticated rather than big and centralized…
Domesticated biotechnology, once it gets into the hands of housewives and children, will give us an explosion of diversity of new living creatures, rather than the monoculture crops that the big corporations prefer. New lineages will proliferate to replace those that monoculture farming and deforestation have destroyed. Designing genomes will be a personal thing, a new art form as creative as painting or sculpture.
Few of the new creations will be masterpieces, but a great many will bring joy to their creators and variety to our fauna and flora. The final step in the domestication of biotechnology will be biotech games, designed like computer games for children down to kindergarten age but played with real eggs and seeds rather than with images on a screen. Playing such games, kids will acquire an intimate feeling for the organisms that they are growing. The winner could be the kid whose seed grows the prickliest cactus, or the kid whose egg hatches the cutest dinosaur. These games will be messy and possibly dangerous. Rules and regulations will be needed to make sure that our kids do not endanger themselves and others. The dangers of biotechnology are real and serious…
[Read the whole essay, "Our Biotech Future," in The New York Review of Books. And do click through to the letters and responses at the end-- an amazing colloquy.]
As Dr. Dyson observes, there are certainly attendant dangers. But as analogs from the silicon revolution (and indeed, all the way back to the beginning of the Enlightenment and the Scientific Revolution) demonstrate, “civilian” participation can speed the development of technologies and multiply the ways in which those technologies can be used. Indeed, the only major technology of which I can quickly think that did not have meaningful enthusiast/tinker involvement was the development and exploitation of nuclear weapons/power (and that’s arguably not far off); others– even capital/research-intensive tech like aviation, telecom et al.– were lousy with it.
So, are biohacking and the ever-democratizing biotechnologies that enable it a good thing or bad? Wrong question. History teaches us that technologies aren’t good or bad, they simply “are.” We experience them positively or negatively as a function of the way that they are used. So surely the better question– given that (like the technological capability that spawned Wikileaks) biohacking is here to stay– is what we can do to assure that its impact is, on balance, good.
This provocative book introduces a brand-new view of technology. It suggests that technology as a whole is not just a jumble of wires and metal but a living, evolving organism that has its own unconscious needs and tendencies. Kelly looks out through the eyes of this global technological system to discover “what it wants.” Kelly uses vivid examples from the past to trace technology’s long course, and then follows a dozen trajectories of technology into the near future to project where technology is headed.
This new theory of technology offers three practical lessons: By listening to what technology wants we can better prepare ourselves and our children for the inevitable technologies to come. By adopting the principles of pro-action and engagement, we can steer technologies into their best roles. And by aligning ourselves with the long-term imperatives of this near-living system, we can capture its full gifts.
[From the Viking 2010 catalog]
One doesn’t have to buy (as, FWIW, I do) Kevin’s identification of technology as a “living, evolving organism,” even as a metaphor, to appreciate the wisdom of his conclusions: we need to understand emerging technologies; we need engage them, steer them in directions that are safe and productive– we need, jiu jitsu-like, to turn their power to our collective good.
Attempts to deal with the unfamiliar and often uncomfortable implications of new technologies by outlawing the new technologies themselves pretty routinely fail. But worse, they distract from the need– and the opportunity– to learn how to make use of their new capabilities: e.g., while the RIAA insisted on playing Whack-a-Mole with P2P sites, Apple figured out how to use the new technology to reconfigure the music market with iTunes.
More recently, governments have gone ballistic over Wikileaks, using every direct and indirect means at their disposal to silence the site. But as The Economist observes, “short of imposing Chinese-style firewalls and censorship, free countries cannot consistently stop their citizens finding out…” Nor, one might argue, should they– given that what citizens are “finding out” is the range of things being done in their name and on their dime. But in any case, unless they head for totalitarian extremes, governments will have to find a way to behave that’s not so vulnerable to disclosure: the genie has left the bottle.
About 12 years ago I gave a talk to the senior management of one of the largest multiple-media conglomerates in the world, outlining the technological forces (then) in play, and suggesting ways in which they might disrupt their (then) current business models. At the end, I asked if there were any questions; the manager of one of the older, more traditional businesses, threw up his hand. “Yeah, I’ve got a question: How do we stop this?”
The answer, history confirms: “You don’t.”
[Apologies to Rudy Rucker for appropriating the title of this post from his terrific novel.]
Filed in Competition and Industry Structure, Economic, Entrepreneuring, Environmental, Information Industry, Media and Entertainment, Political, Scenario Planning, Social, Technological
Tags: Apple, arsenic, arsenic-based life, bacteria, biohacking, biology, biotech, Caleb Scharf, Felisa Wolfe-Simon, Freman Dyson, GFAJ-1, Halomonadaceae, iTunes, Joel Garreau, Kevin Kelly, Meredith Patterson, NASA, Nature, P2P, phosphorus, Radical Evolution, RIAA, Rudy Rucker, technological progress, technological threat, technology, What Technology Wants, wikileaks, xkcd
January 11, 2010
As Americans worry about our nation’s competitive edge, observers note that there’s a crisis in recruitment for the fundamental and applied sciences that are the foundation of a country’s technical prowess. Science graduates are demanding that potential employers “show me the money.” Lawrence Krauss observes in New Scientist,
…young people interested in a productive career [have had] two choices: take a technically demanding route and become an engineer or scientist, guaranteed to earn a respectable middle-class income, or go into the financial world, where the long hours are taxing but the intellectual demands much lighter, and the potential pay-off far greater. In the market-driven First World, is it any wonder that hard-working students are choosing the latter route?
This position contrasts with that of bright, young people in the Third World, where it is clear that the path to prosperity is a scientific or technical education. I recently lectured at the Indian Institute of Technology in Kharagpur, where only one out of every 121 applicants is admitted – a smaller percentage than at Harvard. Enrollment virtually guarantees a job as an engineer at a multinational corporation, with the possibility of starting up your own company a little further down the road.
And there’s the rub. Investment bankers and venture capitalists manage and help create wealth by building on ideas, but ultimately it is to developments in science and engineering that about half the growth in US GDP per capita over the last half-century can be attributed, according to the National Academy of Sciences’ report Rising Above the Gathering Storm, published last year.
Where then will we find ourselves a generation from now? In a world that is increasingly technological and increasingly “flat” – free of barriers to trade or capital flows – how long can a country thrive on managing wealth rather than creating it?
(See also this recent study.)
And then there are those who argue that the problem isn’t what these budding young scientists aren’t doing, it’s what they are– those who blame these Wall Street “rocket scientists” for the financial crisis through which still we struggle. For example, Sebastian Smith in PhysOrg suggests that
…there’s a reason Wall Street resembles a rocket experiment gone wrong: rocket scientists helped make it happen. Known as quants, these are the mathematicians and physicists who devised the financial instruments and computer programs fueling stock markets’ spectacular rise and collapse.
And while in good times they became financial rock stars, quants — short for quantitative analysts — are now being cast as villains of an industry that abandoned its values.
“They thought they could make it easier to make money, one New York investment manager, speaking on condition of anonymity, told AFP. “They thought you don’t need to do your homework anymore.”
Calvin Trillin put it more directly (and more humorously) in The New York Times, quoting “a man sitting three or four stools away from me in a sparsely populated Midtown bar”:
“The financial system nearly collapsed,” he said, “because smart guys had started working on Wall Street”…
“Two things happened. One is that the amount of money that could be made on Wall Street with hedge fund and private equity operations became just mind-blowing. At the same time, college was getting so expensive that people from reasonably prosperous families were graduating with huge debts. So even the smart guys went to Wall Street, maybe telling themselves that in a few years they’d have so much money they could then become professors or legal-services lawyers or whatever they’d wanted to be in the first place. That’s when you started reading stories about the percentage of the graduating class of Harvard College who planned to go into the financial industry or go to business school so they could then go into the financial industry. That’s when you started reading about these geniuses from M.I.T. and Caltech who instead of going to graduate school in physics went to Wall Street to calculate arbitrage odds.”
“But you still haven’t told me how that brought on the financial crisis.”
“Did you ever hear the word ‘derivatives’?” he said. “Do you think our guys could have invented, say, credit default swaps? Give me a break! They couldn’t have done the math.”
“Why do I get the feeling that there’s one more step in this scenario?” I said.
“Because there is,” he said. “When the smart guys started this business of securitizing things that didn’t even exist in the first place, who was running the firms they worked for? Our guys! The lower third of the class! Guys who didn’t have the foggiest notion of what a credit default swap was. All our guys knew was that they were getting disgustingly rich, and they had gotten to like that. All of that easy money had eaten away at their sense of enoughness.”
But in fact, Trillin’s drinking buddy notwithstanding, the tradition of brilliant mathematicians and physicists “going over” to Mammon is centuries old. Indeed, arguably the greatest scientific mind of all time, Isaac Newton, capped his extraordinary career with stints as Warden, then Master of the Mint.
It was a job that Newton took deadly seriously. At the time of his first appointment in 1696, the British currency had been so seriously debased by clipping and counterfeiting during the Nine Years’ War that all English silver coinage was recalled. Newton’s extraordinary knowledge of chemistry and math saw him– and the English Mint– through the crisis. Building on that success, Newton oversaw the recoinage in Scotland, which resulted in single currency for the UK; then spent the rest of his life– he kept the Master’s post until his death in 1727– protecting the coin of the realm from the bogus and the bent. (For a taste of his zeal, see Thomas Levenson’s terrific Newton and the Counterfeiter: The Unknown Detective Career of the World’s Greatest Scientist.)
So mathematicians and scientists have worked with– and for– money since the Enlightenment. But one huge difference does jump out: The rocket scientists of today are (to quote Trillin) ” securitizing things that didn’t even exist in the first place”– pushing the limits of money and value. From the specious (CDOs) to the nefarious (front-running), the best minds of our time have devoted themselves to finding ways to conjure return out of (what turns out all too often to be) thin air… and as we all know, if it seems too good to be true, it is.
Conversely, Newton devoted himself to protecting the economy by policing the currency that fueled it; he devoted himself to assuring that the system was stable, trustworthy.
So while our society could surely do with a renaissance of interest in actually practicing science and technology, the fact that many whiz kids want to enter finance isn’t in itself a problem. The problem, as Newton’s example reminds us, is how those prodigies want to use their gifts.
Filed in Competition and Industry Structure, Economic, Entrepreneuring, Information Industry, Scenario Planning, Social, Technological
Tags: Calvin Trillin, CDO, competitiveness, engineering, Financial Crisis, financial engineering, front-running, Isaac Newton, Master of the Mint, mathematics, Rocket scientists, science, Sebastian Smith, technology, Thomas Levenson, Wall Street, Warden of the Mint