Carlson Vs. Moore
In one corner, we have Moore’s law. In the other corner, there is Carlson’s curve.
Moore’s law— named after Gordon Moore, co-founder of Intel—famously predicted over 40 years ago that the transistor density of integrated circuits would double about every two years. So far, it’s been right.
Carlson’s curve—named after biologist Rob Carlson—refers to a graph showing the diminishing cost per base of sequencing DNA over time. Like transistor density, DNA sequencing prowess is similarly exponential, and showing no signs of slowing down.
Of course, neither of these is a fundamental law of nature, only empirical observations, and reality will inevitably deviate from the prediction someday.
So one naturally wonders, if it were a contest, which will hold steady the longest?
If one assumes a conventional mathematics of exponential growth, Moore’s law will be repealed first because it started first (approximately 1965 and 1990 for the two cases), and such a curve always levels off.
Deviations from a standard exponential would most likely come from new discoveries or technological innovation. For Moore’s law, the physical constraint of moving electrons in circuits is certainly limiting, as is the near certainty that a transistor can’t be smaller than a single atom. However, quantum computing or DNA computing might overcome these inherent limits of silicon chips.
For DNA sequencing (or DNA synthesis for that matter), the cost of chemical reagents and the ability to physically resolve DNA fragments cheaply are clearly limiting at some point. However, rapid advances in robotics, miniaturization, and “lab-on-a-chip” technologies can be expected to continue for the foreseeable future.
Based on nothing but the intuition that cost is easier to overcome than physics, my bet is that halving the price tag of DNA sequencing will outlast the course of integrated circuit doubling. Doubtless, readers can present counter-arguments, which I of course welcome and encourage.