Tag Archives: infinity

Infinity – Infinite Energy

Power. Electricity. The Holy Grail of modern technology.

I say this because the information revolution completely depends on electricity, whether it’s batteries, hybrid motors, or the grid. Everything we do depends on converting some naturally occurring resource into power to drive our lives.

I was thinking about power recently while watching an episode of Star Trek: The Next Generation. Everything they do depends on an infinite (or nearly so) source of energy. Their warp core powers the ship for a 20-year mission. Each device they have is self-powered. From what? Do they need recharging? I imagine not, but it’s been a while since I’ve read the technical manual.

In any case, much of that world (and other Sci-Fi worlds) depends on powerful, long-lasting, disconnected energy sources. For one example, think of the energy required to power a laser-based weapon. And it has to fire more than once.

The truth is that having such a power source is more than world-changing. It has the potential to completely rebuild society from the ground up. If you think about it, much of the world’s conflict is over sources of energy. Authority and power is derived from who controls the resources. If energy was infinitely available, it would be infinitely cheap (at least in some sense). I almost think it would change society from being so focused on worldly gain, to more on pursuit of knowledge, enlightenment, and improvement. We wouldn’t have to worry about how to get from one place to another, or who has more oil, or what industries to invest energy resources in. So much would come free.

When I speak of “infinite” power, don’t take it literally. What I mean is “So much to be practically unlimited.”

Of course there are different types of infinities:

  1. Infinite magnitude – Can produce any amount of power you desire. Not very likely. Something like this would be dangerous. “Ok, now I want Death Star phasers. ok. Go.” Boom.
  2. Infinite supply – There’s a maximum magnitude in the amount of power it can generate, but it can continue “forever” (or at least a reasonable approximation of forever). This is the useful one.

And there are a few other requirements we should consider:

  1. Non-destructive. Environment. Mankind, etc.
  2. Highly-efficient.
  3. Contained and controlled. Obvious.
  4. Portable. Sometimes microscopically so.

It’s nice to dream about such things…

  • Cell phones and Laptops that never need recharged
  • Tiny devices everywhere that never need an external power source (GPS, sensors, communications devices, robots, etc.)
  • Cars that do not fuel. Ever. We’d probably keep them a lot longer. They could do more, be larger, more efficient, faster, safer.
  • Vehicles that can expand the boundaries of their current form. How big can you make an airplane if you don’t have to worry about using up all its fuel? (not to mention the weight)
  • Easier to get things into orbit–space program suddenly becomes much more interesting. Maybe we can develop engines that produce enough power to escape gravity, without using propellant (a truly ancient technology).
  • Devices that can act more intelligently, and just do more than current devices. Think if your iPod that turns itself off after a few minutes of not using it. That scenario would be a thing of the past.

With such a power source the energy economy of devices that we have to pay such close attention to now goes out the window. Who cares how much energy it uses if there’s an endless amount to go around (and since we’ve already established that the energy source is non-destructive and highly-efficient, environmental factors don’t enter in). There would be no need for efficiency until you started bumping up the boundaries of how much power you needed.

del.icio.us Tags: ,,


Check out my latest book, the essential, in-depth guide to performance for all .NET developers:

Writing High-Performance.NET Code, 2nd Edition by Ben Watson. Available for pre-order:

Infinity – Infinite Storage

Anybody who’s taken high school or college mathematics know how phenomenal exponential growth is. Even if the exponent is very, very small, it eventually adds up. With that in mind, look at this quick-and-dirty chart I made in Excel, plotting the growth in hard drive capacity over the years. [source: http://www.pcguide.com/ref/hdd/hist-c.html]

Hard Drive Capacity Graph

Ok. it’s ugly, but notice a few things:

  1. The pink denotes the data points from the source data or what I put in (I added 1000 GB in 2007).
  2. The scale is logarithmic, not linear. Each y-axis gridline represents a ten-fold increase in capacity.
  3. At the current rate of growth, by 2020, we’ll have 1,000,000 GB hard drives. That’s 1 petabyte (1PB). (by the way, petabyte is not in Live Writer’s spelling dictionary–get with the times Microsoft!)
  4. The formula, as calculated by Excel, says that the drive capacity should double roughly every 2 years.

Also, this doesn’t really take into account multiple-hard drive storage schemes like NAS, RAID, etc. Right now, it’s quite easy to lash individual storage units together into packages such as those for more space, redundancy, etc. I’ll ignore that ability for now.

So 2020: that’s 12 years from now. We can expect to have a petabyte in our computers. That’s a LOT of space. Imagine the amount of data that can be stored. How about every book ever written? How about all your music, high-def DVDs, ripped with no lossy compression?

Tools such as Live Desktop and Google Desktop take on a whole new level of importance when faced with the task of cataloging petabytes of information on your home PC. Because, let’s face it, you’ll never delete anything. You’ll take thousands of pictures with your digital camera and never delete any of them. You’ll take hours of high-def footage and never watch or edit them, but you’ll want to find something in them (with automated voice recognition and image analysis, of course). Every e-mail you get  over your entire lifetime can be permanently archived.

What if you could get a catalog of every song ever recorded? That would probably require more than a few petabytes, even compressed, but we’re heading that way. I don’t think the amount of music in the world is increasing exponentially, is it? Applications like iTunes and Window Media Player, not to mention things like iPods, would have to have a critically-designed interface to handle the organization and searching for desired music. I think Windows Media Player 11 is incredible, but I don’t think it could handle more than about 100,000 songs without choking–has anyone approached any practical limits with it?

What about the total information in the world–that probably is increasing exponentially.  Will we eventually have enough storage so that everyone can have their own local, easily searchable copy of the vast sum of human knowledge and experience? (Ignoring the question of why we would want to)

Let’s extrapolate this growth out 100 years to the year 2100. I won’t show the graph, but it approaches 1E+20 GB by the year 2100.

How do the economics of digital goods change when you can have an infinite number of them? It’s the opposite of real estate, an ever-diminishing good.

On my home PC, for  the first time, I do have a lot of storage that isn’t being used. I have about 1 TB of storage, and about 300 GB free. I suppose I could rip all my DVDs, rip all my music at lossless compression (it’s currently all WMA / 192Kbps).

The rules of the game can change quickly when that much storage is available. It will be interesting to see what happens in the coming decades. Of course, all this discussion is completely ignoring the increasingly connected, networked world we live in.

del.icio.us Tags: ,,,


Check out my latest book, the essential, in-depth guide to performance for all .NET developers:

Writing High-Performance.NET Code, 2nd Edition by Ben Watson. Available for pre-order:

The Benefits of Having Too Much Processing Power

Do an experiment: keep Task Manager or any other CPU activity monitoring program up on your screen for a few hour or days, glancing at it every so often. Do you see it EVER above zero (other than momentary spikes)?

Here’s mine, from a Google sidebar gadget:

mycputime

I’ve got a Dual Core and 2GB RAM. Currently I have open two copies of Visual Studio 2005, Word 2003, Outlook 2007, Paint.Net, RSS Bandit, Adobe Reader, IE, MSDN help, Windows Live Messenger, and Google Deskbar.

So that’s using just over 1 GB of RAM. And ZERO CPU. I’m watching this. The CPU meter goes up a little when I type, open a new program, compile my source code, etc., but most of the time it’s zero, even when I think I’m actually working.

I used to eschew running apps like Google Deskbar, wallpaper helpers like Display Fusion, or other system utilities that continually run. But I had a realization–it doesn’t matter! I could run many more utilities concurrently and still not come anywhere close to creating a slowdown on my computer.

Of course, I’m only talking about non-interfering/non-processor-intensive programs. This immediately excludes anti-virus programs, which interrupt every process to examine system behavior continually, or running video compression (duh) in the background.

But things like desktop searching, system monitoring (if it’s not too intrusive), utilities, and any other independent process–yeah, just throw them on. They won’t make a dent.

They key word in that last paragraph is independent. Independent means they don’t depend on or interfere with other processes.

Technorati Tags: ,,,,


Check out my latest book, the essential, in-depth guide to performance for all .NET developers:

Writing High-Performance.NET Code, 2nd Edition by Ben Watson. Available for pre-order: