Monthly Archives: September 2007

Getting Green Off the Grid

Going green is something I am slowly becoming more interested in. I’m not really sure what steps exactly we need to take–I don’t think we have an inordinate impact on the environment, and to be honest, right my pocketbook is far more important. That said, I do drive a Honda Civic that I’ve been able to get more than 42mpg out of. We try to use everything we buy, and dispose, give away, recycle, sell, etc. everything we don’t need. We try to walk places where we can.

Thank you to Eric for his contribution to BuyMeALego. He has a genuinely interesting site. Getting Green Off the Grid is a blog about both more sustainable living and living independently.

About the site:

This is a journal of my research into becoming more independent, away from the power grid. My goal one day is to live out in the middle of nowhere, dependent upon none but myself and my family. That dream is a long way away, but every little step counts.

I think that is a very enviable position to be in–completely independent. Independent power utility in particular fascinates me. Or better, being able to sell your power back to the power company.

I don’t think it’s possible to turn off our dirty technologies or habits all at once, but having people like this who do the research, who advocate, who publicize the next big clean technology is absolutely vital. We need to start down the path and have smart people working on it hard. We’ll get there, eventually.

Anyway, I think I will subscribe to his blog for a while and check it out–the posts I’ve read are interesting and he links to some good stuff.

Technorati Tags: , , ,

A Visual Studio that’s easier on the eyes

After you’ve looked at Visual Studio all day for a few days in a row, the brightness of the white background can really start to bother you, especially as LCD monitors get brighter and brighter. That’s why I’ve become a big fan of Dave Reed’s Dark Side theme for Visual Studio 2005. It took me a few hours to get used to it, but now I’m hooked.

The only thing I changed was the font. I really like Consolas, size 13. Courier New is Courier-Old-and-No-Longer-Used.

At work, I still have to use VS2003 for a project, and keeping it in the older, white background really helps me distinguish which environment I’m in.

My Wife’s Logic (or Women’s Logic Explained?)

For all of you who learned boolean algebra in your CS courses in college, I am sorry to be the bearer of bad news: your education was incomplete. The list of boolean tautologies and truth tables that you may have memorized or learned over time was wrong, with some startling and glaring errors.

To rectify this, I present some new truth.

First, an example in real life, which really happened. For context, Leticia is my wife, has beautiful, olive skin, with dark brown eyes, and hair with various colors of brown, and the little girl in discussion was as white as can be.

Leticia: Look at that little girl–she’s so beautiful! Do you think we’ll ever have a girl who looks like her?

Me: No.

L: So you think our daughter will be ugly!

M: Uhhh……no. I don’t think she’ll be white.

Transforming this little conversation into boolean logic:

A: This little girl is beautiful

B: Our future daughter will look like this girl

C: Our future daughter will be beautiful

So my wife says that AB–>C, and that if I say !B, then !B–>!C. I always knew that, but what I didn’t know is that the other options I thought existed aren’t actually valid! (i.e., that !B–>C is also true, or in other words, that C can be true regardless of the value of B) Who knew! So I present below corrected truth tables. Wikipedia, take note.

Standard Truth Table for Implication

Improved Truth Table for Implication

X Y X–>Y
F F T
F T T
T F F
T T T
X Y X?Y
F F T
F T F
T F F
T T T

So you see that the correct form of implication when dealing with this logic is the same as equality.

Now you know, beware.

The users are in control

I really enjoyed and appreciated this essay from Raganwald about the user experience at work versus that of their home PC environments (among other topics).

I particularly liked the point:

And meanwhile, the very same users could walk across the street and buy themselves a much better PC for less money than we pay and take it home the same day.

Ain’t that the truth. I put together my Core 2 Duo system for the same price as my crappy Pentium 4 hyperthreaded number at work. The time frames were not that far apart. The Core 2 runs circles around this sick puppy.

A company’s philosophy should be to get users (especially developers like me!) whatever hardware/software they need immediately. Within minutes or hours, not days or weeks. Of course, then you have to trust your employees to make good requests. But if you don’t trust them to know what they need, why trust them to do their job at all?

The essay goes on to talk about writing applications that take advantage of modern PC horsepower. I think I’m doing an ok job of this at work now. For example, we have a database of assets that is continually growing. It used to be we could view all of the assets on a single page that took about 30 seconds to load off-site.

Now that list will take several minutes to bring up. Yeah, we’re growing. So we need tools to help manage all of that information. One thing I’m building right now (as soon as I’m done writing this, as a matter of fact) is a quick filtering functionality on a desktop app that talks to the database. The list of assets is filtered as you type, taking advantage of the fast PCs we have these days.

That’s just one example. I can think of others that are immediately useful in business apps:

  • better visualization – it takes time and thought to develop good data visualization, but the results are usually worth it
  • drag & drop support – make sense to drag assets from a customer to another? I don’t know, maybe.
  • dynamic views – use all that processing power to show something more interesting than fields on a scrolling form. Graphics views that change in response to context
  • track history, undo/redo – might make sense in some contexts
  • attach more meaningful information – pictures, videos, documents, whatever. – with stuff like WPF, it’s easier than ever to display varied content

Technorati Tags: , , ,

Don’t ignore naive or "stupid" algorithms — hardware is cheap and fast

I just had a nice reality check. Sort of pleasant in that I realized I could save a LOT of memory usage (like from 35MB down to 9 MB), but also aggravating because I have spent probably 10-20 hours developing a clever algorithm designed for speed.

Lesson learned. I should have built the naive version first. Instead, I wrote up two successively more “brilliant” versions that went through all sorts of hoops to get the most speed out of it. Of course, to do this, they took up all sorts of memory with indexes, and the index creation was starting to take about 10 seconds or longer. I should have just built the naive version.

I just wrote the naive version and realized I could have done that in about 5 minutes and saved many hours of tweaking. The component is a type of indexing component, so there were three metrics: index creation time, lookup time, index size. Here’s a rough comparison just to give an idea:

  Clever Algorithm Naive Algorithm
Index Creation Time 10s 0.3s
Lookup Time 0.0001s 0.005s
Index Size 35MB 9 MB
# items

~27,000

Pretty impressive speed numbers aren’t they! That clever algorithm really rocks. And it would be awesome to use if I was doing a lot of searching consecutively, but the searching in my app is tied to the UI, thus to the user, so in reality 0.005 seconds is not that much different than 0.0001 seconds. <sigh>

The numbers above are from my main machine, which is a Core 2 Duo. Just to be safe I tested the naive algorithm on my 4-year-old Pentium 4 laptop to validate that it still has acceptable performance on an older machine. The creation takes 0.05 seconds, but lookup time isn’t much slower, if at all.

And 9MB index is MUCH better than a 35MB index.

In summary, lessons learned:

  1. Hardware is cheap and fast. Don’t waste time optimizing for speed if you don’t have to. While there are signs the raw speed of a processor is plateauing as multiple cores become more important, in general, speed is always increasing.
  2. If you’re running something when a user inputs something, speed isn’t critical (as long as you have it faster than human response time)
  3. Every application is different, so measure and think critically. If my app needed to run the search 100 times per second, the clever algorithm would definitely be better.
  4. There is almost always a tradeoff between speed and size. Which is more important depends on the app.
  5. Write the dumb algorithm first. It might be good enough and you’ll save yourself hours of development and debugging time.

Technorati Tags: , , , , ,

Farewell, Robert Jordan

According to his blog, Robert Jordan passed away yesterday. He fought a tough illness for quite a while. I became a big fan of The Wheel of Time a few years ago and forced myself to stop reading the books until the final one comes out.

I loved the books because they were immense, detailed, complex, and very engaging. He was in the middle of writing it, and he has left notes and given an oral narration of the end of the series, but it just won’t be the same.

The Fountain

We just got Darren Aronofsky’s The Fountain in the NetFlix mail today, and we loved it. Definitely worth watching, a thinking movie, a feast for the eyes. The use of lights was spectacular. It was in the same realm as What Dreams May Come (though I liked that one better), but it also made me think of Orson Scott Card’s Speaker for the Dead. I’m afraid if I say why it will give too much of the movie away. It just needs to be seen and experienced.

See it–a wonderful movie.

How to measure memory use in .Net programs

In developing an important component (which I will discuss soon) for my current personal project, there were a number of different algorithms which I could use to attack the problem. I wasn’t sure which one would be better so I decided to implement each of them and measure their time/memory usage. I have a series of articles planned on those, but first I need to mention how I did the measurement.

 

For the timing, I simply wrapped the API functions for QueryPerformanceCounter and QueryPerformanceFrequency into a C# class. There are plenty of examples out there on the net to do this.

The memory usage is even simpler. The function you need is GC.GetTotalMemory(bool forceFullCollection). This function returns the amount of memory allocated. The little program below demonstrates how to use it.

using System;

namespace MemUsageTest
{
    class Program
    {
        static void Main(string[] args)
        {
            long memStart = GC.GetTotalMemory(true);
            Int32[] myInt = new Int32[4];
            long memEnd = GC.GetTotalMemory(true);

            long diff = memEnd - memStart;

            Console.WriteLine("{0:N0} bytes used", diff);
        }
    }
}

The output on my 32-bit system is “40 bytes used”–16 bytes for integers and 24 bytes of array overhead.

Passing true to GetTotalMemory is important–it gives the garbage collector an opportunity to reclaim all unused memory. There is a caveat, though. If you read the documentation, you’ll notice it says it attempts to discover the amount of bytes in use, and that setting forceFullCollection only gives it an opportunity and waiting a bit before returning. It does NOT guarantee that all unused memory is reclaimed.

As far as my work goes, I’ve noticed that it does a pretty reliable and consistent job.

Technorati Tags: , ,

On the cover of Wired magazine

OK, It’s a bit old now, but I thought I’d show myself on the cover of Wired magazine. Cool, isn’t it? This was part of a promotion by Xerox where they printed 5,000 (more?) custom covers. This was the July issue. I mostly like how we’re obviously on the water in the picture, but not according to the map. Cool, anyway. Definitely a keepsake.

Ben and Leticia on the cover of Wired

Technorati Tags: ,