Ok, I'm in a bit of a quandry now.

I've got a solution that fully optimises all CDs.

It takes about 1 hour 40 minutes on a machine that I use for work, so I'm not really interested in running it 100 times to generate exactly the same results. Just take the result and multiply by 100 for a worst case run time.

Now, do I start cutting the code to get a lower byte count, or do I start adding code to get a faster executing script?

Do I change the algorithm for one which produces a less accurate result with a smaller script or faster run time?