I know this sounds laughable in this age of freezes, comments that do not appear, orders lost, and just plain irritating stupid computer tricks, but some thinkers say it might be good to quit trying to be cyberly-perfect if it means burning too much energy.
Energy-saving/somewhat error-prone. What do you think?
At the University of Washington, they are designing a computer language called EnerJ, which allows benign errors to slip through, but blocks big giant horrible ones.
It's called approximate computing.
Now--they say--they need to think of a use for it. Any come to mind?
I say almost anything--the errors are slipping through now. But that would be mean, wouldn't it?
Still, I had to switch to Chrome just to do this post. Everything else works with the AOL browser. This is what I mean.