This article on Apple and skeuomorphic design explains that it’s time to move on, and I agree. We long moved away from the need to have our software represent real hardware. There’s many reasons for that. The article suggests that one reason that real-world metaphors were once helpful in teaching people how to use software, but I think it’s more than just that. Making things look like real things made interfaces look better when things were lower resolution. It’s hard to develop forms that could lead to good decisions when they still looked a lot like pixels.
But no we’ve reached a point know where UI that is purely digital is also intuitive. In fact interfaces can be more intuitive than reality now, because they don’t need us follow real-world constraints in order to reach the results we want.
I had not thought about this, many of the conventions that programmers follow come out of a time when computer interfaces were new, home computers equally so and many people really would never learn to code much to use their machines. I remember learning basic, pascal and the hypertext language about the time the early nineties apple LC. We did not grow up with computers everywhere, and touch screens the norm.
Interfaces and programs that mimic things like Polaroid cameras, reel to reel tape recorders, Rolodexes and such are nicely retro and kitch but its an anachronism that will some day fall aside for a better format of UI design unless most people grow up using and relating to those concepts.
Thanks
Greg