Wednesday, October 31, 2007

Don Norman: "The Design of Future Things"

Today I got the new book by Don Norman, "The Design of Future Things". After having read the book tonight I have some words about it. As usual, the book is a typical Norman. It is easy to read, full of good stories and great examples. It also addresses issues that are clearly becoming some of the most interesting emerging design challenges in our society, at least when it comes to technological aspects of our everyday reality.

The basic question in the book is based on the assumption that we are at a point when we are able to develop "intelligent" and autonomous devices. The question is then how to live and interact with such devices? What if the car "takes care of us" and makes decisions that override our own actions as a driver. How can we establish a symbiotic relationship (a concept that Norman uses and likes) with the smart car? The book is full of examples of smart technology that in most cases just seem annoying and terrible to live with. Norman gives very few examples of designs where the symbiotic relationship is designed in such away that it works. His favorite example is the relation between a horse and a rider. He suggests that such a relationship is what we should strive for. He also proposes a number of Design Rules that he says "designers and engineers can implement [them] in the innards of machines".

The book is of course stimulating to read. Just the number of examples and Norman's ability to create a good story around the use of a specific technology or device is an enough reason for reading the book. He raises issues that are at the forefront of interaction design today. Of course, there are also aspects of the book that is not fully in line with my way of thinking. For instance, I think he focuses too much on the notion of "intelligence" and on the idea of "smart machines". I prefer to see most of the issues he is raising less as a question and consequence of "smartness" as a question of intentional design of interactivity. Any move towards "smartness" and autonomy must be dealt with in the design of the interaction. Many of the examples of smart devices in the book are (also in the eye of Norman) bad and lead to situations that no one would see as desireble. I am not sure that these examples are consequences of the level of smartness, instead I see it as examples of simply bad design. We have the same situation with non-smart devices, i.e., regular physical things. Some designs fit perfectly well in their environment and people enjoy them, while other are just annoying and create frustration (see Norman's earlier books). We do not see this as a consequence of the device's non-smartness, instead we see it as a question of how well the designer can shape their functionality, form and appearance in relation to the purpose and environment (and of course in relation to their smartness"). Anyhow, this is not a criticism of Norman's ideas, it is more an example of the kind of thinking and discussions the book will lead to.

I recommend it for anyone who is involved in the design and development of any kind of interactive artifacts. It is a fun and stimulating reading!

2 comments:

Yen-ning said...

This post and Peter Kahn's presentation make me think more about "smart" things, such as smart home and robot. Also, they remind me of a post by David Royer on our interaction culture blog. There are some good comments on smart designs.
http://interactionculture.wordpress.com/2007/10/23/interesting-commentary-on-smart-homes-design-technology-humanity/

I am just curious about why people design smart things. Do we have eager to see objects act like human beings? or these smart designs really contribute to a better life?

It seems Norman's book is interesting to me.

Erik Stolterman said...

Hi Yen-ning

This is a very good comment, and it is similar to the one I made about Don Norman's book (another post). I think it is not always "smart" to focus on the "smartness" of things :-) I think this makes us focus on the wrong thing, that is why I prefer the notion of interaction as the core concept instead of smartness or robots. If I am right, I don't know, and if it actually matters, I don't know either :-)