Today's Reading

2018 Nov 12th:


"Towards Design Principles for Visual Analytics in Operations Contexts" by Conlen, Stalla, Jin, Hendrie, Mushkin, Lombeyda, Davidoff

This is an unusual and interesting example of HCI research. The authors (quite many) have been engaged in developing a visual analytics tool for the NASA Jet Propulsion Lab. The challenge was to design a tool that could manage a large number of tasks and a complex informational environment. The too is made for "analyzing predictions about the amount of data Opportunity can transmit to overpassing satellites under changing conditions".  However, the tool itself is not mentioned much in the paper, instead, the authors decided to based on their design experience extract a number of design principles that are aimed to "inform the design of visual analytics systems in operations contexts. We offer these principles as a step towards understanding the complex task of designing these systems." The authors make the case that these principles are applicable to a wide range of design tasks of complex operations challenges. I find this decision to be an unusual but highly interesting way of presenting this kind of research. Usually, we are presented with the final design and maybe some tests that will measure how well the tool works. In this case, the extracted principles are the contribution. This is definitely a way towards more theory building research instead of presenting interesting but highly individual and particular aspects of one specific case. I wish we could see more of this kind of research.


Reference:
Conlen, M., Stalla, S., Jin, C., Hendrie, M., Mushkin, H., Lombeyda, S., & Davidoff, S. (2018, April). Towards Design Principles for Visual Analytics in Operations Contexts. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 138). ACM.

xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

2018 Nov 9th:

"Evaluation Strategies for HCI Toolkit Research" by David Ledo, Steven Houben, Jo VermeulenNicolai Marquardt, Lora Oehlberg and Saul Greenberg  

A large part of HCI research is devoted to the development of methods and tools for design. Often this work leads to more complex tools, sometimes in the form of toolkits, that is, a set of complementary tools aimed to support design with a particular purpose. In many cases, the creators of toolkits also try to evaluate their creations. This is obviously a seriously difficult task. In this paper, the authors examine how HCI researchers go about doing toolkit evaluations (analyzed 68 papers). Based on this, they present a categorization of evaluation approaches. They discuss the pros and cons of each. This is a quite simple and straightforward paper. Even though it does not really add any deeper theoretical knowledge about the evaluation of tools in general, it is highly valuable as an empirical study of how HCI researchers approach this topic and it gives real insights to anyone who is about to embark on tool evaluation.

Reference:
Ledo, D., Houben, S., Vermeulen, J., Marquardt, N., Oehlberg, L., & Greenberg, S. (2018, April). Evaluation Strategies for HCI Toolkit Research. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 36). ACM.
Reference:


xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx


2018 Nov 8th:
"Interactive Guidance Techniques for Improving Creative Feedback" Ngoon, Fraser, Weingarten, Dontecheva, Klemmer 

It is widely recognized that feedback and critique is crutical to establish well functioning design processes. The traditional studio teaching approach has critique as one of its core pedagogical activitites. In this paper the authors address this activity and has developed a system, CritiqueKit, that is aimed at supporting feedback and critique. The systems is interesting even though it is not easy to see how exactly it works just by reading about it. I do agree that there are ways of increasing the quality of critique, if a system like this can actually do it, I am not sure. One question that the authors do not discuss is to what extent any system narrows the form of critique that will or can be given. What forms of critique is not possible to capture in a system, and could that lead to less creative feedback. Anyway, interesting...



Reference:
Ngoon, T. J., Fraser, C. A., Weingarten, A. S., Dontcheva, M., & Klemmer, S. (2018, April). Interactive Guidance Techniques for Improving Creative Feedback. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (p. 55). ACM.

xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx


2018 Nov 7th:
"Customizing Hybrid Products" by Benford, Koleva, Preston, Angus, Thorn and Glover
One of the existing aspects of the ongoing digital transformation is the growth of hybrid things, that is, things that are not only physical or digital but has distinct qualities related to both. Of course, most digital things have physical aspects but in most cases, they are only as containers of whatever technology is inside. Hybrid things are those where both the physical and digital qualities are crucial to the functionality and use of the thing.

In HCI education, we have seen a drastic growth of including physical design into the courses. In the CHI paper, "Customizing Hybrid Products", the authors experiment with design and use and appropriation of hybrid things. They build a hybrid advent calendar, deploy it, and investigate its use. As a result, they propose some concepts that could support the design of hybrid products, especially related to customization.  Anyone working with hybridity, customization, appropriation might find the paper useful.

Reference:
Steve Benford, Boriana Koleva , William Westwood Preston, Alice Angus, Emily-Clare Thorn, Kevin Glover. (2018) "Customizing Hybrid Products", CHI 2018, April 21–26, 2018, Montreal, QC, Canada


xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx


2018 Nov 6th:  
"Impact of Interaction Paradigms on Full-Body Interaction Collocated Experiences for Promoting Social Initiation and Collaboration" by Ciera Crowell, Joan Mora-Guiard & Narcis Pares (ref below)

I picked this article based on the notion of "interaction paradigm" in the title. Since I am working on interactivity and interaction it seemed interesting. The authors are presenting "first-person and third-person interaction paradigms, and the corresponding theoretical approaches when designing and developing collocated experiences for children with Autism Spectrum Disorders (ASDs)." (from Abstract).

The notion of the two paradigms is not new, it is commonly used when it comes to games, but to apply it in this field is quite interesting. The author does a good job in presenting the basic ideas and the consequences it has on design.

I would recommend this article to anyone who is involved in designing collaboration in VR.

Reference:
Ciera Crowell, Joan Mora-Guiard & Narcis Pares (2018) Impact of Interaction Paradigms on Full-Body Interaction Collocated Experiences for Promoting Social Initiation and Collaboration, Human–Computer Interaction, 33:5-6, 422-454, DOI: 10.1080/07370024.2017.1374185 
To link to this article: https://doi.org/10.1080/07370024.2017.1374185 

No comments:

Featured Post

Why Design Thinking is Not Enough

If you go to Youtube and look for "design thinking" you will find a large number of videos with TED talks and other talks all expl...