Report on Fundamentals of Ubicomp course

During the spring 2010 semester I taught a new course entitled Ubiquitous Computing Fundamentals. The term ubiquitous computing refers to the model of computing in which computers are embedded in everyday objects and become part of everyday activities. As the name implies, this course was designed as an introduction to this exciting field of study.

In this course I used the excellent new ubicomp textbook [1] edited by John Krumm. I highly recommend this book to anyone starting out in the field of ubicomp. Specifically, I like two aspects of the book. First, the team of contributors assembled by John provides a comprehensive introduction to the myriad topics that make up the ubicomp field. The fact that ubicomp is an interdisciplinary field is exciting, but getting an overview of the field may seem like a daunting task. The textbook provides this overview. Second, paraphrasing Aaron Quigley‘s assessment of his chapter [2], the book provides “an entry point” to the world of conducting research in general, and ubicomp research in particular. The contributors discuss the tools used in various aspects of ubicomp research, from prototyping, to user studies, to data processing. The individual chapters help the reader formulate research questions and steps, and provide valuable tips on how to report on results. 

The course covered three topic areas:

  • History of ubicomp. The semester started with Weiser’s seminal paper [3] and with a textbook chapter introducing ubicomp by Roy Want, one of Weiser’s collaborators at Xerox PARC.
  • Building ubicomp systems. We discussed various aspects of creating ubicomp systems, from writing always-on software, to privacy, to conducting laboratory and field experiments.
  • The user experience. As this is my research focus, we spent a considerable amount of time discussion user interactions with ubicomp systems, from speech interactions, to multi-touch tables, to tangible user interfaces.

I found that an excellent way to discuss ubicomp topics is to take advantage of research videos posted online. We viewed many such videos and this led to productive discussions. We also benefited from excellent talks by Marko PopovicBret Harsham and Albrecht Schmidt.

I felt that the course was a success. Students indicated that they liked the course and thought that it was useful. The course also allowed students to express themselves creatively through the course project. The results were impressive and I’ll end this post with an example. The video below is the work of UNH ECE seniors Amy Schwarzenberg and Kyle Maroney (both graduated in May). Amy and Kyle explored user interactions with a Microsoft Surface multi-touch table.


[1] John Krumm (editor), “Ubiquitous Computing Fundamentals,” CRC Press, 2010

[2] Aaron Quigley, “From GUI to UUI: Interfaces for Ubiquitous Computing,” in John Krumm (editor), “Ubiquitous Computing Fundamentals,” CRC Press, 2010

[3] Mark Weiser, “The Computer for the 21st century,” Scientific American, pp. 94-10, September 1991

Return visit to Budapest University of Technology and Economics (BUTE)

On June 7 and 8, 2010 I visited the Budapest University of Technology and Economics (BUTE) for the second time in ten months. As with my last visit I went to discuss the BUTE-CEPS exchange program.

During this visit I met six people who have been involved in organizing different aspects of the exchange program. My host was Eszter Kiss, the Program Director of the Information Center for Engineering Programs in English (ICEPE). For UNH/CEPS students, staff and faculty, she is the Hungarian face of the exchange program. Eszter and I primarily talked about the fact that, starting in 2011, UNH ECE exchange students will spend the spring semester in Budapest. Other CEPS students will remain on the fall-in-Budapest schedule.

Eszter organized two meetings for me with BUTE leaders. The first one was with Dr. Peter Moson, Vice-Rector for International Relations (the Vice-Rector position at BUTE is equivalent to the Vice President position at a US university). Ildiko Varga, the head of the BUTE Erasmus and Exchange Office was also present at this meeting. Dr. Moson expressed his full support for a vibrant relationship between BUTE and CEPS. On a personal note it was great to see Dr. Moson who I met during his visit to UNH last year. It was also nice to talk to Ms. Varga who went to graduate school and taught mathematics at Purdue.

The second meeting organized by Eszter was with Dr. Gabor Stepan. Dr. Stepan, a member of the Hungarian Academy of Sciences (HAS), is the Dean of the BUTE Faculty of Mechanical Engineering, the ICEPE’s parent unit. Dr. Stepan expressed his full support for the BUTE-CEPS exchange program. Again on a personal note, it was exciting for me to visit the BUTE Faculty of ME where my father received his BS ME a long time ago. Dr. Stepan also spent some time telling me about BUTE’s history, including facts and anecdotes about BUTE’s Nobel-prize winning alumni.

While the meetings with Drs Stepan and Moson and with Ms. Varga primarily dealt with the overall BUTE-CEPS relationship, I also had a chance to work on issues related to UNH ECE directly with the BUTE unit that hosts ECE students. Specifically, Dr. Moson introduced me to Dr. Balint Kiss, the person in charge of the English language education at the Faculty of Electrical Engineering and Informatics. This is the BUTE unit that hosts UNH ECE exchange students and Dr. Kiss will be my primary contact in determining courses for our students to take while at BUTE. The meeting with Dr. Kiss was also an opportunity to catch up with Dr. Peter Arato. Dr. Arato, who is also a HAS member, has strong ties to the UNH ECE department having collaborated extensively with UNH ECE professor Andrzej Rucinski.

In addition to all these productive meetings I had a chance to give a talk to BUTE students interested in the exchage program. Seven prospective students attended, several of them interested in coming to the UNH ECE department – I hope we’ll see them here soon.

I would like to thank Eszter Kiss for organizing my visit (on very short notice). I would also like to thank the BUTE faculty, staff and students who took time to meet with me. Finally, I would like to acknowledge the UNH ECE Department and the CEPS Dean’s office who jointly funded this visit.

For pictures about my trips to Budapest visit my Flickr page.

Visit to FTW, Vienna

On June 4, 2010 I visited the Telecommunications Research Center Vienna (FTW). My host was Peter Froehlich, Senior Researcher in FTW’s User-Centered Interaction area of activity. Peter and I met at the CHI SIG meeting on automotive user interfaces [1] that I helped organize.

Peter and his colleagues are investigating automotive navigation aids and are currently preparing for an on-road study. I’m happy to report that this study will utilize one of our eye trackers. My visit provided an opportunity for us to discuss this upcoming study and how the eye tracker may be useful in evaluating the research hypotheses. Part of this discussion was a Telecommunications Forum talk I gave – see the slides below:

I want to thank Peter and his colleagues at FTW for hosting me and I’m looking forward to our upcoming collaboration. I also want to thank FTW for providing funding for my visit.


[1] Albrecht Schmidt, Anind L. Dey, Andrew L. Kun, Wolfgang Spiessl, “Automotive User Interfaces: Human Computer Interaction in the Car,” CHI 2010 Extended Abstracts

Albrecht Schmidt visit to UNH

Last month (April 16) Albrecht Schmidt visited UNH and the Project54 lab. Albrecht gave an excellent talk introducing some of the research problems in pervasive computing and specifically touching on the latest results from his lab, which were just published at CHI 2010 [1, 2]. I was especially interested in the work on helping users find the last place of interest on a map quickly. Albrecht and colleagues track the user’s gaze and when the user looks away, they place a marker (or gazemark) on the map. When the user looks back at the map she can start where she left off: at the place of the marker. Clearly this could be very useful when looking at GPS maps in a car. In such a situation the driver has to keep going back and forth between the map and the road and you want to minimize the time spent looking at the map (the road being the more important thing to look at!). The gazemarks introduced by Albrecht’s group may help. It would be interesting to conduct a driving simulator study with gazemarks.

After the talk Albrecht spent about an hour with students from the Project54 lab and those in my Ubicomp Fundamentals course. This was a more intimate setting for conversations about Albrecht’s research. Finally, Project54 staff and students spent a couple of hours discussing Project54 research with Albrecht – our work on handheld computers, on driving simulator-based investigations of in-car user interfaces and our budding efforts in multi-touch table interaction.

I am grateful to the UNH Provost’s Office for helping to fund Albrecht’s visit through a grant from the Class of 1954 Academic Enrichment Fund.


[1] Dagmar Kern, P. Marshall and Albrecht Schmidt, ” Gazemarks: gaze-based visual placeholders to ease attention switching,” CHI 2010

[2] Alireza Sahami Shirazi, Ari-Heikki Sarjanoja, Florian Alt, Albrecht Schmidt, and Jonna Häkkilä, J. “Understanding the impact of abstracted audio preview of SMS,” CHI 2010

MERL gift

I’m happy to report that I received a gift grant in the amount of $5,000 from Mitsubishi Electric Research Laboratories (MERL). The gift is intended to support my work on speech user interfaces and it was awarded by Dr. Kent Wittenburg, Vice President & Director of MERL.

This gift comes in the context of ongoing interactions between researchers at MERL and my group at UNH. Kent and Bent Schmidt-Nielsen hosted me several years ago for a demonstration of the Project54 system (I drove to Boston in a police SUV, which was fun), and I also gave a talk at MERL last fall.  In 2009 my PhD student Zeljko Medenica worked as a summer intern at MERL under the direction of Bret Harsham (Bret recently gave a talk at UNHon some of this work – see picture below). Zeljko is headed back to MERL this summer and he will work under the direction of Garrett Weinberg.

I greatly appreciate MERL’s generous gift and I plan to use it to help fund a graduate student working on speech user interfaces. I hope to report back to Kent, Bent, Bret and Garrett on the student’s progress by the end of this summer.

Project54 on front page of New York Times

In a front page article of the March 11, 2010 edition of the New York Times Matt Richtel discusses in-vehicle electronic devices used by first responders. Based on a number of interviews, including one with me, Matt gets the point across that interactions with in-vehicle devices can distract first responders from the primary task for any driver: driving. The personal accounts from first responders are certainly gripping. Thanks Matt for bringing this issue to the public.

Enter Project54. According to Matt “[r]esearchers are working to reduce the risk.” He goes on to describe UNH’s Project54 system which allows officers to issue voice commands in order to interact with in-car electronic devices. This means officers can keep their eyes on the road and their hands on the wheel. The article includes praise for the Project54 system by Captain John G. LeLacheur of the New Hampshire State Police. The Project54 system was developed in partnership with the NHSP and almost every NHSP cruiser has the Project54 system installed.

Both the print and the online versions of the article begin with a picture of the Project54 in-car system. This great picture was taken by Sheryl Senter and it shows Sergeant Tom Dronsfield of the Lee, NH Police Department in action.

Alex Shyrokov defends PhD

Two weeks ago my student Alex Shyrokov defended his PhD dissertation. Alex was interested in human-computer interaction for cases when the human is engaged in a manual-visual task. In such situations a speech interface appears to be a natural way to communicate with a computer. Alex was especially interested in multi-threaded spoken HCI. In multi-threaded dialogues the conversants switch back and forth between multiple topics.

How should we design a speech interface that will support multi-threaded human-computer dialogues when the human is engaged in a manual-visual task? In order to begin answering this question Alex explored spoken dialogues between two human conversants. The hypothesis is that a successful HCI design can mimic some aspects of human-human interaction.

In Alex’s experiments one of the conversants (the driver) operated a simulated vehicle while the other (an assistant) was only engaged in the spoken dialogue. The conversants were engaged in an ongoing and in an interrupting spoken task. Alex’s dissertation discusses several interesting findings, one of which is that driving performance is worse during and after the interrupting task. Alex proposes that this is due to a shift in the driver’s attention away from driving and to the spoken tasks. The shift in turn is due to the perceived urgency of the spoken tasks – as the perceived urgency increases the driver is more likely to shift her attention away from driving. The lesson for HCI design is to be very careful in managing the driver’s perceived urgency when interacting with devices in the car.

Alex benefited tremendously from the help of my collaborator on this research Peter Heeman. Peter provided excellent guidance throughout Alex’s PhD studies for which I am grateful. Peter and I plan to continue working with Alex’s data. The data includes transcribed dialogues, videos, driving performance as well as eye tracker data. I am especially interested in using the eye tracker’s pupil diameter measurements to estimate cognitive load as we have done in work lead by Oskar Palinko [1].


[1] Oskar Palinko, Andrew L. Kun, Alexander Shyrokov, Peter Heeman, “Estimating Cognitive Load Using Remote Eye Tracking in a Driving Simulator,” ETRA 2010

Automotive user interfaces SIG meeting to be held at CHI 2010

There will be a special interest group (SIG) meeting on automotive user interfaces at CHI 2010. The lead author of the paper describing the aims of the SIG [1] is Albrecht Schmidt and the list of coauthors includes Anind Dey, Wolfgang Spiessl and me. CHI SIGs are 90 minute scheduled sessions during the conference. They are an opportunity for researchers with a common interest to meet face-to-face and engage in dialog.

Our SIG deals with human-computer interaction in the car. This is an exciting field of study that was the topic of a CHI 2008 SIG [2] as well as the AutomotiveUI 2009 conference [3], and the AutomotiveUI 2010 CFP will be posted very soon. In the last several years human-computer interaction in the car has increased for two main reasons. One, many cars now come equipped with myriad electronic devices such as displays indicating power usage and advanced driver assistance systems. Second, users (drivers and passengers) bring mobile devices to cars. The list of these brought-in mobile devices is long but personal navigation devices and mp3 players are probably the most common ones.

At the SIG we hope to discuss user interface issues that are the result of having all of these devices in cars. Some of the questions are:

  • How can we reduce (or eliminate) driver distraction caused by the in-car devices?
  • Can driver interactions with in-car devices actually improve driving performance?
  • Can users take advantage of novel technologies, such as streaming videos from other cars?
  • How do we build interfaces that users can trust and will thus actually use?
  • How can car manufacturers, OEMs, brought-in device manufacturers and academia collaborate in envisioning, creating and implementing automotive user interfaces?

The 2008 CHI SIG [2] attracted over 60 people and we’re hoping for similar (or better!) turnout.


[1] Albrecht Schmidt, Anind L. Dey, Andrew L. Kun, Wolfgang Spiessl, “Automotive User Interfaces: Human Computer Interaction in the Car,” CHI 2010 Extended Abstracts (to appear)

[2] D. M. Krum, J. Faenger, B. Lathrop, J. Sison, A. Lien, “All roads lead to CHI: interaction in the automobile,” CHI 2008 Extended Abstracts

[3] Albrecht Schmidt, Anind Dey, Thomas Seder, Oskar Juhlin, “Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications, 2009”

Estimating cognitive load using pupillometry: paper accepted to ETRA 2010

Our short paper [1] on using changes in pupil size diameter to estimate cognitive load was accepted to the Eye Tracking Research and Applications 2010 (ETRA 2010) conference. The lead author is Oszkar Palinko and the co-authors are my PhD student Alex Shyrokov, my OHSU collaborator Peter Heeman and me.

In previous experiments in our lab we have concentrated on performance measures to evaluate the effects of secondary tasks on the driver. Secondary tasks are those performed in addition to driving, e.g. interacting with a personal navigation device. However, as Jackson Beatty has shown, when people’s cognitive load increases their pupils dilate  [2]. This fascinating phenomenon provides a physiological measure of cognitive load. Why is it important to have multiple measures of cognitive load? As Christopher Wickens points out [3] this allows us to avoid circular arguments such as “… saying that a task interferes more because of its higher resource demand, and its resource demand is inferred to be higher because of its greater interference.”

We found that in a driving simulator-based experiment that was conducted by Alex, performance-based and pupillometry-based (that is a physiological) cognitive load measures show high correspondence for tasks that lasted tens of seconds. In other words, both driving performance measures and pupil size changes appear to track cognitive load changes. In the experiment the driver is involved in two spoken tasks in addition to the manual-visual task of driving. We hypothesize that different parts of these two spoken tasks present different levels of cognitive load for the driver. Our measurements of driving performance and pupil diameter changes appear to confirm the hypothesis. Additionally, we introduced a new pupillometry-based cognitive load measure that shows promise for tracking changes in cognitive load on time scales of several seconds.

In Alex’s experiment one of the spoken tasks required participants to ask and answer yes/no questions. We hypothesize that different phases of this task also present different levels of cognitive load to the driver. Will this be evident in driving performance and pupillometric data? We hope to find out soon!


[1] Oskar Palinko, Andrew L. Kun, Alexander Shyrokov, Peter Heeman, “Estimating Cognitive Load Using Remote Eye Tracking in a Driving Simulator,” ETRA 2010

[2] Jackson Beatty, “Task-evoked pupillary responses, processing load, and the structure of processing resources,” Psychological Bulletin. Vol. 91(2), Mar 1982, 276-292

[3] Christopher D. Wickens, “Multiple resources and performance prediction,” Theoretical Issues in Ergonomic Science, 2002, Vol. 3, No. 2, 159-177

Promoting the CEPS-BUTE Exchange Program

In an effort to promote the CEPSBUTE exchange program I gave the following presentation to two similar audiences here at UNH. Last Monday Kent Chamberlin hosted me in his ECE 401 class (the introductory ECE course) and I had a chance to talk to about 75 ECE freshmen. Today I gave the presentation to Bob Henry’s TECH 400 students (TECH 400 introduces the CEPS majors to CEPS undeclared students).

View more presentations from Andrew Kun.

My main point was this: spending a semester abroad gives students a competitive advantage because it proves that they can adapt to change. Of course spending a semester in Europe allows students to travel and I spent some time promoting my favorite travel guide author, Rick Steves 🙂

Associate Professor, Electrical and Computer Engineering, University of New Hampshire