Tag Archives: user interface

Zeljko Medenica defends dissertation

Last November Zeljko Medenica defended his dissertation [1]. Zeljko explored new performance measures that can be used to characterize interactions with in-vehicle devices. The impetus for this work came from our work with personal navigation devices. Specifically, in work published in 2009 [2] we found fairly large differences in the time drivers spend looking at the road ahead (more for voice-only turn-by-turn directions, less when there’s also a map displayed). However, the commonly used driving performance measures (average variance of lane position and steering wheel angle) did not indicate differences between these conditions. We thought that driving might still be affected, and Zeljko’s work confirms this hypothesis.

Zeljko is now with Nuance, working with Garrett Weinberg. Garrett and Zeljko collaborated during Zeljko’s internships at MERL (where Garrett worked prior to joining Nuance) in 2009 and 2010.

I would like to thank Zeljko’s committee for all of their contributions: Paul GreenTim Paek, Tom Miller, and Nicholas Kirsch. Below is a photo of all of us after the defense. See more photos on Flickr.

Tim Paek (left), Zeljko Medenica, Andrew Kun, Tom Miller, Nicholas Kirsch, and Paul Green (on the laptop)

 

References

[1] Zeljko Medenica,  “Cross-Correlation Based Performance Measures for Characterizing the Influence of In-Vehicle Interfaces on Driving and Cognitive Workload,” Doctoral Dissertation, University of New Hampshire, 2012

[2] Andrew L. Kun, Tim Paek, Zeljko Medenica, Nemanja Memarovic, Oskar Palinko, “Glancing at Personal Navigation Devices Can Affect Driving: Experimental Results and Design Implications,” Automotive UI 2009

Zeljko Medenica advances to candidacy

Last week my PhD student Zeljko Medenica advanced to candidacy. Zeljko plans to create a driving performance measure that would be sensitive to short-lived and/or infrequent degradations in driving performance. In previous driving simulator-based studies [1, 2] we found that glancing away from the road is correlated with worse driving performance. Importantly, this is true even when performance averages over the length of the entire experiment are not affected. Thus, Zeljko plans to explore the use of cross-correlation in creating a new, highly sensitive driving performance measure.

Zeljko’s PhD committee includes Paul Green (UMTRI), Tim Paek (Microsoft Research), Nicholas Kirsch (UNH) and Tom Miller (UNH). Thanks to all for serving!

References

[1] Andrew L. Kun, Tim Paek, Zeljko Medenica, Nemanja Memarovic, Oskar Palinko, “Glancing at Personal Navigation Devices Can Affect Driving: Experimental Results and Design Implications,” Automotive UI 2009

[2] Zeljko Medenica, Andrew L. Kun, Tim Paek, Oskar Palinko, “Augmented Reality vs. Street Views: A Driving Simulator Study Comparing Two Emerging Navigation Aids,” to appear at MobileHCI 2011

2011 opportunities for UNH CS students: multi-touch surface interaction

I am seeking UNH CS students (individuals or teams) interested in developing a user interface on a multi-touch table. The interface would allow a human operator to control a fleet of unmanned aerial vehicles (UAVs). This project will part of a collaborative effort with WPI on creating a fleet of UAVs. Students at WPI will focus on building the UAVs. Students at UNH will work on communication issues (with Professor Nicholas Kirsch) and on user interface issues (with me).

What should the user interface do?

The operator should be able to view and manipulate data sent out by the UAV fleet. Data types of interest include images, video, sounds and outputs from various sensors (temperature, pressure, accelerometers, etc.). Data manipulation will require some simple processing, such as setting beginning and end points for sounds, zooming images, etc. It will also require more complex processing of data, e.g. filtering.

What are the data sources?

Eventually, the data will come from UAVs. However, as a first step, data will be generated through games, similarly to work done by Jatin Matani and Trupti Telang. Thus, we might utilize cell phones to get images, webcams to get video, and Arduino boards to generate sensor data (e.g. temperature).

What platform will be used?

The project will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

Is this a job, a project, or something else?

CS students would be able to use this effort as a senior project (details to be worked out with appropriate CS faculty). An independent study might also be a possiblity. Finally, I am interested in hiring students for academic year and/or summer jobs.

Can CS and ECE students collaborate?

Collaboration is not a requirement. However, some aspects of this work might benefit from the involvement of one or more UNH ECE students. E.g. ECE students can work on some of the data processing aspects of the projects, as well as on creating data sources (e.g. deployment of wireless sensor networks). I am actively recruiting ECE students for multi-touch projects and you are welcome to talk to your friends in ECE.

What are the required skills? And what new skills will I acquire?

For CS students, work on this project will require object-oriented programming that is necessary to control the multi-touch table. You will explore the application of these skills to the design of surface user interfaces as well as experiments with human subjects – after all we will have to systematically test your creation!

Interested? Have questions, ideas, suggestions?
Email me.

2011 Senior Project topics: multi-touch surface interaction

I am seeking students (individuals or teams) for two senior projects. Both projects would leverage a multi-touch surface to create a natural user interface for pervasive computing applications.

Pervasive computing problems and ideas are often introduced using videos. An excellent exampe is the Microsoft Health Future Vision video (download, watch on YouTube). 

Let’s focus on three themes from the video that are relevant to the senior projects: interactions with multi-touch interfaces, interactions with tangible user interfaces, and data manipulation/fusion. Multi-touch surfaces appear throughout the video: in Sabine’s home, in the doctor’s office, and in the hospital lobby. Several of the multi-touch interfaces, such as Sabine’s remote control, and her virtual wallet (used in the lobby), are tangible interfaces. Finally, Dr. Kemp manipulates/fuses data when interacting with Alex (patient in bed) and especially during the meeting with Sabine and Wei Yu.

The two senior projects will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

With all this in mind, here are the specifics on the two proposed projects.

Project 1: Mobile data fusion

This project will explore fusing data, such as images, video, sounds and outputs from various sensors (temperature, pressure, accelerometers, etc.). Data fusion will require some simple processing, such as setting beginning and end points for sounds, zooming images, etc. It will also require more complex digital signal processing of data, e.g. windowing and filtering (topics covered in ECE 714). Consequently, work on this project will focus on data processing as well as object-oriented programming that is necessary to control the multi-touch table.

This project will be tied to a collaborative effort with WPI on creating a fleet of UAVs. Thus, eventually, the data to process and display on the multi-touch will come from the UAVs. However, as a first step, data will be generated through games, similarly to work done by Jatin Matani and Trupti Telang.

Project 2: IR wallet

The Microsoft Surface uses infrared illumination and cameras to recognize interactions with its surface. It can also recognize 2D barcodes if they are visible in the IR part of the spectrum. The “IR wallet” project would result in a tangible user interface, similar to Sabine’s virtual wallet, that can display 2D barcodes in IR. These in turn will be picked up by the Microsoft Surface. Work on this project will focus on microcontroller-based design (e.g. with an Arduino board) and object-oriented programming for the Surface.

Interested? Have questions, ideas, suggestions? Email me.

MERL gift

I’m happy to report that I received a gift grant in the amount of $5,000 from Mitsubishi Electric Research Laboratories (MERL). The gift is intended to support my work on speech user interfaces and it was awarded by Dr. Kent Wittenburg, Vice President & Director of MERL.

This gift comes in the context of ongoing interactions between researchers at MERL and my group at UNH. Kent and Bent Schmidt-Nielsen hosted me several years ago for a demonstration of the Project54 system (I drove to Boston in a police SUV, which was fun), and I also gave a talk at MERL last fall.  In 2009 my PhD student Zeljko Medenica worked as a summer intern at MERL under the direction of Bret Harsham (Bret recently gave a talk at UNHon some of this work – see picture below). Zeljko is headed back to MERL this summer and he will work under the direction of Garrett Weinberg.

I greatly appreciate MERL’s generous gift and I plan to use it to help fund a graduate student working on speech user interfaces. I hope to report back to Kent, Bent, Bret and Garrett on the student’s progress by the end of this summer.

Project54 on front page of New York Times

In a front page article of the March 11, 2010 edition of the New York Times Matt Richtel discusses in-vehicle electronic devices used by first responders. Based on a number of interviews, including one with me, Matt gets the point across that interactions with in-vehicle devices can distract first responders from the primary task for any driver: driving. The personal accounts from first responders are certainly gripping. Thanks Matt for bringing this issue to the public.

Enter Project54. According to Matt “[r]esearchers are working to reduce the risk.” He goes on to describe UNH’s Project54 system which allows officers to issue voice commands in order to interact with in-car electronic devices. This means officers can keep their eyes on the road and their hands on the wheel. The article includes praise for the Project54 system by Captain John G. LeLacheur of the New Hampshire State Police. The Project54 system was developed in partnership with the NHSP and almost every NHSP cruiser has the Project54 system installed.

Both the print and the online versions of the article begin with a picture of the Project54 in-car system. This great picture was taken by Sheryl Senter and it shows Sergeant Tom Dronsfield of the Lee, NH Police Department in action.