Category Archives: senior project

2015 senior project ideas

Are you a UNH junior looking for an exciting senior project? Are you interested in driving research, and/or eye tracking research? Would you like to work publish your work at a conference (three recent senior projects resulted in publications: pupil diameter, navigation, driver authentication)? Would you like to design new interaction techniques, such as this LED-based augmented reality navigation aid:

If so, here is a list of ideas for 2015 senior projects:

  1. Collision warning systems. Collision warning systems issue auditory, visual, or multimodal warnings in the case of imminent collision. But, do drivers pay attention to these warnings? Do these systems reduce braking reaction time? These are some of the questions the senior project team will explore through driving simulator-based studies.
  2. Intelligent agent controller for automated vehicle. Automated vehicles are of great interest to the automotive industry. The senior project team will develop an intelligent agent to control a simulated vehicle. In future work the intelligent agent will be used in exploring HCI issues related to automated driving.
  3. Intelligent human-computer interaction that supports reengagement in driving. A central question in automated driving is: how will driver reengage in the driving task once the automation needs assistance? The senior project team will design strategies for alerting the driver, as well as methods to evaluate how fully the driver has reengaged in the driving task.
  4. Using Apple Siri while driving. With the support of Apple engineers we are setting up Siri in our driving simulator. The senior project team will design experiments to assess the safety of interacting with Siri while driving.
  5. Eye tracking for early detection of Alzheimer’s. Alzheimer’s disease is devastating. Early detection of the disease, and a subsequent early intervention, might improve the odds of successful treatment. The senior project team will explore the use of eye behavior and pupil diameter as measures for early detection.
  6. Comparing Prezi and slides. Prezi presentations are exciting. The senior project team will explore what the strengths and weaknesses of this presentation style when compared to traditional slide presentations.
  7. Your ideas. Do you have a senior project idea in the general areas of driving, and eye tracking? Let us know – send email to Andrew Kun.

via

2012-2013 senior project ideas

Here is a list of senior project ideas I would be very interested in working on. I’m looking for teams or individuals from the ECE and CS departments.

  1. Augmented reality (AR) on the cheap. Work in our lab has shown that in-vehicle AR navigation aids can effectively guide drivers, that they do not distract, and that drivers like them [1] – see video below. However, they’re expensive to make. In this senior project students will build a device that will augment the speech output of a personal navigation device/app (e.g. Google Navigation) with LED displays indicating upcoming turns. The device will be tested in driving simulator experiments.

  2. Instrumented steering wheel. Today’s vehicles have myriad buttons, many on the steering wheel [2]. This project will build on our work with a push-to-talk glove [3, 4] to explore how drivers could interact with in-vehicle devices by tapping the steering wheel. Additionally, sensors on the steering wheel will produce feedback about the driver’s state (e.g. a stressed driver might squeeze the steering wheel much harder than a relaxed driver). Multiple driving simulator experiments will validate the design of the instrumented steering wheel.
  3. Video call. Work in our lab has shown that video calling can be a real distraction from driving [5] – see video below. This project will explore how different topics of conversation (e.g. playing word games vs. arguing), different relationships between conversants (e.g. friends vs. strangers), and different driving conditions (e.g. city vs. highway) influence driver’s ability to operate a simulated vehicle while video calling.
  4. Tangible user interfaces that support exploring large, time-sequence data sets. The Environmental Response Management Application (ERMA) is a web-based data visualization application. It visualizes geo-coded time series, without requiring users to know how to access specialized databases, or overlay data from these databases on virtual maps. ERMA was developed at UNH, under the guidance of the Coastal Response Research Center (CRRC). Nancy Kinner is the co-director of the UNH Coastal Response Research Center. Building on Nancy’s experiences with ERMA, she and I are interested in exploring how a tangible user interfaces utilizing a multi-touch table could be used to access and manipulate geo-coded time series. In this project students will develop a user interface on a multi-touch table. The interface will allow a human operator to access remote databases, manipulate the data (e.g. by sending it to Matlab for processing) and display the results on a virtual map or a graph.
  5. Tangible user interfaces for children. How can we entertain and teach kids using technologies such as the Microsoft Surface and tangible interfaces? Students working on this project would seek opportunities to collaborate with other researchers on the UNH campus to further explore this question.
  6. Your ideas related to user interfaces in vehicles and on multi-touch tables. Do you have an idea you’d like to explore? Tell me more about it!
If this sounds interesting send me email and let’s talk.

References

[1] Zeljko Medenica, Andrew L. Kun, Tim Paek, Oskar Palinko, “Augmented Reality vs. Street Views: A Driving Simulator Study Comparing Two Emerging Navigation Aids,” MobileHCI 2011

[2] Dagmar Kern, Albrecht Schmidt, “Design Space for Driver-based Automotive User Interfaces,” AutomotiveUI 2009

[3] Oskar Palinko, Andrew L. Kun, “Prototype Wireless Push-to-Talk Glove,” IET 2008

[4] Oskar Palinko, Andrew L. Kun, “Comparison of the Effects of Two Push-to-Talk Button Implementations on Driver Hand Position and Visual Attention,” Driving Assessment 2009

[5] Andrew L. Kun, Zeljko Medenica, “Video Call, or Not, that is the Question,” to appear in CHI ’12 Extended Abstracts

2011 opportunity for UNH CS students: multi-touch surface manipulation of geo-coded time series

When I think back to the recent BP oil spill in the Gulf of Mexico, the images that come to mind are of wildlife affected on beaches, idle fishing vessels, and a massive response that involved thousands of people across multiple states.

How can such a massive response be managed? There is no single answer. However, one thing that can help is to make data about various aspects of the disaster, as well as the response effort, accessible to those conducting the response activities. This is the role of the Environmental Response Management Application (ERMA). ERMA is a web-based data visualization application. It visualizes geo-coded time series, without requiring users to know how to access specialized databases, or overlay data from these databases on virtual maps. ERMA was developed at UNH, under the guidance of the Coastal Response Research Center (CRRC).

Nancy Kinner is the co-director of the UNH Coastal Response Research Center. Building on Nancy’s experiences with ERMA, she and I are interested in exploring how a multi-touch table could be used to access and manipulate geo-coded time series.

Seeking UNH CS student

To further are effort, we are seeking a UNH CS student interested in developing a user interface on a multi-touch table. The interface would allow a human operator to access remote databases, manipulate the data (e.g. by sending it to Matlab for processing) and display the results on a virtual map or a graph. This work will be part of a team effort with two students working with Nancy on identifying data and manipulations of interest.

What should the user interface do?

The operator should be able to select data, e.g. from a website such as ERMA. Data types of interest include outputs from various sensors (temperature, pressure, accelerometers, etc.). Data manipulation will require some simple processing, such as setting beginning and end points for sensor readings. It will also require more complex processing of data, e.g. filtering.

What platform will be used?

The project will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

What are the terms of this job?

We are interested in hiring an undergraduate or graduate UNH CS student for the 2011-2012 academic year, with the possibility of extending the appointment for the summer of 2012 and beyond, pending satisfactory performance and the availability of funding. The student will work up to 20 hours/week during the academic year and up to 40 hours a week during the summer break.

What are the required skills? And what new skills will I acquire?

Work on this ream-project will require object-oriented programming that is necessary to control the multi-touch table. You will explore the application of these skills to the design of surface user interfaces as well as experiments with human subjects – after all we will have to systematically test your creation! Finally, you will interact with students and faculty from at least two other disciplines (civil/environmental and electrical/computer engineering), which means you will gain valuable experience working on multi-disciplinary teams.

Interested? Have questions, ideas, suggestions?
Email me.

2011 opportunities for UNH CS students: multi-touch surface interaction

I am seeking UNH CS students (individuals or teams) interested in developing a user interface on a multi-touch table. The interface would allow a human operator to control a fleet of unmanned aerial vehicles (UAVs). This project will part of a collaborative effort with WPI on creating a fleet of UAVs. Students at WPI will focus on building the UAVs. Students at UNH will work on communication issues (with Professor Nicholas Kirsch) and on user interface issues (with me).

What should the user interface do?

The operator should be able to view and manipulate data sent out by the UAV fleet. Data types of interest include images, video, sounds and outputs from various sensors (temperature, pressure, accelerometers, etc.). Data manipulation will require some simple processing, such as setting beginning and end points for sounds, zooming images, etc. It will also require more complex processing of data, e.g. filtering.

What are the data sources?

Eventually, the data will come from UAVs. However, as a first step, data will be generated through games, similarly to work done by Jatin Matani and Trupti Telang. Thus, we might utilize cell phones to get images, webcams to get video, and Arduino boards to generate sensor data (e.g. temperature).

What platform will be used?

The project will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

Is this a job, a project, or something else?

CS students would be able to use this effort as a senior project (details to be worked out with appropriate CS faculty). An independent study might also be a possiblity. Finally, I am interested in hiring students for academic year and/or summer jobs.

Can CS and ECE students collaborate?

Collaboration is not a requirement. However, some aspects of this work might benefit from the involvement of one or more UNH ECE students. E.g. ECE students can work on some of the data processing aspects of the projects, as well as on creating data sources (e.g. deployment of wireless sensor networks). I am actively recruiting ECE students for multi-touch projects and you are welcome to talk to your friends in ECE.

What are the required skills? And what new skills will I acquire?

For CS students, work on this project will require object-oriented programming that is necessary to control the multi-touch table. You will explore the application of these skills to the design of surface user interfaces as well as experiments with human subjects – after all we will have to systematically test your creation!

Interested? Have questions, ideas, suggestions?
Email me.

2011 Senior Project topics: multi-touch surface interaction

I am seeking students (individuals or teams) for two senior projects. Both projects would leverage a multi-touch surface to create a natural user interface for pervasive computing applications.

Pervasive computing problems and ideas are often introduced using videos. An excellent exampe is the Microsoft Health Future Vision video (download, watch on YouTube). 

Let’s focus on three themes from the video that are relevant to the senior projects: interactions with multi-touch interfaces, interactions with tangible user interfaces, and data manipulation/fusion. Multi-touch surfaces appear throughout the video: in Sabine’s home, in the doctor’s office, and in the hospital lobby. Several of the multi-touch interfaces, such as Sabine’s remote control, and her virtual wallet (used in the lobby), are tangible interfaces. Finally, Dr. Kemp manipulates/fuses data when interacting with Alex (patient in bed) and especially during the meeting with Sabine and Wei Yu.

The two senior projects will leverage Project54’s Microsoft Surface multi-touch table. Here is a video by UNH ECE graduate student Tim April introducing some of the interactions he has explored with the Surface.

With all this in mind, here are the specifics on the two proposed projects.

Project 1: Mobile data fusion

This project will explore fusing data, such as images, video, sounds and outputs from various sensors (temperature, pressure, accelerometers, etc.). Data fusion will require some simple processing, such as setting beginning and end points for sounds, zooming images, etc. It will also require more complex digital signal processing of data, e.g. windowing and filtering (topics covered in ECE 714). Consequently, work on this project will focus on data processing as well as object-oriented programming that is necessary to control the multi-touch table.

This project will be tied to a collaborative effort with WPI on creating a fleet of UAVs. Thus, eventually, the data to process and display on the multi-touch will come from the UAVs. However, as a first step, data will be generated through games, similarly to work done by Jatin Matani and Trupti Telang.

Project 2: IR wallet

The Microsoft Surface uses infrared illumination and cameras to recognize interactions with its surface. It can also recognize 2D barcodes if they are visible in the IR part of the spectrum. The “IR wallet” project would result in a tangible user interface, similar to Sabine’s virtual wallet, that can display 2D barcodes in IR. These in turn will be picked up by the Microsoft Surface. Work on this project will focus on microcontroller-based design (e.g. with an Arduino board) and object-oriented programming for the Surface.

Interested? Have questions, ideas, suggestions? Email me.