On Thursday I participated in a Phase II National Science Foundation Small Business Innovation Research (NSF SBIR) panel. While I’ve been to Phase I panels before, this was my first Phase II panel. In Phase I companies can request up to $150,000 for 6 months to a year. A company that receives a Phase I award, and successfully delivers on its grant, is eligible to compete in Phase II with a proposal for up to $500,000 for two years.
The one thing that always strikes me at the SBIR panels is that proposals have to make a good business case. Panels include both technical experts and business experts and a proposal has to clear the bar with both sets in order to be recommended for funding. I’ve always taken it for granted that an NSF proposal (SBIR or scientific) should make a good argument for why the technology or scientific innovation is worth funding. However, before my involvement in the SBIR review process, I didn’t really think much about the business case to be made when requesting funding for a business venture. In this respect I’m hardly alone: engineers usually don’t spend much time exploring the business side of running a business. At the UNH ECE department we’re looking into alleviating this problem through the involvement of Brad Gillespie in our senior project courses. Brad is a UNH ECE alumnus, Microsoft veteran and business strategy consultant. Read about Brad’s last visit to UNH ECE and check back for more on this in a future post.
So, if you’re a technical person planning to submit an SBIR proposal (note that many federal agencies run SBIR programs, not just the NSF), my advice is this: bring in people who can help you think through (and coherently present in the proposal) a business plan for your venture. Without a compelling business plan your proposal will not be funded.
On Wednesday and Thursday, Oskar Palinko, Mark Taipan and I participated in the NIJ CommTech Technical Working Group meeting. On Wednesday I gave the presentation below reporting on our lab’s progress.
On Thursday we participated in the meeting’s demo session. We demonstrated the advantage of using voice commands to control a police radio over using the radio’s buttons. We used a single-computer driving simulator and a radio setup. Of course the first driving simulator experiment we published investigated this effect . We also demonstrated accessing a remote database using the Project54 system running on a Symbol handheld computer. We expect that, once we get approval from the NH State Police to deploy such devices (NHSP is responsible for data access for all officers in the state), they will be a big hit with local departments.
One of the many people we had a chance to talk to at the TWG meeting is Gil Emery, Communications Manager at the Portsmouth, NH PD. Gil was interested in the handhelds and we may be able to work with him on using these handhelds as cameras that allow tagging pictures on the spot and then using a cellular network to transmit them to headquarters. This work would build on Michael Farrar’s MS thesis research.
Our group presented two posters at last week’s Ubicomp 2009. Oskar Palinko and Michael Litchfield were on hand to talk about our multitouch table effort (a great deal of work for this poster was done by Ankit Singh). Zeljko Medenica introduced a driving simulator pilot, work done in collaboation wtih Tim Paek, that deals with using augmented reality for the user interface of a navigation device .
Oskar (center) and Mike (right)
Oskar, Mike and I are working on expanding the multitouch study. We plan to start with an online study in which subjects will watch two videos, one in which a story is presented using the multitouch table and another with the same story presented using a simple slide show. Zeljko will head up the follow-on to the pilot study – take a look at the video below to see (roughly) what we’re planning to do.
Take a look at other pictures I took at Ubicomp 2009 on Flickr.
After my trip to Automotive UI 2009 I flew to Budapest, Hungary. The UNH College of Engineering and Physical Sciences has an exchange program with BUTE and I went to promote this program to BUTE students. I also got a chance to meet two people responsible for implementing the program “on the ground” in Budapest, Eszter Kiss and Máté Helfrich. Eszter is the person who looks after the UNH students (and many others from all over the world) from the time they arrive in Budapest, so I was very happy to meet her and express UNH’s gratitude for all of her efforts.
Eszter organized a talk in which I presented some of the reasons why a semester at UNH would be beneficial to BUTE students (see the slides). The discussion that followed my presentation was excellent, with students asking questions about many aspects of the exchange program, as well as a new summer internship program. The discussion was in Hungarian, which was fun, as I don’t use this language for work very much 🙂
You can see more pictures about my visit on Flickr.
Last Monday and Tuesday I was in Essen, Germany, at the Automotive User Interfaces 2009 conference. This was the first Automotive UI conference and it was quite successful with around 60 participants, according to conference chair Albrecht Schmidt. Here’s Albrecht welcoming us to AutoUI ’09 and the University of Duisburg-Essen:
I gave a talk at the conference about our latest navigation study that investigated the influence of two personal navigation devices on driving performance and visual attention. This was collaborative work with Tim Paek of Microsoft Research. For more information on our findings check out the paper or take a look at the slides:
As part of my visit I saw the MERL driving simulator, which is an excellent adaptation of a computer game for research purposes (read more about it in Garrett and Bret’s Automotive UI 2009 paper). I really like the driving courses that they can use (e.g. winding mountain roads and narrow village streets) and I’m impressed with the performance of the simulator’s chair which shakes and tilts.
After the simulator tour I gave a talk on our latest navigation study, which compared driving performance and visual attention when using two personal navigation aids: one that displays a map and provides spoken instructions and another that provides spoken instructions only. The talk was based on our Automotive UI 2009 paper.
Finally, I had a chance to talk to Fatih Porikli, who showed me some great videos of his work on recognizing pedestrians. We also discussed possible collaboration on learning grammars for using voice commands to tag photos. More about this in another post.
Associate Professor, Electrical and Computer Engineering, University of New Hampshire