2016–2017 Masters Student Research Projects

Apply now for Coursework Masters Research Projects in the Microsoft Research Centre for Social Natural User Interfaces (SocialNUI).

The projects listed below are available for enrolled University of Melbourne students undertaking research projects in Information Systems, Computer Science and software development within the following degrees:

  • Graduate Diploma of Computer Science
  • Master of Science (Computer Science)
  • Master of Information Technology
  • Master of Information Systems
  • Master of Engineering in Distributed Computing
  • Master of Software Systems Engineering degrees
  • Master of Science (Information Systems)

Note: Projects are open only to current students enrolled at the University of Melbourne with the Department of Computing and Information Systems.

How to apply

If you are interested in any of these projects please contact the listed project supervisor/s with:

  • a current CV
  • University transcripts
  • a brief statement of why you are interested in undertaking the project.

Projects

  • Augmented Learning Environment for Physiotherapy Education

    Annotation for Spatial Augmented Reality System

    We aim to develop a spatially augmented learning environment using Microsoft Kinect and projectors. The system provides an innovative way for instructor and student to facilitate learning, especially in the area of physiotherapy education. We propose augmented-feedback via live annotations (text and graphical) on moving human body. The project will develop software that enables annotation on the scene via a touch screen tablet. This project will suit a student with interest and background in augmented reality, human computer interaction, computer vision, and computer graphics, as well as touch screen app development.

    Supervisor: Thuong Hoang thuong.hoang@unimelb.edu.au

    Natural User Interfaces for Controlling Sensor-Enabled Spaces

    With the rise (and rise) of the Internet of Things the questions of how best to interface with these systems comes to the fore. If the spaces of the future are sensor enabled on a large scale how can users be empowered through NUI interfaces to use that information to change or modify their public and private spaces? In this project the student will develop a natural user interface for a rule-based system that allows a user to modify an environment (lights, screens) based on incoming sensor data. This project would likely interest students with a background in NUI, HCI, interface design, prototyping, physical computing and pervasive computing.

    Supervisor: Eduardo Velloso & Travis Cox eduardo.velloso@unimelb.edu.au

  • Interactive Digital Technology in the Zoo

    Human & Animal Computer Interaction at the Zoo

    Zoos are increasingly leveraging digital technologies to enhance animal monitoring, animal welfare, and improve visitor experience. In this project, the student will develop a novel technology that satisfies one of these aims, and evaluate it in the context of the zoo. The student will ideally build off the prior work we have conducted developing digital enrichment for Melbourne Zoo’s Orangutans. The project will interest students with a background in NUI & HCI, and an interest in animals and their wellbeing.

    Supervisor: Marcus Carter marcus.carter@unimelb.edu.au

  • Eye Gaze

    Implicit and Fun Eye Tracker Calibration

    We are now starting to see an increasing number of eye trackers available in the market. Even though they are more accurate and easy to use than ever, a tedious calibration procedure is required. This involves staring at a sequence of points on the screen. This project will explore how to embed the calibration procedure into the application so that users do not even realise that the tracker is being calibrated. Possible strategies include building calibration mini-games, deriving where the user should be looking from their eye behaviour, or using probabilistic models based on the appearance of the application. This project will suit students with a strong programming background with an interest in Data Science, Machine Learning, and/or Human Computer Interaction.

    Supervisor: Eduardo Velloso eduardo.velloso@unimelb.edu.au

  • SocialNUI in Immersive Gaming Environments

    Gesture in Games

    The Microsoft Kinect was released in 2010, and has enabled gesture based interaction in a wide variety of games. We are interested in exploring the use of gesture in games, and its capacity to generate new playful experiences not present in other modalities. There exists no comprehensive review or history of the use of gesture in games, and a synthesis of the many studies of game-based gesture user experience. A minor thesis project could fill this gap. A larger project could then further involve developing a novel gesture-based game, utilizing the Microsoft Kinect. The project will interest students with a background in NUI & HCI, and an interest in game design.

    Supervisor: Marcus Carter marcus.carter@unimelb.edu.au

    Eye Tracking for Game User Research

    The eyes offer a powerful window into users’ cognitive processes. The goal of this project is to design an optimal methodology for evaluating games using eye tracking. Though there is some understanding of how to evaluate conventional systems using eye tracking data, little is known about how it can be useful for evaluating games. The project will suit students with strong data analysis skills, who are interested in games and in conducting user studies. It will involve recording eye tracking data of novice and experienced players and analyse how the data can inform the game design.

    Supervisor: Eduardo Velloso eduardo.velloso@unimelb.edu.au

    Eye Tracking as a Game Controller

    Modern eye trackers, such as the Tobii EyeX and the SteelSeries Sentry, are being marketed specifically for the gaming market. In this context, creating novel game mechanics and experiences that benefit from gaze information is a largely untapped gold mine. The goal of the project is to design and build these novel game experiences involving the players’ gaze. The project will suit students with strong programming skills (C# and Unity), who are interested in building games.

    Supervisor: Eduardo Velloso eduardo.velloso@unimelb.edu.au

  • Evaluation of NUI in Search Technology

    Towards conversational IR: navigating to a particular Wikipedia page one question at a time

    Imagine a user of a low bandwidth interface (such as limited voice interaction or selecting from a limited number of options on a smart phone where typing isn’t an option) wanting to navigate to particular Wikipedia page to satisfy their existing information need. There are many interesting aspects to pursue: What input modalities should one consider? How can you define a minimal interface that is highly efficient? What is the quickest way to navigate to an individual page? How can we bias the system if we have context (such as a conversation that is concurrently being held, or a search history, or a selection of recently written emails)? What data structures and algorithms are appropriate to facilitate such a search?

    This project would suit a student with an interest in search engine technologies and could lead to further academic work in this area (postgraduate studies) or employment opportunities in the search engine industry.

    Supervisor: Bodo Billerbeck bodob@microsoft.com

  • Augmented Fitness

    A Review of Quantified Self Systems for Sports and Fitness

    As wearable sensors become increasingly more popular and inexpensive, the number of available systems in the market increases every day. Fitbit, Microsoft Band, Apple Watch: all offer some kind of fitness functionality. However, upon closer inspection, we see that these systems are very different, coming in various shapes and forms, as well as offering very different types of functionality. The goal of this project is to understand the landscape of fitness trackers currently in the market as well as how these systems relate to the existing academic literature. The project will suit students with an interest in Wearable Computing, Fitness, and/or Human Computer Interaction. The project will require strong writing skills, the ability to synthesise concepts from the literature, and will involve substantial qualitative research.

    Supervisor: Eduardo Velloso eduardo.velloso@unimelb.edu.au

    Wearable Support for Weight Lifting Activities

    Though there are many fitness trackers available in the market nowadays, few of them aim at providing qualitative feedback for weights training. The goal of this project is to investigate different approaches for providing feedback to weight lifters. We will investigate how different feedback modalities (e.g. audio, smart watch, smart phone, web) can operate at different temporal (during the exercise, after each repetition, after each set, after the whole programme) and spatial (at the gym, on the go, at home) levels. The project will suit students with a strong programming background, interested in building cross-device (I.e. integrating several devices, such as watches, phones and laptops) interactions, who are able to build functional and aesthetically pleasing interfaces.

    Supervisor: Eduardo Velloso eduardo.velloso@unimelb.edu.au

    Natural Language Processing for Human Motion Analysis

    For many activities, such as weight lifting and physiotherapy exercises, determining whether a performance is correct or incorrect is crucial for ensuring safe and optimal outcomes. However, specifying what a correct movement looks like is still a challenge. The goal of this project is to derive human motion models (e.g. how a weight lifting exercise should be executed in terms of the movement on the joints of the body) based on a textual description (e.g. the exercise instructions on a weight lifting book). The project will suit students with a strong programming background, interested in Machine Learning and Natural Language Processing.

    Supervisor: Eduardo Velloso eduardo.velloso@unimelb.edu.au

  • Ageing Bodies and Embodied Interactions

    Avatars as Agents that Effect Behaviour

    Experiments by social psychologists have demonstrated that showing college students avatars of themselves as older people, impacts their attitudes towards saving for the future. The impact is so significant that major banks have started to incorporate the technique into their retirement planning tools. We are interested in how avatars that depict older people as younger versions of themselves may similarly impact on their sense of self, and by extension, effect their ability to complete tasks in virtual environments. The project will suit students who are interested in human-computer interaction, prototyping, field work and gaming.

    Supervisor: Steven Baker steven.baker@unimelb.edu.au