COMP241 Project Ideas

Cruising Along

Keywords: Mobile app; Tourism; Maps; Events/Calendaring; Location Aware
Project Managers: Nathan Rendle; Ben Robinson
Team members: Lulwah Alduraiei; Alex Merz; Catherine Sirett; Ben Stewart; Kurt Sweeney

We take our mobile phones with us everywhere these days, and going on holiday is no exception—if anything our dependency on such devices probably increases at times like these. It really is a little miracle: calendaring, contacts, music, games, e-commerce, GPS-based navigation. Oh yes, and you can even use it to keep in touch with people!

The aim of this project is to harness features like these to develop "the killer-app" for use by people taking a cruise-ship holiday. This is a sector in the New Zealand economy that is really booming:

Cruise is the fastest growing sector in tourism. Over the last five years, the size of New Zealand's cruise sector doubled, and forecasts show no sign of this growth abating. The 2014-2015 cruise sector is worth $436 million in value added to the New Zealand economy and is forecast to grow to $543 million next season. (From cruisenewzealand.org.nz)

Ideas abound. For when a ship docks into port, a key feature to the app would be providing location-based information tailored to the interests of the person using the app (linkage here to the TIPPLE research project). Think of the app being used in a place like Napier. You would want to know about:

  • Places of interest
  • Where to eat
  • Possible excursions (half-day, full-day)
  • Theatre, Shows, and other seasonal events
  • ...
All keyed to your interests, hobbies, food preferences, etc. Other abilities could include:
  • A rendezvous feature, to let people with the app who have split up to do different things temporarily, co-ordinate meeting again (maybe at one of those nice cafes?!?). The idea could be further generalized to a "share information" feature, with a broader set of information shared, not just your location.
  • A multimedia travel diary/blogging component that combines audio and video with text entry—something to do while you're waiting for others to joining up with you, or else when back on the cruise-ship.
  • Live updates of information, from numerous information providers (such as i-sites).
  • Support for both a multilingual interface, and multilingual content.

Back on board the ship, the app could be "wired-in" to the events programme provided while travelling out at sea; provide additional information about the services on-board (e.g., a doctor, if needed), and have an advanced-planner feature for you to sort out what you'd like to do at the next destination.

As easy as that!

Some cross-platform mobile development environment and tools to consider:

 

Spotted You

Keywords: Internet of Things (IoT); iBeacons; Object proximity detection (cars, bikes, walls, power-poles!)
Project Managers: Aaron Davison; Ryan Jones
Team members: Chris Becker; Malakai Curulala; Jacob Hobbs; Christopher Symon;

It's been said we should stop thinking about a car as the development of the combustion engine that has become increasingly sophisticated with the digital technology that has been incorporated into its design, and instead think of the entity more as a highly advanced computer system that just happens to have a combustion engine strapped to it! This is a project in this space, with a focus on detecting potential collisions before they happen—hence the project name, Spotted You.

Taking a DIY approach, the aim of this project is to develop a range of proof-of-concept capabilities that demonstrate different ways a driver could be alerted to an impending collision, or dangerous situation developing. Suggestions are:

  • Reversing to park (proximity to stationary objects)
  • Reversing out of a busy car-park (hard to see approaching cars)
  • Joining a main road from a T-junction (watch out for motorbikes, and bikes)
  • Approaching a dangerous corner too fast (power-pole location, known black-spot, etc).

A range of sensor devices will be bought for the project, such as ultrasound sensors that detect distance, and iBeacons whose position can be triangulated. Likely to involve coding micro-controllers such as Raspberry Pi and Arduino boards.

 

Lights, Computer, Action ...

Keywords: Natural Language Parsing (NLP); Information Visualization; Time-line
Project Managers: Thye Way Phua; Sheena Mira-Ato
Team members: Michael Jang; Quentin Quaadgras; Cory Sarzotti; Corey Sterling;

Imagine the work that went into developing the screenplay for the Lord of the Rings trilogy, or any movie for that matter that is sourced from a novel. Even if you had unlimited budget and no restriction as to how long the resulting movie was, transforming the text descriptions of characters, locations, and events into visual form is a complex task. Now add in all the other constraints the development of a screenplay needs to factor in, and a difficult problem just got a whole lot harder!

Enter Lights, Computer, Action ... where the idea of the project is to develop a software tool that assists the writer in developing the screenplay. Features this tool could include are:

  • Software assisted visualization of characters and locations
  • Tracking of appearance of characters through the text
  • Tagging of location
  • Time-line of events
  • Linked/consistent manipulation of entities (e.g., characters, locations, events)
  • Undo/Rollback
There is a wide range of of a movie has to in those,

 

Haven't I been here before?

Keywords: Graphical Environment Scripting; Multi-platform OS
Project Managers: Vincent Lamont; Jeremy Symon
Team members: Saquib Azmi; Harrison Connell; Vladimir Ilic; Charles Shum; Wenrui Xu;

Computers are meant to be good a repetitive tasks, right? So why is it numerous times a week I find myself at the start of a lecture doing the same thing: after logging in I start up Panopto for video recording (providing log-in details); then I start up a browser (providing log-in details again, this time to the web proxy-server); and head to the Moodle site for the course (more log-in credentials needed); finally I access Google Drive and navigate my way to where the PowerPoint slides for the lecture are located.

Another repetitive situation is after security updates on my Windows laptop (with its obligatory reboot). There is a fairly set sequence of things I go through after this, to get things the way I find most useful. Not always the same though. If I'm at home what I do differs from what I might do at work. In particular at work, there is a finer level of granularity to what I do, as there are various research projects I'm involved with, each with a particular set of applications and command-line windows to open to particular places and environment variables set. Some even being involved remotely logged into other computers. When I switch to working on one of these projects, the first thing I need to do is go through the "same old routine" to get things going. There's even the catch that if I haven't done this for a while, for a given project, I might not even remember all the steps necessary. In the case of an experimental branch of the spatial hypermedia project, Expeditee, I need to remember to launch Eclipse from the command-line, having first set some environment variables. The environment variables to set are all sitting nicely in a script file to run, but if I don't remember to do this, I have to quit Eclipse and beginning the start-up procedure again. For the music digital library work I do with Illinois University, the work involves spinning up up web services on a couple of servers located in the US. And so on ...

The aim of this project is to develop a script solution that can be used to capture such setup sequences: both graphical and keyboard input. Exploring different approaches, and assessing their various strengths and weakness would be the starting point. The ideal end-result would be a solution that can operate across all the main operating systems (Windows, MacOS, Linux). This could be achieved by some slight of hand (separate software solutions developed for each OS), or else solved at a more fundamental level, for example, "fooling" the computer into thinking what the software solution is, is actually a keyboard and mouse that is plugged into the computer generating a stream of events for it to follow.

For something that is highly likely to have sensitive information, such as passwords, sorted in it, then security and/or encryption of data will be an important aspect the the project. I quite like the idea of having the solution in some sort of handy, portable, tangible form: say my phone. That way I'll have the graphical scriptable ability wherever I go. Just plug it in, and have it show me a list of scriptable options sorted into an ranked order, based on location and time of day.

It's unlikely that a person will get the sequence of mouse moves and keyboard input exactly right when in recording mode, and so some sort of editor capability should be included. Maybe model a script as a series of segments that can be individually tweaked, overwritten, etc. Robustness to things like a script being played back in an environment with a different screen resolution should be factored in. As should labelling the scripts (and segments) to make it easy for the user to access them.

Some starting places for you to consider for potential solutions are:

Stuber (Student Uber): "Oh, isn't that a bit like car pooling?"—Elizabeth, age 12

Keywords: Mobile app; Scheduling; Cyber-privacy; Web Technologies
Project Managers: Greg Cannell; Pritpal Chahal
Team members: Lindsen Cruz; Artiqeeiyan Jayshearn Jayagopie; Andrew Milson; Luke Schwarz; Jack Turpitt;

Notwithstanding the university's introduction of paid parking, there are many other economic and environmental reasons to car-pool to get to and from the university campus. Done in a 9am-5pm workplace, car-pooling is normally a fairly straightforward thing to arrange, and can happen quite organically: a couple of people at work through chatting discover they live in a similar area of the city, and decide to start taking turns over who drives to work, swinging by the other person's place to pick them up and drop them off. This idea applied to the needs of students travelling to and from the university campus is a magnitude of order harder: there are many more individuals involved, their times for needing to be where and when vary considerably, and there won't be any regularity in what is needed from one day to the next. Fortunately, there are also elements to the specific problem of car-pooling in a student environment that can be exploited to offset these complexities. Every student has a fixed set of courses they are enrolled in, per semester, and the lecture and tutorial schedule has a regular weekly pattern. The aim of this project is to leverage these domain specific aspects to develop a custom-crafted form of car-pooling mobile app.

 

Pitt Street

Project Managers: Peter Oomen; Jordan Crawford
Team members: George Hewlett; Taoqi Li; Andrew Kyle Simmons; Tycho Smith; Zeyang Xue;
Keywords: Google Street View; Foot motion detection (smartphone sensors); Internet of Things (IoT)

"The Pit" is an 8-sided open-shell sloped wooden structure built by Bill Rogers of the Computer Science department, large enough and strong enough for a person to stand in it. To help visualize The Pit, imagine an Octagonal bipyramid where you have horizontally cut it in half and kept the bottom half; now horizontally cut the piece you kept, again half-way, and this time keep the top off. What you have left is basically the shape of The Pit. But back to the person standing in The Pit. With sensors strapped to the lower part of each of their legs (3D digital compass and gyroscope), the intention is for the person standing in The Pit to make various stepping motions, sliding their feet down the sloped sides of the shell. The sensor data is communicated to a software program that interprets this data to determine what the foot gestures are. This is then fed into a software application as a form of input.

Bill's use of this set up, to date, has been with an Oculus Rift virtual reality environment to allow a user to physically express the act of movement (walking) while playing a game. For this Smoke and Mirrors project, the intention is to repurpose this setup to work with Google StreetView. StreetView is an amazing resource; however, use it for any length of time, and you quickly discover that mouse input for moving around in this 3D environment can at times be quite clumsy: particularly panning (note, not rotating) left and right. That is moving left or right visually in the StreetView, but still looking in the same direction. This is something that is easy to achieve in the physical world by side-stepping. Hence the idea of using The Pit to control movement around Google StreetView.

The core to this project has two parts: coming up with a set of foot movements/gestures that are intuitive for a user to use, and writing software that can detect those movements from the sensor data (which are, by the way, two well spec'd Android phones strapped to the users legs with large-stretchy wristbands). It is likely to be an iterative process! Another significant part will be learning how to use the detected gestures to control StreetView.

For a project where its development has gone well, then Bill is prepared to break out his "Screen Henge" for the latter part of the project. Screen Henge consists of 8 lounge-sized flat-screen TVs oriented vertically and positioned in an octagon shape. The 8 screen are all hooked up to a single PC, and the person stands in the middle of the octagon and interacts with the desktop environment that is now displayed all around them. I think you can see where this is going ... let's now put The Pit in the middle of this, and run Google StreetView around the 8 monitors ... with a bit of fine tuning, should make for an awesome demo!

 

Play it again, Expeditee and/or Touchingly Expeditee

Keywords: Spatial Hypermedia; Audio Processing
Project Managers: Alexander Steel; Aflah Bhari
Team members: Ward Beehre; Christopher Chen; Nicholas Humphries; Yuze Liu;

Expeditee is an open source spatial hypermedia system—developed here at Waikato under the leadership of Rob Akscyn—quite like any other information system you are likely to have encountered. It can be a word processor, a mind-map tool, a graphics visualization system, and many other things besides! Music is one of the forms supported by Expeditee, in addition to text, vector graphics and images. The aim of this project is to enrich the range of features it supports for entering, displaying, and enriching audio (Play it again, Expeditee).

Expeditee makes heavy use of mouse interaction, particular the use of mouse buttons, where it is not uncommon for a user to need to click two buttons (say middle and right) at the same time to effect a particular action (delete item in this case). One aspect to explore within this project, then, could be how to support touch input in Expeditee when it is run using a multi-touch screen. The digital library just so happens to have such a screen, which they would make available for use in this project, if this aspect were to be investigated.

 

Arcade Mashup

Project Managers: Vishnu Mallela; Brad Hansen;
Team members: Aaron Joshua Castro; Meecah Cahayon; Alasdaire Key; Steve Malcolm; Zachary Thompson;
Keywords: 2D Graphics;

Inspiration for this project came from the arcade game mashup Pac-a-Pong. The intended idea: come up with your own idea of a game mashup. The idea has evolved a bit since the inception of the project. It would acceptable for a non-mashup game to be proposed, and one class member has indeed proposed this. They will present the idea for their game to the class (Wed 23rd March). For such a project to run, there would have to be sufficient people express interest in it. The majority of teams for the group projects in COMP 241 this year will need to have 5 team members, some 6.

Another way a game project could run is with a smaller number of members allocated to it (say 2), if it is part of a larger project (such as the game element suggested in Cruising Along—especially if the game has a tie-in with the parent project, such as the map used in the game is based on the town/district the user is currently visiting).