COMP241 Project Ideas

Tannoy Ahoy!

Project Manager: Mohammed Al-Rahbi
Team members: Shashwat Sinha; Isa Ibrahim; Jordan Schroder; Mitchell Grout; Anthony Meehan
Ever been in an airport, railway station, or similar public space and been frustrated that you're not able to clearly make out the announcements made over the Tannoy (Public Announcement, PA) system? How crazy is that, when so many people these days are carrying in their pocket a handy portable device perfectly designed to be held to your ear so you can hear audio clearly—your smartphone! Enter Tannoy Ahoy!: a smartphone app designed to solve this very problem and provide, along with this, as many spin-off perks as possible that can be packed in. Ideas include:
  • Alerts you to when an annoucment is away to be made
  • Repeat announcement
  • History of previous announcements
  • Noise cancelling feature on playback?
  • Announcements keyed in as text, then synthesized as audio
  • Register for announcements based on text/tags
  • Incentivized crowd-sourced translation
  • The Mosquito effect used to broadcast a Tannoy Ahoy service is available in this area?
  • (backup options, iBeacons (e.g., estimote), "Tipple" like GPS solution as approaching building, or listed on web site)
  • Staff can use their own phones to enter annoucements into the central system
  • Offset access to announcement site: e.g., want to know about issues heading to the railway station

Distinct software development roles that occur to me:

  1. A suite of methods for detecting you've entered a Tannoy Ahoy! active zone
  2. The announcer system (audio + text) split to speakers and pushed to users with smartphones in the zone
  3. The mobile app with core functionality
  4. Added extras for the app: e.g., crowdsourcing translations

Leapmotion Hero (combined with Popup Videos to the Max)

Project Idea: Develop an environment where people can have some fun miming playing a given instrument in a song, even if the person has never learnt to play a musical instrument. If the gestures they are making are approximately right, then they are rewarded by hearing their instrument in the song being added in. In class the example of Waterloo by ABBA was given, which has a very distictive piano part featuring high note chords to low note chords played in rapid succession. If the person mimes this right, then they are reward by hearing that part of the song mixed in with the rest of the instruments. If not, then that part is missing from what is being played.

Fine-grained hand and finger movement can be captured using Leapmotion (a USB-plugable device). This would be the key input source for the envisioned project. MIDI files would make another good starting point for audio that can be manipulated in a way that boosts or reduces the volume of particular instrument parts (can be reduced to 0 if you want to). MIDI music, however, purely instrumental (no singing) and so starts to get a bit irkesome after a while, especially for well-known pop songs. A more advanced step would be to try and develop a way of combining the MIDI with an MP3 version of the song.

This project could be framed as its own standalone piece of software, or else seen as a capability to develop within Apollo, the musical version of Expeditee.

Popup Videos to the Max

Project Manager: Ira Pascoe
Team members: Khaled Alrashed; Nathan Kelly; Nikhil Sethi; Aaron Dalusong; Sorrel Gomez

The idea of Popup Videos, but done as an interactive web site, and taken to the max! Many more layers to the information shown over the video, along with search and browse facilities (aka, a digital library). Some examples of layers are:

  • Trivia info layer (like before)
  • Mis-heard lyrics layer
  • Play-along guitar tunes
  • What instruments are playing
  • Play-along piano part: shows the keys to press in a visual at the approprite time
  • Hook-line motifs, and when they are played
  • ...

In addition to developing a way to have this layered information displayed as a video (or music) file plays, for this project you will also need to develop the "input" side of things (essentially a pop-up editor), for how people add in their time-line based information.

Potentially useful technologies:

  • User scripting (e.g., take a look at: VideoTube downloader)
  • Selenium might be a useful approach (but I'm beginning to think not)
  • Web audio API
  • Audio Fingerprinting
  • Alternative language subtitling for YouTube videos

A critical component to this project is: A solution for displaying time-based events over the top of playing video, the items of which are themselves graphical if needed (rather than just text) and hyperlinked themselves

Other software development roles that occur to me are:

  1. The layers "editor" as in: the web-based system a user visits to enter their pop-up information and/or change information already in a layer. And not just text: images, audio, symbolic audio ...
  2. Seeding the site automatically (or semi-automatically) from information on other sites and/or incentive scheme for getting people to enter information.
  3. Providing the end-user web site to search and browser the content that is there (suggest basing this around the Greenstone digital library software)

Mashup: Telling You Where To Go

Project Manager: Gerard De Leon
Team members: Sarah Alrabeah; Aysha Alhashami; Ahid Al-Zakwani; Hani Al-Bahri; Swikrit Khanal

Looking to go on a trip? Configure with your details (how many in your family, how far you are prepared to travel, etc, what your interests are) and let the system propose some recommendations. In the making the decision, the software could factor live weather forecasts (would make a difference if going to the beach was a good idea or not, current tides that sort of thing); and troll social media sites to develop a corpus of information that would help develop a recommendation that would be suitable to you.

Scope to apply Machine Learning (e.g., Weka)? Semantic Web technologies? Linked Data? Triple Stores?

Software development roles that occur to me are:

  1. A systematic approach to page scraping (weather information, GPS location information, events information, price, do coupons exist to bring the cost down, etc.)
  2. The "Store" to keep the located information in, along with an API for accessing it.
  3. End-user interface with a strong map-based feature

My Book App: Don't be a rip-off

Project Manager: Alena Choong
Team members: Jack Taylor; Langley Cavers; Tim Kuizinas; Zachary Carter; Campbell Maxwell
Project Idea: Develop a mobile app that lets you take pictures of pages in books, magazines, etc. (just a few pages, such as that mouth-watering recipe you've read seen in a magazine you're reading at a cafe!), crops the photo to just the page, looks to separate out the text and images from the background, and apply OCR to the text. The processed photos are then bundled up and turned into a "mini" book that is then stored with your other books on your smartphone/tablet etc.

Fits in with existing book software on phone (e.g., Kindle), or else provides its own book management system (e.g., Greenstone).

  • OCR: Tesseract for Android?
  • OpenCV for image processing

Variations: Have the content synced so you can access it from your other devices.

Late breaking: let's add iBeacons into the mix (located around your home, car and work). This means the MyBook app can start to have the most relevant books available to you based on your location (proximity to a particular iBeacon). Gets better over time, as it notices what books you tend to access when you are in which location. All books going into a personal DL so it can be browsed and accessed in different ways. Everything thing synced on a PC, books accessable from phone, tablet etc.

Expeditee Central

Project Manager: Alden Dalusong
Team members: Jonathon Gumbley; Lauren Nasmith; Jessica Xiao; Michaela White; Hakau Ballard; Caleb Millar;
Expeditee is an open source spatial hypermedia system—developed here at Waikato under the leadership of Rob Akscyn—quite like any other information system you are likely to have encountered. It can be a word processor, a mind-map tool, a graphics visualization system, even an Integrated Developement Enviroment (IDE) for programming. And that's really its problem. It can not only be all these things, but it can be all these things at the same time! This makes it hard to explain to people what it is, let alone convey to them just how useful it could be to them. We've come to refer to the difficulty as the kitchen sink problem, paying homage to the phrase they packed everything, including the kitchen sink when referring to a situation where nothing was left out.

The aim of this project is to develop a new capability within Expeditee (link only available within the university) to address this problem, through the idea of tailored "packs" the promote (within the uniform enviroment of Expeditee) a particular mode of use, such as powerpoint presentations, or a line-art drawing editor.

Core work to this project will see Java programming in combination with Expeditee's templating, overlay, and embedded Javascript capabilities.

A critical software component to get going early on is how the central site that stores the "packs" works and is accessed from someone using Expeditee. Beyond that, it's all generating different packs!

'Till Dawn

Project Manager: Carl Lickfold
Team members: Tyler Hale; Logan Krippner; Xavier Simmons; Christian Anderson-Scott; Daniel Oosterwijk; Paul Jones

This will be an online multiplayer survival game. The game will have a top down or third person view and will be in 3d. The game will require users to collect resources during the day to fight off the monsters at night. The game will have a fighting style will be hack and slash. The game will have procedural generation of landscape and have an emphasis on different items that enhance certain abilities and will require users to specialize.