Department of

Nonparametric Methods for Unsupervised Semantic ModellingWray BuntineMonash University, Melbourne, Australia 
Monday 8 June 2015 11:00 am G.1.15 
This talk will cover some of our recent work in extended topic models to serve as tools in text mining and NLP (and hopefully, later, in IR) when some semantic analysis is required. In some sense our goals are akin to the use of Latent Semantic Analysis. The basic theoretical/algorithmic tool we have for this is nonparametric Bayesian methods for reasoning on hierarchies of probability vectors. The concepts will be introduced but not the statistical detail. Then I'll present some of our KDD 2014 paper (Experiments with Nonparametric Topic Models) that is currently the best performing topic model by a number of metrics.


Scalable text mining with sparse generative modelsAntti PuurulaDepartment of Computer Science, The University of Waikato, Hamilton 
Monday 8 June 2015 2:00 pm G.1.15 
The information age has brought a deluge of data. Much of this is in text form, insurmountable in scope for humans and incomprehensible in structure for computers. Text mining is an expanding field of research that seeks to utilize the information contained in vast document collections. General data mining methods based on machine learning face challenges with the scale of text data, posing a need for scalable text mining methods.
This thesis proposes a solution to scalable text mining: generative models combined with sparse computation. A unifying formalization for generative text models is defined, bringing together research traditions that have used formally equivalent models, but ignored parallel developments. This framework allows the use of methods developed in different processing tasks such as retrieval and classification, yielding effective solutions across different text mining tasks. Sparse computation using inverted indices is proposed for inference on probabilistic models. This reduces the computational complexity of the common text mining operations according to sparsity, yielding probabilistic models with the scalability of modern search engines.
The proposed combination provides sparse generative models: a solution for text mining that is general, effective, and scalable. Extensive experimentation on text classification and ranked retrieval datasets are conducted, showing that the proposed solution matches or outperforms the leading taskspecific methods in effectiveness, with a order of magnitude decrease in classification times for Wikipedia article categorization with a million classes. The developed methods were further applied in two 2014 Kaggle data mining prize competitions with over a hundred competing teams, earning first and second places.
