home » research » ltm
Local Topic Models

The n-gram model has been widely used to capture the local ordering of words, yet its exploding feature space often causes an estimation issue. Local Topic Models introduces a new concept of locality, local contexts, which provide rich word representations that generate locally coherent topics and document representations. Based on its novel features, we employ a nonprobabilistic topic modeling technique, sparse coding, to effectively handle large feature space. Sparse coding efficiently finds topics and representations by applying numerous fast-methods (such as greedy coordinate descent). The model is useful for discovering local topics and the semantic flow of a document and constructing predictive models.

Publication

Seungyeon Kim, Joonseok Lee, Guy Lebanon and Haesun Park. Local Context Sparse Coding. Proceedings of the 29th AAAI Conference on Artificial Intelligence (AAAI). 2015. pdf

created: 2014/11/09/ 06:08:47, modified: 2015/01/26/ 18:22:59

«edit» «inkdrop»