Wiki-NLP Tutorial
SMWCon Spring 2014 | |
---|---|
Adding Natural Language Processing Support to Your (Semantic)MediaWiki | |
Talk details | |
Description: | |
Speaker(s): | René Witte, Bahar Sateli |
Slides: | see here |
Type: | Tutorial |
Audience: | Everyone, Developers, Admins |
Length: | minutes |
Video: | not available |
Keywords: | |
Give feedback |
Title: Adding Natural Language Processing Support to Your (Semantic)MediaWiki
Audience: Wiki users, maintainers, and developers, without prior knowledge in Natural Language Processing (NLP)
Presenters: René Witte and Bahar Sateli - Semantic Software Lab, Concordia University, Montréal, Canada
Abstract[edit]
Wikis have become powerful knowledge management platforms, offering high customizability while remaining relatively easy to deploy and use. With a majority of content in natural language, wikis can greatly benefit from automated text analysis techniques. Natural Language Processing is a branch of computer science that employs various Artificial Intelligence (AI) techniques to process content written in natural language. NLP-enhanced wikis can support users in finding, developing and organizing knowledge contained inside the wiki repository. Rather than relying on external NLP applications, we developed an approach that brings NLP as an integrated feature to wiki systems, thereby creating new human/AI collaboration patterns, where human users work together with automated "intelligent assistants" on developing, structuring and improving wiki content. This is achieved with our open source Wiki-NLP integration, a Semantic Assistants add-on that allows to incorporate NLP services into the MediaWiki environment, thereby enabling wiki users to benefit from modern text mining techniques.
The proposed tutorial has two main parts: In the first part, we will present an introduction into NLP and text mining, as well as related frameworks, in particular the General Architecture for Text Engineering and the Semantic Assistants framework. Building on the foundations covered in the first part, we will then look into the Wiki-NLP integration and show how you can add arbitrary text processing services to your (Semantic) MediaWiki instance with minimal effort. Throughout the tutorial, we illustrate the application of NLP in wikis with a number of applications examples from various domains we developed in our research within the last decade, such as cultural heritage data management, collaborative software requirements engineering, and biomedical knowledge management. These showcases of the Wiki-NLP integration highlight a number of integration patterns that will help you to adopt this technology for your own domain.
Outline[edit]
Part 1 - Foundations (by René Witte)
- Introduction to Natural Language Processing (NLP) and Text Mining
- NLP Pipelines and Frameworks: The General Architecture for Text Engineering (GATE)
- Existing components, pipelines, and services for ready-to-use wiki systems
- Publishing NLP pipelines as Web Services through the Semantic Assistants (SA) server
Part 2 - The Wiki-NLP Integration (by Bahar Sateli)
- The Wiki-NLP integration architecture
- The Semantic Assistants extension for Semantic MediaWiki
- Consuming NLP web services within a wiki environment
- Case studies: Biomedical Literature Curation (IntelliGenWiki) and Software Requirements Engineering (ReqWiki)