|SMWCon Spring 2013|
|Current status and next steps for SMW|
|Description:||Presenting the Semantic Assistants Wiki-NLP integration architecture for automatically deriving SMW metadata from wiki contents through Natural Language Processing (NLP) and its application in the IntelliGenWiki for genomics research as well as the ReqWiki system for Software Requirements Engineering.|
|Event start:||2013/03/22 03:00:00 PM|
|Event finish:||2013/03/22 03:45:00 PM|
Wiki-NLP Integration is the first comprehensive open source solution for bringing Natural Language Processing (NLP) to wiki users, in particular for wikis based on the well-known MediaWiki engine and its Semantic MediaWiki (SMW) extension. It can run any NLP pipeline deployed in the General Architecture for Text Engineering (GATE), brokered as web services through the Semantic Assistants server. This allows you to bring novel text mining assistants to wiki users, e.g., for automatically structuring wiki pages, answering questions in natural language, quality assurance, entity detection, summarization, among others. The results of the NLP analysis are written back to the wiki, allowing humans and AI to work collaboratively on wiki content. Additionally, semantic markup understood by the SMW extension can be automatically generated from NLP output, providing semantic search and query functionalities.
In my presentation I will explain the Wiki-NLP Integration architecture and how NLP capabilities can be integrated within a given wiki environment. As application scenarios, I will demonstrate two applications based on our Wiki-NLP Integration that is used in real-world projects, one from the Software Engineering and one from the Biomedical domain.