Semantic MediaWiki and Neuroscience Data - A Blue Brain Perspectiv

From semantic-mediawiki.org
< SMWCon Fall 2013
SMWCon Fall 2013Semantic MediaWiki and Neuroscience Data - A Blue Brain Perspectiv
SMWCon Fall 2013
Semantic MediaWiki and Neuroscience Data - A Blue Brain Perspectiv
Talk details
Description: Presents "DataSpace/LabSpace/KnowledgeSpace" a tool that allows to integrate existing data and knowledge on a global level.
Speaker(s): Martin Telefont
Type: Talk, Demo
Audience:
Event start: 2013/10/29 04:00:00 PM
Event finish: 2013/10/29 04:30:00 PM
Length: 30 minutes
Video: click here
Keywords: data integration, tool
Give feedback

With 3 large neuroscience projects kicking off in the next couple of months one of the question people has asked is how will we be able to integrate existing data and knowledge on a global level in neuroscience. I will present DataSpace/LabSpace/KnowledgeSpace a tool that we developed to address this problem. One of the components of this system is SemanticMediaWiki which makes it easy for experimenters to "see" their data, while allowing modelers to access the same data on a much larger scale for modeling of neurocircuitry. The flexibility of the wiki platform allows experimenters, data analysts and modelers to continual refine our understanding about elements of the neural system in a global collaborative way. Hi,

Since it's start in 2005 the BlueBrainProject has tried to build realistic models of human neuronal networks by setting up a modelling workflow that allows the easy integration of novel lab data. The workflows have been developed on a data set that was experimentally obtained in an electrophysiological laboratory during a period of some 15-20 years. The vast majority of data was obtained under virtually identical experimental conditions (same animal age, sex, processing protocol, stimulation protocol...). Thanks to the refinement of the workflow over time and the increase of computational power of the hardware used to build and simulate in-silico circuits the BlueBrainProject can today build biological realistic models with thousands of neurone, replicating a wide range of structural and physiological properties that are consistent with biologically tissue properties.

We have started to use SemanticMedia Wiki technology to keep an inventory of meta data of experimental protocols, data artefacts generated and analysis performed on them. The wiki provides links to raw data, analysis artefacts, and models stored in a distributed file system, called dataspace. On top of this it is also a place where information extracted from the literature, by a uima natural language processing engine, is accessible to researchers to drive the topical data integration process using information streams derived from data, models and literature.

If you would like more detail go to the following webpages: