Thread:Talk:SMWCon Fall 2013/Semantic MediaWiki and Neuroscience Data - A Blue Brain Perspectiv/Feedback on the talk/reply

Hi,

I'll try my best to answer without going too much into detail, I can still talk your ears off in Berlin if you would like me to.

Since it's start in 2005 the BlueBrainProject has tried to build realistic models of human neuronal networks by setting up a modelling workflow that allows the easy integration of novel lab data. The workflows have been developed on a data set that was experimentally obtained in an electrophysiological laboratory during a period of some 15-20 years. The vast majority of data was obtained under virtually identical experimental conditions (same animal age, sex, processing protocol, stimulation protocol...). Thanks to the refinement of the workflow over time and the increase of computational power of the hardware used to build and simulate in-silico circuits the BlueBrainProject can today build biological realistic models with thousands of neurone, replicating a wide range of structural and physiological properties that are consistent with biologically tissue properties.

We have started to use SemanticMedia Wiki technology to keep an inventory of meta data of experimental protocols, data artefacts generated and analysis performed on them. The wiki provides links to raw data, analysis artefacts, and models stored in a distributed file system, called dataspace. On top of this it is also a place where information extracted from the literature, by a uima natural language processing engine, is accessible to researchers to drive the topical data integration process using information streams derived from data, models and literature.

I hope that's ok for detail. If you would like more detail I can refer you to the following webpages:
 * bluebrainproject
 * dataspace
 * uima