14 - OSSR: ESCAPE Repository and Catalogue [ID:20093]
50 von 197 angezeigt

So first I'd like to apologize for those who were in Brussels.

The slide I will present there, now I very much the same that I presented there.

It's an introduction to what we are trying to achieve with the Escape repository.

So the goals of the Escape repository as we're defining in the proposal is to maximize software

we use in our community to foster co-developments, identify open standards for software releases,

instantiate data mining tools and new analytics techniques and expose community specific data

services under the FHIR principles.

And to do that the idea is to develop an open access repository to share all the products

of the scientific developments of software, data sets, documentation, tutorials, training

activities also with the Escape School and all of that should be included at the end

in the EOSC catalog.

Short but important note for the rest of the presentation, I just want to stress that software

repository when I talk about that is not a development platform.

So the repository is something different as GitHub for example.

We often talk about GitHub as a repo.

I think we should make the difference.

So why it's important to develop this repository?

To not reinvent existing tools and analysis of course but more importantly to combine

efforts and go further in the multi-missenger analysis domain.

And also of course to promote and ensure reproducible science, open science and doing that by encouraging

the implementation of the FHIR principles.

So to start with I'd like to stress or to show an example of open science projects which

is a really good example and should be a showcase of what we are trying to do.

This is the crab bundle so analysis of the crab nebula at very high energy with multi-instrument

so gamma ray analysis.

So the scientific content is not what is important for this presentation.

Maybe Cosimo will go further in the content of this analysis next week in this workshop.

What is important to me is how they published this.

So first it's a multi-missenger, multi-instrument analysis.

So they had to combine a lot of data from different instruments so already put everything

together.

But then they published the analysis on GitHub first for development and then on Znodo.

So on Znodo you can find all the source code and the data to reproduce analysis and of

course if you know Znodo you know that you can find you can cite the analysis you can

see who cited the analysis you can find links to the project and the article here in the

deposits.

So you have all the information to reproduce this analysis and to build upon it.

They also provide a binder link so you can test and run the complete analysis online

with the binder service and also they produce and they provide a container, a docker container

to ensure the reproducibility of the analysis through time.

So to me here we have all the complete recipe for open science and reproducible science.

So I'd like you now to imagine this as a standard for all our science analysis and we could

have that in a complete integrated environment.

So with a single login you should be allowed to run any analysis or part of any analysis

to rerun it with another data set for example and easily publish your new results if you

for example use the same analysis on new data set and obtain new results and doing that

automatically giving credit to the original analysis or original data set workflows extra

all the tools you used for your new publication.

And I think this is really the goal we are aiming to escape.

So how to do that?

Teil einer Videoserie :

Zugänglich über

Offener Zugang

Dauer

00:20:12 Min

Aufnahmedatum

2020-07-24

Hochgeladen am

2020-07-24 19:46:20

Sprache

en-US

Speaker

Thomas Vuillaume, LAPP, CNRS

Content

Introduction to the development of the Open-source scientific Software and Service Repository

The Workshop

The Workshop on Open-Source Software Lifecycles (WOSSL) was held in the context of the European  Science Cluster of Astronomy & Particle Physics ESFRI infrastructures (ESCAPE), bringing together people, data and services to contribute to the European Open Science Cloud. The workshop was held online from 23rd-28th July 2020, organized@FAU.

Copyright: CC-BY 4.0

Einbetten
Wordpress FAU Plugin
iFrame
Teilen