Changes between Version 4 and Version 5 of Evaluations/20121002


Ignore:
Timestamp:
Oct 30, 2012 9:00:51 AM (12 years ago)
Author:
David van Enckevort
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Evaluations/20121002

    v4 v5  
    1 =  October 2012 Evaluation GoNL =
     1= October 2012 Evaluation GoNL =
    22People present: Morris Swertz, David van Enckevort, Paul de Bakker, Lennart Karssen, Kai Ye, Tom Visser
    33
     
    77The goal of the evaluation was to learn from our experiences in this project, both from 'What Went Well' (www) and what we have to 'Take A Look At' (tala). We tried to identify which items we have to address immediately in the project and what are the lessons we have learned for the next project.  The evaluation was done with a brain storm session in which each person could first write down five items that went well and five items to take a look at. We categorised these items and discussed them in the group.
    88
     9== Evaluation ==
     10
    911In total we had 29 www and 37 tala items. These items could be divided in two major groups: organisational and technical, which left only a few other uncategorised items. We discussed these items and tried to distill actions and lessons from them.
    1012
    11 == Technical ==
     13In general the project evaluated quite positive, if we had to rate ourselves we would give the project a 7. We identified several things that we feel we should try to keep and implement again in follow-up projects:
     14
     15 * Weekly Skype calls;
     16 * Mailing list;
     17 * Open communication and low-threshold to find each other;
     18 * Sharing of best practices nationally and internationally;
     19 * Forming the group;
     20 * Access to international collaboration;
     21 * Sharing knowledge and code via wiki+svn;
     22 * Self-organization in working groups along sensible lines;
     23 * Using pragmatic solution and get started;
     24 * The involvement of NBIC !BioAssist was instrumental in establishing the group.
    1225
    1326
    14 === File management & replication ===
    15 * General backup strategy and restore?
    16 * What is where (ToC of files)?
    17 * Is the file in hand the same as in ToC (checksum)?
    18 * What version is this file (e.g. multiple align runs)
    19 * Does the researcher have the file available on site?
    20 * Data freeze: can we mark data sets.
    21 * Data librarian: who is responsible for keeping the lists
     27=== Technical ===
     28==== File management & replication ====
     29 * General backup strategy and restore?
     30 * What is where (ToC of files)?
     31 * Is the file in hand the same as in ToC (checksum)?
     32 * What version is this file (e.g. multiple align runs)
     33 * Does the researcher have the file available on site?
     34 * Data freeze: can we mark data sets.
     35 * Data librarian: who is responsible for keeping the lists
    2236
    23 ==== Action items ====
    24 * create a series of user stories describing the practical issue we encountered during the project to share with SARA and BigGrid
    25 * Version individual files, not the whole set because to big (+index, etc)
    26 * Have overview of who wants what
    27 * Create small files we can release as a whole, e.g. SNP releases
    28 * Sort out backup strategy, what to keep, how to distribute over the resources and make it automated.
    29 * Make people responsible for data management.
     37===== Action items =====
     38 * create a series of user stories describing the practical issue we encountered during the project to share with SARA and BigGrid
     39 * Version individual files, not the whole set because to big (+index, etc)
     40 * Have overview of who wants what
     41 * Create small files we can release as a whole, e.g. SNP releases
     42 * Sort out backup strategy, what to keep, how to distribute over the resources and make it automated.
     43 * Make people responsible for data management.
    3044
    31 === Distribution of the analysis ===
    32 * Where do you compute what? There was not a clear plan on the usage of the resources.
    33 * Can we really distribute analyses over multiple sites
    34 * Currently we depend on LISA and UMCG clusters.
    35 * What pipelines do we want to distribute and why, and what are the barriers???
     45==== Distribution of the analysis ====
     46 * Where do you compute what? There was not a clear planning on the usage of the resources, the simple queueing and per scheduling of resource usage caused some project to get into trouble.
     47 * Can we really distribute analyses over multiple sites
     48 * Currently we most work was done on LISA (Imputation/GWAS) and UMCG (SV/Alignment) clusters, only some indel calling was done on the Grid, could have done more on the Grid.
     49 * What pipelines do we want to distribute and why, and what are the barriers???
    3650
    37 ==== Action items ====
    38 * Reduce dependency on single resources:
    39  * Make pipelines distributed: deploy pipelines on multiple clusters
    40  * Make dependent executable available on other clusters
    41  * Make data available on other clusters
     51===== Action items =====
     52 * Reduce dependency on single resources:
     53   * Make pipelines distributed: deploy pipelines on multiple clusters
     54   * Make dependent executable available on other clusters
     55   * Make data available on other clusters
     56   * This is taken up within RP2 and the eBioGrid project.
    4257
    43 === QC and tracing of errors ===
    44 * Robustness of the analysis
    45 * How do we make certain that data analyses are used
     58==== QC and tracing of errors ====
     59 * Robustness of the analysis
     60 * How do we make certain that data analyses are used
    4661
    47 ==== Action items ====
    48 * Clear QC steps but pragmatic. E.g. compare unique aligned reads.
    49 * Verification of pipelines accross sites using overlap samples.
     62===== Action items =====
     63 * Clear QC steps but pragmatic. E.g. compare unique aligned reads.
     64 * Verification of pipelines accross sites using overlap samples.
    5065
    51 == Organizational ==
    52 * Coordination: Communication problems
    53  * Overview of external GoNL projects
    54  * Very good that we have a SC member (Cisca) on the call all the time.
    55  * Foreign contributors is nice, but it seems like they take away nice projects away. Need better communication.
    56  * Who is responsible for what?
    57  * Decentralized management (we can not boss other locations)
    58 * Organization:
    59  * It's not always clear which resources are actually available
    60  * SV team has too little man power to do the work (largely volunteers, hard to stimulare people)
    61  * Some groups could use some strengthening from one or more experienced people (Pheno, Imputation)
    62  * Not clear what should go into which paper, responsibility for the papers.
    63 ==== Action items ====
    64 * Communication:
    65  * At every Steering Committee meeting have one of the subproject report results to Steering Committee
    66 * Organisation:
    67  * Ask the Steering Committee about available human resources (do the GoNL members get the time they need?)
    68  * Group responsible of rolling roadmap for one year (get from the steering committee)
    69  * Have more bioinformaticians in the steering committee and recognition of that
    70  * The technical people should get appreciation for their scientific contribution!
    71  * Need experienced person for each working group (SV is okay, imputation and pheno are a bit light because Yurii left)
    72 * Science / Roadmap:
    73  * Paper plan
    74  * Get from the steering committee general directions, very broad, what can / should do next with the data (GoNL flag, or just using)
     66=== Organizational ===
     67 * Coordination: Communication problems
     68   * Overview of external GoNL projects
     69   * Very good that we have a SC member (Cisca) on the call all the time.
     70   * Foreign contributors is nice, but it seems like they take away nice projects away. Need better communication.
     71   * Who is responsible for what?
     72   * Decentralized management (we can not boss other locations)
     73   * It's not always clear who is paid by the project and who is a volunteer, you can only kindly ask the volunteers to do tasks.
     74   * It was approached as a scientific project, which meant there was not always a clear direction from above.
     75 * Organization:
     76   * It's not always clear which people resources are actually available
     77   * SV team has too little man power to do the work (largely volunteers, hard to stimulare people)
     78   * Some groups could use some strengthening from one or more experienced people (Pheno, Imputation)
     79   * Not clear what should go into which paper, responsibility for the papers.
    7580
    76 ==== Things to Keep ====
    77 * Weekly skypes
    78 * Mailing list
    79 * Open communication and low-threshold to find each other
    80 * Sharing of best practices nationally and internationally
    81 * Forming the group
    82 * Access to international collaboration
    83 * Sharing knowledge and code via wiki+svn
    84 * Self-organization in working groups along sensible lines
    85 * Using pragmatic solution and get started
     81===== Action items =====
     82 * Communication:
     83   * At every Steering Committee meeting have one of the subproject report results to Steering Committee
     84 * Organisation:
     85   * Ask the Steering Committee about available human resources (do the GoNL members get the time they need?)
     86   * Group responsible of rolling roadmap for one year (get from the steering committee)
     87   * Have more bioinformaticians in the steering committee and recognition of that
     88   * The technical people should get appreciation for their scientific contribution!
     89   * Need experienced person for each working group (SV is okay, imputation and pheno are a bit light because Yurii left)
     90 * Science / Roadmap:
     91   * Paper plan
     92   * Get from the steering committee general directions, very broad, what can / should do next with the data (GoNL flag, or just using)