Categories for review
The URC will score GUPs with respect to five (5) different categories.
(i) Aims & Impact: scientific impact – Scientific and technological importance.
(ii) Feasibility & Data: scientific feasibility – Fit as a cryoEM project.
(iii) Proposed Experiments: technical feasibility – Ability to be completed within a defined amount or resources/time.
(iv) Goals & Expectations: NCCAT resources requested – Appropriate amount of NCCAT resources requested for the proposal.
(v) Expertise & Resources: geographical demographics and need – Resources available at home institution and geographical proximity to similar resources requested.
The URC will score TPs on the same set of five (5) categories, but the purpose is more focused on Cross-Training.
(i) Aims & Impact: training goals – areas and scope for training;
(ii) Feasibility & Data: training plan – specific objectives and milestones for the proposed training, and provide details on the potential growth in the EM areas outlined;
(iii) Proposed Experiments: resources requested – what period of time is requested for the training period and instrumentation that may be required;
(iv) Goals & Expectations: user EM background and history – brief history with current expertise level and familiarity with equipment;
(v) Expertise & Resources: geographical demographics – proximity to potential training opportunities and cryoEM practitioners in your area, and describe the impact if successful.
Note: Geographical diversity, need for access by under-served or under-represented institutions, the extent of requested support, and other options available to the users or the trainees, will be taken into account in setting priorities.
These scores will be combined and averaged for a final proposal score from 1 (excellent) to 5 (poor).
Note: Scored proposals that have not been assigned time will automatically be forwarded to the next cycle with an improvement of 0.2. Proposals will be forwarded for a maximum of two (2) times for a maximum improvement of 0.4.
We have a quarterly call for applications, reviewed and ranked by a User Review Committee (URC), which is a peer review committee. Projects requiring multiple days and many sessions will be given extra scrutiny and will have to meet a high standard for preliminary data.
User’s proposals are scored in the following categories: (i) Aims & Impact; (ii) Feasibility & Data; (iii) Proposed Experiments; (iv) Goals & Expectations; and (v) Expertise & Resources. The URC will score these aspects on a scale of 1-5 and also have the option to provide additional comments.
Training applications will be scored on the same categories, but additional topics to consider when reviewing these proposals are: a) current scientific and technical skills, b) clearly stated goals and plans for the trainee period, c) potential for long-term value of the training, and d) follow up plans after the training period.
Geographical diversity, need for access by under-served or under-represented institutions, the extent of requested support, and other options available to the users or the trainees, will be taken into account in setting priorities.
A request for access proposal will expire in three (3) cycles or when the requested amount of time recommended by peer review has been used, whichever comes first. General user projects or applications will expire in two (2) years. After that point, users will have to submit a new project to submit an additional request for access applications/proposals against.