ASTRONOMY GRANTS PANEL 2012, Report by the Chairman to Community
1 message

Melanie Kidd <Melanie.Kidd@stfc.ac.uk> 30 January 2013 14:40
Reply-To: Melanie.Kidd@stfc.ac.uk
To: ASTROCOMMUNITY@jiscmail.ac.uk

ASTRONOMY GRANTS PANEL 2012

Report by the Chairman to Community

Following the pattern of the last two years, here is a short informal report on behalf of AGP panel members - as ever, official news and policy statements will come from STFC itself. My apologies that this report is a little later than last year's.

KEY POINTS

* The 2012 round was not as badly oversubscribed as 2011 (which required mitigation through transfer of funds into the round), but still fiercely competitive, and we were not able to fund all of the excellent science proposed

* Levels of Investigator time awarded (FEC) are worryingly low, but we cannot change this without losing considerable RA support

* STFC will be reviewing the Consolidated Grants scheme in 2013

* The next deadline is Feb 13th 2013

 

BACKGROUND

This was the second year of the new scheme. The community is getting used to the new process, so we deliberately made no big changes in policy or procedure. Further down this report, there are a few words on how the panel feels the new scheme has been going.

GRANTS RECEIVED

This round was not as large as 2011, but still intensely competitive. We received 27 applications (cf 35 last year). These 27 applications contained 196 distinct projects, including 170 RA and Technician posts, and requesting a total of £63M, a factor of roughly 2.5 more than STFC could afford. As last year, the overbidding factor compared to existing baseline had a large range, roughly from 1 to 4. So no sign yet of overbidding calming down!

HOW IT WORKS

We allocated projects to one or more of the sub-panels - Astronomy Observation (AO), Astronomy Theory (AT), Solar Science (SS), and Planetary Science (PL).  As in previous years, every application had multiple reviewer reports, and an "Introducer", who writes an initial report before the panel meetings, and co-ordinates additional "Assessor's Questions".  The sub-panels then met in two sessions - AO/AT together, then SS/PL together - lasting a total of 5 days. The AGP chair and deputy chair attended all the meetings. Before the meetings, all sub-panel members provided initial scores, but only the chairs saw these. At the meetings, each project was discussed in turn, and then given a final score and ranking around the table. As you might expect, when it became obvious that a project was near the bottom or near the top, discussion was accelerated, so that we could spend most of the time near the middle - but every project was considered fully.

A subset of the AGP then met as the "merging panel" for a further two days. Selected projects were cross-read by members from the other pair of panels in order to agree matching points in the two initial lists. This process resulted in a small but definite re-scaling of the rankings.

The final output from the AGP was a recommended ranked ordering of projects, which STFC could work down through once they knew their budget situation. To help us arrive at that ranked list we used, as in previous years, a scoring system based on the categories mandated STFC-wide, and explained in the official guidelines. Each category was scored 0-5, with each possible grade defined by a standard wording such as "competitive with the best science funded worldwide". This temporary scoring system is only used for internal discussion; the actual result is an agreed ranking order for projects, and recommendations on overall award components for a group. (Some things people request cut across projects).

It is important to realise that AGP judged investigator-time-only cases on exactly the same basis as RA cases - is the science excellent? Is it clear what the investigator will actually do?

BUDGET and FINANCE ISSUES

The budget situation was similar to the year before. The AGP does not actually hold a budget; we produce a recommendation based on a ranked list, and STFC work down this until they run out of money. However, we are traditionally advised on roughly what STFC expects to have available, which at the moment is approximately £9M per year of new commitment, so usually we know very roughly but not precisely where the funding cut-off will be.

STFC staff have to rely on a frankly inadequate RCUK-wide grants system, and so do additional manual calculations to actually deduce the financial consequences of each grant. Calculating the effect of changes is therefore non-trivial. Nonetheless we achieved our aim this year of notifying applicants of the outcome in November.

RESULTS

In total we recommended 84 posts, including RAs and Technicians. Compared to existing holdings, groups obtained roughly level funding on average. This is better than the 2011 round, where most groups suffered a 30% cut. The difference is partly because this round had less demand volume, and partly because groups proposing in 2011 did so previously at a historical maximum in grant generosity, whereas the 2012 applicants last got funding in a round that was already tougher than the year before. Based on our absolute quality scores, the fraction of "world class" proposals we are able to fund has stayed about the same.

We also recommended 20.9 FTEs of Investigator time. We followed our published guidelines of recommending 20% FTE on average for a major involvement in a project, and smaller amounts for secondary involvements. There were also some successful "FEC only" requests. The net result was that the recommended Investigator time was on average 24.8% of the recommended RA+Tech time.

At the request of Science Board, we made a careful analysis of results by science area, geographical area, and facility use. There were no obvious trends, but we did note that some unsupported or soon to be unsupported facilities are still popular with researchers! We also looked at the effect of group size. There is no consistent statistical effect, but of course at the small group end, simple statistics means that there is a greater chance of receiving no funding.

This year there were five New Applicant proposals, of which three went forward to full review and two were funded. Two others were considered under urgency and one was funded. There were two consortium proposals and they were both partly successful.

HIGH PERFORMANCE COMPUTING

There was considerable confusion this round over HPC requests, basically because the DiRAC initiative, which should for the medium term future satisfy the majority of community HPC requirements, was only announced after people submitted their grants. AGP reviewed requests on a case-by-case basis, but tended to be more supportive of year-1 requests. In the future, the guidance STFC has given AGP is that use of DiRAC-II should be the expected route for HPC activities, but AGP has still not received a clear description of the funding model, so DiRAC time may or may not require explicit funding. Applicants are strongly encouraged to contact the Office for the latest guidance.

NEXT ROUND

The deadline for the 2013 round is Feb 13th, and of course many of you are already beavering away towards this. Our schedule is similar to last year - during April applicants will be replying to reviewer's questions; in July and August there will be a chance to respond to Panel questions; the panel meetings will be in August and September; somewhere around October the provisional outcome will be discussed by Science Board; and the results should be announced by November. The general pattern of sub-panels and meetings will be the same as the last two years. The categories against which proposals are judged will be the same as in previous years.

THE FEC ISSUE

Even more than last year, we are keenly aware that the levels of Investigator time we are able to award is not fulfilling the original aims of the FEC scheme. On average we award 0.25FTE of investigator time per FTE of RA time, and for most astronomers this is their only source of "FEC".  If we had awarded the 0.5FTE level needed to replace QR, the fall in RA numbers would have been even more catastrophic - Investigators cost twice as much as RAs. Worse, many research active staff have not received any support. We must all make very clear to our University administrations that in the current climate, FEC support is NOT an indicator of whether a staff member is research active.

REVIEW OF CONSOLIDATED GRANT SCHEME

We understand that during the coming months Science Board will review the entire Consolidated Grants scheme.  Do be on standby to make constructive comments. For your information, the points that we have already made to Science Board were as follows:

(i) Management of expectation. Previously, rolling grants were oversubscribed by a factor 1.3-1.5, and standard grants by a factor of 6-7. Of course, a large fraction of standard grant applications would be repeated every year with modifications, so their real oversubscription was perhaps more like a factor 2-3. The first year of the new scheme was oversubscribed by a factor 2.5, and this year by a factor 2.1. So the net management of expectation has shown little change, but in fact the ex-rolling grant groups have been caused to overbid by more.

(ii) Research concentration. Large groups have declined by about the same rate as small, groups on average. Some small groups have failed to win anything. However, some have won resource, and some new groups have appeared. There is no obvious net drift either towards or away from major groups. Of course, it may be simply too early to tell.

(ii) Use of flexibility powers. This will only become clear by the second cycle, but there is some anecdotal evidence. Large groups are able juggle money at the margins, but even in these groups this has limited effect. The behaviour is likely to be very different from PP groups, because grants are facility exploitation focused and individual PI-centred, rather than group-project focused. Applicants take their proposed science plans very seriously; they are not seen as a kind of convenient fiction to win money.

(iii) Simplification. From the AGP point of view, we have saved administrative costs - by reducing panel size, by removing applicant presentations, and by reducing the number of meeting days. However the burden on panel members has increased. We have maintained the principle of rigorous peer review of specific proposed science, rather than, for example having a "light-touch" review of past group performance. It seems clear that any further simplication/cost saving can only be obtained by sacrificing that rigorous peer review of specific science cases.

(iv) Co-ordinated research versus individual projects.  The result of the new scheme has been not so much an expanded rolling-grant like system, but more like a giant standards grants round, but with the opportunity to try again next year lost.

 

Andy Lawrence

Edinburgh January 2013

 

 

 


--
Scanned by iCritical.