Ms Emma Creasey

HEFCE

Northavon House

Coldharbour Lane

BRISTOL BS16 1QD

                                                                                                             Direct Line: +44 (0)1509 222720

                                                                                                                                                 E-mail: I.C.Morison@lboro.ac.uk

                                                                                                                                                  

                                                                                                                                                

19  December 2001

 

 

Dear Ms Creasey

 

Information on Quality and Standards of Teaching and Learning –

Proposals for Consultation

 

I am happy to respond to this consultation document on behalf of Loughborough University.

 

Loughborough’s specific responses to the proposals are set out below.  I should first, however, make the more general comment that the whole Consultation appears to be dominated by full-time undergraduate provision, with little consideration apparently given to taught postgraduate activity.

 

Question 1: Do you agree with the purposes and principles of information collection and publication set out in paragraphs 18 – 20?  Are there others that should be added?

 

Yes.  However, we are unclear what is meant by “replicable” in the context of surveys of student opinion, and we feel that the “resistant to manipulation” criteria will be difficult to achieve, as HEIs will inevitably seek to present themselves in the most favourable light.

 

Question 2: Do you agree that the classification system for the collection and publication of information at the intermediate level between the whole institution and the individual programme should the 19 JACS subject areas?

 

No.  We have analysed how the JACS codes would relate to our provision, and in most areas the outcomes would be meaningless.  Under the ESR methodology we have been able to negotiate groupings of programmes with QAA so that reviews have been of coherent aspects of provision, and the published outcomes have been useful both to the University and to external stake holders.  Under JACs, three of our Departments are split three ways, and nine further Departments two ways.  Two codes – Biological Sciences and Engineering – cover three separate Faculties.  Geography is

contd.


 

 

 

partially aligned with Physics and Chemistry, and partially with Social Sciences.  Humanities refers

only to the History of Art, whilst our English programmes are misleadingly co-located with Modern Languages.  We would prefer to retain the 42 subject areas; this would also make it possible and meaningful to compare the published information with previous Self Evaluation Documents and ESR Reports. 

 

Question 3: Do you agree with the portfolio concept – a set of defined categories of information within which each HEI collects information to suit its own needs and circumstances?

 

Yes, provided that the exact contents of each section of the portfolio can be determined by the HEI in accordance with its own internal needs, traditions and practice.

 

Question 4: Do you agree that there should be four main categories for collecting information about quality and standards, as set out in paragraph 26?

 

Yes.

 

Question 5: Are the documents listed in paragraph 29 the right ones to describe the institutional context for quality and standards?

 

Yes.

 

Question 6: Are the data in paragraphs 30 and 32 the right ones to have available internally, and discuss as part of institutional audit, in relation to student admission, progression, completion and employment?

 

The proposed data focus on undergraduate provision.  It is unclear to us what information it is proposed be used in regard to postgraduate taught provision.

 

Para 30 (b)

 

If the indicator of socio-economic background is to be the postcode, we could regard this as a flawed mechanism.  It would certainly distort figures for postgraduate provision, as final year undergraduates’ home addresses when applying would frequently be inner city student areas.

 

Para 32

 

Experience with ESRs indicates that raw figures for progression and retention are unreliable.  Account has to be taken of transfer between programmes within the HEI, which often accounts for the major part of apparent “wastage”.

 

Question 7: Are the headings in paragraph 37 the right ones to use for information on assurance of academic quality and standards?

 

Partly.  Paragraphs 37 d, e and j relate to the same thing, whilst the issues raised under student satisfaction etc. form part of our Annual and Periodic Programme Review documentation anyway.  We would not expect to produce a separate report of periodic reviews of assessment methods, as these form an integral part of Programme Review.  Similarly, the outcomes of student satisfaction

contd.


surveys should be incorporated within Programme Review.

 

We currently analyse external examiners’ reports in the context of programmes, and would not expect to do so at the modular level.

 

We remain concerned that Paragraph 37 h is too vague and, unless clarified, could lead to pressure to recreate the Base Room.

 

Question 8: Are the items listed in paragraph 39 the right ones for HEIs to consider in internal reviews of quality and  enhancement of learning opportunities?

 

These and other issues are already included in Annual and Periodic Programme Review, and Programme Specifications, as appropriate.  We would specifically include student feedback and external examiners’ reports.

 

Question 9: Are the data in paragraph 42 the appropriate quantitative indicators to include in the published data set?

 

The data in paragraph 42 are appropriate.  We are however concerned at paragraph 44.  Expenditure on Library and Computer Services is incurred at both Departmental and Central level.  The use of HEMS reports by the Press to compile league tables gave the impression that Loughborough’s provision was under-resourced, simply because the data only reflected central costs.  We would not wish such a misapprehension to be created again.

 

Question 10: Do you agree with the Task Group recommendation that option b in paragraph 49 should be adopted – namely that summaries of external examiners’ reports should be published?

 

No.  Both options a and b would require the external examiner to serve two masters, and would create a tension which would not be in anyone’s interest.  Institutions would be tempted to negotiate with external examiners about the content of their reports, thus threatening the reliability and validity conditions set out in paragraph 20.

 

It should be sufficient for the Institutional Audit to demonstrate the integrity of the University’s internal quality assurance mechanisms without requiring the publication of all the evidence.  We need to avoid information overload.

 

A further consideration is that external examiners themselves might not wish their reports to be published.

 

Question 11: Do you agree with the Task Group recommendation that option d in paragraph 51 is the best way forward – that it should be an option for HEIs to publish commentaries on external examiners’ reports if they wish?

 

No, for the same reason as in our response to Question 10.  A robust internal quality assurance mechanism incorporating programme review should be sufficient.

 

Were this to become a requirement it would necessitate a lot of work to determine how to structure such a commentary, and HEIs would inevitably seek to manipulate the commentaries to their best advantage.  Application to modular programmes would be particularly problematic.

 

Question 12: Do you agree with the Task Group recommendation that the combination of

contd.


options in paragraph 57 is the best way forward – that is, to use the FDS to collect the views of recent graduates, with HEIs continuing to conduct their own surveys but on a more consistent basis?

 

It is unclear whether the data from FDS surveys will be at programme or institutional level.  Nor is it clear how such surveys will differentiate between graduates from vocational and non-vocational programmes.  Questionnaires seeking to establish the relevance of University programmes to a graduate’s current employment are unreliable, given that the outcomes inevitably tell one more about the employment than about the degree programme.

 

As far as student surveys at programme level are concerned, individual institutions need to be able to tailor feedback to their particular circumstances, so any common criteria need to be very general to facilitate this.

 

Once again, these proposals are clearly directed towards full-time undergraduate provision.

 

Question 13: Do you agree with the Task Group recommendation that option a in paragraph 64 – publication of learning and teaching strategy statements at the whole institution level – is the preferred approach?

 

Yes.  In Loughborough’s case, this already happens, and our learning and teaching strategy is published on the web.

 

Question 14: Do you agree with the Task Group recommendation that statements of learning and teaching strategies should be published to dovetail with the cycle for institutional audit?

 

 Yes.

 

Question 15: Do you agree with the Task Group recommendation that option a in paragraph 68 should be pursued, namely that the results of annual monitoring and periodic major programme reviews should be summarised at programme level in association with programme specifications?

 

Yes.  This is largely in line with our current procedures.

 

Question 16: Do you agree with the Task Group’s recommendation that both options a and b in paragraph 70 should be pursued: that is, that material on employer views should be included in HEI learning and teaching strategy statements and programme specifications; and that further consideration should be given to whether a useful national survey of employer views could be designed to supplement other forms of information (option c)?

 

No.  We think that the risks acknowledged in Paragraph 70 c are such that a national survey of employer opinion would be of limited value.  Experience with ESR indicates that there is such disparity between subject areas in terms of graduate destinations that any attempt to summarise employer feedback would be unreliable.

 

Yours sincerely

 

 

 

 

Professor I C Morison

Pro-Vice-Chancellor (Teaching)