Does research degree supervisor training work? The impact of a professional development induction workshop on supervision practice

ABSTRACT Supervisor induction and continued professional development programmes constitute good practice and are enshrined in institutional policies and national codes of practice. However, there is little evidence about whether they have an impact on either supervisors’ learning or day-to-day practice. Set in a discussion of previous literature, this article unpacks the concepts ‘impact’ and ‘evaluation’ and assesses the medium- and longer term impact of the University of South Australia's doctoral supervisor induction programme, Supervising@UniSA. It suggests that the workshop leads to the acquisition of understanding and knowledge and, for the majority of attendees, also has an impact on supervisory practice.


Introduction
Formal professional development for the supervisors of research degree students has increasingly been mandated by universities over the last two decades, driven by national codes of practice and good practice guidelines developed by bodies such as the UK's Quality Assurance Agency for Higher Education (QAA, 2004(QAA, , 2015 and the Australian Deans and Directors of Graduate Studies (DDoGS, 2014). Supervisor responses to professional development vary. Some view it as an obstacle to be overcome in order to gain institutional permission to supervise or as a waste of time and/or an affront to their professionalism and expertise (Guerin & Green, n.d.). Many others, on the other hand, approach it positively and as a natural and valuable part of their professional practice. Despite its increasing prevalence, there is little research into whether professional development activities enhance the learning and knowledge of supervisors around policies, procedures or pedagogies of supervision. There is a paucity of literature exploring impact on individual supervisor's supervisory practice. Subsequently the question 'why, apart from its being mandated, do we do it?' remains largely unanswered. This paper takes some first steps towards addressing this question in the context of an analysis of the medium-and longer term impacts on participants of attending Supervising@UniSA. Supervising@UniSA is a day-long supervisor induction workshop offered by the University of South Australia (UniSA) for its research degree supervisors. The article draws on data from an anonymous online questionnaire distributed to participants who attended the workshop between 2009 and 2012.
The aim is to examine the extent to which a research degree supervisor induction workshop has an impact on participant learning and professional practice and is structured into five sections. The first briefly outlines the origins of professional development programmes for research degree supervisors and reviews the literature relevant to assessing the impact of academic development on practice. The second offers a simple typology of supervisor development before outlining the context and content of Supervising@UniSA and situating it within the typology. The third reports on the development and form of the questionnaire and the characteristics of the respondents. The penultimate section presents an analysis of the data, and the fifth presents conclusions.

Professional development, research degree supervisors, impact and evaluation
Professional development for supervisors has been on university agendas for some time now, Moses (1985) and Cryer (1997) being pioneers in this field in, respectively, Australia and the UK. Early developments in the UK included modules on supervision in teaching in higher education programmes at Leeds Metropolitan and Staffordshire Universities. 1 A full postgraduate certificate in research degree supervision was introduced by the then Edge Hill College of Higher Education in 1999 2 (Cryer, 2000). Extended courses in supervisor training followed earlier national initiatives involving singular workshops for institutional leaders in research degree education (e.g., Council for National Academic Awards, 1991). In Australia, Griffith University was the first to introduce supervisor training programmes for its staff (Conrad, 1998). Supervisor development is now promoted through national guidelines (see DDoGS, 2014;QAA, 2004QAA, , 2015 and is also embodied in the European Universities Association's Salzburg Principles which state that '[p]roviding professional development to supervisors is an institutional responsibility' (EUA Council for Doctoral Education, 2010, Section 2.3).
Alongside the emergence of supervisor training in higher education institutions, a descriptive literature on the need for, and nature of, supervisor professional development has been published (e.g., Bills, 2004;Cryer, 1997;Grant, 2007). A recent exemplar is the work of Bitzer and colleagues (2013) reporting the collaborative work of senior academics from three countries seeking to improve doctoral education by enhancing across 13 Southern African universities doctoral supervisor professional development. There are many 'How to … ' books on doctoral education addressing supervision issues (Stokes & McCulloch, 2006) and there are accounts of specific programmes and workshops (e.g., for Australia see Manathunga, 2005;Pearson & Kayrooz, 2004; for South Africa see Quinn, 2003). However, very little attention has been paid to the crucial question of impact. Academic development literature on impact largely ignores both supervisor development programmes and also their typical modes of delivery (Rust, 1998(Rust, , 2000Stes, Clement, & Van Petegem, 2007;Stes, Coertjens, & Van Petegem, 2010). This literature dates back to the 1970s and offers some pointers (see, for example, the discussion in Kreber & Brook, 2001;and Chalmers, Stoney, Goody, Goerke, & Gardiner, 2012), but it is apparent that, as discussions get closer to an assessment of 'impact', the focus on evidence becomes less clear and the methodological challenges more apparent.
When we said earlier that 'little attention has been paid' to impact, we were not suggesting that supervisor development activities are not evaluated. The distinction between evaluation and impact assessment is important because, while academic developers tend to be conscientious to a fault regarding post-event evaluation, 'mere attendance does not guarantee learning' (Manathunga, 2005, p. 20) and we use the notion of evaluation in a specific way. Some of the difficulty of engaging with the impact and evaluation literature lies with the latter concept which can refer to a wide range of activities ranging from the examination of the extent to which an entire organisational unit (organisational evaluation) or a programme or suite of activities (programme evaluation) have achieved their stated goals or objectives, to post-event evaluation intended to reveal participants' immediate responses to an activity, its delivery and organisation (referred to as the 'happy sheet' by Chalmers & Gardiner, 2015, p. 4). Our view is that impact requires some form of learning or change in behaviour to take place McAlpine & Harris, 2002), while an evaluation can take place without any reference to learning or changes in behaviour. Consequently, while impact assessment can be an important part of an organisational or programme evaluation, evaluation does not necessarily involve the measurement of impact.
The linkage between activity and 'behaviour, action or change' underpins the report, The impact framework 2012: Revisiting the rugby team impact framework (Bromley & Metcalfe, 2012), and also the influential work by Kirkpatrick and Kirkpatrick (2006) which provides the intellectual underpinning for the report. The impact framework identifies four levels of impact which can result from a training activity: reaction, learning, behaviour, and outcomes. The first (reaction) encapsulates post-workshop evaluation and, in our terms, should not be included as a category of impact. To avoid the type of slippage across the notions of impact and evaluation seen in Quinn's 2003 article on a South African Postgraduate Certificate in Higher Education and Training, this article uses the term evaluation only in the first sense referred to above, that is, to the gathering of feedback on an event or activity either during or shortly after its conclusion. Impact is used to refer to learning or changes in behaviour which occur as a result of an activity.
Using impact in this way raises questions about how change can be identified or measured. Gray and Radloff (2008) examine impact through a reflection on the Academic Development Unit where they had formerly played leadership roles. The resulting focus is not, however, on change but rather on surfacing some of the issues involved in using the notion of impact, and while they conclude that '"effectiveness" is a better term to use' than impact (p. 104), they actually tell us very little about either the impact or effectiveness of specific interventions.
Others working in the field of supervisor development have suggested looking for secondary indicators of change. One example of this is the 2004 Reflective Supervisor Questionnaire (RSQ) developed by Pearson and Kayrooz which is administered to research students in order to determine student satisfaction with the supervision they receive. Unfortunately, there is no examination of how the results of the questionnaire impacted, or did not impact, upon supervisory practice and post-publication references to the RSQ and its use or effectiveness are extremely hard to find. Reid and Marshall (2009) also adopt an indirect approach in their presentation of a model of supervisor development based on a programme entitled the 'Colloquium for Research Supervision'. In their case, rather than asking research students for their opinion, impact was assessed by asking senior researchers for their perceptions of the impact of the programme on faculty staff practice. As it is the perceptions of senior researchers that are collected and analysed at the neglect of reports from the programme participants themselves, the paper contains very little data or discussion of the Colloquium's impact on the individual behaviour of participants.
Indirect methods relying on third-party reports cannot demonstrate the extent to which an activity involves learning and whether this learning is integrated into the dayto-day practice of the participant. They are, however, attractive because of the genuine difficulties associated with measuring impact directly. These difficulties are exemplified in The impact framework 2012 discussed earlier (Bromley & Metcalfe, 2012). Produced by a national organisation (VITAE) dedicated to delivering impact in the area of professional development for researchers and research students, it has its roots in the Research Council UK's action plan for supporting the implementation of the national Concordat to support the career development of researchers (RCUK, 2008). The five levels used in the report comprise a 'logical progression of cause and effect of investment in researcher training and development' (Kirkpatrick & Kirkpatrick, 2006, p. 5) and mirror this article's concern with distinguishing evaluation (reaction to) from impact (learning and behavioural change). The impact levels are: . 0 (Foundationsinstitutional investment in and provision of development opportunities) . 1 (Reaction) . 2 (Learning -'the extent to which participants change their attitudes, improve knowledge, and/or increase skill') . 3 (Behaviour change) . 4 (Outcomes)the 'final results of the training and development activity'. However, with progression through the levels, measurement becomes increasingly problematic as the relationship between an activity and impact becomes more attenuated. As impact levels move from 1 to 4, the volume and quality of evidence marshalled in the report become scantier and more indirect. This can be seen in an appendix mapping 'suggested possible outcomes' from an earlier 2008 report against actual reported examples of outcomes included in 2012 (Bromley & Metcalfe,Appendix 4). The cited examples fail to demonstrate an evidential link between development activities (whether general or specific) and reported outcomes.
To find examples of work where the focus is on linking specific pieces of academic development to participant learning or behaviour change, it is necessary to move outside the supervisor development space. Research based on reporting conceptual and behavioural change as a consequence of participation in programmes involving workshops delivered over several months is reported in a series of articles by Rust (1998Rust ( , 2000 and Stes and colleagues (Stes et al., 2007(Stes et al., , 2010. Stes included participants from one supervisor development workshop in his 1998 questionnaire survey, but none of those interviewed had attended it. Rust focused exclusively on teaching. Chadha (2015) examined the impact of a Graduate Certificate in Academic Practice through a small number of interviews (nine) with doctoral students and other very early career researchers, and an examination of evidence-based portfolios, finding an increase in knowledge and skills, increased reflection and 'a slight shift from a teacher-centered approach to a more student-centered one' (pp. 51-54). Gibbs and Coffey report on 'a three-year international study of the training of university teachers … concerned with identifying any changes in teachers' behavior and approaches to teaching and their students [sic] approach to learning, which could be attributed to the training' (2004, p. 89). They adopted an experimental approach involving a very small control group, finding that there was support for the conclusion that training: 'increase(d) the extent to which teachers adopt a "Student Focus"; "improve a number of aspects of teachers' teaching"; and, change teachers such that their students' [sic] improve their learning' (p. 98). They conclude, however, that they 'are still not in a position to demonstrate that it was the training itself that resulted in the positive changes, merely that those institutions that had training also had teachers who improved' (p. 99).
A final category of work on the impact of academic development activities focuses on 'impact frameworks'.  point to three, Kirkpatrick and Kirkpatrick (2006), Stes and colleagues (2007) and Guskey (2002), saying that 'they provide limited guidance on sources of evidence, contextual factors or the time frame in which impact or change can be expected' (p. 4). They then offer a discussion of one part of an Academic Development Professional Development Framework developed as part of a project funded by the Australian Government's Office of Learning and Teaching (OLT). The element of the Framework reported in the article was designed for use with formal programmes (e.g., 'accredited Graduate Certificates or Foundations programmes') of university teacher preparation programmes. A second element designed to be used with 'Informal programs … (which) include short (one to three hour) workshops or seminars, on-line courses or resources, special events such as teaching week or visiting scholars, communities of practice and ad hoc events … ' can be found in the OLT project report (Chalmers et al., 2012 p. 29). The Framework proposes four types of indicator, input, process, output and outcome mapped against a number of foci. Both the report (Chalmers et al., 2012) and the article  use the concepts 'evaluation' and 'impact' without distinguishing between them. They do identify a number of sources of evidence which can be used for assessing the impact of formal programmes, but only one, 'Staff perception of their teaching skills following participation in TPPs', for assessing impact in informal programmes (the usual form taken by supervisor development activities; Chalmers et al., 2012, Appendices C and D). Professional development for research degree supervisors is not addressed in either publication.
Overall, it is hard to disagree with Chalmers and colleagues' comment that 'there appears to be little systematic investigation of the impact of programmes over time' (2012, p. 4). For a variety of reasonssimple neglect, difficulty in measurement and an associated lack of evidence, and slippage between evaluation and impactthe literature on the impact of professional development in the area of higher degree supervision remains extremely limited both in scale and in scope. The next section presents a typology of supervisor development and situates the Supervising@UniSA workshop within that typology before moving to the study of its impact.

Contexts of supervisor learning and supervising@Unisa
To position the Supervising@UniSA workshop in the broader field of research degree supervisor professional development, a two-dimensional typology is developed. The first dimension is identified by Wisker and Kiley (2012, p. 2) who highlight a tension between the emphasis on initial development or induction, and the longer term, evolving nature of the development of improved supervisory practice. They write that the induction phase involves the orientation of an individual or group of individuals towards their institution and their future practice within the institution. The development phase involves the improvement of practice through learning and reflection throughout an academic career. We characterize these two phases as 'initial development' and continued professional development (CPD).
The second dimension relates to the subject matter or content of supervisor development activities which may be either 'narrow' or 'broad'. Narrow approaches focus on the introduction of supervisors to the immediate context of their supervisory practice, their institution's policies and processes, and on 'tips' about how to supervise. Broad approaches encompass all aspects of the narrow approach, but extend them through consideration of the changing higher education environment for doctoral education, the pedagogy of the research degree, and most commonly involve the use of case-study scenarios to encourage reflection on practice. The broad approach is encouraged in McCormack and Pamphilons' (2004) article identifying the common elements of supervisor development programmes. These are the provision of information and resources; the development of skills, and; the 'scholarship of supervision pedagogy … [through] reflection/critical reflection' (p. 24).
The two dimensions produce the typology shown in Figure 1. Initial development or induction is the starting point and can be either narrow or broad in focus. In developmental terms, the aim should be to move staff from the initial development phase and towards CPD activities. While these can be either narrowly or more broadly focused, the objective of good development activities should be to encourage a culture and practice of learning such that development and reflection around all aspects of academic practice become a natural and taken-for-granted part of a career. Ideally, institutional policy should encourage individual academics to move from the bottom left category towards either or both of the top two categories.
As an induction workshop which is compulsory for all academic staff new to UniSA or new to supervision at UniSA, Supervising@UniSA has been designed to introduce participants to the broader higher education environment for doctoral education and the pedagogy of supervision and to promote development beyond the initial, narrow, phase. Participants are introduced to UniSA offerings in the broader area of supervisor development and encouraged to view the induction workshop as an opportunity to spend a day reflecting on this important part of their professional practice, rather than as a hurdle to be overcome or a session that will teach them 'how to supervise'. The workshop is explicitly positioned as an activity enhancing knowledge and learning about supervision and encouraging reflection on pedagogies of practice, something which challenges some participants' preconceptions. While the 'informational' elements necessarily take a more didactic approach, the development aspects are explicitly based on a constructivist paradigm of participant learning. Figure 2 situates Supervising@UniSA in the typology developed above.
Supervising@UniSA has been offered at UniSA since 2003. By 2009, it had developed and taken the shape it retained until the end of 2012, the workshop evolving in response to external policy drivers, feedback from evaluations and changes to the facilitating/delivering team. It is usually offered four or five times a year, runs for a full day and is fully catered to encourage participants to eat together and thereby engage with one another. The workshop is divided into four distinct elements. The first involves a brief introduction to the national context for doctoral education and to the environment, structures and processes relevant to research degrees at the University. This 'narrow' element is typically delivered by the Dean of Graduate Studies and staff from the University's Graduate Research team. The second 'broad' element considers the development of doctoral education from its origins in the apprenticeship model to the current emphasis on research training, the scholarship of research degree supervision, and acts to focus attention on the issues associated with the challenges facing supervisors and institutions in a period of increasing diversity in research degrees. This session is delivered by the first author of this paper.
Following lunch, the third element of the programme (titled 'From the Chalkface') involves one or more experienced UniSA supervisors discussing practical responses to issues they have faced in doctoral supervision. The fourth and final element involves participants working in small groups on a selection of case-study scenarios to develop, discuss and share strategies for helping students to overcome situations typical of those faced in doctoral study. This session is facilitated by the second author. The workshop is offered and badged as part of UniSA's Research Education Support Activities (RESA) programme which is the organisational 'home' for all UniSA's research degree development activities and support services for students and supervisors (see Bastalich et al., 2010). Programme evaluation sheets are completed by participants at the close of the workshop. These elicit immediate feedback on which elements of the programme had been most useful, which could be changed, and issues and topics that could be the subject of other CPD activities. A question about the overall satisfaction and usefulness of the workshop is also asked.

The questionnaire
The study was conducted between January and June 2013. A link to an anonymous online questionnaire was emailed to all staff who had attended Supervising@UniSA in the four years 2009-2012 and who had retained an employment or other professional relationship with the University into 2013. The project received ethics approval from the UniSA Human Research Ethics Committee. Background information about the UniSA Division respondents were based in, gender, whether they held a PhD and, if so, whether it had been awarded by an Australian institution or elsewhere was gathered. Respondents were also asked whether they had held a PhD at the time they attended Supervising@UniSA and, if so, whether at that point in time they had held it for more than five years. Year of attendance was asked, as was whether they had been supervising research students when they attended.
The second section of the questionnaire was designed to examine impact. Questions were asked to ascertain whether participation had contributed to individuals' knowledge and understanding of research degrees and of the UniSA resources provided to support research students and their supervisors. It asked whether they had drawn on the workshop in their subsequent practice and if they had suggested to others that they should undertake development activities, and whether they had actually undertaken any. Finally, whether, overall, Supervising@UniSA had proved a useful resource. It is important to note that the study is based on reported behaviour rather than observed or directly measurable change, but the data contained in this article concerning the impact of supervisor development workshops on participant learning and professional practice represent the best reported to date.
The data from the questionnaire were coded and entered into SPSS. The SPSS package was used to provide descriptive statistics which are reported in this article. Because the responses to the questionnaire generated ordinal data, where the analysis is comparative in nature, it is reported in terms of 'more-than' and 'less-than' rather than being subjected to a more sophisticated statistical analysis which would also have required a greater number of respondents. Where appropriate, that is where it adds to understanding, the numbers and percentages of those not answering any particular element in the questionnaire are reported. Where it was not important, as, for example, in Table 3, which reports the extent to which respondents had drawn on elements of Supervising@UniSA, nonresponses are not reported.

The respondents
There were a total of 94 responses. When compared to the University's 2013 academic staff profile, females are slightly underrepresented. Response rates from the UniSA Business School, the Division of Education, Arts and Social Sciences, and the 'other', non-Divisional, category reflected the proportions of staff in post in 2013. The proportions of respondents drawn from the Divisions of Health Sciences, and Information Technology, Engineering and the Environment, were significantly below the proportions of attendees working in those Divisions. It may be that respondents from those two divisions were less happy to identify their Division and may, therefore, represent the bulk of the one in five respondents who chose not to answer the question about location in the institutional structure. Table 1 shows the total numbers and the number of respondents attending the workshop in each of the years 2009-2012 and also each year's response rate. It is acknowledged that the response rate is relatively low, particularly from the earlier cohorts, and this reflects, at least in part, staff turnover and the lower likelihood of a participant in Supervising@UniSA in 2009 or 2010 still being employed at UniSA in 2013 when the survey was undertaken than would be the case for a 2012 participant.
Since possession of a PhD is a pre-requisite for doctoral supervision at UniSA, the proportion of attendees holding a doctoral qualification was high at 94%. Three-quarters of these PhDs had been awarded in Australia and the remainder overseas. Almost three out of every five respondents held PhDs awarded in the five years prior to workshop attendance and two out of five had been awarded their PhD more than five years before. Respondents were almost equally divided between those who were supervising at the time of their attendance (with about three-quarters of these being experienced supervisors) and those who were not supervising.

Analysis of the responses
This section is organised under three headings: (1) contribution to understanding as a result of attending; (2) drawing on the workshop in academic practice; and (3) encouraging others to attend. A short discussion of overall satisfaction is also included.

Contribution to understanding
The extent to which respondents reported that attending Supervising@UniSA had contributed to their understanding of the different elements addressed is shown in Table 2. Overall, even when the 'not answered' responses are combined with the 'did not contribute [to my understanding]' responses, the maximum 'negative' response is of the order of a quarter. Approximately four-fifths of respondents reported a contribution (and in many cases a large contribution) to their understanding of the national context for and development of doctoral education, and the RESA programme for research students. The largest figure was for the contribution to their understanding of the University's policies and procedures (94%).

Drawing on Supervising@UniSA in academic practice
The question of the extent to which participants had drawn on Supervising@UniSA was addressed in a number of questions based on the various elements of the workshop. These elements and the responses are shown in Table 3 and it can be seen that the sessions involving discussion with experienced supervisors and the introduction to the University's policies and procedures were those most drawn on by participants in their own supervisory practice. At the other end of the scale, fewer respondents reported having drawn on the national context and development of doctoral education elements or on their networking with other participants. That said, at least half of the respondents said  that they had drawn on each of the elements in the period since attending the workshop. Information about grants and scholarships scored almost as highly as the experienced supervisor sessions, while information about online resources and the case studies lay somewhere in between. The most important conclusion to draw from Tables 2 and 3 is that each of the elements of Supervising@UniSA which, taken together, represent a broad view of supervisor development, contributed to the participants' understanding of research degree supervision. These elements had also been drawn on in their subsequent practice by at least 50% and by as much as 84% of the participants.
Acting on Supervising@UniSA Table 4 looks at two aspects of post-workshop behaviour, although it should be pointed out that, as there are no baseline data from which to assess change, these are as much an indicator of the perception of the worth of RESA and Supervising@UniSA as they are a measure of impact. One aspect was whether participants had recommended RESA activities and resources to either research students or fellow supervisors. Recommending an action is a conscious act on the part of the recommender and can be taken as an indicator of the perceived value of the action being recommended. The point that Supervisin-g@UniSA is part of the RESA programme is highlighted during each workshop. This frequently generates discussion on the merits and perceived benefits of the broader RESA programme for research students, leading to the conclusion that most participants would know what they were recommending. Table 4 suggests that RESA (and by implication Supervising@UniSA) is perceived as having relatively high value.
In the realm of behaviour subsequent to the workshop, respondents were asked whether attending Supervising@UniSA had encouraged them to pursue further professional development in the area of supervision. One in five said that attendance had strongly encouraged them, almost half that it had encouraged them and just over a quarter that it had not encouraged them. Feeling encouraged to do something does not necessarily involve action; however, in this case, one in five respondents had undertaken further voluntary professional development in the area of supervision since attending the compulsory induction workshop. The remaining four-fifths stated that they had not or chose not to answer the question.

So, in retrospect, was it worth it?
When asked, in retrospect, whether attending Supervising@UniSA had proved useful, 72% agreed that it had (32% strongly) with about 20% disagreeing (14% strongly). Eight per cent of respondents did not answer. These figures are less positive than the post-workshop evaluations that typically return 'useful' ratings around the mid-90% range, but the figures are sufficiently high to suggest that the workshop remains highly regarded even with the passage of fairly considerable amounts of time.

Discussion and concluding comments
This research was undertaken because of the paucity of evidence on which to base judgements about the impact of supervisor development activities. The results strongly suggest that a single one-day induction workshop designed both to induct staff to the institutional setting for doctoral education and to involve them in developmental activities relevant to research degree supervision, while also pointing them towards resources and development opportunities for themselves and their research students can have a medium-to long-term impact (see the typology in Figure 2). This impact can be identified in terms of contribution to understanding, and also in the participants' behaviour and professional practice.
Moving beyond the measurement of immediate reaction, this longer term impact was identifiable in both learning and behaviour (Bromley & Metcalfe, 2012) as regards both understanding and the elements of the workshop which had been drawn on since attending. Impact was also seen in terms of recommendations being made to others (both supervisors and, to a greater degree, research students) to attend development activities.
Responses showed that the longer term impacts were evident across both the 'narrow' and the 'broad' dimensions of the typology of supervisor developmentalthough the narrow and more immediately practical elements were drawn on most. Overall, the most highly valued elements were those concerned with the everyday practice of supervision, that is, the 'from the chalkface' session and the case studies and associated small group discussions. In the typology, these sit within the narrow/initial development quadrant, but there is sufficient evidence in the results to support the claim that the workshop also engenders learning addressing the broad and CPD dimensions. It is gratifying for the authors to find that, for participants looking back at the workshop between one and four years after attending, about three-quarters continued to view attendance as having been useful. As noted earlier, Supervising@UniSA receives very high post-event evaluations in terms of usefulness. While the percentage of respondents to the current survey reporting that participation had been useful was lower than that immediately post-workshop, at 72%, it remained very high. We also found high positive scores as regards contributions to learning and the elements drawn on since participation (Tables 2 and 3). These findings lead us to support Rust's conclusion that academic development 'workshops are successful tools of change in teaching practice and (that) post-workshop evaluations operate as indicators of impact ' (1998, p. 79) in the specific area of supervisor development. The finding that post-workshop-evaluation plays a role as an indicator of impact will cheer those who have assiduously collected and reported evaluation data over the years.
The project also permits us to claim to have made a methodological contribution to the measurement of the impact of supervisor development activities. We have shown that it is not necessary (indeed, we have argued that it is not helpful) to seek to examine impact simply by indirect methods. By unpacking the notion of impact and discussing it in terms of 'contribution to understanding' and the 'extent to which participants have drawn on' the event, and by examining participants' responses at a distance of between one and four years, we have been able to demonstrate the link between a specific development activity and changes in learning and behaviour. This addresses the second and third levels of impact discussed by Bromley and Metcalfe (2012). We are not able, however, to address the issue of 'outcome' (the fourth level) in the context of Supervisin-g@UniSA due to the number of variables at play and the high degree of attenuation between the activity and any outcomes. This fourth level of impact remains to be addressed.
While much further work remains to be undertaken in the area, this article demonstrates that supervisor development activities can lead to longer term learning and do have an impact on supervisory practice. Furthermore, it shows that impact can be identified long after the event. The article began by asking the question, 'does research degree supervisor training work?' Our research allows us to conclude that it does and we make two recommendations. First, that universities should maintain and enhance their academic development provision in the area of research degree supervision. Second, that strong encouragement should be given to current and intending supervisors to engage regularly with professional development in this key area of academic practice. Given the importance of good supervision to the research degree student experience, we owe nothing less to our doctoral candidates.