Plagiarism Policies: Looking for intra-institutional consistency



Download 22.4 Kb.
Date conversion18.05.2018
Size22.4 Kb.
HEA Annual Conference 2008

Plagiarism Policies: Looking for intra-institutional consistency
Jo Badge and Jon Scott.

School of Biological Sciences, University of Leicester, University Road, Leicester, LE1 7RH, UK.


Introduction
Plagiarism is said to be on the increase in higher education (Park, 2003; Carroll, 2004; Hart & Friesner, 2004; Duggan, 2006; Maurer, Kappe & Zaka, 2006). One response to this perceived increase has been the use of automated plagiarism detection methods (Culwin & Lancaster, 2000; Maurer et al., 2006). In the UK, over 80% of Higher Education Institutions (HEIs) now subscribe to one such system, TurnitinUK (Dawson, 2007). This system, when implemented fully across a whole school or institution combined with effective teaching on good academic practice, can lead to a decrease in the incidence of plagiarised work (Badge, Cann, & Scott, 2007). Such systems provide a quick and objective measure of the proportion of matching text with one or more other sources, which may or may not be adequately referenced. The member of staff marking the work must, however, still critically assess the reports to judge if plagiarism has occurred, taking account of both the quality of the referencing and the extent of the match. Whilst electronic detection provides a strong deterrent to students (Dahl, 2007) and automated detection can highlight cases more quickly, how we deal with these cases once detected can vary widely between institutions (Jones, 2006).

Many institutions are making changes to their plagiarism policies to adapt to these new drivers for change. However, plagiarism policies are problematic to write and implement. They need to be flexible enough to deal with a very wide range of different circumstances and yet provide clear guidance to staff and students on what is good and bad practice. The penalty given can depend on the level of study of the student (development phase, undergraduate, postgraduate), the severity and extent of the plagiarism (large chunks of copy and paste, bought or ghost written essays, minor paraphrasing problems or poor note taking), the type of work completed (coursework that counts towards a final module mark or is required for progression, dissertation work where a single piece of work counts directly to the final degree mark), and previous history of the student in terms of plagiarism offences. Dealing with these difficult constraints has led to a range of different solutions with some institutions operating blanket top level policies and others devising complex tariff systems (Tennant, Rowell & Duggan, 2007).

The Independent Adjudicator for Higher Education, has repeatedly called for an investigation into the consistency of penalties applied for cases of plagiarism across the sector (Baty, 2006; EducationGuardian.co.uk, 2006). In response to this call, the JISC Plagiarism Advisory Service conducted a survey of UK HEI plagiarism policies and penalties. The Academic Misconduct Benchmarking Research (AMBeR) Project examined the published policies and procedures of 91% of UK HEIs (Tennant, Rowell, & Duggan, 2007). Two scales were constructed to permit numerical analysis of the policies and penalties: a scale of offences and a scale of penalties. Substantial variation in both the penalties available and the regulations used across the sector was demonstrated (ibid). The penalty of expulsion was almost universal (99% of HEIs citing as a possible penalty), with assessment level penalties such as an assessment mark reduced to zero or fail (with or without resit) the next most common penalty. Analysis of the penalty systems showed that institutions could be divided into three groups which related to the type of institution. Group A comprised small specialist institutions with very open policies, allowing for any possible penalty at any possible level/severity. Group B comprised research intensive institutions with loosely prescriptive policies. Group C comprised teaching intensive institutions that commonly had stepped, highly prescriptive policies (ibid). It was clear from this part of the study, that similar offences could produce very different outcomes dependent on the type of institution in which they occurred.

The second phase of the AMBeR study was published in May 2008 (Tennant & Duggan 2008). Response rates were much lower (59%) but still representative of the sector. HEIs were asked to report on the recorded penalties used in cases for plagiarism during one chosen academic year, from the last three years. Many institutions were not able to provide the level of detail requested, particularly on previous history or level of the offence. In the first part of the AMBeR study, 72% of institutions stated that previous history should be taken into account when considering the level of penalty for a piece of work (Tennant, Rowell & Duggan 2007). However, just over a quarter (27.6%) of these same institutions could not supply information on the number of first or subsequent offences in the second part of the study. This highlights the real need for accurate and accessible records to be kept if these factors are to be considered when deciding on penalties.

The AMBeR studies and other studies (e.g. Jones’ (2006) examination of law departments in Scottish institutions) point to a lack of inter-institutional consensus on plagiarism-related policies and procedures. Anecdotal evidence from many institutions and our own experience suggest that intra-institutional diversity also exists. As research in this area is limited, we set out to investigate if these differences at a national level might also be reflected between departments or sections of a single institution.
Methods

A survey of local policies was carried out at the University of Leicester, a ‘group B’ (research led) institution in terms of the AMBeR study. Practical ‘operating units’ for learning and teaching within the institution were determined by close analysis of the university’s internal faculty websites. Clerical staff were contacted in each department, institute, centre or faculty to locate members of staff responsible for dealing with plagiarism. -. The thirty staff responsible for dealing with plagiarism, at a local level, were then contacted individually by email with a request for copies of any plagiarism policies held locally.

The policies and responses received were analysed and an online questionnaire was devised to gather standardised data on plagiarism practices. The questionnaire contained 18 questions and was split into five sections covering roles, policies, electronic detection procedure for handling plagiarism cases and penalties. It was possible for respondents to leave open comments after the majority of the questions to further clarify their answers. The online questionnaire was hosted on an internal content management system - and access required a university username and password. The survey was piloted with one member of staff responsible for devising plagiarism policy and revisions were made to improve the clarity of some questions. The staff identified in the initial survey were contacted again individually by email and asked to take part in the online survey. The questionnaire data was analysed using Microsoft Excel.


Results

The response rate to the online survey was 87% (26/30). The majority of respondents (22) had some responsibility for undergraduate students, but often included other levels of study and distance learners, whilst 4 were solely concerned with postgraduate students. The responses from the 22 participants involved in undergraduate studies were used for further analysis.


Respondents were asked some general questions about the practicalities of dealing with plagiarism in the department. For example, 18 departments responded that TurnitinUK was in use for electronic detection of matching text. Participants were asked to describe how TurnitinUK had been implemented in their department. The majority of respondents submitted all work for scanning (figure 1). Interestingly, no departments offered students the facility to check their own work and some commented that they were not aware that this was practicable.



Figure 1: Count of the responses to the question 'How do you select which student work to screen?' [with TurnitinUK]. n=22, respondents could select all the options relevant to them.
Given our suggestion of the difficulties involved with applying penalties, we asked if there were factors taken into account when deciding on tariffs. The majority of departments responded positively (19/22) and many commented that they looked at each case individually, taking into account previous history, level and severity of offence.

In common with other Group B institutions identified in the AMBeR study, the plagiarism policy of the University of Leicester is set at an institutional level and requires consideration of other factors when considering a penalty. This top-level policy allows departments a degree of flexibility up to a maximum suggested penalty. For example, for a first offence, Boards of Examiners may impose a penalty of up to a zero mark for the module. This could encompass a wide range of penalties and still fall within the institutional policy.

An important part of the questionnaire was therefore to ask what documentation respondents used when dealing with plagiarism. The University’s Code of Practice on Plagiarism clearly takes precedence, and the Statement on Academic Honesty is used in departmental handbooks. There was great overlap in the documentation used, as shown in figure 2, but local variation is recorded. Nine respondents reported that their policies and practices had been revised or updated within the last two years, demonstrating that this is an evolving area.



Figure 2: Venn diagram showing the responses to the question 'Which documents does your department consult when dealing with plagiarism cases?' (n=22, respondents could choose all documents that applied).

Participants were asked to about the penalties available for use in their department when dealing with plagiarism. A list of 19 penalties was constructed, in line with those used in the AMBeR survey, but that all fell within the institutional policy. These penalties were classified in broad areas of effect, to provide a comparison with the AMBeR scales (see figure 3).






Figure 3: diagram to show the potential penalties available for plagiarism within the institutional policy. Penalties were grouped into classes (warning, assessment, module etc.).

Participants were asked to state which penalties would be available for use in a first offence of plagiarism. Respondents could choose multiple penalties from the list and an open comment box enabled participants to expand on their choices. Most participants commented that where more than one penalty had been selected as being available, the choice of penalty would depend on the level of the student, the type of work and the degree of the plagiarism. Some penalties may be applied together, such as a written warning and the downgrading of a piece of work and some may come into effect as a consequence of other actions. For example, a fail mark for a piece of coursework may lead to a requirement for resubmission for the purposes of progression.

The most commonly cited penalties were assessment class tariffs and the second most commonly available penalty was some form of warning. Participants were also asked to choose available penalties for second, third and subsequent offences. However, as several respondents cited that a second case of plagiarism had not been encountered in their department, the data for these subsequent offences was incomplete.
Discussion

In light of the inter-institutional variation in policies and the application of penalties for plagiarism observed through the AMBeR studies (Tenant, Rowell & Duggan, 2007; Tenant & Duggan 2008), this research project set out to investigate the practices and policies at an intra-institutional level. The study demonstrated that where a top-level institutional policy is in place within an institution, local practice does vary in the penalties available for a first offence. This is perhaps to be expected, given the constraints of working with complex cases and taking into account the range of factors that need to be considered when considering the application of penalties to individual cases.

As the Office of the Independent Adjudicator has called for inter-institutional consistency, there is also a place for closely examining intra-institutional consistency. Just as students could rightly argue that differential treatment for similar offences is unfair between institutions, the same is true within institutions. The question that needs to be addressed in each institution is how practice across different faculties and subject areas can be given the flexibility required whilst maintaining a sustainable and consistent judgment process when tariffs are decided. One key to this question is the accurate and accessible recording keeping of offences and processes at a local and institutional level, enabling an informed comparison of the range of penalties being imposed.
References:

Badge, J. L., Cann, A. J., & Scott, J. (2007). To cheat or not to cheat? A trial of the JISC plagiarism detection service with biological sciences students. Assessment & Evaluation in Higher Education, 32(4), 1-7.

Baty, P. (2006). Inconsistent penalties raise risk of legal action, Deech says. The Times Higher Education Supplement, 23 June.

Carroll, J. (2004). Deterring, Detecting and Dealing with Plagiarism. Oxford Brookes University. Retrieved 14 January, 2008, from http://www.brookes.ac.uk/services/ocsd/2_learntch/plagiarism.html

Culwin, F., & Lancaster, T. (2000). A review of electronic services for plagiarism detection in student submissions. LTSN-ICS 1st Annual Conference, Herriot Watt University.

Dahl, S. (2007). Turnitin(R): The student perspective on using plagiarism detection software. Active Learning in Higher Education, 8(2), 173-191.

Dawson, P. (2007). TurnitinUK - electronic plagiarism detection. Retrieved 10 January, 2008, from http://www.jiscpas.ac.uk/turnitinuk.php

Duggan, F. (2006). Plagiarism: prevention, practice and policy. Assessment & Evaluation in Higher Education, 31(2), 151-154.

EducationGuardian.co.uk. (2006). Conference to tackle university plagiarism problem. Retrieved 10 January, 2008, from http://education.guardian.co.uk/higher/news/story/0,,1924352,00.html

Hart, M., & Friesner, T. (2004). Plagiarism and poor academic practice – A threat to the extension of e-learning in higher education? EJEL, 2(1), 89-96.

Jones, M. (2006). Plagiarism proceedings in higher education – quality assured? Second International Plagiarism Conference, Gateshead. 123-130.

Maurer, H., Kappe, F., & Zaka, B. (2006). Plagiarism - A survey. j-jucs, 12(8), 1050-1084.

Park, C. (2003). In other (people's) words: Plagiarism by university students--literature and lessons. Assessment & Evaluation in Higher Education, Volume 28 (5), 471-488.

Tennant, P., Rowell, G., & Duggan, F. (2007). Academic misconduct benchmarking research project: part I The range and spread of penalties available for student plagiarism among UK higher education institutions. Northumbria: JISC Plagiarism Advisory Service. (available online at: http://www.jiscpas.ac.uk/AMBeR/index.php last accessed 11 June 2008)


Tennant, P. & Duggan, F. (2008). Academic misconduct benchmarking research project: part II the recorded incidence of student plagiarism and the penalties applied. Academy JISC Academic Integrity Service (Available online at http://www.heacademy.ac.uk/ourwork/learning/academic_integrity last accessed 11 June 2008).


Thursday 03 July 2008






The database is protected by copyright ©hestories.info 2017
send message

    Main page