SUBSCRIBE

Massive Effort In Academic Integrity Research Finds Urkund Best Overall Plagiarism Tools In 8 Languages, Top In 3

plagiarism originality authorship urkund
AI in Education Leaderboard Post Page
Ai In Education Square Post Page

--- Advertisement ---

Updated on September 15. Urkund has acquired PlagScan to become Ouriginal. Learn more here.

If you are considering a plagiarism plugin, you might have a simple goal in mind: To crack down on cheating. A reasonable goal indeed. Unfortunately, it is one that might miss out on the complexities, as well as the opportunities, of online learning in the 2020s.

The leading players in the plagiarism, or better yet, the “originality management” space, have evolved fast in recent years. Some of them, however, benefited from the limited information available to consumers. They have taken advantage of this asymmetry to downplay some of their risks. They clearly provide several benefits and efficiencies, by being able to compare thousands of student deliverables against massive datasets to produce originality scores in a matter of seconds. But underneath this layer of information, several issues linger:

  • How good the tool really is, in terms of the error rates? That is both the undetected plagiarism, as well as products incorrectly deemed as plagiary? (False positives and false negatives.)
  • How verifiable are its results?
  • How does it treat student data in order to achieve the results?
  • Is the tool easy to use overall, its results intuitive or easy to make sense of?
  • How good, efficient and reliable is the tool’s computational performance?
  • What kind of non-punitive mechanisms does it offer to encourage originality?

Looking to provide some of the answers to these questions, a 9-person strong team of researchers from universities in Czechia, Germany, Latvia, Mexico, Slovakia and Turkey took on a 2-year, large-scale tool testing project named “Testing of support tools for plagiarism detection.” Started in 2018 and funded by the European Union Erasmus+ Program, the first report on the outcomes of the initiative by the European Network of Academic Integrity (ENAI) were first released last February.

Beyond text-matching: The ‘value chain’ of original authorship checking

A broad and encompassing survey of the state of the art in the evaluation and testing of original authorship tools by the authors, going back decades, suggest a critical starting issue: The seeming lack of academic accountability for these tools. While there’s evaluation attempts dating as far back as 1999, few works provide methodological sound testing frameworks. Which considering the fast speed of evolution of these tools, it might not be that big of a deal.

The main driving innovation for these tools, in a way, has been student ingenuity. Each new iteration of tools was mainly the result of people finding sophisticated ways to fool the algorithm:

  1. The tool searches for the student submission on an online database. Yes, it probably starts with Wikipedia.
  2. The tool looks for word alternatives on the paragraphs.
  3. The tool accounts for technical loopholes such as homoglyphs (characters with different codes but that look exactly the same).
  4. The tool checks for combinations of multiple sources. And so on.

The researchers make the case that “plagiarism,” while the most common term, it is not technically the most appropriate, as in no circusmtance either tool can authoritatively claim plagiarism. That role remains firmly on the hands of the teacher. The research tries to privilege the term “similarity checker.”

The test

Out of 63 systems initially reached out, 15 accepted and were cleared as participants. Each system evaluated sets of documents in 8 different languages. Each set had 7 different documents, ranging from verbatim Wikipedia content to fully original texts, to combinations of sources and automatic or manual variations.

The evaluation consisted in a 0 to 5 scale, 5 being the best, for several criteria including accuracy as well as false positives. The results were then averaged for each tool per language. Usability assessment also took place independently. The testing took place between June of 2018 and November 2019, with the latest release of each software that was available at the time of testing.

The results

After computing all the results together, the results are in. Per language, top 3 in order:

  • Czech: StrikePlagiarism.com, Urkund, PlagAware
  • English: PlagScan, Turnitin, Urkund
  • German: PlagAware, Turnitin, PlagScan
  • Spanish: Urkund, Turnitin, PlagScan
  • Italian: PlagiarismCheck.org, PlagScan, StrikePlagiarism.com
  • Latvian: PlagiarismCheck.org, Urkund, PlagScan
  • Slovak: Urkund, Unicheck, Turnitin
  • Turkish: Urkund, Turnitin, Akademia

Urkund is recognized by the researchers for its readiness on Wikipedia detection, paraphrasing, synonyms and translated works. Urkund also yields the overall best results from single source comparisons as well as multiple sources.

Regarding usability, Urkund shares the maximum score in “Workflow” with Docoloc (Docol©c), DPV, PlagScan and Unicheck; and is also the top in the “Presentation of results” rubric along with PlagScan.

Despite getting minuses for not having “clearly stated pricing” and not offering a free trial on its website, it complies with the rest of the “Other aspects” criteria, something only Unicheck and PlagiarismChecker.org match:

  • API
  • Moodle integration
  • Phone-based support (Hotline)
  • English service available
  • Perfect grammar on website
  • No ads
  • Offline side-by-side evidence

While the research is clear on the need to evaluate these tools for specific procurement scenarios, the researchers classify the tools in 3 groups: Useful, Partially useful, and Marginally useful. On the top group they include Docol©c, PlagScan, Turnitin, Unicheck and Urkund. They also include best tool for groups of languages:

  • PlagAware for German
  • PlagScan for English, Italian
  • PlagiarismCheck.org for Italian, Latvian
  • Strikeplagiarism.com for Czech, Italian
  • Urkund for Slovak, Spanish, and Turkish

The report leaves the door open for a debate on more “intrinsic” approaches to the issues or originality and academic integrity; particularly on the motivations behind seeking reward for work which is not one’s own. It ends with a series of suggestions for the tools to improve on their scores.

Disclaimers

The researchers disclose that Turnitin, Urkund, PlagScan and StrikePlagiarism.com are regular supporters of the Plagiarism across Europe and Beyond conference, which many of the study’s authors organize. One of them even received a “Turnitin Global Innovation Award.” Furthermore, licenses for all the tools compared were provided free of charge by each vendor, for research purposes only. The vendors were informed in detail about the testing process and were allowed to raise methodological issues, many of which were incorporated into the process. None of them, however, funded the research nor the researchers directly, who declare there were no influence nor conflicts of interest in the study.

Unicheck and Urkund are sponsors of LMSPulse. Urkund is a bronze sponsor of the Elearning Success Summit.

R&R

Check out Urkund’s Eric Gibbs, on the Elearning Success Summit on Thursday

One Response

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

The Latest

The eLearn Podcast

--- Advertisement ---

Subscribe to our newsletter

Education technology has the power to change lives. 

To get the latest news, information and resources about online learning from around the world by clicking on the button below.