Can a Transformer Assist in Scientific Writing? Generating Semantic Web Paper Snippets with GPT-2


Research output: Chapter in Book / Report / Conference proceedingConference contributionAcademicpeer-review

115 Downloads (Pure)


The Semantic Web community has produced a large body of literature that is becoming increasingly difficult to manage, browse, and use. Recent work on attention-based, sequence-to-sequence Transformer neural architecture has produced language models that generate surprisingly convincing synthetic conditional text samples. In this demonstration, we re-train the GPT-2 architecture using the complete corpus of proceedings of the International Semantic Web Conference since 2002 until 2019. We use user-provided sentences to conditionally sample paper snippets, therefore illustrating cases where this model can help at addressing challenges in scientific paper writing, such as navigating extensive literature, explaining the Semantic Web core concepts, providing definitions, and even inspiring new research ideas.

Original languageEnglish
Title of host publicationThe Semantic Web
Subtitle of host publicationESWC 2020 Satellite Events, Heraklion, Crete, Greece, May 31 – June 4, 2020, Revised Selected Papers
EditorsAndreas Harth, Valentina Presutti, Raphaël Troncy, Maribel Acosta, Axel Polleres, Javier D. Fernández, Josiane Xavier Parreira, Olaf Hartig, Katja Hose, Michael Cochez
PublisherSpringer Science and Business Media Deutschland GmbH
Number of pages6
ISBN (Electronic)9783030623272
ISBN (Print)9783030623265
Publication statusPublished - 2020
Event17th Extended Semantic Web Conference, ESWC 2020 - Heraklion, Greece
Duration: 31 May 20204 Jun 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12124 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference17th Extended Semantic Web Conference, ESWC 2020


  • Natural language generation
  • Scholarly communication
  • Semantic Web papers


Dive into the research topics of 'Can a Transformer Assist in Scientific Writing? Generating Semantic Web Paper Snippets with GPT-2'. Together they form a unique fingerprint.

Cite this