"Process Mining; Enterprise Resource Management; Petri Nets"
"Industry; Web Services; Models"
"Process Mining; Enterprise Resource Management; Petri Nets"
"Industry; Web Services; Models"
Transformer
Subjective Context Description
Semantic annotation
Bidirectional Encoder Representations from Transformers (BERT)
Beschreibung:
An agent in pursuit of a task may work with a corpus containing text documents associated with Subjective Content Descriptions (SCDs) [1]. SCDs provide additional location-specific data for documents and add value in the context of the agent's task. On the pursuit of new documents to add to the corpus, the agent may come across documents without associated SCDs or documents where content and SCDs are interleaved. Therefore, this paper presents approaches estimating SCDs using the well-known BERT [2] language model. Furthermore, the paper presents approaches separating SCDs and actual content interleaved in a document. An evaluation compares the performance of BERT with approaches from Kuhr et al. [1] and Bender et al. [3], which use an SCD-word distribution represented by an SCD matrix $\delta(\mathcal{D})$ . The evaluation uses text documents annotated with additional textual definitions.