IBA mit drei Publikationen auf der WI 2021

Universität Ulm

Roland Graef, Kilian Kluge und Steffen Zimmermann nehmen vom 09. bis zum 11. März an der 16. Internationale Tagung Wirtschaftsinformatik teil und präsentierten insgesamt drei Publikationen. Eine davon wurde für den „Best Practice-Oriented Paper Award“ und eine weitere für den "Best Paper Award" nominiert.

Die WI21, die aufgrund der Corona-Pandemie in diesem Jahr virtuell durchgeführt wird, ist im Fachbereich Wirtschaftsinformatik gleichermaßen renommiert wie traditionsreich. Unterschiedliche Disziplinen, vereint durch einen Fokus auf Informationstechnologien, werden in verschiedenen Tracks behandelt. Die drei vorgestellten Publikationen werden in den Proceedings der Konferenz veröffentlicht.


Roland Graef
Leveraging Text Classification by Co training with Bidirectional Language Models A Novel Hybrid Approach and its Application for a German Bank
nominiert für den Best Practice-Oriented Paper Award

Abstract: Labeling training data constitutes the largest bottleneck for machine learning projects. In particular, text classification via machine learning is widely applied and investigated. Hence, companies have to label a decent amount of texts manually in order to build appropriate text classifiers. Obviously, labeling texts manually is associated with time and expenses. Against this background, research started to develop approaches exploiting the knowledge contained in unlabeled texts by learning sophisticated text representations or labeling some of the texts in an automated manner. However, there is still a lack of integrated approaches, considering both types of approaches to further reduce time and expenses for labeling texts. To address this problem, we propose a new hybrid text classification approach combining recent text representations and automated labeling approaches in an integrated perspective. We demonstrate and evaluate our approach using the case of a German bank where the approach could be applied successfully.


Kilian Kluge and Regina Eckhardt
Explaining the Suspicion: Design of an XAI-Based User-Focused Anti-Phishing Measure

 Abstract: Phishing attacks are the primary cause of data and security breaches in businesses, public institutions, and private life. Due to inherent limitations and users’ high susceptibility to increasingly sophisticated phishing attempts, existing anti-phishing measures cannot realize their full potential. Against this background, we utilize methods from the emerging research field of Explainable Artificial Intelligence (XAI) for the design of a user-focused anti-phishing measure. By leveraging the power of state-of-the-art phishing detectors, our approach uncovers the words and phrases in an e-mail most relevant for identifying phishing attempts. We empirically show that our approach reliably extracts segments of text considered relevant for the discrimination between genuine and phishing e-mails. Our work opens up novel prospects for phishing prevention and demonstrates the tremendous potential of XAI methods beyond applications in AI.


Christoph Rohde, Alexander Kupfer, Steffen Zimmermann
Explaining Reviewing Effort: Existing Reviews as Potential Driver
nominiert für den Best Paper Award

Abstract: Online reviews systems try to motivate users to invest effort in writing a review since their success crucially depends on the reviews’ helpfulness. However, other factors might influence future reviewing effort as well. We analyze whether existing reviews matter for future reviewing effort. Analyzing a dataset from Google Maps which covers 40 sights across Europe with over 37,000 reviews, we find that reviewing effort – measured by the propensity to additionally write a textual review and (textual) review length – is negatively related to the number of existing reviews. Further, also the rating distribution of existing reviews matters: If there is a large discrepancy between the existing ratings and the own rating, we observe more additional textual reviews. Our findings provide important implications for review system designers regarding the presentation of review metrics: changing or omitting the display of review metrics for potential reviewers might increase their reviewing effort.