The MOML Project

In many cases it is important that an autonomous system acts and reacts adequately from a moral point of view. There are some artifacts of machine ethics, e.g., GOODBOT or LADYBIRD by Oliver Bendel or Nao as a care robot by Susan Leigh and Michael Anderson. But there is no standardization in the field of moral machines yet. The MOML project, initiated by Oliver Bendel, is trying to work in this direction. In the management summary of his bachelor thesis Simon Giller writes: “We present a literature review in the areas of machine ethics and markup languages which shaped the proposed morality markup language (MOML). To overcome the most substantial problem of varying moral concepts, MOML uses the idea of the morality menu. The menu lets humans define moral rules and transfer them to an autonomous system to create a proxy morality. Analysing MOML excerpts allowed us to develop an XML schema which we then tested in a test scenario. The outcome is an XML based morality markup language for autonomous agents. Future projects can use this language or extend it. Using the schema, anyone can write MOML documents and validate them. Finally, we discuss new opportunities, applications and concerns related to the use of MOML. Future work could develop a controlled vocabulary or an ontology defining terms and commands for MOML.” The bachelor thesis will be publicly available in autumn 2020. It was supervised by Dr. Elzbieta Pustulka. There will also be a paper with the results next year.

Findings on Robotic Hugging

In the first part of the HUGGIE project initiated by Oliver Bendel, two students of the School of Business FHNW conducted an online survey with almost 300 participants. In the management summary of their bachelor thesis Ümmühan Korucu and Leonie Stocker (formerly Leonie Brogle) write: “The results of the survey indicated that people have a positive attitude towards robots in general as robots are perceived as interesting and useful rather than unnecessary and disturbing. However, only a minority of the participants stated that they would accept a hug from a robot. A possible reason for this could be that for the majority of participants, a hug is an act of intimacy with a deeper meaning attached to it which is only being shared with selected persons. For a robot to be perceived as an attractive hugging partner, a human-like design including a face, eyes, a friendly look as well as the ability to communicate verbally and non-verbally is desired. However, an appearance being too realistic has a deterrent effect. Furthermore, an in-depth analysis of the data in relation to age and gender of the participants resulted in the discovery of interesting facts and differences. Overall, the findings contribute to a clearer picture about the appearance and the features Huggie should have in order to be accepted as a hugging counterpart.” The bachelor thesis will be publicly available in autumn 2020. There will also be a paper with the results next year.

Ingenuity on Mars

The Perseverance rover, which is on its way to Mars, is carrying a drone called Ingenuity (photo/concept: NASA). According to NASA, it is a technology demonstration to test powered flight on another world for the first time. “A series of flight tests will be performed over a 30-Martian-day experimental window that will begin sometime in the spring of 2021. For the very first flight, the helicopter will take off a few feet from the ground, hover in the air for about 20 to 30 seconds, and land. That will be a major milestone: the very first powered flight in the extremely thin atmosphere of Mars! After that, the team will attempt additional experimental flights of incrementally farther distance and greater altitude.” (Website NASA) After the drone has completed its technology demonstration, the rover will continue its scientific mission. Manned and unmanned flights to Mars will bring us several innovations, including novel chatbots and voicebots.

Four-Legged Robots to Scout Factories

Ford experiments with four-legged robots, to scout factories. The aim is to save time and money. The Ford Media Center presented the procedure on 26 July 2020 as follows: “Ford is tapping four-legged robots at its Van Dyke Transmission Plant in early August to laser scan the plant, helping engineers update the original computer-aided design which is used when we are getting ready to retool our plants. These robots can be deployed into tough-to-reach areas within the plant to scan the area with laser scanners and high-definition cameras, collecting data used to retool plants, saving Ford engineers time and money. Ford is leasing two robots, nicknamed Fluffy and Spot, from Boston Dynamics – a company known for building sophisticated mobile robots.” (Website Ford Media Center) Typically, service robots (e.g., transport robots like Relay) scan buildings to create 2D or 3D models that help them navigate through the rooms. Shuttles use lidar systems to create live 3D models of the environment, to detect obstacles. The robots from Boston Dynamics are also mobile, and that is their great advantage (photo: Ford). Nothing can escape them, nothing can hide from them. Probably the benefit can be increased by including cameras in the building, i.e. using robot2x communication.

Who can be My Companion?

Science-fiction regularly portrays deep friendship or even romantic relationships between a human and a machine, e.g., Dolores and William (Westworld, 2016), Joi and K (Bladerunner 2049, 2017) or Poe and Takeshi Kovacs (Altered Carbon, 2018 and 2020). In fact, for several years now, there has been a development approach aiming to create artificial companions. The so-called companion paradigm focuses on social abilities to create adaptive systems that adapt their behavior to the user and his/her environment [1]. This development approach creates a high degree of individualization and customization. The paradigm principally intends to reduce the complexity of innovative technology, which can go together with a lack of user-friendliness and frustrated users [2]. Ulm University hosted in 2015 the International Symposium of Companion Technology (ISCT). The ISCT conference paper gives a broad overview of the research issues in this field [3]. Some of the discussed research questions approached data input modalities for recognizing emotional states and the user’s current situation or dialog strategies of the artificial companions in order to create a trustworthy relationship. Although the paradigm is already approached quite interdisciplinary, Prof. Hepp (2020) has recently called on communication and media scientists to participate more influential in these discussions. Since in particular, the human-machine-communication was explored lacking pronounced participation of communication scholars [4]. In terms of perception, thrilling issues could consider e.g., possible gradations among different companion systems, and what effects these have on the interaction and communication with the technology? Such questions have to be discussed not only by computer scientists but also by psychology and philosophy scholars. Especially when it comes to the question of how human-machine-relationship will develop in the long run? Will companion systems drift into unemotional and function-centric routines as we have with other technologies, or can they become our forever friends?


References

[1] Wahl M., Krüger S., Frommer J. (2015). Well-intended, but not Well Perceived: Anger and Shame in Reaction to an Affect-oriented Intervention Applied in User-Companion Interaction. In: Biundo-Stephan S., Wendemuth A., & Rukzio E. (Eds.). (2015). Proceedings of the 1st International Symposium on Companion-Technology (ISCT 2015)—September 23rd-25th, Ulm University, Germany. p. 114-119. https://doi.org/10.18725/OPARU-3252

[2] Biundo S., Höller D., Schattenberg B., & Bercher P. (2016). Companion-Technology: An Overview. KI – Künstliche Intelligenz, 30(1), 11–20. https://doi.org/10.1007/s13218-015-0419-3

[3] Biundo-Stephan S., Wendemuth A., & Rukzio E. (Eds.). (2015). Proceedings of the 1st International Symposium on Companion-Technology (ISCT 2015)—September 23rd-25th, Ulm University, Germany. https://doi.org/10.18725/OPARU-3252

[4] Hepp A. (2020). Artificial companions, social bots and work bots: Communicative robots as research objects of media and communication studies. Media, Culture & Society, 016344372091641. https://doi.org/10.1177/0163443720916412

 

Chocolate Could Increase Acceptance

A multi-stage HUGGIE project is currently underway at the School of Business FHNW under the supervision of Prof. Dr. Oliver Bendel. Ümmühan Korucu and Leonie Stocker (formerly Leonie Brogle) started with an online survey. The aim was to gain insights into how people of all ages and sexes judge a hug by a robot. In crises and catastrophes involving prolonged isolation, such as the COVID 19 pandemic, proxy hugs of this kind could well play a role. Prisons and longer journeys through space are also possible fields of applications. Nearly 300 people took part in the online survey. The evaluation is almost complete and the results are remarkable. Among other things, it was found that women want to be hugged by a robot that is bigger than them, and men want to be hugged by a robot that is smaller than them. Not only the size is relevant for the acceptance of robotic hugging: “An interesting input given by one of the participants was that it could be more pleasant to hug a robot if it smelled nicely, for example like chocolate.” (Draft of Bachelor Thesis) Whether this is a typically Swiss view remains to be investigated. The results of the survey and the conclusions drawn from them for the design of HUGGIE will be compiled in a paper in the course of the year.

Talking to Harmony

At the end of June 2020, DIE WELT conducted an interview with Prof. Dr. Oliver Bendel about sex robots and love dolls. It was especially about their natural language skills. Particularly owners and users who want to have a relationship are interested in conversations of all kinds, about God and the world and in the sense of “dirty talk”. Companies like Realbotix go very far in this respect. Harmony for example can talk to her partners for hours in a quite convincing way. The engineers experiment with GPT-2, but also with other language models. Kino Coursey, AI boss of Realbotix, deals with this topic in his article “Speaking with Harmony” for the book “Maschinenliebe” (“Machine Love”) which will be released in October. The interview with Oliver Bendel was published on 11 July 2020 in the printed edition of DIE WELT, under the title “Intelligente Sexroboter sind begehrte Gesprächspartner” (already published the day before in the electronic edition, under the title “Was Sexpuppen können” …). In addition, an English version – “Intelligent sex robots are sought-after dialogue partners” – is available.

From 2D to 3D Codes

3D codes have been researched for over 15 years. Even 2D codes can store short texts or other information. However, codes that use color as a third dimension are far superior in this respect. They open up numerous fields of application and raise technical, economic and ethical questions. According to a press release, the JAB Code of the Fraunhofer Institute for Secure Information Technology SIT – JAB stands for “Just Another Barcode” – is on its way to becoming an international ISO standard. “Job references, training certificates and wills, but also proof of authenticity for products can be secured with JAB Code. (Press release Fraunhofer SIT, 26 June 2020) Furthermore, longer texts can be stored. Already in 2010 Oliver Bendel has published a book with QR codes from which haikus could be read offline. In the same and the following year he scientifically dealt with 2D and 3D codes. A longer article on the topic from 2011 can be found here – the teaser and the link are included in in this 3D code, which can be scanned via www.jabcode.org. A 2D code also manages a text like that, but already has an enormous complexity. Waldemar Berchtold from Fraunhofer SIT explains: “With eight colours, readability is robust, with the smartphones available on the market. Whereas with more than eight colours, reliable readability with older smartphones cannot be guaranteed across the board.” More about the Fraunhofer Institute project at www.sit.fraunhofer.de/de/presse/details/news-article/show/bunter-barcode-wird-iso-standard/.

 

Towards a Proxy Machine

“Once we place so-called ‘social robots’ into the social practices of our everyday lives and lifeworlds, we create complex, and possibly irreversible, interventions in the physical and semantic spaces of human culture and sociality. The long-term socio-cultural consequences of these interventions is currently impossible to gauge.” (Website Robophilosophy Conference) With these words the next Robophilosophy conference was announced. It would have taken place in Aarhus, Denmark, from 18 to 21 August 2019, but due to the COVID 19 pandemic it is being conducted online. One lecture will be given by Oliver Bendel. The abstract of the paper “The Morality Menu Project” states: “Machine ethics produces moral machines. The machine morality is usually fixed. Another approach is the morality menu (MOME). With this, owners or users transfer their own morality onto the machine, for example a social robot. The machine acts in the same way as they would act, in detail. A team at the School of Business FHNW implemented a MOME for the MOBO chatbot. In this article, the author introduces the idea of the MOME, presents the MOBO-MOME project and discusses advantages and disadvantages of such an approach. It turns out that a morality menu can be a valuable extension for certain moral machines.” In 2018 Hiroshi Ishiguro, Guy Standing, Catelijne Muller, Joanna Bryson, and Oliver Bendel had been keynote speakers. In 2020, Catrin Misselhorn, Selma Sabanovic, and Shannon Vallor will be presenting. More information via conferences.au.dk/robo-philosophy/.

Show Me Your Hands

Fujitsu has developed an artificial intelligence system that could ensure healthcare, hotel and food industry workers scrub their hands properly. This could support the fight against the COVID-19 pandemic. “The AI, which can recognize complex hand movements and can even detect when people aren’t using soap, was under development before the coronavirus outbreak for Japanese companies implementing stricter hygiene regulations … It is based on crime surveillance technology that can detect suspicious body movements.” (Reuters, 19 June 2020) Genta Suzuki, a senior researcher at the Japanese information technology company, told the news agency that the AI can’t identify people from their hands, but it could be coupled with identity recognition technology so companies could keep track of employees’ washing habits. Maybe in the future it won’t be our parents who will show us how to wash ourselves properly, but robots and AI systems. Or they save themselves this detour and clean us directly.