In a society where artificial agents such as personal assistants or domestic robots are very common, artificial and human actors have to show consideration for each other. Artificial agents have to learn  culturally specific, socially appropriate manners. That way, human actors are not the only ones showing consideration and technological process logic influences socio-technical behaviour. These manners may be flexible in terms of time and space; however, they then become behavioural cultural techniques in order to be more effective in specific situations. 

Cultural techniques, societal norms and conventions regulate the social adequacy of interventions in socially shared situations. For example, they determine the appropriate time for apologies, greetings, good wishes, reproaches or other social practices and rituals. Assistant systems are playing an increasingly prominent role in everyday situations. If human actors do not want to entirely submit to the fact that the intervening assistants impudently demand immediate attention, assistant systems have to learn social manners. Then, cultural techniques become an essential aspect of artificial intelligence (AI) and research regarding assistance systems today. In many contexts, socially appropriate behaviour can be considered polite, hence the name „poliTE (polite technology) – social adequacy for assistance systems“.

Designing these culturally sensitive, „polite“ assistance systems is an extremely complicated task that can only be accomplished with the help of initial research, research funding and networks of expertise. That way, our cultural techniques influence the systems’ design. In turn, successfully living with polite assistance systems influences how these hybrid communities behave. For example, are we the ones collaborating with robots? Should humans and robots shake hands?

„In a society where artificial agents such as personal assistants or domestic robots are very prevalent, artificial and human actors have to show consideration for each other in everyday life.“

(B. Gransche)

Culture – Technology – Cultural Techniques

Culture, technology and cultural techniques are constantly interacting in a complex, highly effective manner. We understand cultural techniques to be goal-oriented, physical processes (techniques) that deal with symbols. For instance, bowing, shaking hands or smiling are physical routines that manifest and symbolically communicate something abstract such as social recognition or friendliness (Maye 2010; Krämer 2004). These processes are essential to successful social interaction. However, before engineers can design socially sensitive agents, the humanities have to investigate a number of different factors: What cultural techniques in terms of behaviour, interaction, cooperation and communication do we expect and master? What scope do we consider socially acceptable? What individual factors influence how we rate social adequacy? This can only be investigated for specific contexts of action, which is why the challenge of „polite assistance systems“ can only be tackled on an interdisciplinary scale.

Assistance systems are playing an increasingly important role in our everyday lives: Personal assistants capable of learning, AI agents that run on smartphones (Siri, Cortana), on assistants such as Echo (Amazon 2016) and Jibo (Jibo 2016), or so-called social robots such as Pepper (Aldebaran Robots 2016) are on the cusp of becoming an everyday phenomenon in society as is already the case with mobile phones. At the same time, these new technological systems are assigned more roles than just that of an assistant; butler, coach, friend, companion (Böhle und Bopp 2014), protector or guardian angel (GA Project 2012), playmate, training partner, compassionate listener (Schroder et al. 2009), social or intimate partner, just to name a few.

All these systems influence the way and the context in which we act. In the future, if our everyday life is going to be penetrated by intelligent assistance technology, making this future „human“ is going to be a question of  technological manners. A technological assistant requires information on the current situation and has to be able to evaluate when and in what way interrupting a process would be appropriate. Whether it finds the right moment, tone, and mode all depends on evaluating the situation correctly.

How can we teach these manners to artificial assistants? After all, when it comes to being aware of those manners, there are enormous differences between individual people. In any case, the existing modes normal, silent and airplane mode are not going to be sufficient for future assistance systems.  We are going to need more nuanced modes – but what exactly are they going to look like? At a formal event, at work amongst colleagues, at work with (international) clients, with family, with a partner, with friends, with good friends, with old friends, with acquaintances, etc? For each of those modes, what are the requirements in terms of intervention, channel, timing, etc? For example, should household robots shake hands with us and if so, when there is a group of guests, who do they shake hands with first? In what order do we shake hands when meeting a group of people: someone we know or the most important person first, ladies first or just one person after another? Are we and is a robot allowed to refuse to shake hands with someone? Is an autonomous car allowed to act against traffic regulations and let a hesitant cyclist pass instead of putting them in a critical situation that would technically be in accordance with the rules?

Answering these types of questions is extremely relevant to the success of human-technology-interaction, but there are also a number of prerequisites. Teaching manners, cultural techniques, or the ability to assess and consider social adequacy to future assistance systems or letting them acquire these skills is going to be an important step towards creating a hybrid society that consists of both people and technological agents interacting in complex contexts (Gransche et al. 2014) and that is still „human“.

„PoliTE is looking at how to design polite and socially adequate technology so that people have a positive perception of human-technology-interactions.“ R. Wullenkord