Meta: standing up to racism

Former lawyer Said Haider has invented a chatbot for those seeking advice on anti-discrimination issues – Meta is the first chatbot of its kind in the world. In conversation with Qantara.de, Haider talks about the development of Meta, his experience of discrimination and Meta's future prospects

By Schayan Riaz

Mr Haider, you’ve developed the world’s first ever anti-discrimination chatbot. What was your inspiration?

Said Haider: I went to university because I wanted to change the world. But I quickly realised that nothing changes. In the search for something new, I stumbled upon the Datteltaeter (The Date Assailants) – a German YouTube channel that uses humour to tackle prejudice. We wrote to our community and asked them to describe their worst experiences – we got so many submissions I couldn’t deal with them all. I thought; something needs to be done about this. We had more than 1,000 submissions in one day and no one knew their rights. So, I had the idea of setting up a low-threshold advisory service for people to access whenever they experience discrimination.

And how did the chatbot idea arise?

Haider: Well, a service like that needs staff, an office and knowhow – which I didn’t have at the time. So, I thought, a chatbot could help. I assumed that such a simple idea would already exist in some form. But that wasn’t the case. Anti-discrimination experts liked the idea, so in early 2019 I launched a roundtable with a prototype. And in the following summer a hackathon, with IT experts as well as the anti-discrimination team. We wanted to digitalise the entire body of work against discrimination, that was the challenge.

What exactly does a chatbot like this replace? There are so many places for people to go, after all…

Haider: Discrimination comes in a whole range of different forms. And these trigger different legal consequences. Previously, you’d search for information online to find the appropriate category. Meta saves you doing all that. You get introductory access to legal information. And the other thing is that Meta is still available on a website, but the idea is that in future it will run on messaging platforms like WhatsApp. That’s another way it differs from the classic advisory service. Up to now, you would have to physically go and find an advice centre and you couldn’t remain anonymous. Meta offers you the opportunity to get all the information you need, without leaving your workplace or wherever else you might be.  

Meta antidiscrimination chatbot; How can a chatbot help against discrimination and racism? (source: LinkedIn)
More than just 'three hijabis and a kanaka' doing their 'thing': "Our approach is one of intersectionality. There are enough people with a migrant background who experience repeated discrimination because of their age or gender. We know what it’s like to be affected, we want to protect others. A person with a disability from a non-migrant background experiences societal marginalisation to the same extent, and I hope more than anything that everyone, whether or not they are from a migrant background, can benefit from Meta," says project initiator Said Haider

 

"My past is etched with my own experience of racism"

 

And are you working together with advice centres? Surely, they’re going to be taking over at some point?

Haider: If Meta were a pie, then what we’re seeing now is just the crust. The challenge is going to be making the filling as precise and informative as possible. If for example you’re discriminated against on the housing market, then there are handouts detailing the steps you can take. At the moment, Meta can’t provide in-depth advice, but it can give an indication of what’s involved on the road ahead. The aim is to enable people to send their Meta chat transcript to advice centres, so that both sides can prepare accordingly for the case in question. One of our chief aims is to relieve the burden on information bureaus.

You are perceived as a non-white man within Germany's majority society. To what extent does your own experience permeate a project like this?

Haider: I come from Hamburg, and of course, my past is also etched with my own experience of racism. Both at school and out and about in Hamburg at night. That was a hard nut to crack back then, something I could never escape. At university, I noticed that my peers weren’t sensitised; they made light of a lot of things. During my internship period, I chose my trainee positions at organisations that would give me valuable insights into post-migration issues. These included for example the integration authorities in Hamburg and the Federal Office for Migration and Refugees in Nuremberg. And I got a sense that we’re just not addressing problems directly enough. As soon as it’s about migrants, an othering takes place, a "them and us" kind of thing. An honour killing within migrant families is perceived as something to be expected, whereas a relationship drama between Germans is just a one-off. My time as a student is saturated with negative experiences like this.

 

Meta's DNA is operating from an affected perspective

 

And how do you keep a cool head in such a situation?

Haider: By allowing people who know what they’re doing to take the lead. If you’re not affected yourself, then often there’s a lack of expertise. This intuition only exists in people who are affected themselves. At every turn we’re asking ourselves, would this benefit us somehow, are we becoming savvier, would that set us back? With Meta too, it’s also about operating from an affected perspective, that’s the project’s DNA. It’s the same for the Datteltaeter by the way – i.e. for Muslims from a majority-Muslim group. That approach has turned out to be really useful; it’s shown that there’s evidently a language to use and subjects that are perhaps best dealt with by those who are affected.  

But you’re also hoping your chatbot will reach people who maybe don’t identify with the reality of your life. Your team is made up of three women who all wear hijab, and yourself.

Haider: Exactly. People look at our team and might think we only deal with discrimination cases concerning religion. But that would be discrimination in itself. Our approach is one of intersectionality. There are enough people with a migrant background who experience repeated discrimination because of their age or gender. To think that we’re just "three hijabis and a kanaka" doing their "thing" – that’s rubbish. Because we know what it’s like to be affected, we want to protect others. A person with a disability from a non-migrant background experiences societal marginalisation to the same extent, and I hope more than anything that everyone, whether or not they are from a migrant background, can benefit from Meta.

[embed:render:embedded:node:19078]

Another question concerning the digital aspect: don’t chatbots like Meta lack the human touch? If you’ve suffered discrimination, maybe you’d prefer to speak to a real person rather than a robot, and have a proper conversation? Aren't you unintentionally excluding people?

Haider: Meta doesn’t aim to substitute a personal consultation and isn’t competing with analogue services for those seeking support. It’s much more a case of Meta seeing itself as an add-on – enhancing the visibility of other services, but also improving the quality of consultations through enhanced access to information. Meta aims to provide advice with a heart. Meta isn’t a person, but has personality. The fact that we’ve created a virtual personality means it’s possible to create a different awareness, to get hold of information. Our communication is based on dialogue. It’s not just scant pieces of information. On the subject of exclusion: if certain people have difficulties with digitalisation, for example, the question that arises is, aren't we as a whole society changing the way we communicate? My Dad uses emojis too, it’s really caught on with him. Our communication has changed in the digital age. And the anti-discrimination movement needs to keep pace.

Interview conducted by Schayan Riaz

© Qantara.de 2021

Translated from the German by Nina Coon