Bilal Zafar Wants to Make AI Fairer

"I am an academic with all my heart," states Bilal Zafar. After several years in industry, the computer scientist has returned to university, advocating for diversity and fairness.

Bilal Zafar was appointed as new professor for Computing und Society.

Bilal Zafar is professor for Computing and Society at Ruhr University Bochum (RUB). Copyright: RUB, Marquard

Artificial intelligences (AI) like ChatGPT are the future. However, these models are trained using historical data, which inevitably contains biases. "It's common knowledge that there was discrimination in the past," notes Prof. Dr. Bilal Zafar. "AI language models, including ChatGPT, often exhibit biases. Sometimes they deliver responses that are racially or sexually charged, but also other, more subtle biases occur." Bilal Zafar and his team want to tackle this issue.

The newly appointed Professor for Computing and Society at Ruhr University Bochum is focused on the biases in language models and how they can be prevented. Since April 1, 2024, Zafar has been a member of the Faculty of Computer Science at Ruhr University Bochum as well as of the Cluster of Excellence CASA and the Research Center Trustworthy Data Science and Security. In the Research Center scientists from Bochum, Dortmund, and Duisburg-Essen cooperate interdisciplinary as a part of the University Alliance Ruhr.

"In Bochum, I immediately noticed how friendly everyone is here and that hierarchies don't matter" highlights Bilal Zafar.

"The diversity at the Research Center is very important to me," states Bilal Zafar. "I firmly believe that interdisciplinary collaboration fosters groundbreaking research." Moreover, the Pakistani-born researcher feels very comfortable in the Ruhr region. "In Pakistan, people are very helpful and hospitable, and that is very important to me." Zafar explains. "In Bochum, I immediately noticed how friendly everyone is here and that hierarchies don't matter. Even famous professors are friendly and approachable, and that helped a lot with settling in."

Before his professorship, Bilal Zafar worked in industry for five and a half years. "I returned to the academic world because mentoring and supporting young minds in their education is a personal priority." emphasizes Zafar, remembering well how crucial that was for him. "Thanks to a scholarship, I was able to attend a university in Pakistan that I could never have afforded otherwise. I realized how much education and good teachers can influence a life. I benefited from it, and now it's my turn to give back."

Fairness and Diversity as a Matter of the Heart
His research field is no coincidence: Fairness and diversity are a matter of the heart for Bilal Zafar. Upon learning of AI's discriminatory tendencies, he was resolute: this had to change. He draws the motivation for his research from the proximity of the topic to everyday life. He draws the motivation for his research from the proximity of the topic to everyday life. "But science requires more: It means delving deep, even when the subject becomes more abstract." he says. Somewhere in the depths of language models, Bilal Zafar hopes to find the key to fostering fairness in AI.

Bilal Zafar's Career Path

  • 2011: Ericsson-PTA Mobile Excellence Award from the Pakistan Telecommunication Authority
  • 2012: Bachelor in Computer Science, LUMS SBA School of Science and Engineering, Lahore, Pakistan
  • 2015: Master in Computer Science, Saarland University
  • 2012 to 2018: Doctoral Researcher, Max Planck Institute for Software Systems, Saarbrücken
  • 2019: PhD, Saarland University
  • 2018 to 2020: Research Scientist Bosch Center for Artificial Intelligence, Renningen
  • 2020 to 2024: Applied Scientist, Amazon Web Services, Berlin
  • 2021: Otto Hahn Medal from the Max Planck Society for exceptional scientific contributions by young scientists
  • Since 2024:
    • Professor for Computing and Society, Ruhr University Bochum & Research Center for Trustworthy Data Science and Security
    • as well as member of  the Cluster of Excellence CASA and the Horst Görtz Institute for IT Security

General note: In case of using gender-assigning attributes we include all those who consider themselves in this gender regardless of their own biological sex.