Laura Hollink appointed as Professor Responsible AI in Culture and Media

Utrecht University has appointed Laura Hollink as professor of Responsible AI in Culture and Media, effective 1 March 2026. Her chair focuses on uncovering bias in AI systems, such as recommendation algorithms, and on developing inclusive AI within cultural and media organizations.

The rapid advancements in artificial intelligence raise important questions for the cultural and media sectors, explains Hollink. “Enormous strides have been made, particularly in language models, but also in personalization, recommendation systems, and decision support. Many organizations are now asking themselves: does the way we apply these technologies align with the values and norms we hold dear?”

There are legitimate concerns that AI systems may reinforce biases, lack privacy safeguards, and reduce human oversight. “Diversity and inclusion are core pillars of the cultural sector; it is crucial that everyone feels seen. While the sector has a wealth of knowledge in this area, the technology has not yet caught up.”

Libraries

Hollink cites recommendation systems in libraries as an example. In such systems, authors who receive more clicks become more visible to the public. This is a well-known phenomenon called popularity bias. “We found that this reduces the cultural diversity of recommendations. Libraries aim to serve their readers, and are not helped by systems that steer them towards a less diverse selection.”

News media also use AI, for instance to generate headlines, short descriptions, and summaries. “Even in these ‘small’ tasks, there is often a systematic bias in the models,” says Hollink. “Consider, for example, a more negative, stereotypical portrayal of people with queer identities. Organizations need to be aware of this when deploying AI.”

Hollink’s chair is not primarily about preventing bias, but about exposing it. Hollink: “What bias exists, and where in the chain? In the dataset, the algorithmic model, or the user interface? Can we identify where it occurs and how much it influences the outcomes?”

Another question concerns the definition of diversity and inclusion. “For a book recommendation system, does diversity mean a variety of authors, a range of subjects, or better representation of minorities? In previous research, we asked people in these sectors what they mean when they talk about diversity and inclusion, and we received different answers each time.”

Systemic analysis of bias

Together with her PhD students, Hollink will investigate how to detect and analyse hidden biases in language models. To do this, they will compile research datasets that can be used for large-scale, systematic analysis of bias in text generated by language models.

Hollink will combine her chair with her work as a researcher at the Centrum Wiskunde & Informatica (CWI) in Amsterdam. “As a national research institute, the connection with the university is very important for CWI. This allows us to contribute to education and establish collaborations within a broader network.”

Her first impression of her new workplace in Utrecht? “At Utrecht University, making an impact is not just a glossy layer over research, it is genuinely valued. For me, research is not just about collecting publications, but about making a difference in society. Moreover, there are excellent colleagues here in fields adjacent to mine. Whatever I am working on, there will always be someone to exchange ideas with, not only in the Faculty of Science, but also in the Humanities and Social Sciences. That was the deciding factor for me to start working here.”

Text: Utrecht University
Pictures: Harold van de Kamp

Laura Hollink

More information