top of page

Blind Data

How automated and unbiased automated processes are? Are #data synonyms for exactness? Listen to this talk I gave at the University of Luxembourg #scienceslam organised by LuxDoc A.S.B.L


Last October, I took part in the #ScienceSlam competition held at the Neimenster Abbey in Luxembourg. I talked about a topic that truly interests me and informs my research. Beyond that, as a digital citizen, it is a topic close to my heart. I talked about the impact of the digital on research and society as it currently is far-reaching and ever-growing. We are surrounded -in fact overwhelmed - by Artificial Intelligence (#AI) and #data but a question still remains: how much can we trust computational and digital resources? Indeed, the alluring and reassuring promises of data neutrality, objectivity, fairness and accuracy have turned out to be illusory and fully data-driven methods are even more biased then the interpretative act itself. So what have we forgotten?

The alluring and reassuring promises of data neutrality, objectivity, fairness and accuracy have turned out to be illusory

To fully understand how an algorithm operates the way it does, one needs to look at it in combination with the data it uses, better yet at how the data must be prepared for the algorithm to function. This is because in order for #algorithms to work properly, that is automatically, humans have to render information into data, i.e., categorise information into database records. What the categories are, what belongs in a category and what does not, but also who decides what is what, who's in and who's out, these are all powerful assertions about how things are and are supposed to be.

Data is information, knowledge and knowledge is constructed, taken, interpreted.

And that is why AI magnifies the biases already present in society, it shows us what makes us human, for better and for worse. Thus, data, databases, and algorithms must always be critically assessed if we want to make sure that biases are not perpetuated further and marginalized communities no longer suffer from the repercussions of naïvely thinking of algorithms as impartial and of data as unbiased. I conclude the talk by saying that now that all information is digital, the humanity has in fact the responsibility to do so.


Watch the full video of the talk HERE


Follow the presentation slides HERE


Department of Art and Culture, History, and Antiquity

Faculty of Humanities

Vrije Universiteit, Amsterdam (VU )
De Boelelaan 1105, 1081 HV Amsterdam, Netherlands
 

  • White Twitter Icon
pngguru.com.png
hiclipart.com.png

© Copyright Lorella Viola and www.lorellaviola.me.uk, 2024. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited.

bottom of page