Module VII·Article II·~2 min read

Data Ethics and Digital Ethics

Applied Ethics in the 21st Century

Turn this article into a podcast

Pick voices, format, length — AI generates the audio

Data — the New Oil or the New Nuclear Weapon?

“Data is the new oil” is a popular metaphor that deserves critical scrutiny. Oil extraction destroys ecosystems. Data “extraction” breaches privacy. Oil is a scarce resource. Data is excessive (there is ever more of it). Oil belongs to the one on whose land it is extracted. Data about you—who owns it? You? The company? The state?

Data about people is a special type of resource. It carries consequences for the lives of those about whom it is collected. Data on health, finances, behavior, movements, preferences—this is power over a person. Whoever controls this data—controls the possibilities.

GDPR and the Rights of Data Subjects

The European General Data Protection Regulation (GDPR, 2018) is the most comprehensive legal response to questions of data ethics. Key rights: access to one’s own data, correction, deletion (“the right to be forgotten”), portability, objection to automated decisions.

GDPR is based on principles: legality (there is a legal ground for processing), purpose (data is collected for a specific, declared purpose), minimization (only necessary data), accuracy, storage (no longer than necessary), security, accountability.

Fines up to 4% of global turnover or €20 million. This made GDPR the “de facto law” for all global business working with European users.

Algorithmic Discrimination and Fairness

Algorithms make decisions about credit, insurance, hiring, conditions for prison release, medical resources. These decisions affect people’s lives. If an algorithm discriminates—systematically treats women, minorities, the poor worse—this is an ethical problem.

Sources of algorithmic discrimination: training data reflecting historical biases (Amazon’s recruiting algorithm trained on male resumes discriminated against women); proxy variables (zip code as a proxy for race); erroneous “objectivity” (the algorithm appears neutral, but reproduces the bias of the data).

“Algorithmic transparency” and “explainable AI” are technical and regulatory answers. GDPR gives the right to know the logic behind automated decisions.

Question for reflection: Which algorithms influence decisions about you—in banking, insurance, employment? Do you know how they work? Do you have the right to challenge their decisions?

§ Act · what next