
- •Metaethics: where our ethical principles come from (for example, Social construction? Will of God?) and what they mean
- •Applied Ethics: examining specific areas (for example, business ethics) and specific controversial issues (for example, abortion, capital punishment)
- •1) Difficulty of proving Supernatural Existence
- •2) Religious people can be immoral.
- •4) Different religions promote different ethical systems.
- •In Aristotle’s ethics (arete) is “excellences of various types.”
- •Virtue ethics is about character (agent-centered)
- •1) Psychological egoism:
- •2) Ethical egoism
- •Values of Traditional Society:
- •Impartiality and equality
- •Intensity
- •In other words with his/her choice man is setting an example of what he/she thinks is the right thing to do
- •Niccolò Machiavelli
- •Is the corporation a moral agent?
- •Favored by just cause advocates: legally.
- •Favored by at-will advocates: through the promotion of a vibrant labor market in which jobs are frequently created and readily available.
- •It can create a climate of support for attitudes that harm women
- •Issues in Euthanasia:
- •Voluntariness and Non-consequentialism
- •Bioethics: stem cell research
- •1953: Watson and Crick determine the molecular structure of dna
- •2000: Human Genome Project
- •Individuals with rare genetic disorders
- •In 1992 in Orlando, Florida, 5% of the drivers were black or Hispanic, but they accounted for 70% of those who were stopped and searched.
- •Information, computer and roboethics
- •Intellectual property
- •Isaac Asimov’s Laws of Robotics (1942, I Robot):
- •56 Nations are developing robotic weapons
Intellectual property
Intellectual property rights on software ownership
Patenting a computer algorithm and the complaints of mathematicians of this action as a way to remove algorithms from the public domain and threatening the development of science
GLOBALIZATION
Global Laws: state laws on the internet do not apply to the rest of the world although websites are crossing borders
Global Cyberbusiness: the problem of the technological infrastructure gap
Global Education: internet access as a source of information or a disinformation?
ROBOETHICS
Isaac Asimov’s Laws of Robotics (1942, I Robot):
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Added Zero Law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
ROBOTS IN WAR
From Asimov’s idea that robots should be designed not to harm humans to the idea of using robots in war (and the added justifications)
56 Nations are developing robotic weapons
Human controlled robots: drones (Predators aircrafts), mine detectors, sensing devices
Autonomously operating robots: the robot makes its own decisions regarding the use of force on the filed without requiring human consent at the moment
Three Reasons are provided by the military for using robots in war (as 2007, source: Arkin)
Force multiplication (reducing the number of soldiers needed)
Expanding the battle space (conducting combat over larger areas)
Extend war fighters’ reach (allowing individual soldiers to strike further)
Problems of human controlled robots:
Lack of real experience of war makes it more probable
No risks for operators: lower the barriers of warfare and potentially leading to a new arms race
Civilians at more risk: it is already difficult to distinguish them on the battlefield, it is even worse when the device is remotely operated
Killing a human being or deleting a shadow from the screen at thousands of kilometers of distance?
Problems of autonomous robots:
Recognizing civilians
Recognizing wounded soldiers and soldiers willing to surrender
Deciding when to shoot
No emotions: lack of empathy
Undermining human responsibility