Dec 10, 2006

the moral status of the corporation

Here's a useful resource for the January / February LD topic, "Resolved: the actions of corporations ought to be held to the same standards as the actions of individuals." [Update: for you LD researchers, this is my primary link.]

It's an article titled "The Moral Status of the Corporation," from the Journal of Business Ethics, from Volume 10, Issue 10, copyright 1991, by R.E. Ewin. (Use ProQuest or a university library to access the entire article--that is, unless you're liberal with your cash and want to pay $30 for the privilege.)

The conclusion gets to the heart of the matter:
Because they are artificial and not "natural" people, corporations lack the emotional make-up necessary to the possession of virtues and vices. Their moral responsibility is exhausted by their legal personality. Corporations can have rights and duties; they can exercise the rights through their agents, and they can in the same way fulfill their duties. If necessary, they can be forced to fulfil [sic] their duties. The moral personality of a corporation would be at best a Kantian sort of moral personality, one restricted to the issues of requirement, rights, and duties. It could not be the richer moral life of virtues and vices that is lived by the shareholders, the executives, the shop-floor workers, the unemployed, and "natural" people in general."
There are two ways this could run. For an affirmative running Kantian morality, the argument shows that corporations must be held to the same standards--"requirement, rights, and duties." For the negative running virtue ethics (or, perhaps, some other non-Kantian moral framework), it lines up well with an argument that corporations cannot be held to the same standards of approval or disapproval of various virtues and vices.

As a tangent, one must question the "natural / artificial" dichotomy that grounds Ewin's argument. Would an enhanced human, with artificial limbs, neurological prosthetics (such as a computer chip that would aid in memory) or other "artificial" characteristics be considered a "natural" person? Why wouldn't a robot programmed to obey Asimov's three laws be morally culpable for breaking them?

7 comments:

  1. Anonymous3:12 PM

    Assuming that a robot would act in a manner that would cause harm to itself, it would probably be from a lack of information. I.e. the robot accidentally throws a piano onto a human it didn't see to save two people under it. Did the robot act properly? Well, yes, considering what it knew and its intent.

    ReplyDelete
  2. 1. I'm not sure what you're getting at--couldn't the same analysis be applied to a "natural" person?

    2. Is it possible that a software corruption could produce an "evil" robot that intentionally violates Asimov's laws?

    3. Would we be justified in declaring that robot's actions immoral?

    ReplyDelete
  3. Jim, would you remind us clueless surfers-by if we're welcome to offer thoughts on these schoolish seeming topics?

    ReplyDelete
  4. Anonymous11:23 PM

    I had a response, but it wouldn't post. I am unsure why. Will retort tomorrow.

    ReplyDelete
  5. murky, of course you can. In fact, if I get the time, I'm going to try and arrange a mini-carnival of entries by some of my smart blogging neighbors, all related to this topic.

    ReplyDelete
  6. Anonymous8:04 PM

    I don't understand how this shows that a corporation must be held to that standard. ("requirements, rights, and duties.")...

    ReplyDelete
  7. If we adopt a Kantian moral framework, and if we argue that the corporation is an "artificial person" within that framework, then it ought to be held to Kantian moral standards. The explanation would form your contentions.

    ReplyDelete

Note: Only a member of this blog may post a comment.