About Me

My photo
Australian philosopher, literary critic, legal scholar, and professional writer. Based in Newcastle, NSW. My latest books are THE TYRANNY OF OPINION: CONFORMITY AND THE FUTURE OF LIBERALISM (2019) and AT THE DAWN OF A GREAT TRANSITION: THE QUESTION OF RADICAL ENHANCEMENT (2021).

Wednesday, July 26, 2006

Supping with Leviathan - emerging biotechnologies and the state

The state has a role in regulating the new biotechnologies, but what should that role be?

Thoughts of a good Millian
I'm a good enough Millian to be suspicious of attempts by the state (or by the pressure of public opinion, if it comes to that) to interfere in the ways that competent adults choose to live their own lives. Mill's "harm principle" may not be justified all the way down into some kind of philosophical bedrock, but it does seem to be a reasonable one for modern democratic societies to respect in a wide range of cases where the state might otherwise be tempted to overreach. Some such principle is needed to protect us from Leviathan's attempts to control our lives, or else its powers become something to contemplate with fear as much as gratitude.

First, the state has a role in regulating "how we get there from here" - the manner in which emerging biotechnologies can be researched and invented. I hope that no one wants to see research programs that are cruel, exploitative of vulnerable people, or likely to produce shocking outcomes such as the birth of congenitally deformed children. There is a role here for regulators in ensuring that any research is conducted in ways that meet normal ethical requirements.

I think we should accept that this will sometimes create an impasse, whether temporary or permanent. For example, we don't have a safe technology for human reproductive cloning, and for the moment at least there's no obvious prospect of developing it without conducting unethical experiments. That does not entail that human cloning would be an intrinsically wrong act, or that it should be stigmatised as seriously reprehensible or anti-social by the enactment of draconian laws (such as those applying in Australia). The problem is just that there is no obvious ethical way of getting to safe human reproductive cloning from where we now are. If someone ever finds that way, then I suggest that their actions should not fall afoul of the law.

Second, the state has a role in developing standards to protect consumers from products or procedures that are not safe for their purpose. In areas such as this, where a high degree of expertise is required to make judgments, some degree of state paternalism is surely allowable before we begin to feel that it is offensive to us as competent adults. Though there comes a point with paternalistic legislation where it is, indeed, offensive, the state should be given some margin of appreciation to work within. More generally, the state has a role in regulating the safety and efficiency of industries within its jurisdiction, including the biotech industry. This might, for example, include establishing licensing schemes to ensure the competence and ethical responsiveness of practitioners and corporations.

What about the children?
So far, so good.

But does the state have a legitimate role that goes beyond these - possibly into the area of making moral judgments about what technologies people should use in the process of shaping their own lives and those of their children?

Where children are involved, there is often a double standard: attempts by parents to shape the development of children's capacities and personalities through "natural" means, such as exposing them at an early age to dubious moral or religious beliefs, are seldom criticised (except by the bold and forthright Richard Dawkins), but postulated attempts to use "unnatural" means are routinely castigated, even if they involve measures that might actually increase the ability of children to think rationally and autonomously as they grow older. Something odd is going on here - perhaps the good old "yuck factor" at work.

That is not to say that we adults should be able to do anything at all to shape children's personalities and capacities. Perhaps, indeed, we should be critical of parents in a wider range of situations than has historically been the case. Whether or not that is so, the same moral standard should apply to all attempts by parents to shape and socialise their children. That does not necessarily mean that the same legal standards should apply, since there may be factors to do with which laws are enforceable or which kinds of behaviours are so deeply rooted in human nature that attempts to prohibit them would simply be cruel, etc. However, it is significant that no special moral issue arises.

At first blush, the following suggestion looks plausible: before we prohibit a parental practice on the ground that we must protect children's interests, it should be a necessary condition, even if not a sufficient one, that the practice really is likely to be harmful to children. Sounds reasonable, yes? At least, we should be able to defend a claim that the practice presents a risk to children's flourishing in some publicly explicable sense. Many such claims are quite implausible when gazed at with a cold eye.

Supping with the sea monster
But should some technologies be suppressed even if they could be developed ethically, used safely, and applied to purposes that do no obvious harm to others (including to the flourishing of growing children)? This is where any role for the state becomes highly controversial. The first thing that should be said is that we need to very wary of legitimising Leviathan's actions in making judgments about whether we are acting in a manner that deviates from the most valuable forms of life, or that shows a lack of moral virtue, or expresses undesirable attitudes. These are just the sorts of judgments that would, quite rightly, horrify a liberal like Mill. If we're supping with the sea monster, here is where we need a very long spoon.

However, I do think that Leviathan has a legitimate stake if there is some prospect that fundamental elements of the social order found in modern liberal-democratic societies could be threatened by emerging technologies. This might mean, for example, that we don't want the state to permit the creation of beings who might be unusually dangerous to the rest of us, even if the harm to us is merely potential and the beings concerned will arguably flourish in their fashion (consider a happy, superhumanly powerful psychopath ). Again, we don't want the recreation of a pre-democratic, hierarchical kind of society in which some people flourish only at the expense of others who are subordinated. The state has an interest in trying to prevent this, even if it is likely to come about only gradually, with no direct harms inflicted during the process.

Well, what should Leviathan do?
It's one thing to say that Leviathan has a legitimate interest, but how should it actually act? Surely that depends on many things, including the magnitude of the risk identified, the probability of its occurrence, the difficulties that might be experienced in taking various actions, the utility of not taking action and letting events run their course, the most advantageous action that would have a good prospect of avoiding the risk, and so on. In other words, I suggest that the state should respond to these sorts of (sometimes quite intangible) risks in a manner not unlike the common law "calculus of negligence" that applies to civil claims for damages arising from breach of a duty of care. The analogy can perhaps be pressed further: for example, the state should not take action based on conjecture about some far-fetched or unforeseeable outcome. We know that any action, or any failure to act, may have far-reaching implications that no one could reasonably be expected to foresee. That is not a good reason for paralysis.

If, as I believe, emerging biotechnologies have an obvious prospect of bringing great benefits, the state should not suppress them on the basis of mere speculation about the social harms that they might also bring. An attempt should be made to assess what the realistic risks are, and to see what can be done to reduce those risks while minimising the loss of likely benefits. As I've written elsewhere, it would be deeply regrettable if some beneficial technologies had to be suppressed or severely regulated. If this happens, it should lead us to expressions of regret and a sense of humility, rather than to moralistic posturing.

Still, I can imagine scenarios that might prompt us to take some public action to affect not only the order in which certain technologies are developed but also the order in which a whole range of problems - scientific, technological, and social - are tackled. For example, there could be scenarios in which we must tackle poverty and inequality before we dare permit the sudden availability of a particular technology.

That is not actually my preferred policy option with any technology. The main one to which such thinking might apply is germ-line genetic enhancement, if this became unexpectedly flexible and powerful. Even here, however, I suspect that the practical reality will be that whatever is developed in the foreseeable future will not require suppression in order to avoid the recreation of hierarchical societies.

Nonetheless, it is legitimate for the state to assert a degree of control, partly because the viability of the social order is at least conjecturally at stake, and the more so because there are so many other aspects of technological development and distribution that clearly fall within Leviathan's proper purview - e.g. the above-mentioned matters of research ethics, consumer safety, etc. If obtaining control involves a rational program of differential problem solving, I can live with it, perhaps, though I'd want my say about what problems should be tackled immediately. (Hint: we can't go putting off everything else until we solve all the intractable problems of social and global justice; some far-reaching biotechnological research will have to be pursued at the same time.)

Drawing out Leviathan with a hook while keeping the candle burning
There has to be a limit somewhere as to what we embrace as legitimate action by the state.

Less legitimate action is exemplified by such knee-jerk (over-) reactions as the enactment of sweeping and harsh criminal prohibitions of human cloning, genetic manipulation of human embryos, and so on. Given that draconian overreactions have been so common in many jurisdictions, with no end in sight, perhaps the best thing that philosophically-minded people like me can do is simply try to keep alight the candle of reason, dissenting from the approaches that have been so dominant to date. Some of us need to maintain a reminder that all these complex matters to do with emerging technologies need to be considered coolly, without pre-emption by the yuck factor, or Kassian repugnance, or by even the instinctive impulse that we all sometimes feel to attempt to use the law to impose our personal views of what is virtuous, or what is a valuable way of life.

That's a pretty tough job. Of course, someone's got to do it.

No comments: