Friday, April 1, 2011

Are Researchers blinded by their own value in their work??

When considering the term “dual-use research,” there is an inherent tension in the background of its meaning that implies a dark twist on the honorable institution of science research and the actions/actors within that institution. If honest research can be tainted and misused for harm, is there any science that is safe from the grips of the evildoers that impose this harm? Furthermore, the type of iniquity that produces an opposing use of the research is capable of creating alternate forms of harm such as social, ethical and political. With the odds of sound research producing harmful unintended results, how is the researcher equipped to foresee these events before they unfold so he is better able to protect his research from misuse? In the book “Science, Policy and the Value Free Ideal,” author Heather Douglas examines the moral responsibility of researchers to reasonably predict consequences of their work. In chapter four, she breaks down the types of unintended consequences and what actions have been taken to ensure the purity of science discovery remains. Her objective however is to deconstruct the notion of a morally autonomous scientist by exposing the entwined way that science and society are bound together. While she contends that the responsibility of foresight and prediction for unanticipated outcomes of research can be distributed throughout groups outside the realm of science such as in public participation or ethics review boards, she upholds the claim that the scientist is the specialist and therefore in the best position to recognize and appreciate the implications of his work and the potential for harm to stem from within it. While it is certain that the knowledge of science is best acknowledged by the scientist, if her claim is that science and society are intertwined, than it cannot be the sole moral responsibility of the scientist to evaluate all implications of his work. The ability of a person to recognize the flaws or the inherent ability of ones work to be used for an alternate purpose is often lost when he becomes completely absorbed by their work. The more focused a person is on a goal or completion of a research interest, I would argue, the less likely they are to recognize the possibility of dual use or unintended consequences. Furthermore, often the unforeseeable consequences are a product of the research itself and may not have been a possibility until the research was conducted and reported upon. For instance, consider the drug Methamphetamine. This drug originally was used in World War II and dispensed to soldiers for its stimulant effects. At the time, this was a great benefit to armies that needed alert soldiers to fight long battles. We all know about the way this drug has been re-synthesized and misused in today’s society. Now consider the drug Pseudoephedrine. This drug has been marketed as a nasal decongestant that targets the alpha-adrenergic response system. Millions of people have benefited from the market of this drug, however millions of people have been harmed by the dual use of this drug to produce methamphetamine, a harmful and extremely addictive stimulant. From my chair I see no possibility of the researchers and producers of pseudoephedrine to foresee or predict this drugs potential for harm or misuse, mainly this lack of foresight is due to their goal of creating a vasoconstrictor that reduced nasal inflammation. Their focus upon their applied project was their downfall in their ability to foresee negative potentials for their work. It appears more likely that the misuse is derived from negative social applications of research, therefore making the unintended consequence an issue of social moral responsibility and less of a researchers moral responsibility. I agree with the authors stance on science being integrated with society and not acting as a morally autonomous agent. However, the moral responsibility to predict dual use applications of research should not primarily fall upon the researchers shoulders due to their own blindness to their works weaknesses and the unpredictable nature of society to misuse science.

3 comments:

  1. I agree that moral responsibility to predict should not fall on the researchers. We can barely predict the weather let alone how society will use research for their own purposes (good or bad).

    Nonetheless, then comes the question about the scientists working on the atomic bomb. They knew what its purposes was (a weapon), even though many saw a potential for its use in energy. They question then lies if the scientists had a moral obligation to say/do something. Or, did the potential good outcomes out way the bad?

    ReplyDelete
  2. I think the scientists felt that it was their moral obligation to do something....build a bomb that was so intense it would forever protect the american dream and the society that favored and fought for it. I suppose that it was not until reflection of their actions, post hiroshima, that the consequences were realized by the researchers. Perhaps the inventors of methamphatamine from psuedophedrine will feel the same way, although they probably died from an overdose before they could feel remorse.

    ReplyDelete
  3. Awesome dispatch! I am indeed getting apt to over this info, is truly neighborly my buddy. Likewise fantastic blog here among many of the costly info you acquire. Reserve up the beneficial process you are doing here. 파워볼사이트

    ReplyDelete