Showing posts with label Week 2. Show all posts
Showing posts with label Week 2. Show all posts

Sunday, January 30, 2011

The inherent tensions of science and military complexes.

As children, we are taught the history of the world wars and the tension between the allies and the “enemies.” And, like most children, there ensues a sense of deep conflict over the pride of victory but the empathy for those who died. It seems that the impetus of knowledge has not altered these fundamental tensions, but instead enhanced the grounds by which the tensions are felt.

Dominque Pestre states that during World War I there was no clear policy for the integration of the sciences into the military complex. I would like to think that this should not come as a surprise considering that at the very same time in history, governance over everyday hazards such as pharmaceuticals were still not standard and policy was not developed to generate safe regulations that prohibited the sale of things like mercury or ethylene-glycol from being included in drug therapy. It would seem odd if the United States government had enough involvement to implement policies to continue or govern the relations between military and science. A method of little policy concerning warfare tactics may be quite handy in times of warfare as it would seem a little contradictory to make policies surrounding the use of mustard gas during times of war when that gas may the best way to end the war. The same may also be said of the use of atomic weapons. While society may be quick to cover their eyes to evil that is created, there is little doubt of the world being a much different place without those very evils. This idea expresses the existence of internal tension surrounding the notion of ethics in military science advancements. Do we really want governance over technologies that can end the suffering of war? Pestre also states “it was the physicists who pushed the project forward” in reference to the creation of the atomic bomb. This was only after they contemplated the ethics and chose to advance the military initiative with open eyes. It wasn’t until after the moment of creation that regret became more powerful than the need to compete and overcome the “enemy.” I don’t find this situation very unique, as the creator of mustard gas also advanced his discovery with full knowledge of the capacity to do harm. It is worth considering that the regret may be from the personal involvement in the creation and not in the actual creation of the military device. All of the evils existed in part because there was no policy on the role of science in the military and yet had there been policy that inhibited those evils than maybe the world would be a much different place, for the better or the worse.

Friday, January 28, 2011

Ops Research as Social Science

In looking back on the development and uses of science in the 20th century, one of the themes that emerges is the growing ability to scale the creation and application of science. Science before the industrial revolution was often tied up in the craft based professions and experiential systems of transferring knowledge from generation to generation. Agriculture, mining, metalwork and the other initial scientific endeavors were built on small groups of individual craftsmen or artisans. The educational systems of pre-industrial civilization were closely tied to the privileged for skills such as mathematics and writing and philosophy and critical thinking, while accumulated knowledge such as farming and animal husbandry was passed on through apprenticeships or years of performing tasks. As cultures began to depend more on technology, science had to learn to scale the transfer of knowledge to wider circles of society and start developing methods of recording and educating more and more people. The history of science is in many ways tied to our ability to institutionalize what has already been learned. At the start of the industrial age, we learned that invention was mostly the work of individual inventors who accumulated enough cross disciplinary knowledge to make breakthroughs in technology. Once the breakthrough had been made industry could learn to apply it and distribute it to society. Carlson describes the work of Edison and Bell as examples of these inventors. As they created new technologies, they then focused on the operational aspect of manufacturing equipment, which became the artifacts that influenced the greater society.

In Mendelsohn and Porter, we were shown how government or society coops these ideas for the benefit of the greater society. Mendelsohn touches on the invention of the science of Operations Research, where scientific method is applied to systems of behavior and human endeavor, not just to natural phenomenon or to physical inventions. At this point the study of science begins to take shape. I believe the work done in World War II on Operations Research laid the foundations for the modern social scientists who do Science and Technology Studies and investigate the interaction of social systems and new technologies. The application of science to concepts of warfare is certainly more application specific than the current questions investigating how new technologies will drive social changes and how new technologies are at the intersection of society and science, but the study of how war was executed using new technologies and how even newer solutions could be used to counter those tactics, harkens to modern ideas of co-production and social construction. Personally, I have always been fascinated by the Operations work done in Design of Experiments during this age which gave us the first systems of multi-variable analysis and helped the war effort through economic and production optimization. These tools allowed science to develop systems that accelerate the formulation of questions and the identification of key gaps in knowledge that drive more rapid achievement. Science and Technology Studies can take these lessons and utilize them in the current environment of complex interdisciplinary research and social interaction with emerging technologies.

Innovation, but why?

Ancient peoples worshiped many gods, but modern civilization bows before a single principle: Innovation. As President Obama said in Tuesday's State of the Union address, “In America, innovation doesn't just change our lives. It is how we make our living.” He went on to use the word innovation ten more times, making it the major theme of his speech. Innovation is more than just a word, its influence can be seen in the ways that major institutions, such as business and the military, have re-organized themselves around a state of permanent innovation. In the following, I will examine two paths to this state, and its consequence for the scientific community and society at large.


Carlson traces the development of the corporate research and development lab. The first innovators were inventors, craftsmen who improved devices increment by increment. But as a systemic source of innovation, these small inventors typical of Industrial revolution were hobbled by a lack of capital, and the limitations of human knowledge. While tinkering with existing devices and principles was within the reach of many ambitious craftsmen, truly novel principles and the means to bring advanced technologies to market were out of reach.


Carlson traces the dawn of institutional innovation to the telegraph. As Western Union spread across the country, competing with local firms, railroads, financiers, and anti-trust lawyers, it became apparent that the difference between profit and extinct lay in harnessing the latest in electronics technology, usually by buying patents off of private inventors. Thomas Edison parlayed his success as an inventor into an immense private workshop, however General Electric and its chief scientist, Elihu Thompson, created the modern model of corporate R&D in 1900. Frustrated by the amount of coordination between scattered factories required to build an experimental car, he convinced the GE board to create a permanent lab conducting basic research.


At first, the purpose of the lab was purely defensive, to protect GE products from superior competitors. But as time passed, industrialists realized that new knowledge could be used offensively, to create new markets, to trade with competitors, and to improve public standing. Compared to the 'random genius' of inventors, management preferred scientific innovation because it seemed predictable and controllable. This basic pattern, with the added details of intra-industry collaboration and Federal support of risky technologies, has continued through the 21st century, although in real terms, large R&D labs have been responsible for surprisingly few breakthroughs, with much of the most creative work coming from smaller companies, a model best demonstrated in biotech and computers, where small start-ups with one piece of very valuable IP are purchased and developed by larger conglomerates.


A second side of institutional innovation is the military, which supports up to half of the basic research conducted in America. War and technology have long been closely intertwined, as brilliant explored by William McNeill in The Pursuit of Power. Perhaps the first noteworthy institutionalization of innovation was the British shipbuilding industry circa 1900, where an “Iron Triangle” of shipyards, admirals, and hawkish liberal politicians pushed steel to its limits with ever more powerful battleships. But it was not until WW1 that innovative warfare had its first chance to shine. Innovation was applied haphazardly, in the form of machine guns, poison gas, aircraft, tanks, submarines and anti-submarine warfare, but there was little coordination between scientists and soldiers. A new weapon would make an initial splash, but quickly add to the stalemate. The war was eventually decided by a German economic collapse.


Many of the scientific institution of WW1 were dismantled in the interwar years, but WW2 was above and beyond a war won by cutting edge science. Radar, operations research, airpower, and of course the atomic bomb were all products of Allied scientific knowledge, while jet fighters and rockets rolled off of Nazi lines at the close of the war. Federally supported labs, and defense companies who sold solely to the government proliferated, too many to name. With an obvious and immediate clash between the Allies and the Soviet Union at hand, neither side disarmed their scientific apparatus. Both sides sought to avoid a qualitative defeat, or worse, technological surprise, investing ever larger sums in military R&D, and leading to the domineering “military-industrial complex” of President Eisenhower's farewell address.


For scientists, these twin processes have been a mixed blessing. On the one hand, science has obtained a great deal of funding from industrial and military sources, orders of magnitude more than the pure 'pursuit of truth'. Yet, scientists have lost their autonomy, tied either to market forces or military imperatives. Biomedicine has improved healthcare, but also exponentially increased costs. The process of introducing a new drug is more akin to marketing than science or medicine. Through the military, “Science has known sin,” to paraphrase Oppenheimer's haunting phrase. Where for a period from about 1850 to 1945, the scientist could truly claim to represent a universal humanity, working towards the ends of destruction has permanently damaged scientific prestige and credibility. The values of science are subordinated towards petty, nationalist ends.


For society, pursuit of innovation has lead to the threat of man-made extinction through nuclear war. The process of action-reaction in the arms race brings us ever closer to the brink of annihilation. From the market side, the permanent churning of the basic constituents of society has created an immense dislocation. Skills and jobs can become obsolete in less than a decade. With new-found material wealth came a crass materialism. The objects around us change constantly, their principles of operation becoming ever more opaque. The deep sense of unease pervading American society might be reasonably traced to chronic future shock. Innovation is a god, but it has become Moloch, concerned solely with profit and military might.


So, to return to the State of the Union. I've read it several times, and I feel conflicted. It's a good speech, certainly, and I agree with many of the specific policies he outlines for a continued investment in innovation, yet there is a certain hollowness to it, a failure to grapple with the crux of why we innovate. The main drive to innovate is material, the jobs of the 21st century should be located in America, yet we don't know that innovation will bring back jobs, at best we know from the lessons of the past that a failure to innovate will mean the loss of more jobs. But the ultimate hollowness came at the end. President Obama made a deliberate callback to the space race, with the phrase “Sputnik moment,” but President Kennedy knew where we were going; the moon, in ten years.


Obama's answer to Kennedy, “I'm not sure how we'll reach that better place beyond the horizon, but I know we'll get there. I know we will.”


That's certainly true. We'll definitely make it to the future the old-fashioned way, by living it, one day at a time. But that's no guarantee that the future will be any place we want to live. Right now, all we have is a notion that America must be wealthier than China. As individuals, as a nation, and as a species, we must decide what is beyond that horizon, and we must build the institutions of governance to take us there.

Thursday, January 27, 2011

The Creation of a Uniform Identity

Who are you? What are you? These sound like existential questions philosophers have been trying to address since the moment of self-actualization. Yet, we are asked them whenever applying for new jobs or programs. Every ten years, the constitution mandates that we face these daunting metaphysical questions.

However, it is not the questions themselves that are intriguing. The established answers people must choose from taken in junction with the questions asked produces consequences (unintended or not).
  • Which age group are you in?  Under 19, 20-29, 30-39, 40-49, 50-59, 60+
  • What is your gender? Male or Female
  • What is your race? Caucasian, Hispanic, Black, Asian, Other
  • Which social class do you place yourself and your household in? Poor, middle class, or upper class
In his chapter in Science in the 20th Century by John Krige and Dominique Pestref, Theodore M. Porter described how census and social data created and reinforced labels and identity. Along this same line, Michael Foucault discussed how analysis (like the census) is a form of political power that has normalization powers. People are forced to identify themselves with one of the provided groups (and choose only one!). Such distinctions implies there are relevant social/economical/political differences between the groups. Then there is the ambiguous “other” category. When a person does not identify themselves with on of the chosen labels, they are forced to identify themselves as ‘different’ and not worth a label.

This idea of identity and social labels being created and reinforced by science intrigued me. As an identical twin, I spent years with many internal existential debates trying to figure out who I am as an individual separate from another who shares 99.99% of my DNA.--Yes, my dad bought a test to learn how identical we were. (This company produced and marketed these tests now has my DNA frozen somewhere for some unknown future research.) Would this search have been quicker if I just looked at government surveys and my genome? Take a DNA test to figure out who you are physiology then take a government survey to figure out who are socially...done.

Why propels us to do this? Taking a historical prospective, Daniel Kevles provided a timeline of heredity and genetics research in regards to eugenics and genetic manipulation. Taking momentum in 20th century, this ‘science’—discussion on whether this was a proper science occurs outside the scope of this blog—was used to explain how those outside the promoted norm came to be scientifically. Science was then used to justify pushing the “outsiders” further away from the norm if not eliminating this group.

Thus, is it actual fear or just a lack of understanding that leads us to do this? How big a role does prejudice play? In our capitalistic world, where does money matters come into play? Trying to come up with a universal answer is just as fruitless as trying to procure a universal reason why people created and joined the Tea Party movement. Each individual has their own logic and reasonings based on their own history and education.

Nonetheless, Keyles does describe how eugenic scientists tried to move away from perceived and actual prejudice focusing instead on specific aspects of heredity and genes. The human genome project ignited fears of potential use of science to promote bigotry. These fears never came to be.

Yet, we must ask ourselves if we facing a new wave of potential abuses with the growth of DNA testing. If so, the masses have not noticed or are choosing to agree or stay silent.  The Genetic Information Nondiscrimination Act (GINA) signed in May 2008 and went into effect a year later bans the discrimination by health insurers and employers on the bases of DNA. I myself do not feel protected enough by this act. This acts is built to be reactive: punish those who have already committed the act. Proactive policy is needed, but in a legal and government system built to be reactive, this will require the masses to do what they were supposed to do after the Human Genome Project: education themselves, unite their voices, and push for action.