Monday, January 31, 2011

The Human dimension of technology: Individual Vs. Machine.

The Human dimension of technology: Individual Vs. Machine.


Survival, that is the very origin of all technology we know as human race. Since the begging of human history, the evolution of the specie has been deeply interconnected with a variety of methods, processes and tools that gave us the ability to survive to: weather, using the knowledge on stars and seasons; shortage of resources, using agriculture; self protection, using weapons and; achieving race endure throughout time, using society.


It is determinant to understand that human technology does not circumscribe to material artifacts. In an anthropological sense, an artifact is anything man made, in this particular case there exist a complex variety of non material artifacts that govern our lives since we base our reality in technology: frontiers, race, social hierarchies, moral, languages, economic models and so on, all of them human technologies thought to simplify or overcome significant complications present through human history.


Technology is understood in a material sense but based on human needs, it turns to be much more that devices, gadgets and machines, human artifacts are based on the most important technology ever created by man: society.


Derived from society, every other sort of technology, material or intellectual, has been constructed upon that fundamental apparatus. Thus, when tracking back any rationale behind any given technology, society manifest to give meaning and use to that technology. That is the primary human dimension of all technology, which is valued and enhanced following commandment, stipulations, requirements and practices embedded in society.


When technology is in need of governance it takes its principles out of social ideas and constructions, in this way policies on technology are a reflection of what the society sets as advantageous and functional.


Technology, been theorized as a social construction, makes me wonder how society, constituted by individuals, transforms in to technology? In other words, how a social machine creates the technology to keep going? Is that even observable?


Example. In a society where capital is the fuel that keeps machinery dynamic, the priority would be to maintain that fuel flowing in order to survive, which as stated before is the higher motive for technology to exist, then what would the system (the society) would do to endure through time? It will create technology to keep fuel circulating. Bearing this in mind, let us track back the origin of that fuel; capital must be present in every single process within society; if somebody is sick, in other words not capable of contributing to the system, the society will generate the proper mechanism to make that individual healthy again, so he can reinsert in the societal mechanism. It is obvious that there are guidelines to be part of the system, the most important is ultimately making the organism resilient enough to subsist, consequently the medical artifacts will manifest ; again the question, technology will manifest because of the need of the individual to be healthy or because of the potential inability to be part of the system?

Is society making us parts of a machine, that is per se struggling to survive?

Are we as individuals still in control?


If the thesis of society as the main force that drives the functioning is logic, we are artifacts of society, we live as social technology capable to perform determined activities, and only that. As in any other systems if we are not able to perform properly in the system we became obsolete and we are not useful anymore. A clear example of this phenomena would be the lack of faculty to generate fuel for the system, without this ability we are not an artifact worthy of the system, we became in the best scenario a malfunctioning part, but eventually useless technology.


The highest question to answer is, are we part of a functioning system, is the system in where we function coherent with our individuality?

Sunday, January 30, 2011

The inherent tensions of science and military complexes.

As children, we are taught the history of the world wars and the tension between the allies and the “enemies.” And, like most children, there ensues a sense of deep conflict over the pride of victory but the empathy for those who died. It seems that the impetus of knowledge has not altered these fundamental tensions, but instead enhanced the grounds by which the tensions are felt.

Dominque Pestre states that during World War I there was no clear policy for the integration of the sciences into the military complex. I would like to think that this should not come as a surprise considering that at the very same time in history, governance over everyday hazards such as pharmaceuticals were still not standard and policy was not developed to generate safe regulations that prohibited the sale of things like mercury or ethylene-glycol from being included in drug therapy. It would seem odd if the United States government had enough involvement to implement policies to continue or govern the relations between military and science. A method of little policy concerning warfare tactics may be quite handy in times of warfare as it would seem a little contradictory to make policies surrounding the use of mustard gas during times of war when that gas may the best way to end the war. The same may also be said of the use of atomic weapons. While society may be quick to cover their eyes to evil that is created, there is little doubt of the world being a much different place without those very evils. This idea expresses the existence of internal tension surrounding the notion of ethics in military science advancements. Do we really want governance over technologies that can end the suffering of war? Pestre also states “it was the physicists who pushed the project forward” in reference to the creation of the atomic bomb. This was only after they contemplated the ethics and chose to advance the military initiative with open eyes. It wasn’t until after the moment of creation that regret became more powerful than the need to compete and overcome the “enemy.” I don’t find this situation very unique, as the creator of mustard gas also advanced his discovery with full knowledge of the capacity to do harm. It is worth considering that the regret may be from the personal involvement in the creation and not in the actual creation of the military device. All of the evils existed in part because there was no policy on the role of science in the military and yet had there been policy that inhibited those evils than maybe the world would be a much different place, for the better or the worse.

Friday, January 28, 2011

Democratic plurality or enlightened despotism?

In a globally connected and technologically complex world the process of democratic decision making becomes increasingly difficult.  Never have so many known so little about so much.  In this light, how should a democratic society process and handle technical decisions when outside the core of decision making are people who are cut off, don’t understand, and wouldn’t understand even if they got in to find out? 

 

Since the beginning of the Enlightenment era science and scientific ideologies have been entangled with politics and used by the state to regulate society.  Chapter 4 of “of “Historical Perspectives on Science and the State…” provides a pertinent historiography on the interaction of the sciences with the political world and the state.  The author, Dominique Pestre, discusses ways the sciences can be “sites of power” by contributing to the management of the state and society and how they are an integral part of larger techno-political systems.  As such, scientific experts and specialists have long since been affecting decision making on scientific matters, therefore and to some degree replacing democracy with technocracy. 

 

Pestre points out that this reliance on experts requires a “conviction that expertise does exist”, “…the certainty that social and natural facts can be assessed authoritatively by science.”  He continues, …science has thus become an authority to legitimate public action, to ‘technicalize’ public action, to ‘de-politicize’ it, to render it impersonal, thus bypassing the democratic rules of accountability.  He suggests this leads to an “instrumentalization of politics through use of specialists”, which gives to political decisions the force of necessity, therefore a substitutions of, “…competence and technical knowledge for the affirmation of will and of values deliberately chosen.”  At the extreme, “…‘scientific politics’ converges with political practice reduced to certitudes of a well managed leviathan state”, thus a technocratic management of society. 

 

The notion of the few making decisions for the many is obviously problematic, but when decision making becomes so complex it requires esoteric knowledge the democratic process breaks down.  The question then is, does the use of experts merely constitute a form of enlightened despotism?  Or is there a better -- more democratic way -- to achieve a genuine plurality and to effectively combine a confluence interests and opinions which can competently address complex scientific decisions?  A difficult dilemma indeed!   

 

Ops Research as Social Science

In looking back on the development and uses of science in the 20th century, one of the themes that emerges is the growing ability to scale the creation and application of science. Science before the industrial revolution was often tied up in the craft based professions and experiential systems of transferring knowledge from generation to generation. Agriculture, mining, metalwork and the other initial scientific endeavors were built on small groups of individual craftsmen or artisans. The educational systems of pre-industrial civilization were closely tied to the privileged for skills such as mathematics and writing and philosophy and critical thinking, while accumulated knowledge such as farming and animal husbandry was passed on through apprenticeships or years of performing tasks. As cultures began to depend more on technology, science had to learn to scale the transfer of knowledge to wider circles of society and start developing methods of recording and educating more and more people. The history of science is in many ways tied to our ability to institutionalize what has already been learned. At the start of the industrial age, we learned that invention was mostly the work of individual inventors who accumulated enough cross disciplinary knowledge to make breakthroughs in technology. Once the breakthrough had been made industry could learn to apply it and distribute it to society. Carlson describes the work of Edison and Bell as examples of these inventors. As they created new technologies, they then focused on the operational aspect of manufacturing equipment, which became the artifacts that influenced the greater society.

In Mendelsohn and Porter, we were shown how government or society coops these ideas for the benefit of the greater society. Mendelsohn touches on the invention of the science of Operations Research, where scientific method is applied to systems of behavior and human endeavor, not just to natural phenomenon or to physical inventions. At this point the study of science begins to take shape. I believe the work done in World War II on Operations Research laid the foundations for the modern social scientists who do Science and Technology Studies and investigate the interaction of social systems and new technologies. The application of science to concepts of warfare is certainly more application specific than the current questions investigating how new technologies will drive social changes and how new technologies are at the intersection of society and science, but the study of how war was executed using new technologies and how even newer solutions could be used to counter those tactics, harkens to modern ideas of co-production and social construction. Personally, I have always been fascinated by the Operations work done in Design of Experiments during this age which gave us the first systems of multi-variable analysis and helped the war effort through economic and production optimization. These tools allowed science to develop systems that accelerate the formulation of questions and the identification of key gaps in knowledge that drive more rapid achievement. Science and Technology Studies can take these lessons and utilize them in the current environment of complex interdisciplinary research and social interaction with emerging technologies.

Revisiting Nuclear and Childhood Reflections


Above is a photo of a presentation I attended a few winters back about the Manhattan Project. Pictured is a collection of rocks gathered from nuclear test sites across the globe. This lecture was held in an art gallery and part of a broader initiative entitled the Body Cartography Project: "1/2 Life" which sought to “address the environmental problems of nuclear residue and indestructible plastics”. Speaking was a professor of physics at the University of Minnesota who worked on the project and witnessed the testing. Hans told the story of his recruitment into the project stating that he had one semester of science (I don’t think it was even a physics course) when he enrolled into the army. He was of course a grunt-worker in the lab performing menial tasks and gradually gathered information about the project on his own. He was cut off from friends and family completely for the duration of his time there. In describing his viewing of the Trinity test, you could see the memory in his eyes and his only response to the question of what it was like was, “Awesome”.

This story relates to the interconnections between universities, the military, governments, and scientific pursuit discussed in most detail in “Science, Scientists, and the Military”. The “need to know” basis experience of Hans could hardly be an anomaly in continued sensitive research. Could he be held partially responsible for the bomb by ensuring day-to-day logistics of the base? The moral debate surrounding the supposed innocence of scientists is still relevant today. In discussing the responsibility of scientists while conducting research, my physicist roommate made the distinction between science and technology in that scientists created knowledge and then others decided what to do with it. This hands off approach is clearly problematic as the quote from the paper suggests, “It was physicists who saw the explosive implications and it was physicists who pushed the project forward after briefly considering in 1939 the idea of keeping the work secret and perhaps achieving a moratorium at least on publication, if not on further experimentation” (Mendelsohn,189). I have this debate often with my “hard science” friends. They usually mention the drive and joy of solving a problem as the catalyst for research. But how could one knowingly disattach such results from the work that they do? Even as a social scientist, I find most war technology to be awe inspiring, interesting, and all the rest. Yet milliseconds later the realization of what they are used for creeps in and ends those emotions. The impetus for technological innovation on military grounds has been incredible, yet why can’t we channel that energy for, say, reinvesting in renewables? These are some questions I hope to explore over the next few years.

On a different note, and one that I am sure to return to, the opening paragraph of the Carlson piece has the reader imagine a child’s response to describe the image of the scientist. He then goes onto say that children are a mirror for cultural change. I have a nephew going on 4 and I am continually astonished at some reflections of society enacted through his play. I will keep my analysis to one example for the time being. Originally Tommy began to explore “name and find” games as I will call them through books. (“Can you find the blue butterfly?) More recently, he has been playing interactive games on PBSkids.org (and will adamantly demand to play them at any time which could throw me into a long discussion about technology, instant-gratification and a host of other issues, but I digress). One day over winter break, he said out loud, “Let’s find the toy-box,” pointed at the toy-box and exclaimed, “click!” My mom becomes irritated with me when I attempt to analyze societal issues through my nephew, but I couldn’t help but find this enactment to be a perfect example of the blurring between technological or more specifically, online behaviors and real-life interactions. I suppose I wouldn’t label this as a negative outcome of learning games online, however, it is simply indicative of the way in which technology changes our actions.

The language of war and the rise of the medical-industrial-complex

The narratives that political leaders and stakeholders use to tell a story can have many components. Socio-economic statistics, religious and nationalistic images all may contribute to the narrative soup, with varying degrees of persuasion. One of the most common and effective narrative tools has been the language of conflict and war. Images of war have been used by poets and presidents to bring tears of pride or tears of sadness to mass audiences. Although these days, Presidents usually use professional bards or speech writers to help tailor their message, the language of war remains. Warlike analogies have been used to rally soldiers and scientists alike to the defense of hearth and home. Scientists were key players in Allied dominance during the World Wars and their contributions did not go unnoticed.

With the recent rise of the National Institute of Health (NIH) over the past several decades, we see similar language as endemic within the system. The NIH currently receives more basic research funding than any other agency, including the Department of Defense. It should be no surprise that the imagery of war is used to inspire and defend budgetary allocations within what might be called the Medical-industrial-complex.

Just as the role of scientists was fundamentally transformed by two World Wars, so was the role of medical professionals. While there is a distinction between medical practitioners and medial researchers, they share a common language when speaking of conquering disease or other public health concerns. The wars against cancer, obesity, and heart disease have raged just as fiercely as any other geo-political conflict. Instead of using rifles and body armor, they use statistics and a sense of public health urgency to fight their particular medical battles. The language of war helps get the conflict funded, but who benefits in the end?

As private funding for research and development has increased, one might question where the societal safeguards have gone? Do the best practices that were developed with government oversight still exist? After the Second World War, the role of scientists shifted more toward applied research. Government involvement in this research funding process inherently provided the assumption of public stewardship. Scientists didn’t have to worry about the application of their research because someone in government already had that figured out for them.

Combine the language of war and medical statistics together, then put them on stage with market-driven frameworks and societal good becomes endangered. These powerful tools may be used outside the realm of good governance for purely profit-driven motives. With networks of unregulated private capital increasingly funding research, scientists may come to another defining moment.

Unlike the Sputnik moment recently resurrected in the State of the Union address, we have a more subtle, pivotal transformation taking place in the medical science sector. The language of war is still present in the battle against xyz disease, but the tactics are changing. Just at the military uses more technological solutions in warfare, so has the medical industry come to rely upon technical solutions to promote public health. The incentive to produce a profitable pill, medical machine or diagnostic device drives healthcare innovation just as much as actual warfare drives military defense research. So it’s no wonder that the warlike language of persuasion is reflected in both realms.

Innovation, but why?

Ancient peoples worshiped many gods, but modern civilization bows before a single principle: Innovation. As President Obama said in Tuesday's State of the Union address, “In America, innovation doesn't just change our lives. It is how we make our living.” He went on to use the word innovation ten more times, making it the major theme of his speech. Innovation is more than just a word, its influence can be seen in the ways that major institutions, such as business and the military, have re-organized themselves around a state of permanent innovation. In the following, I will examine two paths to this state, and its consequence for the scientific community and society at large.


Carlson traces the development of the corporate research and development lab. The first innovators were inventors, craftsmen who improved devices increment by increment. But as a systemic source of innovation, these small inventors typical of Industrial revolution were hobbled by a lack of capital, and the limitations of human knowledge. While tinkering with existing devices and principles was within the reach of many ambitious craftsmen, truly novel principles and the means to bring advanced technologies to market were out of reach.


Carlson traces the dawn of institutional innovation to the telegraph. As Western Union spread across the country, competing with local firms, railroads, financiers, and anti-trust lawyers, it became apparent that the difference between profit and extinct lay in harnessing the latest in electronics technology, usually by buying patents off of private inventors. Thomas Edison parlayed his success as an inventor into an immense private workshop, however General Electric and its chief scientist, Elihu Thompson, created the modern model of corporate R&D in 1900. Frustrated by the amount of coordination between scattered factories required to build an experimental car, he convinced the GE board to create a permanent lab conducting basic research.


At first, the purpose of the lab was purely defensive, to protect GE products from superior competitors. But as time passed, industrialists realized that new knowledge could be used offensively, to create new markets, to trade with competitors, and to improve public standing. Compared to the 'random genius' of inventors, management preferred scientific innovation because it seemed predictable and controllable. This basic pattern, with the added details of intra-industry collaboration and Federal support of risky technologies, has continued through the 21st century, although in real terms, large R&D labs have been responsible for surprisingly few breakthroughs, with much of the most creative work coming from smaller companies, a model best demonstrated in biotech and computers, where small start-ups with one piece of very valuable IP are purchased and developed by larger conglomerates.


A second side of institutional innovation is the military, which supports up to half of the basic research conducted in America. War and technology have long been closely intertwined, as brilliant explored by William McNeill in The Pursuit of Power. Perhaps the first noteworthy institutionalization of innovation was the British shipbuilding industry circa 1900, where an “Iron Triangle” of shipyards, admirals, and hawkish liberal politicians pushed steel to its limits with ever more powerful battleships. But it was not until WW1 that innovative warfare had its first chance to shine. Innovation was applied haphazardly, in the form of machine guns, poison gas, aircraft, tanks, submarines and anti-submarine warfare, but there was little coordination between scientists and soldiers. A new weapon would make an initial splash, but quickly add to the stalemate. The war was eventually decided by a German economic collapse.


Many of the scientific institution of WW1 were dismantled in the interwar years, but WW2 was above and beyond a war won by cutting edge science. Radar, operations research, airpower, and of course the atomic bomb were all products of Allied scientific knowledge, while jet fighters and rockets rolled off of Nazi lines at the close of the war. Federally supported labs, and defense companies who sold solely to the government proliferated, too many to name. With an obvious and immediate clash between the Allies and the Soviet Union at hand, neither side disarmed their scientific apparatus. Both sides sought to avoid a qualitative defeat, or worse, technological surprise, investing ever larger sums in military R&D, and leading to the domineering “military-industrial complex” of President Eisenhower's farewell address.


For scientists, these twin processes have been a mixed blessing. On the one hand, science has obtained a great deal of funding from industrial and military sources, orders of magnitude more than the pure 'pursuit of truth'. Yet, scientists have lost their autonomy, tied either to market forces or military imperatives. Biomedicine has improved healthcare, but also exponentially increased costs. The process of introducing a new drug is more akin to marketing than science or medicine. Through the military, “Science has known sin,” to paraphrase Oppenheimer's haunting phrase. Where for a period from about 1850 to 1945, the scientist could truly claim to represent a universal humanity, working towards the ends of destruction has permanently damaged scientific prestige and credibility. The values of science are subordinated towards petty, nationalist ends.


For society, pursuit of innovation has lead to the threat of man-made extinction through nuclear war. The process of action-reaction in the arms race brings us ever closer to the brink of annihilation. From the market side, the permanent churning of the basic constituents of society has created an immense dislocation. Skills and jobs can become obsolete in less than a decade. With new-found material wealth came a crass materialism. The objects around us change constantly, their principles of operation becoming ever more opaque. The deep sense of unease pervading American society might be reasonably traced to chronic future shock. Innovation is a god, but it has become Moloch, concerned solely with profit and military might.


So, to return to the State of the Union. I've read it several times, and I feel conflicted. It's a good speech, certainly, and I agree with many of the specific policies he outlines for a continued investment in innovation, yet there is a certain hollowness to it, a failure to grapple with the crux of why we innovate. The main drive to innovate is material, the jobs of the 21st century should be located in America, yet we don't know that innovation will bring back jobs, at best we know from the lessons of the past that a failure to innovate will mean the loss of more jobs. But the ultimate hollowness came at the end. President Obama made a deliberate callback to the space race, with the phrase “Sputnik moment,” but President Kennedy knew where we were going; the moon, in ten years.


Obama's answer to Kennedy, “I'm not sure how we'll reach that better place beyond the horizon, but I know we'll get there. I know we will.”


That's certainly true. We'll definitely make it to the future the old-fashioned way, by living it, one day at a time. But that's no guarantee that the future will be any place we want to live. Right now, all we have is a notion that America must be wealthier than China. As individuals, as a nation, and as a species, we must decide what is beyond that horizon, and we must build the institutions of governance to take us there.

Thursday, January 27, 2011

Applied Science: Inevitable results?

Fitzgerald’s Mastering Nature and Yeoman explores the effect that applied science has had on agricultural practices in the developed, and increasingly the developing, world.  Attention is paid to technological and organizational advancements in crop breeding, animal rearing, and pest control, with particular consideration toward the increasing use of industrial-type processes in the food industries.  Rather than being just a passive receiver of techniques, agriculture has been the impetus for growth in a number of scientific disciplines.

Although seeming to avoid normative judgments, the author paints an image of an agricultural sector dominated by industrial organization and advanced technological processes that tends toward a feeling of “look what science hath wrought”.  Fitzgerald mentions the lack of taste of Hanna’s genetically-engineered tomatoes, the economic and health effects of animal feed additives, and the deleterious effects of pesticide use throughout the nineteenth and twentieth centuries in a way indicating disdain for modern agricultural practices, making no attempt to separate “applied science” from these industrial techniques in her opening paragraph.  Although a subtle distinction is made between the science practiced in “land grant systems” and that practiced by “private companies”, the author then declares those differences are “not as stark as one might imagine”, further cementing the idea that all methods of science application to agricultural practices manifests in much the same way.

I wouldn’t disagree with Fitzgerald concerning her identification of the negative outcomes that can result from the industrial agricultural methods that are currently in use, but I would disagree with her unstated (or at least not contradicted) assumption that science applied to agriculture inevitably manifests in the same types of practices.  Although a relatively emerging market and field at the time of the chapter’s writing, organic farming makes use of scientific practices which, advocates would argue, emphasize quality of food over quantity (a consistent goal of industrial agriculture).  Organic advocates point to the ill effects of industrial agriculture identified by Fitzgerald and assert that organic practices can alleviate many of those ills through judicious application of the scientific method.  Much university and private sector research has focused on both the benefits and hurdles of small- and large-scale organic farming, continuously using experimentation and other methods to enhance knowledge of crop growth, pest control, transportation, and other techniques that adhere to organic guidelines and notions of sustainable practices. 

The key factor lies in the general goal of the research; in this case of agriculture, should production be geared toward quantity and lower prices, as with current industrial practices, or quality and higher prices, as with current organic practices?  This isn’t the only dichotomous configuration that agricultural research goals can take.  One can imagine a world in which the dominant question would be whether to develop green or purple crops, or sweet smelling or sour smelling milk, using the same scientific practices that Fitzgerald implies inevitably causes the ills outlined above.  In this respect, science is essentially goalless.  The only goal that applied science can be said to have is to control something, but the question of “controlling what?” is left up to the researcher.  Too often in the public’s mind, and evidently in the author’s, science is strongly connected with industrialized practices.  Science, however, is strictly a tool.  Like all other tools it can be used in a number of ways  by a number of users, with the achieved result being dependent on the aims of the one using the tool. 

From liberal humanism to technocratic fascism.

Dominique Pestre, in his chapter "Science, Political Power and the State," looks at the development of the political-military-industrial-academia complex in the United States and Europe. Or, put in a less cumbersome manner, Pestre is interested in examining the political mobilization of science in the 20th-century. Though, as he argues, science at many times sits in for political processes as an authority unto itself:
Since science is a discourse that claims not to depend on partisan decisions, it enables one to 'technicalize' public action, to 'de-politicize' it, to render it impersonal, to bypass the democratic rules of accountability. This mode of action leads to an instrumentalization of politics through the use of specialists, it gives to political decisions the force of necessity, and it comes to substitute competence and technical knowledge for the affirmation of will and of values deliberately chosen.
Scientific and technocratic decisions thus come drenched with the scent of neutrality, which covers over the inherently political processes operating just beneath the surface. In other words, good science fools us into thinking that we needn't deliberate about its implications; it gets to pass go and collect $200. As Pestre points out, this was especially true in the transition period after the war when science and social sciences were mobilized to win the war and then found themselves taking that ethos back to post-war laboratories. The Cold War, of course, presented a perfect opportunity to stress the continued importance of ongoing technological and scientific advancements to win the new war. And today we have this ethos continued on in the ever-more-ubiquitous war on terror. As with most who talk about these "cold wars," Pestre underplays the severity of the "contained local conflicts" simply because they happen to be folded into the ambiguous boundaries of the greater Cold War.

Pestre emphasizes how the instrumentalization of science plays a central role in the transformation of what it means to do science. Thus scientific and technological gadgets, including devastating weapons, took center stage during the second world war. And this emphasis carried on after the war, thus displacing the more abstract philosophical questions that science had been equally interested in tackling. The delineation is clearly present in the uproar that ensues around stem-cell research (i.e. science playing God) and the absolute lack of concern around the proliferation of information technology and personal computing devices. The implications of the latter on civilization are no less ethically profound, but they pass along into our hands with no concern whatsoever. We don't even think to politicize the production of the iPhone or iPad; that's just "good science" at work put towards a seemingly practical end. Unfortunately, the massive land-fills of "out-of-date" computer devices in China tells a different story and challenges the goodness of the science. However, we typically ignore such issues because a distance has been placed between science and politics. Science does things and politics does ideas, or something like that. A wedge has been driven between the humanism of the Enlightenment project and the practice of science, where science, when it invents things like the atomic bomb, ends up embodying something like a technocratic fascism. In what other way could we describe something as monstrous as the creation and use of the atomic bomb?

The Creation of a Uniform Identity

Who are you? What are you? These sound like existential questions philosophers have been trying to address since the moment of self-actualization. Yet, we are asked them whenever applying for new jobs or programs. Every ten years, the constitution mandates that we face these daunting metaphysical questions.

However, it is not the questions themselves that are intriguing. The established answers people must choose from taken in junction with the questions asked produces consequences (unintended or not).
  • Which age group are you in?  Under 19, 20-29, 30-39, 40-49, 50-59, 60+
  • What is your gender? Male or Female
  • What is your race? Caucasian, Hispanic, Black, Asian, Other
  • Which social class do you place yourself and your household in? Poor, middle class, or upper class
In his chapter in Science in the 20th Century by John Krige and Dominique Pestref, Theodore M. Porter described how census and social data created and reinforced labels and identity. Along this same line, Michael Foucault discussed how analysis (like the census) is a form of political power that has normalization powers. People are forced to identify themselves with one of the provided groups (and choose only one!). Such distinctions implies there are relevant social/economical/political differences between the groups. Then there is the ambiguous “other” category. When a person does not identify themselves with on of the chosen labels, they are forced to identify themselves as ‘different’ and not worth a label.

This idea of identity and social labels being created and reinforced by science intrigued me. As an identical twin, I spent years with many internal existential debates trying to figure out who I am as an individual separate from another who shares 99.99% of my DNA.--Yes, my dad bought a test to learn how identical we were. (This company produced and marketed these tests now has my DNA frozen somewhere for some unknown future research.) Would this search have been quicker if I just looked at government surveys and my genome? Take a DNA test to figure out who you are physiology then take a government survey to figure out who are socially...done.

Why propels us to do this? Taking a historical prospective, Daniel Kevles provided a timeline of heredity and genetics research in regards to eugenics and genetic manipulation. Taking momentum in 20th century, this ‘science’—discussion on whether this was a proper science occurs outside the scope of this blog—was used to explain how those outside the promoted norm came to be scientifically. Science was then used to justify pushing the “outsiders” further away from the norm if not eliminating this group.

Thus, is it actual fear or just a lack of understanding that leads us to do this? How big a role does prejudice play? In our capitalistic world, where does money matters come into play? Trying to come up with a universal answer is just as fruitless as trying to procure a universal reason why people created and joined the Tea Party movement. Each individual has their own logic and reasonings based on their own history and education.

Nonetheless, Keyles does describe how eugenic scientists tried to move away from perceived and actual prejudice focusing instead on specific aspects of heredity and genes. The human genome project ignited fears of potential use of science to promote bigotry. These fears never came to be.

Yet, we must ask ourselves if we facing a new wave of potential abuses with the growth of DNA testing. If so, the masses have not noticed or are choosing to agree or stay silent.  The Genetic Information Nondiscrimination Act (GINA) signed in May 2008 and went into effect a year later bans the discrimination by health insurers and employers on the bases of DNA. I myself do not feel protected enough by this act. This acts is built to be reactive: punish those who have already committed the act. Proactive policy is needed, but in a legal and government system built to be reactive, this will require the masses to do what they were supposed to do after the Human Genome Project: education themselves, unite their voices, and push for action.

Saturday, January 22, 2011

A few basic instructions

Someone asked how do we make a post on here? Here's a few basic directions:

First, setup your account (Mike can help w/ that). Next click the link that Mike sent us, which will take you to the class blog and add it to your functionality.

Here's what you do to view & post to that STTP (Science, Technology, Policy, Power) blog for our class:

From your blog Dashboard , click on the Science, Technology, Policy blog button that says "view blog".

That takes you here which you can bookmark directly in your browser or reader, if you prefer.

You can also choose to follow this or any other enabled blog by clicking the blue "Follow" button on the very left in the blue bar at the top of your screen. That's another cool feature which is explained here:

Ok, so you should now see the STPP blog page. From this point, look in the upper right hand corner of your screen, in the blue bar at top. Click on the "New Post" button, and you're off and running.

For those who are wondering, to comment on someone else's post, click on the blue text that follows their post which says "# comments." If nobody's commented yet, it'll read "0 comments".

If you copy and past text from your blog to post somewhere else, be sure to re-create all the HTML links. I learned this the hard way. *blush*

Hope this helps without seeming too patronizingly simple. Again, thanks to Mike for setting this all up. He's the true wizard behind the green curtain.

-Jason

Friday, January 21, 2011

So You Want to be a Blog Super Star

Blogging is pretty easy, but there are still some tricks to it.

So, what you'll want to do is register for a google account (you should have been sent a link, check your spam folder). This lets you post, edit the blog, and all that fun stuff. The editor is fairly easy to use, with WYSIWYG and HTML editing modes.

Writing blog posts is interesting, because stylistically it sits somewhere between journalism, a diary, and for us, academic papers. It has to be current, personable, and smart. We need hyperlinks, and multimedia. Really, it's an art, and the best way to learn is to do. The second best way is to see how other blogs work. Here are some of my favorites.

Roger Pielke Jr is great at using his climate blog as an extension of his academic career.
Kyle Munkrittrick has a similar blog, with a more casual style.
Science Progress has great policy position type articles from a variety of scholars.
The Bubble Chamber is a philosophy of science blog run by the University or Toronto. Very scholarly, very solid.
Charles Stross is a science-fiction author, who writes an amazing blog on his travels, personal experience, and whatever he's thinking about. Make sure to check the comments.

Happy blogging, all!