Science as an Unavoidably Double-Edged Sword
A lot of people tend to mistrust science, even to fear it, and this seems to be a natural but unfortunate part of human psychology.
Those of a mindset that favors the need for certainty, utility, and a dislike of change like to talk in terms of the dangers and untrustworthy nature of science, imposing a false dichotomy in saying that either it must be totally safe and immediately useful in a practical sense, or rejected outright as at best impractical, even irredeemably evil and dangerous, especially when it offends them emotionally, politically, religiously, and/or ideologically.
This is ridiculous, since no human activity has ever shown itself to be without consequences.
Religion and politics are two very good examples of that, areas of human interaction that are historically well-known to be frequently subject to dangerous extremist movements.
But like religion and politics too, science has more than one side to it, science too has a double edge, for it is our best means at present of discovering and testing knowledge, a thing that no other human endeavor does quite as well, and this is because only science has the means of finding out when it is wrong, and of fixing its mistakes and finding out the closest approximation of the truth given the tools at hand. Science can be both a good servant and a poor master, it’s up to us to decide.
We can use and we can misuse the knowledge it gives us.
We cannot have the good of science without the potential for the bad, because of the limits of logic, of the human mind, and the limits of the laws of the universe, and the constraints these place upon us in knowing of the countless and sometimes initially inconceivable consequences of any human activity, including our use of the process and the products of science.
We cannot know everything.
We cannot expect to foresee all possible dangers of science and its resultant technologies, to know and immediately account for all detrimental effects it might pose. Nor should we argue that if this cannot be done, then all research, all intellectual curiosity, is simply not worth it because we cannot absolutely guarantee its complete safety.
I’m reminded of the current situation of nuclear physics and some of its present applications, both positive and negative. We use it to power our cities, successfully treat deadly cancers, and power many of our space-probes, like the old Voyager and Pioneer vessels, and at the same time, hold ourselves and the rest of the world hostage to problems of nuclear waste disposal, reactor accidents, and the threat of possible nuclear conflict.
There are the examples of DDT and Chlorofluorocarbons, which while both brilliant developments, beneficial and useful in their own time, both of which had long-term dangers that were discovered to outweigh their benefits.
But it was not politics, nor economics, nor religion which warned us of these dangers. Yes, it was science which created these substances, but it was also science which alerted us to the hazards that these substances posed.
The point is that we cannot have perfect risk-free science, perfect risk-free technology, we cannot have science as a cornucopia of blessings without chancing taking the bad with the good, without minding ourselves and using it carefully and wisely or paying the cost for our folly.
To expect science to be perfect, infallible, and demand it to entail no dangers whatsoever — no human enterprise is without risk — is to set ourselves up for major disappointment when we find out that we must take the responsibility to think before using our gifts, and it is fallacious to decry it as fatally flawed because we must do so.
We can only solve those problems caused by the misuse of science and technology with better science, not by embracing ignorance and going back to the caves of our paleolithic ancestors, since the efreeti has already been released from the bottle, and will not simply go back because we tell it to.