Did space exploration sow the seeds of its own demise?by Bob Mahoney
|
Unfortunately, some introductions of new technology, while they successfully accomplish their immediate objective, precipitate unintended—and sometimes extremely deleterious—consequences. |
In many instances, the primary effects on society of these advances were positive and anticipated. ([6]’s discussion of the London sewers offers an excellent illustration; [7]’s history of the NYC subway highlights the multifaceted challenge—and consequent impact to society—of implementing a government-directed technically complex transportation system. Sound familiar?)
Unfortunately, some introductions of new technology, while they successfully accomplish their immediate objective, precipitate unintended—and sometimes extremely deleterious—consequences. (See [8] for a disconcerting but intensely thought-provoking perspective.) While we certainly don’t have giant ants colonizing the Los Angeles sewer system [9], it’s a bit hard to miss that the dark specter of nuclear weapons affected our society and culture in unexpected—and not always positive—ways.
And unexpected consequences need not be so earth shaking. Consider that innocuous pop-top beverage can opener. The original toss-away design introduced during the 1960s was a major improvement in convenience over the need to use triangular punch-blade (‘church key’) openers on the old steel cans (twice: don’t forget the air hole). Unfortunately, the convenience of removal from the can begat the convenience of disposal… on the sidewalks, on the lawns, and especially on the beaches. It didn’t take long for all those improperly tossed pop-tops to become their own mini-environmental disaster, contributing to litter everywhere while slicing untold numbers of bare feet (not to mention lips, tongues, and digestive tracts—some conscientious consumers dropped them into the can before drinking.)
Granted, after a few iterations industry solved the pop-top problem to a reasonable degree of satisfaction, even if along the way the hardware upgrades precipitated unmistakable changes in human behavior. Specifically, each iteration altered the precise way we placed our lips to the can top and how far we had to tilt our heads to ensure adequate in-flow of displacing air while not catching our noses—and to recover those last few drops of beverage. Minor adjustments, yes, but we should not dismiss the lesson attached: a technical advance of even minor degree can (and often does) bring changes in our everyday behavior.
Now, about that oft-touted child of the US space (and missile) programs, microelectronics. I don’t feel it necessary to cite any references when I suggest that our society has been significantly transformed, from the heights of global commerce through the nature of warfare to the actions we execute hourly as individuals, by the pervasive infiltration of the microchip. Of course the benefits of this technology are legion, and few would choose to walk away from the enhancements and conveniences these wonders have introduced to our lives. But one need only consider the SUV wandering into one’s lane of traffic, driven by a cell-phone-gabbing soccer mom (to cite the cliché), to recognize that not all the impacts of this revolution in technology have been positive. And can any of us forget how that memory-guzzling e-staple of today’s boardrooms, lecture halls, and classrooms—PowerPoint—may have contributed to at least one space disaster? [10]
At least one author decades ago accurately anticipated many of the positive changes comprising this incredible transformation [11], even if he overestimated their rapidity. But he again was speaking of society, on both grand and local scales. Could our extensive, immersive e-universe—infested so thoroughly with cell phones, personal assistants, the Internet, Wi-Fi, videogames, laptop computers, and virtual reality—be modifying us? While some might characterize such a concern as knee-jerk Luddite alarmism, some researchers are already finding correlations between certain cognitive problems and the use of digital media [12, 13].
Waiting for something—anything—has become the exception in their world, not the expected. |
And while such assessments point toward the direct impact of e-tech use on mental and physiological processes, my worries address effects a bit more subtle, perhaps second- or third-tier. And I’m not worried so much about us middle-aged fogies who came of age alongside the first waves of ever-advancing electronics (except perhaps those cell-phone-wielding SUV drivers). No, my concern addresses the mental development of the Alices and Alecs of today and tomorrow who have spent or will spend nearly their every waking moment immersed in this marvelous, pervasive e-Wonderland.
Consider—those of you in your thirties, forties, and above—how you came to know and become attached to the wonders of the Microelectronics Age. You appreciated every advance, every step up in convenience, because you had grown up during a time when, at best, calculators weighed pounds (or at least measured an inch in their smallest dimension), if you used calculators at all. Back in high school, college, or the workplace, you personally experienced the transition to desktop PCs from the dark ages of battling mainframe computers via remote workstations or even punch card keyboards and readers (if not prior, when “regular folk” did no computing at all).
When you made your first cell phone call, typed your first e-mail message, or brought up your first web page, you were already an adult or nearly so. You essentially came of age—you certainly developed your basic intellectual skills—before microelectronics really infiltrated everyday life. (Mind you, we late-stage baby-boomers were spoiled; neither of my parents growing up in a large city even had a telephone in their house.) Whatever generation you’d care to assign yourself to, you did not come of age in the modern e-world but in years long past, before society had sailed over the horizon into the midst of the cyber sea.
Now consider children born within the past few years who have never known a time—not even a day—without near-constant interaction with sophisticated computer and communications technology. When riding in the family vehicle, their parents permit—even encourage—them to keep their eyes focused on a fold-down flat screen or on their individual hand-held videogames, their ears isolated by headphones. They watch 1000-channel widescreen HD TV at home, interact inside countless multi-layered DVD landscapes of entertainment and “learning”, and listen to—no matter where they happen to be—vast personalized music selections via earplugs attached to wafer-thin devices nearly small enough to swallow. They partake at their fingertips of a vast dynamic ocean of text, music, and video that stretches orders of magnitude beyond (in both content and quality—especially quality) the mere puddles of material offered by the libraries that our generations grew up with—libraries, mind you, that we had to actually travel to, sometimes in cars wherein we had nothing to do but (gasp!) watch the world pass by outside while listening to our parents’ music. And they play amazingly authentic (and sometimes disturbingly violent) videogames in a virtual reality creeping ever-closer to the Feelies of Huxley’s Brave New World.
Today’s kids are in near-constant communication (usually with cryptic text exchanges in place of the spoken word) via cell phones, social websites, and any number of other platforms ready at hand. Various server hosts and device limitations force upon these exchanges a hyper-abbreviation of the written word that creates a chopped up micro-thought shorthand that is inevitably creeping into their schoolwork.[14] Not that they actually swap much substantial information in all this staccato-burst interaction. Most of what zips across the ubiquitous electromagnetic web doesn’t even rise to the caliber of what one would term twaddle. Perhaps twiddle better fits its typical vapidity.
And with all of it, from the videogames to the DVDs to the Internet to the cell phones, they expect near-instantaneous response to all queries, messages, and requests. Waiting for something—anything—has become the exception in their world, not the expected.
Remember the not-so-old joke about how to blow your kids’ minds? Don’t tell them you had to walk five miles to school; tell them you had to walk across the room to change the channel. The paradigm shift underpinning this humor has been writ large across the consciousness of an entire generation.
The primary rationale for space exploration is its value as entertainment, entertainment in the sense that it satisfies our intellectual curiosities, feeds our yearning for adventure, and appeals to some of our more noble emotions. |
One can only tremble (either in awe or terror) at how far things may go as virtual reality and our electronic companions become ever more capable—and invasive. The kids of today (and especially tomorrow) will spend an unbelievably large proportion of their lives immersed in an electronically generated un-reality, communicating with their peers in abbreviated fragments of thought focused almost exclusively on their personal immediacies, all the while considering a response delay of more than five seconds—for obtaining or accomplishing anything—to be an unacceptable burden.
I can’t help but wonder if kids growing up inside such an instant-gratification, truncated-interaction, artificial-reality universe might have difficulty with—even a disdain for—any and all aspects of life that require longer-term and deeper investments of both effort and especially thought. Not that I’d expect most teenagers today to be keen on spending time contemplating Plato’s Republic—I suspect many in Plato’s day didn’t want to either—but just how challenged are the minds of kids who expect cash registers to always tell them their change due and who presume that their electronic companions will automatically—assuming they attempt to write in complete sentences—correct their spelling and grammar?
I fear that certain core human abilities, in particular some of our best intellectual capacities, may atrophy or fail to develop in this new instant-access puréed-thought virtual world that microelectronics has begotten. When these kids—and then their kids—reach adulthood and must face the greater challenges that the real world will inevitably throw at them—personal, community-wide, national, and global—will they even have the rudiments of the intellectual and emotional tools necessary for grappling with those challenges?
Those of us who did have to cross the living room to change the channel, or go to the library to research our term papers, or find a wall- or table-mounted telephone to let our parents know we were coming home late, we developed certain fundamental abilities and skills because some tasks—even down at the level of playground games and everyday chores—still required that we ourselves exercise modicums of sequential critical thought, perseverance, and creativity. We carried these basic intellectual and emotional skills and qualities—I hope—into adulthood, allowing us to take on what life tossed at us even as they helped to shape our personal outlooks and our wider worldviews. (And our parents before us, who faced even more challenging times with fewer conveniences at their disposal, even more so.) Without an entourage of electronic companions bursting with downloaded apps ready at hand to help us along or to entertain us, we had to mature in these skills and abilities because life, such that it was, demanded as much.
Folks may scoff and categorize me with those who had forecast dire societal plagues to follow such valuable innovations as type-set printing, railroads, and automobiles. And of course the over-the-top humor in the films Wall-E and Idiocracy come to mind with, respectively, cyber-coddled couch-potato humans aboard a space ark so oblivious they’re unaware they have swimming pools available, and a future populace so dumbed down they don’t know that crops need water to grow. [15,16]
Space exploration’s foundational raison d’être resides solely in the highest tiers of our intellect and in the deepest wells of our most noble passions. It is a pursuit so difficult to defend on its own merit because its true justification is woven into the very fabric of what makes us human. |
But if people are so immersed even today in their private e-worlds that they’re walking into shopping mall fountains and off subway platforms onto tracks [17, 18], is it so far-fetched to suggest that some of their higher intellectual abilities might be at risk? And what of their children and grandchildren, whose e-world another generation or two hence will likely be even more immersive and more accommodating of their immediate needs and wants? Just how deep and contemplative will the hearts and minds of those folks be?
As I have described previously (see “Space for improvement: re-engaging the public with the greatest adventure of our time”, The Space Review, February 5, 2007), the primary rationale for space exploration is its value as entertainment, entertainment in the sense that it satisfies our intellectual curiosities, feeds our yearning for adventure, and appeals to some of our more noble emotions. As human activities go, it is one of the more elevated mental pursuits of actual physical objectives because it has little immediate practical value but nonetheless serves to fulfill our higher aspirations toward self-realization and improvement (including the acquisition of new knowledge), it offers the genuine satisfaction garnered from accomplishing unique (and uniquely difficult) tasks, and it generates camaraderie by taking on challenges that demand well-executed team effort (at both small and large scales).
All other space rationales are either derivative (e.g., spin-offs, national prestige, international goodwill) or are so long-term in both their substantial required investment and ultimate pay-off (e.g., entrepreneurial return-on-investment, asteroid deflection, power generation) that they too rely on satisfying the same intellectual and emotional yearnings that call us to the adventure and discovery of pure exploration.
In other words, space exploration’s foundational raison d’être resides solely in the highest tiers of our intellect and in the deepest wells of our most noble passions. It is a pursuit so difficult to defend on its own merit because its true justification is woven into the very fabric of what makes us human. Since time immemorial the human intellect, driven by a curiosity and fired by emotions that collectively defy explanation, has sought to expand its domain and discover—and master—everything within its ever-expanding scope. Why? The clichés are rampant but nevertheless true since they all reveal an underlying characteristic of the human mind: it wants to explore because it finds exploring fun. All the benefits that accrue from that inexorable yearning are merely gravy.
Which is why I wonder if the space program may have done itself in by ushering in the Age of Microelectronics. Will the intellects of our children and their children, shaped as they will be by their immersion in the instant-gratification, fractured micro-thought, unreality-based cyber universe have any such yearnings to explore out there in the real solar system or beyond? Can we even predict the value system of those who will come of age in such an environment?
Why would someone who can go to Mars from inside the confines of their own home (virtually and near-instantaneously, the way that they will have experienced many of the other “travels” in their life) feel any calling to actually journey there for real (much less pay to send someone else) especially when such a physical journey would require years of investment in both time and resources—and won’t even offer any aliens to zap along the way? When a maturing person’s experience base is largely comprised of cyber-vistas populated by artificial (but visually and physically enhanced) avatars executing activities that easily weave back and forth between the possible and the fantastic, will he or she derive any compelling merit or satisfaction from experiencing mere reality?
Perhaps the new generations will be satisfied with sending physical avatars—robotic probes—that will beam back sufficient data to make their “experience” of distant destinations nearly the same as being there while demanding less cost and commitment than that required to send people. |
While one can find some children today, especially in primary school, who speak enthusiastically of wanting to go to Mars or beyond, I can’t help but question if they can distinguish between, on the one hand, the marvelous images and PowerPoint shows presented to them by their teachers or visiting NASA personnel and, on the other, the thousands of other fantastic images and scenarios they experience through their TVs, games, or the Internet. (Such a question is all the more ironic given that during the past half-decade those classroom presentations likely discussed a program—Constellation—that no longer exists. What, indeed, constitutes “real” for these kids?) Will their enthusiasm remain when they come to fully understand the actual years-long commitment and effort required—from both them and others—to achieve any such goal in the genuine physical solar system?
Some data may suggest no. [19] A poll taken in 2006 among 18–25 year olds showed that a majority of them considered NASA (and by proxy its specific space exploration efforts) irrelevant to their lives. Granted, this may have all or much to do with NASA’s pitiful public engagement efforts (as suggested by the reference’s author; also see again my “Space for improvement” essay). However, it is a curious coincidence that all those surveyed were born after IBM introduced their PC and they became teenagers during the decade when cell phones and the Internet began their inexorable sweep across the landscape.
Perhaps the new generations will be satisfied with sending physical avatars—robotic probes—that will beam back sufficient data to make their “experience” of distant destinations nearly the same as being there while demanding less cost and commitment than that required to send people. Some have encouraged just such a paradigm shift in exploration strategy for decades.
But I fear that even that sort of exploration is at risk. If the minds and hearts that come of age in today’s and tomorrow’s cyber-world become stunted by instant-access conveniences and the absorbing e-distractions around them, will they ever hear the call of that same yearning that brought humanity through all its explorations in the past? Given the cozy environment to which their minds and hearts will have grown accustomed, why would they bother with sending anything at all…anywhere?
I am well aware that my supposition may be without basis. My fears are likely derived as much from trends I perceive around me (among both adults and children) as from my consideration of the historical antecedents and behavioral propositions I have presented here.
I must duly concede that those higher elements of the human intellect and the deeper wells of our noblest emotions that embrace exploration and compel us to pursue greater knowledge and expanding horizons may transcend these cyber-induced modifications to human society and behavior just as they weathered previous world-changing transformations. And perhaps the innovations wrought by the microchip will continue to leverage the opening of doors to vast new vistas of creativity, industry, and thought, just as printing and the steam engine did centuries ago. But we must remember that the world-shaking revolutions precipitated by those previous technical advances, while they bestowed immense improvements, also generated unanticipated negative fallout, the remnants of which we still wrestle with today.
Humanity now stands on the threshold of being able to stretch forth its intellect and its abilities to investigate and tame the entire solar system for its collective benefit. Will the children of the fractured-thought instant-response artificial-world microchip society that Apollo hath (partly) wrought be up to the task? |
Consider a more recent development. Somewhere along the road that carried us from the post-World War II explosion in consumerism through Sputnik and into the radicalism of the 1960s (each due partly to advances in various technologies), large numbers of US colleges and universities began dumping in-depth philosophy and logic courses from their core curricula. They did this purportedly to make room for more discipline-specific lessons aimed at keeping their students better attuned to the ever-advancing outside world, even as the all-too-common “We’re smarter today” attitude likely contributed. (See [20] for one pertinent commentary.) Many high schools followed, pruning their own curricula of content geared toward teaching kids how to think in favor of the facts-and-skills-based content that taught them what to think.
And now, decades later, an inordinate percentage of the population swallows wholesale a stream of bumper-sticker sound-bite pretzel-logic drivel—and far too often flagrant distortions of reality—as insightful political discourse and decisive scientific argument. And how about that crisis in STEM (Science, Technology, Engineering, Math) education? Does anyone think it isn’t about… thinking?
I believe it behooves us to be extremely careful regarding any possible threat to whatever lingering intellectual abilities we as a species still possess.
Humanity, having been inspired by curiosity and fired by emotions through ages past to explore and (imperfectly) master the world, now stands on the threshold of being able to stretch forth its intellect and its abilities to investigate and tame the entire solar system for its collective benefit. Accomplishing this phenomenal challenge and wisely reaping its bounteous rewards will require substantial sustained effort and brilliant innovative thinking over the span of decades, perhaps even centuries.
Will the children of the fractured-thought instant-response artificial-world microchip society that Apollo hath (partly) wrought be up to the task? Will they even feel any desire to contemplate taking on the task?
mite it alrdy b 2 L8 2 ask the ?
[1] Ceruzzi, Paul E., Beyond the Limits: Flight Enters the Computer Age, The MIT Press, 1989.
[2] Hall, Eldon C., Journey to the Moon: The History of the Apollo Guidance Computer, American Institute of Aeronautics & Astronautics, 1996.
[3] Petroski, Henry, The Evolution of Useful Things, Vintage Books, 1992.
[4] Castleden, Rodney, Inventions that Changed the World, Chartwell Books, 2007.
[5] Smith, Merritt Roe, & Marx, Leo (editors), Does Technology Drive History? The Dilemma of Technological Determinism, The MIT Press, 1995.
[6] Cadbury, Deborah, Seven Wonders of the Industrial World, Harper Perennial, 2003.
[7] Hood, Clifton, 722 Miles: The Building of the Subways and How They Transformed New York, The Johns Hopkins University Press, 1993.
[8] Burke, James, & Ornstein, Robert, The Axemaker’s Gift: A Double-Edged History of Human Culture, Grosset/Putnam, 1995.
[9] Them!, Warner Brothers Pictures, 1954.
[10] Marcus, Ruth, “Powerpoint: Killer App?”, The Washington Post, August 30, 2005.
[11] Evans, Christopher, The Micro Millenium, The Viking Press, 1979.
[12] Richtel, Matt, “Digital Devices Deprive Brain of Needed Downtime”, The New York Times, August 24 2010.
[13] DeGaetano, Gloria, Visual Media and Young Children’s Attention Spans, 1996.
[14] Fieldman, Chuck, “Teachers, Students See Texting Lingo Popping Up in School Writing”, The Chicago Sun-Times, March 31, 2011 (updated May 2, 2011).
[15] Wall-E, Pixar Animation Studios & Walt Disney Pictures, 2008.
[16] Idiocracy, Twentieth Century Fox Film Corporation & Ternion Pictures, 2006.
[17] CBS News, “Texting While Walking Woman Walks into Fountain”, January 20, 2011.
[18] KTLA News, “Distracted Boy Falls onto Subway Tracks While Gaming”, February 3, 2011.
[19] Dittmar, Mary Lynne, “Engaging the18-25 Generation: Educational Outreach, Interactive Technologies, and Space”, Dittmar Associates, 2006.
[20] Kreeft, Peter, “Why Study Philosophy and Theology”, from The Newman Guide to Choosing a Catholic College, The Cardinal Newman Society, 2009.