What Are You Doing Next Saturday?

By | Jesan Sorrells Blog, Old Posts
What Are You Doing Next Saturday?

6th Annual Conference on Applied Ethics:
Technology and Ethics
April 4-5 2014 at SUNY Broome Community College
  • What are the ethics of data mining, genetic screening and hydrofracking?
  • What is the significance and future of neuroethics?
  • Can there be ethical guidelines for the production and use of chimeras?
  • Is there a right to technological connectivity?
Keynote speaker for this year’s conference will be Dr. David Sloan Wilson, Distinguished Professor in the Departments of Biology and Anthropology at Binghamton University. He is a prolific author and frequent speaker at conferences around the world. His address on Friday, April 4th at 7pm will be on, “Ethics, Technology and Evolution.”
 

All That Happens Must Be Known

By | Jesan Sorrells Blog, Old Posts

Given revelations of internet data surveillance what concerns should be raised about the possibility of brain monitoring devices?

All this week on the HSCT Communication Blog, we are answering questions put forth by the folks running the at the upcoming Suny-Broome 6th Annual Interdisciplinary Conference being held on April 4thand 5th at Suny-Broome Community College.
This week’s question was posed by the plot of David Eggers’ most recent novel, The Circle, and was not definitively answered by the end of the book.
Well, we here at HSCT have three primary concerns about brain monitoring devices. And the NSA didn’t make the top three.
  • The first is around marketing and the idea of “opting-out” rather than a mandatory “opt-in.”
The most annoying moment on the internet or social media is waiting for the commercial at the front of a YouTube video to load, with the countdown going before the viewer can “skip this ad.”

As the customer (you and I) have gained more control over blocking being sold to, marketers and advertisers have had to come up with more clever (and blunt) ways to compel our valuable time and attention, with confusing and frustrating results for all parties involved.

Now imagine if marketers had access to the most intimate space on the planet: Your private brain space. There would be no “option to opt-out,” even though all the legalese would say that there would be.

Which gets us to point number 2…

  • The second concern that we have is that increasingly, the desire to not participate in social communication is seen as a sign of social ineptitude at best and dangerous at worst.
Case in point: Whenever a school shooting happens, the first thing that the media does is breathlessly report whether or not the perpetrator possessed a social media account.
If he (and it’s usually a ‘he’) does, then there is breathless data mining that goes on in a search for pathology, motive, and aberration.

In other words, the nature of the aberrant act itself is no longer enough to create outrage; the lack of social participation is the driver for primary outraged responses. This leads to concern number 3…

  • The third concern is that we have long sought—as individuals, societies, and cultures—to control people under the guise of freeing them from Plato’s Cave.

Brain monitoring devices won’t be used to give us freedom, collaboration, and connection. Instead, they will be used to take away freedom, encourage and inflame false fracturing and individualization, and destroy connections between people.

In other words, criminalization of thought will happen using the same powerful social sanctioning to illegality continuum that has banned smoking from restaurants, trans fats from NY City restaurants, and has gotten the White House cook to quit.
The inevitability of technological progress demands that we think about the ramifications of power and control, not only from government and corporations but also by and from each other.
So, HSCT’s conflict engagement consultant,  Jesan Sorrells, will be presenting on the issue of online reputation maintenance in a world where virtue and ethics are not often addressed.
Register for this FREE event here http://www.sunybroome.edu/web/ethics and stay for the day.
We would love to see you there!
wisdom

Wisdom in the Machine

By | Jesan Sorrells Blog, Leadership Philosophy, Leadership Theology, Old Posts

When the astronaut Dave powers down the rebellious HAL 9000 computer in 2001: A Space Odyssey, and more recently in the 2013 film, Her, starring Joaquin Phoenix, we determine through pop culture, what machine “death” looks–and feels–like.

The fact of murder comes from the fact of life and ideas and philosophies that we have as individual humans–and collective societies–about what traits constitute life.

In the case of a machine, I take the position that a machine cannot overcome the limitations of its creator.

Life is defined, not only by self-sustaining processes (we were asked while writing this post if it would be murder to power down a machine created by another machine) but also by wisdom that is attained through life experience.

The crux of wisdom lies at the intersection of common sense, insight, and understanding.

HAL 9000 may have had one, or even two, of those things—such as insight and understanding—but “he” (see how we anthropomorphized an inanimate object there) lacked the third trait in spades: common sense.

Just like Skynet in Terminator or the machines and computer programming networks of The Matrix, HAL 9000 was unable to negotiate in good faith with his creator.

“He” made an “all or nothing” decision about Dave’s presence, Dave’s mission, and Dave’s motives and then took extreme action.

The same way that the machines did in The Matrix and Terminator.

The ability to negotiate with others in good faith, and to honor those agreements, is a human trait based on knowledge, experience, common sense, and insight, not just a happy byproduct of a conscious mind.

And until machines have the ability to negotiate with, not only their environments in the rudest sense of the term but also with their creators, we should feel free to power them up—or down—at our will.

After all, our Creator does the same thing.

Right?