Showing posts with label passion. Show all posts
Showing posts with label passion. Show all posts

Monday, March 16, 2015

Are we destined to fall in love with our robots?

If we could look into the near future, would we see some people who normally would perhaps be single, instead living in a love relationship with human created non-human, basically in love with a personal robot?
From Svedka Vodka ad
We see the elements now a days that if put together, indicate one thing. That we will one day in the not to distant future be seeing those who are in love with their personal robots and it doesn't even need to be a fully anthropomorphized bot, but simply a digitized personal assistant on one's phone as exemplified in the film, "Her".

As many reading this may know, a few years ago I published a short sci fi story myself about just that issue. By the way, feel free to download and read a free copy. It's titled, "Simon's Beautiful Thought" (if you would prefer to pay for a 99 cent copy, it's also available on Amazon and thanks for the support in downloading either copy). There's also a video trailer for it if you like.

My story came out a couple of years before Spike Jonze and Joaquin Phoenix's film, "Her". I should say here that although my story came out a couple of years previous, the film was in the works for over ten years. Although I had never heard about the "Her" project until I saw the film's promos, I just felt the Zeitgeist that it was time for this story to be written. Most likely again really, as I'm sure someone sometime previously had already written one like it; but it's all about the timing. 

I've done well on that issue if I do say so, predicting, writing and releasing stories prior to more famous versions coming out in novels or film. Another example of that might be between my short story first published on the online hard sci fi magazine, PerihelionSF.com, titled, "Expedition of the Arcturus" (also now in ebook and audiobook formats and yes, here's the video trailer). It wasn't long after that the film, "Interstellar" started to be talked about, in production and eventually start to be promoted and finally released.

In my story entire families are sent to a planet in Earth's first generational space ship in order to save humankind whereas in "Interstellar", a team is sent to explore a similar need, but they have more time than Earth does in my story. Two very different stories based in the same need and offering questions if not answers to specific issues.

We need to examine these things ahead of time, to think about them before it's too late to do anything about it. That's where things like science and speculative fiction and Futurism come into play. And speaking about that, allow me to inject here an article titled: "20 Crucial Terms Every 21st Century Futurist Should Know". It's good to at least be aware of these things and at most, it's a fascinating field to get involved in.

One of those things we should be aware of and explore are the proposed Laws of Robotics. A set of very useful things to have as protection against some very bad things possibly happening. It doesn't take a lot of foresight to see how robots or AIs and humans will have very close relationships. 

Ones that may very well easily transition into situations where there is little difference in the experience between "normal" interhuman relationships and inhuman or transhuman ones. So we really do need to consider various elements in the topic. Basically, how robots will relate to us, how we will relate to robots and, how we will relate to one another and treat one another. Treatment of robots is another topic that was first explored in "I, Robot" by Asimov.

I see these relationships as going one of two ways. Mostly people will get along with their "tool" and that will be the end of it. But for some, the love interest will grow and seem to be rewarding and useful, or it will become dark and destructive. I think the orientation in my story was a bit more realistic considering what these assistants will be designed for and then the film, "Her" took the other more popular \ entertaining (salacious?) if not more sensationalist view. 

Science fiction and speculative fiction should tell a good story, but they should also point out realistic possibilities to give us a good view into the future so that we can be more prepared for what is to come. Much of science fiction has turned dark to show us the negative effects of technology on humanity, but it's not realistically all going to be dark. Dark is just more fun and offers more of a roller coaster ride experience in entertainment.

Some examples of this are the following. The excellent 1979 film, "Alien" which was a reaction in a way to the 1977 more positive film, "Close Encounters of the Third Kind". Alien scared the hell out of many people. The reaction to that film was another Stephen Spielberg film in 1982, "ET The Extraterrestrial" which had a far more positive effect. There was also of course Start Trek and Star Wars but those dealt more really with overall societal issues, such as war and galactic federations.

We have heard much of late from famous and great thinkers about Artificial Intelligence and the dangers it poses if we do not pay attention now, first and foremost to some very necessary things. To follow on that progression by the way, we have the first of the "The Terminator" film franchise that started in 1984, an iconic year (considering the George Orwell book of the same name) and was itself a reaction against the ET and for that matter.

Terminator and its SkyNet were prime examples of the worst that could come from AI if it is allowed to run unabated. 1970's, "Colossus The Forbin Project" was a much earlier example of this with its echoes back to Kubrick's 1968 film based on Arthur C. Clark's short story "The Sentinel" and titled, "2001: A Space Odyssey" with its demented and destructive HAL 9000 computer AI.
SPOILER, it wasn't HAL's fault but politicians in Washington DC as uncovered in the 1984 film, "2010" based on another 1982 Clark book, "2010: Odyssey Two".

Some of those who have raised the question of control and protections for humanity regarding digital autonomous beings are none other than Stephen Hawking, Bill Gates and Elon Musk, both very concerned about the power that AIs might end up with and a very real need for us to control them before we give them too much control over us or too much autonomy. 

Controls as in the "Three Laws of Robotics", Isaac Asimov first came up with believe it or not in his 1942 short story "Runaround". Laws he published in science fiction and which have since been updated and will continue to be made better or at very least, he had set the standard and the basis to indicate that we would indeed need such laws built into robots and AI's.

By the way, while we're talking about all this: 
"The first use of the word Robot was in Karel Čapek's play R.U.R. (Rossum's Universal Robots) (written in 1920)". Writer Karel Čapek was born in Czechoslovakia (Czech Republic) and I read his play many years ago.

Isaac Asimov's "Three Laws of Robotics":
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
He later introduced a fourth or zeroth law that outranked the others:
  • 0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm.
This is further explored in the article, "Do we need Asimov's Laws?" The reader's comments below that article are also interesting and one offers a counter argument to the usefulness of the laws in, "Why Asimov's Three Laws of Robotics Can't Protect Us."

Well that's all very interesting and these are things we need to consider, but let's get back to the point of this article....

As I saw in the film "Her", the AI didn't follow the laws to a logical conclusion (for me), whereas in my story, "Simon's Beautiful Thought", they were. Is my story better? That's for the readers to decide and my tale was merely a short story, but in my view, they are in some ways equal. Both give the readers or viewers a good projection into two very different and possible futures. 

I read a short sci fi story when I was younger about a guy on another planet who was a top robot field handler. He could handle up to the considered limit of nine which very few could do. If I remember correctly these were like clones with attached appliances so that someone could wear an appliance himself and thereby control the cybernauts in the field to do manual labor too grudging for humans to do. 

Being that kind of a wild west mining planet, they had towns with bars and brothels to service the labor managers and handlers. Our hero falls in love with a prostitute and decides to run off with her. But at the end he discovers there is a control appliance under the bed where he meets his "lover", much like the one he uses to control his clones and he realizes in horror that this whole time he had been in love basically, with himself. He was the one guiding the clone "prostitute" to pleasure him as he needed and desired.

So not only was it a fake "hooker\john" relationship, it had no bearing in the realm of "relationships" whatsoever. Except to himself. Is that what we really want? It may even in some ways for some sound desirable, but unless you have a personality problem, I don't think that is actually the case.


For some more background on the topic there is a Huffington Post article on the robot paramour paradigm:

Also for a quick survey of sci fi and our obsession with sex and robotics:

While many of us will not "fall in love" with our digitized assistants, there will most definitely be those who will fall fast in love with their non human entity or entities (as some will have multiples). The near future is going to be an odd place indeed.

Considering people now fall in love with things like toasters and electrical appliances residing on the street, some are now falling in love with their RealDolls (life sized sex dolls appropriately weighted and articulated) with no animation at all to them. They cost around $7,000 and some people have more than one of them. These types of objectivists will most certainly become bonded to their purchased non-human "toys" that may be either inanimate, animated, interactive or eventually, both.

Consider thisMashable reports:

"With the press of a button, Barbie’s embedded microphone turns on and records the voice of the child playing with her. The recordings are then uploaded to a cloud server, where voice detection technology helps the doll make sense of the data. The result? An inquisitive Barbie who remembers your dog’s name and brings up your favorite hobbies in your next chitchat."

Is it a matter of time then that those who would now be against gay marriage will one day welcome the old days of when that was their only concern? Will they have to consider those who want to marry their dolls, or their AI or full cybernaut manufactured device?

My concern with all this isn't that people have a diverse array of things they can explore. More power to them in that; mostly. We have after all only scratched the surface of what it means to be human.

Humankind is a mere adolescent in it's development and there is much, much more to come. Not the least of which is our own physical and mental development. Once we have AIs freely available they will alter and further develop us in ways we can now only imagine, just by their mere existence and our interacting with them a on regular and long term basis.

But we have to imagine it all now, before it's too late.

My concern in all this isn't so much that people might fall in love with autonomous individuals, even if digitally created. We do that now with naturally organically created individuals in relationships.

My concern is in how people might fall in love with themselves, at least in the beginning, before robots become sophisticated enough to have their own mental / personality kernels with their own ID\Ego|Superego if you will to use an old Freudian paradigm.

At that point, it will be no different than a real human relationship.

However before that point, when AIs as they are now are mostly a reflection of who we are, for us to have a period in time where we are basically falling for ourselves, like falling in love with your reflection in a mirror, during that period in robotics development what might that do to humanity as a whole? Will it alter and develop us in a new direction where we are beings narcissistically involved only with ourselves? And if that were to become widespread, where would that leave humanity as a whole?

I see that as a far greater threat to humanity and our overall development, than I do an AI taking over the world as in SkyNet in the Terminator film franchise, taking over the world. That might be a better ending for humanity that for us to turn into the narcissistic mental midgets that we are seeing in the media now a days all around us. "Selfies" are rampant, entertainment news media is all about those who it is "all about". 

We have in point of fact, become enamoured with ourselves. For more on this see: "Psychogenic Photopenia-A New Disorder?"

If given the ability to further develop in that direction, how many may choose perceived perfection in a relationship over that of the more problematic and rich ones we receive from our fallible human beings in one to another?

There are many and varied, new and novel things in our future and I look forward to them all. To explore them, to consider and become involved in many of them. Still, we do need to consider looking before we leap as a species and consider well, our new paths before us.

And now, this....
Rise of artificial intelligence is changing attitudes on robot romance

And, this....

The Washington Post | Love in the time of bots

April 9, 2015
Source: The Washington Post — March 17, 2015 | Dominic Basulto
Artificial intelligence thought leader Ray Kurzweil has suggested that a real life human & AI romance might be possible in as little as 15 years.
In his review of the 2013 Spike Jonze film Her, Kurzweil said he expected similar types of advances by the year 2029, “Samantha herself I would place at 2029, when the leap to human level AI would be reasonably believable.”
Kurzweil says your romantic partner might not need a physical body, as long as there’s a “virtual visual presence.” Kurzweil sees this happening via virtual reality experience.
“With emerging eye mounted displays that project images onto the wearer’s retinas and also look out at the world,” he said, “we will soon be able to do exactly that. When we send nanobots into the brain — a circa 2030s scenario by my timeline — we will be able to do this with all senses, and intercept other people’s emotional responses.” [...]

Monday, October 6, 2014

Isn't it all about the journey and not the destination?

Isn't it all about the journey and not simply the destination?

If nothing else it just makes sense, because we spend so much more time getting somewhere, than being somewhere. If you see what I mean. And if that's so, then why, WHY, do I work so hard to achieve a destination, just to...get there, to be there?

Why do I work so hard on things I don't want to work on, simply in order to achieve a destination?

I do things that I find difficult, things that force me to learn new things. Though I do love to learn, I find myself learning about things I really couldn't care less about. Like, marketing, since I'm a writer, when I really just want to write. But then understanding marketing, understanding your audience can actually enhance sales. Does it really enhance the quality of your writing, though?

Maybe. Maybe, not.

I don't really know why, why I do that.

I mean if it's all about the journey, if Life is about the moments and not the end point, the achievements, then why do it?

What am I doing, why am I doing this to myself?

Perhaps it's just that I'm looking at it all wrong? Perhaps, I'm simply incorrect?

So, what IS correct then? What SHOULD I be doing? Where AM I going and WHY am I going there?

I suspect it is because if life is really all about the journey, then that journey will select your destination for you, anyway. If you select the right journey, then you are selecting the destination you want to begin with. You just need desperately to want it, you need to be passionate about it.

Otherwise, why bother? Right?

I suspect that when you feel like you ARE trying to live the journey and yet, you feel you are only working on the destination, then IF you TRULY are working on the journey, what may be happening, what hopefully IS happening, is that you are "changing cars", "changing modes of transportation".

Look. If you go on a journey for a vacation, the entire thing is really about the journey. Even if you get to your destination and you only then begin that journey, then you are still on the journey. You do however have to realize that the journey in getting to the destination, where the journey is to begin, really is also a journey.

If you don't realize that, or see that this is indeed the case, then you are cutting half (or at the very least, a third) of the overall experience completely out of your available bank of possibilities.

Experience happens. It just, happens. You can't stop and start it, it is always just, occurring, all around you.

So why not enjoy it? Why not utilize that journey-that-isn't-the-journey before the destination, when the journey is actually only then supposed to begin?

Let's say you agree and decide to do that and yet, you realize that you are still dealing with the journey as if only the destination mattered.

Why is that? I mean, why do that?

I submit to you that this is the changing of cars, of modes of transportation.

When you go on that vacation, say you take a boat to Italy, you get your things packed at home. Then, let's say you take a taxi to the boat. You then load onto the boat, across a bridge, up a gangplank, whatever; you board the boat.

Then you ride the boat for days to your destination. You eventually depart the boat and head to your hotel. You unpack.

Finally, you are beginning your journey at your destination.

In taking that initial taxi to the boat, you are wondering to yourself, did I forget anything, do I have my tickets, wallet, credit cards, whatever? Because I don't want to get on that boat and have it drop its connections with the land and only then realize, I am going to be lost without that thing that I needed or wanted to bring along.

On that taxi ride I might ignore the driver. I might ignore the garbage truck we pass on the street, I might ignore the truck load of incredible oddities that drives by us. The beautiful park with the strange little parade will go unnoticed. The child who looks up at an adult on the sidewalk, the one who is being a complete ass who gets a childlike look that says it all about what an ass that adult is being. A look I might click to see a video of it on YouTube, or would even pay to see as it's just so damn funny. And I could have seen in in real life...for real...really.

The world is filled with these marvelous and amazing moments of life lessons and entertainment, with educational moments, with creative, with cathartic and artistic moments.

All these things you can miss, if you don't remember that when the journey to the destination begins, your destination is already there on the way to the journey's destination's journey.

So in the end and after all, it really is all about the journey. Right?


Now, to completely change the subject....
There is a signed horror book giveaway on the Horror Writers Association's Selfies site. There will be a lot of authors offering a free signed book.

Like Peter Straub's "A Dark Matter", for one.

And my own, "Death of Heaven", for another.

Best of luck! Cheers!