Susan's Place Logo

News:

Visit our Discord server  and Wiki

Main Menu

Androids - Mankind's Hamartia?

Started by Ryno, July 27, 2011, 01:37:05 PM

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Ryno

I just got the August issue of National Geographic, and one article is about androids. There's been a lot of talk about the reality of robots performing everyday tasks for humans, like folding laundry, cooking, and even babysitting or tending to the elderly. What do you guys think about the concept of robots that are capable of reasoning, making decisions, and higher thinking similar to humans?

Personally, I'm both excited and terrified. There are military robots being made that could potentially decide themselves who to shoot and when. I'm pretty sure that's probably the stupidest application of human intelligence. Give a robot free will and a gun. Don't we have enough problems with humans toting guns?

I think it would be cool to have a few machines with personalities to do some of our work, but we already have plenty of robots, controlled by humans, that do just about everything we need, for consumers, medicine, education, and military. Why should we give these robots the ability to perform tasks without human interference? That just spells danger.

First, dissociation. The more we leave to robots the less connected we become with each other. There are already "lovebots" in the making, who can perform romantic tasks with humans. Babysitting robots are being considered, which sure saves parents some money and removes the risk of kids being taught to smoke or drink, but it also leaves the kids in the care of a machine and diminishes social development. The same goes for the care of the elderly. These are people who need to be with other humans, who need to interact with others - not machines. With added personality traits and human-like features, human emotional reaction means people will probably befriend their machines, replacing human interaction with artifical interaction. We'll become even more reclusive than we already are with Xbox, iPods, and the internet.

It's a cool concept that makes great sci-fi shows, movies, and games, but I think in reality it's a horrible idea.

However, my opinion doesn't mean it's not going to happen. Androids in every home and school probably will happen, and probably within 10-15 years. So I guess I'll just have to hope for the best...
Пудник
  •  

Sabriel Facrin

If a lot of people go with romance 'bots, it certainly will help quell a lot of excess human population when a lot of bloodlines suddenly stop dry.  For it, I'm not...entirely against the idea. >>' <<' >>'
I think we have a long way before we create AI that would be effective, much more before we make one that's compact enough for a human head.  The modern concept of AI in practice is a glorified search engine. XD

That said...I think it's ok for humans to develop androids.  androids can go and do some of the big jobs that humans shouldn't be trusted for, and the dangerous jobs that humans shouldn't do.  psuedocreatures with optimized bodies can cut a lot of hard work away.  The only issue is a sudden severe unemployment rate...
  •  

tekla

More people, fewer jobs, great idea.
FIGHT APATHY!, or don't...
  •  

Pinkfluff

Those military robots have nothing close to free will. Some degree of autonomy in UAVs (and ground and naval vehicles) helps them do their job better and to actually be more discriminating in their targets. In all of the systems that I have seen, a UAV (or a group of them) that is not remotely piloted will be given a mid or high level directive (depending on the system's sophistication and level of autonomy) that it will execute. This could be anything from enforce a no-fly zone at an indicated area and engage anything without a friendly IFF, or simply to move from one location to another. At any time they could be given a lower level command by a remote pilot too. There are also plenty of safety systems to prevent everything from shooting the wrong thing to a dangerous malfunction. The control systems for them are even designed to handle mechanical failure or battle damage (to an extent). Honestly I trust a robot with a gun more than I do a human. Sure they don't have compassion, but neither do they have hate, anger, fear or stress. At least not emotional stress, since they don't have emotions.

I do also worry about robots putting humans out of jobs, but if we could evolve beyond a greed-based economy and social motivation for doing pretty much anything as well as solve overpopulation then it would cease to be a problem.
  •