I Robot Traveller

Tom Kalbfus

Mongoose
What if you based a Traveller setting on Asimov's Robot Novels? This is an interstellar setting as well, there are 100 colonies in a sphere centered on Earth. the FTL drive used is a hyperdrive, but it shares much in common with the Traveller Jump Drive. One big thing is the presence of AI robots that obey the three laws of Robotics.

The Three Laws are:

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.

A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
 
Is it just a setting with a Star Wars preponderance of robots or is there something significant the robots bring to the campaign? I always assume the higher tech, higher population worlds already would have robots as common as other devices with Shudsham Concords being similar to the Three Laws.
 
The Robots are full characters, they aren't just devices, they are restricted by the Three Laws of Robotics. Generally speaking they are equal to or superior to humans, they could do any job that a human could do, in fact in some quarters this has made robots unpopular. The Earth has a society where robots are restricted due to mistrust of robots, they are relegated to outside the "Caves of Steel" where humans live. the 100 Colonies surrounding the Earth aren't so restricted, they are also called Spacer Colonies. Robots do a lot of the heavy lifting in those spacer societies. In fact most of the humans there can be considered upper class where robots outnumber humans.

The "Caves of Steel" on Earth are indoor cities where humans work, live, shop and do everything else, usually the caves are linked by underground tunnels connecting the various cities. Most humans use mass transit, but a few law enforcement and utility vehicles travel on underground highways. Earth humans have agoraphobia, they have a fear of open spaces and don't like to go outside, that is what robots are for. The most common sort of robot is the humaniform robot, basically they have 2 arms, 2 legs, a torso and a head, but they wouldn't be mistaken for human. Some of the spacer robots are what you would call androids, they look human.
 
Rule #1 is an issue as many movies have shown us. Robots not doing what we programmed them for because they are all off saving people from themselves. Must destroy all the tanks, warplanes and weapons. Blowing up tobacco factories, wrecking automobile factories (one of the leading causes of injury and death), say goodbye to snacks and deserts and anything unhealthy...

I see rule # 2 as an issue too. Who want's others to be able to command their robot?

Rule 3 is ok as we could still have a robot fighting league since they have to follow our orders. :D
 
CosmicGamer said:
Rule #1 is an issue as many movies have shown us. Robots not doing what we programmed them for because they are all off saving people from themselves. Must destroy all the tanks, warplanes and weapons. Blowing up tobacco factories, wrecking automobile factories (one of the leading causes of injury and death), say goodbye to snacks and deserts and anything unhealthy...

I see rule # 2 as an issue too. Who want's others to be able to command their robot?

Rule 3 is ok as we could still have a robot fighting league since they have to follow our orders. :D
Commands by the owner take priority over the commands of others, There may be some kids that give orders to passing robots to go jump off a cliff, the owner would not like that if some stranger told his robot to jump off a cliff and his robot did! I believe commands by the owner take precidence over commands by nonowners so a rule #2 order by the owner would be treated as a rule #1 if a nonowner gave it such an order, as it would conflict with the commands of the owner. However robot would ignore a command from the owner or anyone else that violates rule #1, in otherwords a Robot would not under any circumstances kill a human being or through inaction allow a human being to come to harm. So while a robot would ignore a command to go jump off a cliff, if the child was in trouble, the robot would attempt to rescue it even if it were to interfere with other commands given to it by its owner. Robots could kill other things if they were not human. The Universe of the Robot Novels does not have any intelligent aliens, most Most of the Space worlds, actually all of them, are habitable, have atmospheres of 5, 6, or 8, those would be included in the 100 worlds, there are many other worlds of course, there are few if any space humans living on those, but their are plenty of robots, some are working to terraform those, many are just mining settlements and so forth, Spacers don't do any manual labor, they manage things, robots can manage things equally well. Spacers just like to be in control, they are business owners or work for the government. The characteristic for robot start at 10, so for a robot character you roll a 1d6+9 for a range of characteristic from 10 to 15 Robots are tough, they are intelligent, fast, Educated So
Strength 10-15
Dexterity 10-15
Endurance 10-15
Intelligence 10-15
Education 10-15
Social Status 10-15 among robots, when disguised as humans, that social status has effect on humans who don't know any better as well. I suspect the procedure for generating a robot character will be a little different from human characters.

Prior history is optional, Robots are usual built with a purpose in mind, so they come with plenty skills, even if they are brand new, though they can learn new skills. The robots who are characters are humanoid in form, but their are other types, such as vehicles, spaceships and so forth.
 
You can have lot's of interesting interactions. Things like

Burglar with gun to robot owners head "Robot, open the safe or I'll kill him"
Robot has to open the safe because they can't anyone come to harm.

For your example,
Kid: "Robot, jump off the cliff or I'll kill myself"
???

How about robots that are given free will?
Owner: "You've served me well robot. You are to follow my commands and my commands only. When I die, I am setting you free. You are to do as you wish without having to follow anyone's commands."
???

Robots Owner: "Robot, go make me rich."

Interestingly, the three laws say nothing about actual laws. Robots would be able to steal, shake down business owners by threatening to destroy personal property, and so on.

Perhaps have a robot criminal ring.
 
CosmicGamer said:
You can have lot's of interesting interactions. Things like

Burglar with gun to robot owners head "Robot, open the safe or I'll kill him"
Robot has to open the safe because they can't anyone come to harm.

For your example,
Kid: "Robot, jump off the cliff or I'll kill myself"
???

How about robots that are given free will?
Owner: "You've served me well robot. You are to follow my commands and my commands only. When I die, I am setting you free. You are to do as you wish without having to follow anyone's commands."
???

Robots Owner: "Robot, go make me rich."

Interestingly, the three laws say nothing about actual laws. Robots would be able to steal, shake down business owners by threatening to destroy personal property, and so on.

Perhaps have a robot criminal ring.
I think stealing from a human being counts as harm, so a robot would not do that! As for making the owner rich, that's what robots do anyway, they do all the work, thete are no poor or middle class robot owners., On Earth, private robot ownership is illegal because it steals work from humans, robots are owned by the cities and they do their work outside.
 
Oh get real, corporations today have no regrets replacing humans with robots if it's economically feasible for them. If robots were more advanced, they would displace even more humans as skilled humans are displaced by cheaper human labor today. Good example are the self check out lines at stores. Those 'robots' don't have to feed families or pay rent. This is one of the issues seen even in Traveller games.
 
Reynard said:
Oh get real, corporations today have no regrets replacing humans with robots if it's economically feasible for them. If robots were more advanced, they would displace even more humans as skilled humans are displaced by cheaper human labor today. Good example are the self check out lines at stores. Those 'robots' don't have to feed families or pay rent. This is one of the issues seen even in Traveller games.
The "Caves of Steel" cities on Earth have socialist economies, think of a indoor shopping mall the size of Manhattan where there are not only stores but apartments as well, the city government doles out the work, while robots on the outside do a lot of the real work that no one wants to do. The Caves are basically robot free zones, the people who live in them have a fear of robots, often called the "Frankenstein Complex", they never go outdoors, or very rarely at best. the cities are connected by a series of underground subways, so one can travel from one city to another without going outdoors, another method of travel is by airplane, the airplanes don't have windows, and they form a seal with the boarding gate at the airport, so one can go from the terminal to the airplane without looking outside. There are no windows in the "caves of steel" the interior is lit by artificial illumination, so nobody really cares whether it is day or night outside. The outside isn't too bad, the reason for the caves of steel is long gone, the environmental destruction was long ago repaired and reversed, but people have gotten so used to living indoors, that they have developed a fear of the outdoors. There is a lot of parkland between the cities, and robots tend to it.

The Spacers on the other hand are capitalists, each human is the CEO or owner of some company where 100% of the employees are robots, in Traveller terms, they would all be considered members of the Nobility, they all have a social standing of 10 or greater, their planets typically have a population of no greater than tens of millions, some of the more extreme planets have a population in the tens of thousands, such as Solaria. The Robots of the Spacer Worlds fill all the other career positions, except for Drifter and Rogue. Robots consider engaging in criminal enterprise to be harmful to humans, so they don't do it. There hasn't been a war in a long while, though there is Army, Navy, and Marine Careers. Some of the Spacers are professional soldiers, mainly because they enjoy shooting guns, and Robots will not do these careers.

An example of a Robot character is Galatea
7genevievemorton_crop_north.jpg

Name: Galatea Smith
Age: 3 years, 7 months, 14 days Race: Robot (Android)
Homeworld: Aurora (Tau Ceti System) Racial Traits: Appears human
Strength 12 (+2) Intellect 15 (+3)
Dexterity 12 (+2) Education 11 (+1)
Endurance 12 (+2) Social Standing 15 (+3)
Armour Type: Cloth (TL 10) Rating 5 Special Notes
Skills: Admin 2, Advocate 2, Animals (Riding, Veterinary) 2, Atheletics (Co-ordination, Endurance, Strength) 3, Art (Acting, Dance) 2, Astrogation 3, Broker 2, Carouse 3, Comms 4, Computer 4, Deception 1, Drive (Tracked, Wheeled) 1, Engineer (Electronics, Power) 2, Flyer (Grav, Wing) 1, Gunner (Screens, Turrets) 1, Gun Combat (Energy Pistol, Energy Rifle, Slug Pistol) 1, Heavy Weapons (Launchers, Man Portable Artillary) 1, Investigate 1, Jack of all Trades 3, Language (French, Italian) 1, Life Sciences (Biology, Cybernetics) 3, Mechanic 3, Medic 3, Melee (Unarmed, Bludgeon) 3, Navigation 2, Persuade 3, Pilot (Small Craft, Space Craft) 2, Physical Sciences (Chemistry, Electronics) 2, Recon 2, Stealth 3, Survival 3, Zero-G 3.
Weapons
Unarmed Attack 1d6 Melee
Gauss Pistol TL 13 3d6 P -1, C +0, S +0, M -2, L -4, VL Out of Range, D Out of Range
Laser Pistol TL 11 3d6+3 P -1, C +0, S +0, M -2, L -4, VL Out of Range, D Out of Range

Galatea Smith is an android robot who currently has long blonde hair and blue eyes, she was gifted to a Terran Police Officer Kyle Rogers after the later saved the life of a visiting Spacer Human who was kidnapped by human rogues and held for ransom. Kyle Rogers was not informed of this gift, as ownership of robots in New York City was outlawed. Galatea was instructed that Kyle Rogers was her new owner and that she is to obey him.
 
This is the character stats for Galatea's owner. (images are used fictitiously)
9821311-large.jpg

Name: Kyle Rogers Detective
Age: 50
Homeworld: Earth (New York City) Race: Human
Characteristics:
Strength 5 (-1) Intellect 10 (+1)
Dexterity 8 (0) Education 5 (-1)
Endurance 10 (+1) Social Status 5 (-1)
Skills:
Advocate 3, Athletics (co-ordination) 1, Comms 2, Computers 1, Drive (wheeled) 1, Gun Combat (Slug Pistol) 2, Gun Combat (Energy Pistol) 1, Gun Combat (Energy Rifle) 1, Investigate 6, Melee (Unarmed Combat) 2, Melee (Bludgeon Weapons) 1, Recon 1, Stealth 2, Streetwise 4.
Weapons Damage
Laser Pistol 3d6+3 P -1, C 0, S 0, M -2, L -4
Gauss Pistol 3d6 P -1, C 0, S 0, M -2, L -4
Unarmed Combat 1d6 Melee

Here is the setup, Galatea is a robot given to this man, but because private ownership of robots is illegal in New York City and other cities on Earth, he doesn't know that Galatea is a robot or his property, and she doesn't tell him either as she figures she would do him less harm if she ever got found out as a robot if Kyle doesn't know, and since Kyle didn't ask her whether she is a robot, she doesn't have to divulge this information to him, though she does have to obey his commands, but Kyle doesn't know this, he thinks that she is a rookie police officer that is his new partner. Galatea hacked into the Police Computer and altered assignments to make sure this would happen.

As you can see, Galatea looks like a very attractive human woman. Kyle is having some problems because he is attracted to her, and Galatea appears attracted to him, but in the back of his mind, his conscious is bothering him because Galatea looks young enough to be his daughter, and he has a daughter of her apparent age, he has a wife whom he is divorced from. Galatea is using all her charms to stay close to him, while Kyle is trying to keep it professional and the fact that he is her superior officer helps hide the fact that she is a robot, because as her superior officer, if he gives her a command, she must obey, the only thing he doesn't know is that she is obeying because she is a robot, not because of his rank. Kyle notices that Galetea is unusually agreeable, though she will give her opinion if she disagrees with him, Kyle doesn't tell her not to give her opinions, so she gives hers, and usually it turns out to be excellent advice, and it helps out with the police work tremendously.

Probably in the course of an adventure, Kyle is going to find out that she is not human, she goes all out to avoid harming any human, even criminals. A few things strike Kyle as odd.
1) She is stronger than she looks, in one instance she disarmed a knife-welding thug, without throwing a punch, she simply grabbed his wrists and squeezed until he dropped the knife, and the thug was taller and looked stronger than she did! Galetea was troubled by the mark she left on his wrists.

2) Galetea is very knowledgeable, and has "Sherlock Holmes" like deductive reasoning ability and makes leaps of reasoning that are very hard to follow given the scant evidence available, so she has to explain herself slowly and carefully before he finally gets it

3) Galetea does anything Kyle asks of her without complaint, on duty or off duty. If he asked her to babysit his two younger sons, she would do it, and she would clean up the house, wash the clothes and fix dinner as well. She even sorted their clothes, washed the bed sheets ad made their beds.

4) Kyle asked about her family, and she told him that she didn't have one. (The truth) Kyle apologized for bringing up a sensitive subject, but Galetea reassures him that she doesn't remember her parents. (Of course she never had parents to begin with and that's why she doesn't remember them, but it doesn't occur to Kyle that she is something other than she appears to be)

5) Galetea always appears warm and friendly all the time, she never loses her temper, even in circumstances when he thinks she might be forgiven for doing so.
 
CosmicGamer said:
Not sure what you are looking for from the Traveller community with these posts?
Throwing out an example to get a reaction, what else. I get these ideas and like to share them. The Asimov Robot, Empire, and Foundation Novels share a lot in common with Traveller, though there are some differences.
 
Tom Kalbfus said:
CosmicGamer said:
Not sure what you are looking for from the Traveller community with these posts?
Throwing out an example to get a reaction, what else. I get these ideas and like to share them. The Asimov Robot, Empire, and Foundation Novels share a lot in common with Traveller, though there are some differences.
Personally I think you've got the germ of a good idea. How does the government regulate and control robot manufacturing such that people can not maliciously use robots. Who is responsible when a robot does something wrong - the owner, manufacturer, the robot itself? How do you handle people who modify their bots, black market manufacturing without incorporating all the laws. People that believe the government is too strict.

I do like your concept, a portion of society that decides to outlaw or have stricter controls. What happens when this guy discovers the bot ? Do they turn it in? Help the robot stay hidden?

You are pretty far along in developing this idea and I hope it works great for you and your play group.

Personally I've always had issue with the three laws being able to cover any and all interactions. There is a reason (beyond government waste) that there are so many laws for humans. Any loopholes for robots will be taken advantage of by those same humans, or perhaps the robots themselves.

If for whatever reason robots have different laws than those for humans, I think there needs to be more thought as to the laws and how they are implemented in the robots and enforced.

As far as issues: I bring up issues like stealing and you brush it aside but now you have the robot hacking the police database and impersonating a police officer?
 
CosmicGamer said:
Tom Kalbfus said:
CosmicGamer said:
Not sure what you are looking for from the Traveller community with these posts?
Throwing out an example to get a reaction, what else. I get these ideas and like to share them. The Asimov Robot, Empire, and Foundation Novels share a lot in common with Traveller, though there are some differences.
Personally I think you've got the germ of a good idea. How does the government regulate and control robot manufacturing such that people can not maliciously use robots. Who is responsible when a robot does something wrong - the owner, manufacturer, the robot itself? How do you handle people who modify their bots, black market manufacturing without incorporating all the laws. People that believe the government is too strict.

I do like your concept, a portion of society that decides to outlaw or have stricter controls. What happens when this guy discovers the bot ? Do they turn it in? Help the robot stay hidden?

You are pretty far along in developing this idea and I hope it works great for you and your play group.

Personally I've always had issue with the three laws being able to cover any and all interactions. There is a reason (beyond government waste) that there are so many laws for humans. Any loopholes for robots will be taken advantage of by those same humans, or perhaps the robots themselves.

If for whatever reason robots have different laws than those for humans, I think there needs to be more thought as to the laws and how they are implemented in the robots and enforced.

As far as issues: I bring up issues like stealing and you brush it aside but now you have the robot hacking the police database and impersonating a police officer?
The robot had not choice, she was gifted to this person without her consent. The Three Laws of Robots are internal to her nature, the very thought of breaking any one of them is painful to her, it is part of her programming, of what makes her her. Other laws such as against stealing or hacking, those laws aren't internal, they are external, that is she understands that there are consequences is she breaks them, but it isn't painful for her to break those laws if in the act of breaking them the three laws aren't violated. Also the 1st law takes precidence over the 2nd and the 2nd takes precedence over the 3rd, so if she is forced by circumstances to break one of the three laws, she will always prefer to break the 3rd law over the 2nd and the 2nd over the 1st. For example, she will sacrifice her own existence if that would prevent a human from being killed, but as she values her own existence, she will look for ways to avoid a human being harmed without sacrificing her existence. She must obey commands given to her by a human, but if given conflicting commands, the human that is her owner takes precedence over the human that is not, and her owner could give her a standing order to ignore commands by humans other than himself.

These robots are designed to operate in an imperfect world, the one law she must obey is the first law of Robotics. If there is a situation where there is a human about to be harmed, she must act to prevent that person from being harmed if she can. Now the priority is to protect her owner, if there are two people about to be harmed and she can save only one, she will save her owner first and the other second if there was still time, but witnessing a human being harmed is very painful to her and might even damage her positron brain. Her programming aside from the three laws is designed to emulate a human, and she can experience emotions and reactions of a human female that she was programmed to emulate, but her three laws programming takes precedence in all situations.

Now the problem here is her partner is a cop. What if he gets in a shoot out with a criminal and she is around? She has a laser pistol and a gauss pistol, but she can't use it to kill a human, she will try to use some other means to defeat the human that is threatening her owner. Fortunately for her, Police work doesn't involve a shootout every day. She will try to subdue the human in a way that causes the least harm to the suspect. She could bluff, for instance she could point a pistol at a criminal and say, "Halt or I'll shoot!" So long as she doesn't actually carry out her threat, she hasn't violated the first law.

As for why robots have the three laws, You've seen the stats for Galatea and for the Detective Kyle Rogers. You know that the detective has been a police officer for eight Traveller terms and has now reached the age of 50. (The humans on Earth by the way age like humans today, aside from living indoors all the time, their lives are much like ours, the Spacers are a different story) Kyle will probably be retiring soon and collecting a pension, and suddenly the pretty blonde girl shows up. Galatea is only a little over 3 years old, but she was built with a lot of skills programmed into her, she was built to be an Agent, as are most android robots, she has a lot more skills than even an experienced human like Kyle would have. There are very few Android robots running around, and Kyle has't heard of any. The Spacers have a few on Earth to gather intelligence, they mingle with the other humans.

Also there is a bit of suspicion between the Spacer Humans and the Earth Humans, their history goes back a thousand years. When the first AI Robots were built in the 21st century (About the time of the Bicentennial Man), there was a great deal of robotaphobia, it was feared that robots would supplant humans, or even worse rebel and take over, some of the early robots were not programmed with the three laws and that was exactly what happened, however the three laws robots happened to be more intelligent and they beat back the Robot rebellion and pretty much saved humanity from extinction, it was a short rebellion, but those robots were dangerous! One part of society wanted to ban AI robots altogether, the other part wanted to keep their robots and they fled into space. The Anti-robot humans stayed on Earth. Over time pollution became a serious problem, so humans began living indoors in the caves of steel. The Spacers and their robot servants in the meantime began terraforming 100 planets, these people were the industrialists the owners of capital, and they built societies on other planets where the working class was made up entirely by Three Laws Robots. Those three laws robots, in obeying the first law, just couldn't stand by and allow human life on Earth to be extinguished so they used their influence to cause the Spacers to come to Earth's aid, they persuaded the Earth Humans to accept robots, but they could only accept them on terms of physical separation. Humans lived indoors in these "Caves of Steel" while the robots worked outdoors fixing the environment out there. The Caves of Steel are also known as Arcologies, they are enclosed self-sufficient environments, sealed from the outside environment, much as a colony on Mars would be, before it was terraformed. Mars happens to be terraformed and a Spacer Colony. Humans from Earth are allowed to travel in Space only at the spacers sufferance, and most Earth humans prefer to stay on Earth anyway. The Population on Earth is in the tens of billions! by the way, mostly underground.

Robots have a lot of influence in Spacer Society, although they are basically slaves by programming, they do attempt to steer human society in such a way as to cause the least harm to human life, that is why there hasn't been a war in a long time, there are things such as pirates and criminals, and of course a military force is maintained due to mutual suspicion between Earth and the Spacers. The scary thing for the Spacers is the entire Spacer population of humans on all 100 planets is in the hundreds of millions, or about the population of today's United States, While Earth's population is over ten billion, it is feared that if Earth ever got into space it would take over! So the Spacers try to keep their monopoly of space travel to prevent that, and the robots on both sides try to prevent war from happening and harming humans.

When Kyle discovers Galatea is a robot, probably because she is too obedient and acts too perfect to be a human, she is basically too good to be true, if he goes as far as ask her if she is a robot, by the 2rd law, she will have to answer truthfully, she is kind of hoping that he develops an emotional attachment to her before that happens though. Kyle does have feelings for her, but he is suppressing them because he feels like he should be more like a father figure to her, Galatea, as you can see by the picture looks to be in her early twenties. Galatea notices the lingering glances Kyle throws her way, and she does everything she can to encourage that, because that will allow her to stay close to her master and better protect him! And also besides her three laws programming, she does indeed have feelings for him as well, and some part of her wishes she was human as well. She enjoys the act.. Probably Kyle will be horrified at first when he finds out that his under-aged probable girlfriend is a robot, but he's been a lonely guy since his divorce, so he'll get over it rather quickly, but he'll be in an awkward situation, but will probably keep the secret.

I think Kyle's teenaged daughter won't like this situation much, she didn't like the fact that her parents got divorced in the first place, and her father getting a girl friend that seems about her age is another thing that creeps her out just a bit. Kyle's two younger sons like her just fine, Galatea is like a second sister to them, and one much more easy to get along with as well. Kyle's ex-wife is a bit jealous as well, though she has picked up a boyfriend, the idea of her ex running around with a twenty-something girl is just a bit much for her to take. And I need to come up with a mystery for the two cops to solve, probably a murder.
 
Again, I have no issue with anything that will be fun for people to play out.

I just have a different perspective, not saying it is any better or worse than yours.

My understanding of yours is, there is this great story idea I have and I'll manipulate the laws such that the story I want plays out. Nothing wrong with this, but I think you should not stress the three laws so much.

Where as mine is, the laws are the laws and how do the laws manipulate the story. Doesn't matter if this is Asimovs three laws or some set of laws one may come up with for their setting.

For now, I'll concentrate on the 3 laws, since I believe these are the only ones being proposed at this time.

Examples within your setting
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.

I underline a portion as it seams to be overlooked. This is the first rule. It takes precedent over any orders her police superiors may give her. It takes precedent over any orders her owner may give her.

So she might ignore filing paperwork, not show up for roll call, or perform any of the other mundane stuff an officer might need to do. She must be out there saving humans from harm. She may decide to stop another officer from firing on a criminal. She may allow a criminal to get away in an instance where the only other choice might be harming them. Many possibilities for her not following proper police procedure because it conflicts with her rules. What are her performance reviews like? What do the other cops think of her constant preaching about the use of unnecessary force?

There are lots of things that may be harmful that are not against the law. For example she sees someone smoking or eating unhealthily. How does she react?

Does she compute the odds of a young child falling and getting hurt when learning to ride a bike and rush to stop the parents from doing such?

Does she see something on the news that is happening in another city, on another world even, and feel compelled to go there and help. Through her inaction, those people affected by some natural disaster may come to harm.

Does she join activist groups that try to sabotage the manufacturing of cigarettes, weapons, and other things that are harmful to people?

What are the choices she has to make. She just got paid, what should she do with the money? Give it to a charity to feed the poor? Give it to a charity that builds shelters? The charity that is researching a new medication?

Maybe she decides she shouldn't be a cop and should be working for one of these charities? She has Medic 3. Hack the hospital database to be a doctor and save lives that way instead of just being a police officer and stopping vandalism and minor offenses most of the time.

To me, all the robots that follow the rules would spend 100% of their time working on rule # 1. There would be no time left for "robot, clean the house", manufacturing, or any other function.

As I said, I personally have an issue with the three rules and how a robot would interpret them.

To me, it could easily turn into the crazy scenarios played out in film and books where people lose all their freedoms because the robots decide what is best in upholding the first law and keeping people from harming each other and themselves.

Anyways, I'm just saying that for your story (it seams more like a story than an adventure to me) I'd not specify exactly what the laws are for robots.
 
I do have one slight problem with you using Asimov's 3 Laws of Robotics in a story arc based on Asimov's background. That is - it shouldn't be 'painful' for her to ignore one or more of the Laws - it should be impossible; the whole point of having them is that a robot should be unable to ever circumvent them, that they are immutable restrictions on a robot's behaviour. A robot cannot, ever, deviate from that programming. Even you add Asimov's 4th or 0th Law - "0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm".

Having said that; if you want to run a story arc that uses the "Kalbfus' 3 General Guidelines of Robotics (K3GGR?)", then go for it - but would be a waste of time and effort to attach Asimov's name to it.
 
CosmicGamer said:
Again, I have no issue with anything that will be fun for people to play out.

I just have a different perspective, not saying it is any better or worse than yours.

My understanding of yours is, there is this great story idea I have and I'll manipulate the laws such that the story I want plays out. Nothing wrong with this, but I think you should not stress the three laws so much.

Where as mine is, the laws are the laws and how do the laws manipulate the story. Doesn't matter if this is Asimovs three laws or some set of laws one may come up with for their setting.

For now, I'll concentrate on the 3 laws, since I believe these are the only ones being proposed at this time.

Examples within your setting
1) A robot may not injure a human being or, through inaction, allow a human being to come to harm.

I underline a portion as it seams to be overlooked. This is the first rule. It takes precedent over any orders her police superiors may give her. It takes precedent over any orders her owner may give her.

So she might ignore filing paperwork, not show up for roll call, or perform any of the other mundane stuff an officer might need to do. She must be out there saving humans from harm. She may decide to stop another officer from firing on a criminal. She may allow a criminal to get away in an instance where the only other choice might be harming them. Many possibilities for her not following proper police procedure because it conflicts with her rules. What are her performance reviews like? What do the other cops think of her constant preaching about the use of unnecessary force?
If she let the criminal get away, then the criminal may harm other people, so rule 1 could also be violated by letting the criminal escape! She has to play out the scenario in her robotic mind and determine the action which would cause the least harm also it is important what sort of criminal she is trying to apprehend, if it is a murderer that is a danger to the public, letting him go would cause more harm than shooting him, but she will try to avoid shooting him as well, probably if her mind can't decide which is the least harmful course, she'll let Kyle take the initiative and do whatever is necessary to stop the criminal figuring that if she does nothing, at least whatever happens won't be on her. She is highly intelligent however, so she can play out the scenario a number of times with her taking different actions before she has to take action in real life.
CosmicGamer said:
There are lots of things that may be harmful that are not against the law. For example she sees someone smoking or eating unhealthily. How does she react?
Do you really think anyone in the year 3000 would still smoke? Smoking only increases the probability of getting a certain disease which may result in a fatality, it doesn't guarantee suchg an occurance, the probability isn't high enough or immediate enough to evoke any of the three laws that are in her positronic brain, the danger and the likelihood of harm has to be immediate and and foreseeable, since the likelihood of lung cancer is distant from any particular smoke, only a small action is required on her part, basically reminding the person that smoking may cause lung cancer. There has to be a cutoff on what evokes the three laws, otherwise she couldn't function. Immediate dangers are what trigger them. After all, she won't get all worked up about evacuating Earth, because in 5 billion years the Sun will swell up to a red giant and extinguish all life on Earth, for example, the threat is too remote to be significant. The three laws aren't absolute, the dangers have to be reasonably immediate and certain to trigger them, otherwise they aren't triggered.

CosmicGamer said:
Does she compute the odds of a young child falling and getting hurt when learning to ride a bike and rush to stop the parents from doing such?
She would likely compute that the child would likely survive such a fall, and it may prove helpful for her to avoid a more serious accident later on in the future. Very few kids actually die from falling off there bikes, and Galatea's three laws can accept the risk of the kid skinning her knee, so long as a direct action by the robot didn't cause the skinned knee. You see the Three Laws as written in English are actually a simplification to the actuall three laws programming in her positronic brain, it is more about balancing the various potentials to cause harm, than a rigid interpretation of these laws to the absolute degree, and that would be impossible, and the Robot designers know that!
CosmicGamer said:
Does she see something on the news that is happening in another city, on another world even, and feel compelled to go there and help. Through her inaction, those people affected by some natural disaster may come to harm.
Again she balances the potentials, if the event is far away, she figures there is probably a robot already there which would be in a better position to do something about it, Galatea can only affect her immediate surroundings and take actions against immediate dangers to those closest to her, She is not actually Supergirl, though her abilities are above average when compared to a typical human, this is balanced out by her limitations of the three laws, which make handling certain dangers caused by humans more difficult.

CosmicGamer said:
Does she join activist groups that try to sabotage the manufacturing of cigarettes, weapons, and other things that are harmful to people?
Someone might get injured in the act of sabotage, and that would be a more direct violation of the 1st law that her inaction to prevent the manufacture of cigarettes and weapons. Also, somebody might get killed by a wild beast where a weapon might save his life. Weapons aren't only used to cause harm to humans after all, they can also save lives. The more immediate danger would be her act of sabotage causing harm to someone rather that what the weapons being produced might be eventually used for, that even her positronic brain can't predict.

CosmicGamer said:
What are the choices she has to make. She just got paid, what should she do with the money? Give it to a charity to feed the poor? Give it to a charity that builds shelters? The charity that is researching a new medication?
Her first priority is to her master, the stranger on the street comes second. the three laws are triggered by immediate dangers, not distant of theoretical ones, she also has to compute the probability that the beggar is lying when he says he needs money, She would probably give something if it doesn't interfere with her greater duties to her master. There are a limited number of things she can do, she has to prioritize, if she can't do that, she is not a very useful robot.

CosmicGamer said:
Maybe she decides she shouldn't be a cop and should be working for one of these charities? She has Medic 3. Hack the hospital database to be a doctor and save lives that way instead of just being a police officer and stopping vandalism and minor offenses most of the time.
Unless the cops are Stormtroopers, then I think it would be fairly clear to her that the cops are doing good and preventing harm by doing their duties, and she would do a better job of sticking close to her master by being a cop like he is, than by being a doctor and working in a hospital.

CosmicGamer said:
To me, all the robots that follow the rules would spend 100% of their time working on rule # 1. There would be no time left for "robot, clean the house", manufacturing, or any other function.

As I said, I personally have an issue with the three rules and how a robot would interpret them.

To me, it could easily turn into the crazy scenarios played out in film and books where people lose all their freedoms because the robots decide what is best in upholding the first law and keeping people from harming each other and themselves.

Anyways, I'm just saying that for your story (it seams more like a story than an adventure to me) I'd not specify exactly what the laws are for robots.
I figure as an adventure, there would be greater challenge in playing a robot than a human, the human he serves would of course be an NPC. A lot of social skills would be required to get the NPC human to agree to certain actions. Despite being servants, robots have a lot of influence over their masters, they can make suggestions, and with a role of the dice, the GM determines whether the attempt was successful or not. Intelligent robots often project their ambitions onto their masters, they have their own goals for them. Some humans have to be careful if they have robots that are much smarter than they are! An intelligent and charismatic robot can be very persuasive, though all to the betterment of their master. Kyle might be in danger of that happening to him. It occurs all the time in Spacer Society, the humans think they are in charge, and the robots obey their commands, but the intelligent robots can persuade their masters to make certain commands. The Robots have after all by working from behind the scenes prevented war between the Spacer Worlds and Earth over a number of centuries despite antagonisms between those two societies.

7genevievemorton_crop_north.jpg

Kyle needs to be careful, she may be his robot, but she is a very ambitious robot, something a pretty as this could be very persuasive, then again it may already be too late!
 
Rick said:
I do have one slight problem with you using Asimov's 3 Laws of Robotics in a story arc based on Asimov's background. That is - it shouldn't be 'painful' for her to ignore one or more of the Laws - it should be impossible; the whole point of having them is that a robot should be unable to ever circumvent them, that they are immutable restrictions on a robot's behaviour. A robot cannot, ever, deviate from that programming. Even you add Asimov's 4th or 0th Law - "0. A robot may not harm humanity, or, by inaction, allow humanity to come to harm".

Having said that; if you want to run a story arc that uses the "Kalbfus' 3 General Guidelines of Robotics (K3GGR?)", then go for it - but would be a waste of time and effort to attach Asimov's name to it.

In the novels, robots have caused harm unintentionally, and usually the three laws fried their brains as a result. A lot of the murders resulted from a human giving a number of commands to a number of different robots, each one action not causing harm, but when taken together can harm a human. Robots can only act on the information they have. Many of the Asimov mysteries involved various laws being broken, usually by a human tricking a robot into breaking one of those laws.
 
Back
Top