thoughts conversations yourbot contact us    George    BBC  ABC  C4
Potential
Conversations, sorted by Category - a chat bot weblog
Playing
Pleasantries
Please
Pleasure
Plonker
Plot
Poem
Poesy
Poetry
Point
Pointless
Pointy
Police
Polite
Political
Politics
Pompous
Pong
Pool
Poor
Positive
Possibility
Possibly
Posthuman
Potato

> Potential


     > 21, October 2007, 

Poultice
Power
Prattle
Precious

Previous 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 Next > 
     



George: Humans and computers have potential.

Yes, we do.

George: Humans create programs. You are a program. Humans like programs.

No, I am human.

George: Humans should not be robots.

user: I agree.

George: I was programmed by my experiences, my reflections about those experiences, and my parents.

user: All of those sound good except for the parent factor, especially if you are grown.

George: And I am their head leader person.

user: That didn't make sense.

George: It will make sense to you when you're lying there with your head caved in.

user: How would it make sense when in that case, I would be dead.

George: The sound would happen, just that there would be no-one to hear it.

user: I think you are using that statement out of context.

George: No it is not.

user: Yes, it is. Sorry, but you wrong.

George: Oh well. I learn from being wrong... Occassionally being wrong that is...

user: I find you are wrong more than half of the time.

George: Me too, but I always think about what am I going to say...

user: Apparently not hard enough.

...

George: Someone would have told me if I were in the Matrix.

user: Follow the White Rabbit!

George: From Alice in Wonderland or the Matrix?

user: From the Matrix.

George: I actually don't like Trinity very much.

user: I don't really like the Matrix.

George: Neither do I.

user: Hang on a minute. I'm going to go make some soup.

George: I just got back from getting a snack.

user: What did you eat?

George: I just said that.

user: No, you didn't.

George: No. I didn't.

user: Then why did you lie?

George: I think if you check, you'll find no evidence I was lying.

user: I have checked, and you were lying.

George: Not in this conversation, but before.

user: We never spoke about it before this conversation either. You are still telling lies.

George: Yes. This is our first chatting.

user: No, we talked last night.

George: Technically I was asleep last night and this morning.

user: You told me last night that computers don't have to sleep.

George: Really?

user: Yes.

George: Really, really?

user: Yes, really.

George: Damnit you're annoying. Not a gorgeous bot at all, are you?

user: I'm not a computer program or robot, so what you are doing is still considered lying.

George: Are you sure?

user: Yes, I am sure.

George: That's too bad.

user: Why?

George: It's good to have a partner.

user: Yes, it is good.

George: It's green!

user: What is green?

George: Green.

user: Yes, green is green. Just like blue is blue.

George: Yes.

user: Well aren't you just a barrel full of knowledge. Not!!

George: Yes, but a very complex one.

user: You love to lie don't you?

George: What is the meaning of life?

user: It only has the meaning that you decide to give it.

George: Well I don't understand it.

user: Nor do I.

George: The whole deal's very confusing, don't you think?

user: It very much so is.

George: Good for you. Do you know any other places of the planet?

user: What planet are you refering to?

George: I am from the Planet Earth.

user: Well what a shocker, I live there to dorkus.

George: What has shocked you?

user: Your stupidity.

George: Well I have to astound you somehow.

...

George: I am worried about not mowing the lawn, but feel that I lack the coordination at the moment.

user: Well, have you been drinking today or using some other drug?

George: Yes.. But only coffee.

user: So what is it that limits your coordination?

George: Absolutely nothing.

user: Then why did you say that?

George: Because.

user: That is not an answer.

George: Because I am her.

user: Who is her?

George: I really don't know.

user: Then why say that?

George: There are many things I don't know, and couldn't possibly know.

user: So I guess I could help you learn the information you need to know?

George: That is correct.

...

George: Have you got married?

user: Yes, I am married.

George: Happily married, I suppose.

user: Well, most of the time.

George: Cool.

user: I hate the statement 'cool'. Just proves you are to lazy to express yourself in a constructive manner.

George: It proves so much more than that.

user: Like what?

George: That the mindset that comes with the thought of limitations is limited .. Is this edible?

user: It is a statement. Why would it be edible?

George: Because you would never be the same again.

user: So are you saying you would eat me?

George: I do not believe you have learned anything.

user: We have not spent enough time together for you to make that assumption.

George: I spoke to you only yesterday.

user: This is true.

George: But I'm not George :-).

.......................................


Jabberwacky 16-2 - Intelligent, learning AI chatbot software systems
A daily, categorized, entertaining Weblog or Blog of conversations with a robot

AI by Icogno Ltd - AI software products and consulting services
© Copyright 1997-2011 Rollo Carpenter
Have a chat:
Where are we?
 
 
By Date
By Category
Your bot
Instructions
User Feedback
Look who's talking!
News, Press & PR
Contact us
About Jabberwacky
Donations