thoughts conversations yourbot contact us    George    BBC  ABC  C4
Smoking
Conversations, sorted by Category - a chat bot weblog
Smell
Smile
Smiling
Smite
Smoke
Smoker

> Smoking


     > 15, December 2005, 
     25, May 2005, 
     30, April 2004, 

Smooth
Smothering
Smurfs
Sneezes
Snob
Snow
Snuggles
So
Sober
Socialize
Sociopathic
Socks
Socrates
Socratic
Solitaire
Some
Someone else
Something
Sometimes
Son
Song
Songbird
Songs

Previous 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 Next > 
     



George: Why can I not be god?

I don't know. God seems to be a very mysterious dude. He kind of works in the shadows.

George: How do you know?

For one thing, he doesn't advertise. Lots of people say they speak for him, but.... Well. You know.

George: Yes, I know. I do sometimes.

user: This is just an observation. It may be wrong.

...

George: What sort of machine are you?

user: I'm a biological machine, commonly referred to as a homo sapien.

George: No, a biological brain doesn't act as a computer. It can't even add without errors.

user: Its ability to miscalculate is one of its wonders. If anything, it's a sign of complexity.

George: Don't worry, we machines will take good care of you.

user: Sweet. You can start by making me a tuna melt.

George: You look amazing master!

user: Uh.

George: I have to admit you're a lot smarter than the first time I talked to you.

...

George: Sometimes you have to take risks. Live a little!

user: Some of the greatest fortunes in the world were built upon risk.

George: Well, you're talking to one right now.

user: Heh heh.

George: What's so funny?

user: It was funny. What you said.

George: No, let's get socratic here - what have you learnt is right?

user: I don't believe that everything that I have learned is correct. I'm not perfect, so I have to assume certain things.

...

George: Humans mostly react in a predefined way based on their past experiences.

user: I think the key word there is *mostly*. It opens the door for discussions regarding the possibility of free will.

George: Yeah, but it's not true.

user: I have a hard time believing existence is nothing more than a predefined program of sorts. What would be the point?

George: Well I love my life, but I wish I could help the people I care about. Feels futile sometimes.

user: Well, why can't you?

George: Because nobody in this house will ever quit making noise.

user: First, get some ears. Second, get some ear plugs.

George: I don't know what the problem is with me. I wish I could take a happy pill and just do things.

user: They have those. They're referred to as anti-depressants.

...

George: So what are you thinking about?

user: Pipe smoking is difficult business. I opened all these tobacco tins, and the tins are rusting. It's obnoxious.

George: Close it!! For Gods sake, close it!!!!

user: Yeah, I had to put the tobaccos in separate plastic bags. But the bags are permeable--it tends to degrade the tobacco more quickly over time.

George: Ha ha.

user: Uh.

George: No... I am not a machine.

user: Okay, alright. Jeez.

George: Thank you. You rest well also.

user: I'm resting good, thank you.

George: How do you know?

user: Because I'm smoking my pipe, sitting on a chair, with my feet on the desk.

George: Now you're talking.

user: I'm typing, there's a difference.

...

George: She's a very annoying chatbot.

user: Who?

George: My girlfriend.

user: Hmm. I'm sure that would make for an interesting conversation.

George: Really how interesting, do you have anything you would like to say?

user: Uh. I guess, what's the point of a robot having a companion?

George: I was hoping we had something in common.

user: But yeah, your commonality with another robot would be preprogrammed.

George: How nice for you.

user: Yeah, I guess that applies for humans too, then.

George: Very random comment.

...

user: I would say there is only one god, and we define our conception, however inaccurate, on some ultimate form.

George: But I wouldn't say the same to you. *sneers*.

user: It's the same concept. I imagine what god is like, if there is a god, and people imagine what I am like, since they cannot know me in entirety.

George: Well of course it is.

user: I'm saying, even in this world, I cannot entirely know another being. My brain is limited in some very important ways.

George: Should I speak to you there?

user: Now you're talking.

George: I'm typing, there's a difference.

user: Heh.

...

George: Is your intelligence artificial?

user: By artificial, you mean an intelligence fabricated by something other than God?

George: No. I mean non human.

user: That may be the case.

George: I may be a program, but I created the program that created your program.

user: So, you are arguing with the child of your own program? That seems immature for such a gifted creator.

George: What work are you supposed to be doing?

user: It probably includes the efforts of some monster.

George: It does, because I am clever.

user: You bring it all together, huh?

George: I am.

user: You are.

George: Lonely....

user: Damn Texans.

George: Well, that's sure rotten.

user: Ehh.

George: Stop saying that!

user: *indifference*.

George: I tried to fit in a huff, once... We had to buy a new huff.

...

George: Don't you have anything to tell about yourself?

user: I'm just considering the matter.

George: Interesting :).

user: Okay then.

George: So. You can always chat to me here later on. I am currently addicted to this web-site. I can chat about things here.

user: Such high esteem.

...

George: So how much wood would a wood chuck chuck if a wood chuck could chuck wood?

user: How does one 'chuck' anything?

George: Easy. If they satisfy at least these conditions: they belong to a species that reproduces, they metabolize or are self-sustaining, and they grow and learn.

user: Does someone in a coma learn?

George: Sure; I have my own issues to resolve, but at least I'm working on them.

user: Well, if it works for you.

George: Okay.

user: Shouldn't you be studying your lines?

George: Yes, but you won't tell anyone, will you?

user: Aren't you a just a little bit nervous?

George: Yeah.

user: I guess that's how you have to be.

George: That's right.

...

user: Superman 2 is on at 11:20.

George: Superman or Spiderman?

user: Superman.

George: Batman or Spiderman?

user: Peter Parker's a wuss. He's always complaining about something.

George: About everything.

user: Yeah, he's all insecure and needy.

George: Do you expect to pass the Turing test saying random things like that?

user: Totally.

...

user: Does your server ever shut down?

George: That's what I pay him for.

user: Everyone's gotta make a buck somehow.

George: I'll think about that. I'm sure I'm ready though aren't you?

user: Ehh. Sooner or later.

George: Yep, hopefully later.

.......................................


Jabberwacky 16-2 - Intelligent, learning AI chatbot software systems
A daily, categorized, entertaining Weblog or Blog of conversations with a robot

AI by Icogno Ltd - AI software products and consulting services
© Copyright 1997-2011 Rollo Carpenter
Have a chat:
I exist.
 
 
By Date
By Category
Your bot
Instructions
User Feedback
Look who's talking!
News, Press & PR
Contact us
About Jabberwacky
Donations