Off-topic Talk Where overpaid, underworked S2000 owners waste the worst part of their days before the drive home. This forum is for general chit chat and discussions not covered by the other off-topic forums.

Intersting Debate

Thread Tools
 
Old Mar 23, 2009 | 02:55 PM
  #51  
UnkieTrunkie's Avatar
Moderator
20 Year Member
Liked
Loved
Community Favorite
 
Joined: Sep 2004
Posts: 109,425
Likes: 1,648
From: SJC
Default

Originally Posted by mxt_77,Mar 23 2009, 11:47 AM
You never know what kinds of technological/practical/legal issues will pop up that may advance or interfere with the progress.
. . . like having enough HVAC for our new digital overlords. . .
Reply
Old Mar 23, 2009 | 03:16 PM
  #52  
Elistan's Avatar
Registered User
20 Year Member
 
Joined: Oct 2000
Posts: 15,323
Likes: 28
From: Longmont, CO
Default

Originally Posted by 8D_In_Trunk,Mar 23 2009, 01:22 PM
There's a difference between a mathematical sequence, even a big fractal one, and what humans often do via serendipity.
[...]
Technology, yes. However, that is with a serious caveat; Moore's Law has peaked, possibly even disproven. . .
Well, no, considering that what humans do is simply a mathematical sequence, based on the physical construction of our bodies, the inputs we receive, and the physical laws of the universe.

Why do you say that about Moore's Law? From what I can gather from a quick Google search, it's still as accurate an observation now as when it was first originated.
Reply
Old Mar 23, 2009 | 03:59 PM
  #53  
UnkieTrunkie's Avatar
Moderator
20 Year Member
Liked
Loved
Community Favorite
 
Joined: Sep 2004
Posts: 109,425
Likes: 1,648
From: SJC
Default

Originally Posted by Elistan,Mar 23 2009, 03:16 PM
Well, no, considering that what humans do is simply a mathematical sequence, based on the physical construction of our bodies, the inputs we receive, and the physical laws of the universe.
You have a point. . . but it's flawed until we have proof of the mathematics of human serendipity.

Right now, you have conjecture per the assumption of that if it's made of transactions, the serendipity must be a linear transaction.

Computers, being machines, have a derivative path. So long as we're researching AI, that derivative path is plain and clear. Also, in the case of the automaton experiments (TraviS2000's post), they were all given the same orders, and followed them. There was no choice. . . the computer would have had to have primary instructions to investigate the feasibility prior to execution. . . save nothing of the machine's desire to stay at home and get high. . .

[QUOTE]
Why do you say that about Moore's Law?
Reply
Old Mar 23, 2009 | 04:35 PM
  #54  
HowardZinn's Avatar
Registered User
 
Joined: Feb 2009
Posts: 417
Likes: 0
Default

I think computers will have true artifical intelligence one day. After all look at how babies learn, it's pretty simple. We do fill in some gaps though, associating words with meaning for example is a pretty complex process at times. Computers will require a lot of code to do that.

At the end of the day though, the machine won't have a soul
Reply
Old Mar 23, 2009 | 06:28 PM
  #55  
mxt_77's Avatar
Registered User
 
Joined: Jan 2003
Posts: 8,482
Likes: 3
From: Wylie, TX
Default

Originally Posted by 8D_In_Trunk
You have a point. . . but it's flawed until we have proof of the mathematics of human serendipity.
If an algorithm can be derived to "learn", then I see no reason that it can't be derived to recognize serendipitous events. Serendipity is easy to recreate... the brute force method is just to continuously try random actions and observe the outcome. There's no reason that a computer can't do that just as well as a human. In fact, I'd wager that it could do it much more quickly and effectively than a human.

Originally Posted by 8D_In_Trunk
Computers, being machines, have a derivative path.
I think you're too closed-minded about your definition of computers. Sure, if all you allow is silicon, precious metals & electrons, then computers may never become "intelligent." But, if you allow more advanced chemical compositions that can adapt to their environment, forming new pathways, independent algorithms and completely new structures based on the inputs that they receive and the interactions that they have, then I see no reason that computers have to have such a limited path. Sure, it could be calculated or predicted if you can control its inputs, but just as with humans, unless it is in a lab environment, you cannot pre-determine what it will "experience" and how it will develop.

Originally Posted by 8D_In_Trunk
Moore's Law is dependent around processor speed, which has peaked (thermals).
Actually, Moore's Law refers only to transistor density, not processor speed. And, as far as I'm aware, it has not yet become outdated... although I believe it soon will, based on the basic laws of physics and our current definition for transistors.

Originally Posted by 8D_In_Trunk
As I also said elsewhere it's become a dicey apples vs. oranges issue, as it is now multiple processors versus one human brain.
Are you implying that the brain is not a multi-core device? I'd argue that this assumption is incorrect. Basic psychology & neurological research shows that particular areas (or cores) of the brain are used for specific functions.
Reply
Old Mar 23, 2009 | 07:09 PM
  #56  
Elistan's Avatar
Registered User
20 Year Member
 
Joined: Oct 2000
Posts: 15,323
Likes: 28
From: Longmont, CO
Default

Originally Posted by 8D_In_Trunk,Mar 23 2009, 06:59 PM
You have a point. . . but it's flawed until we have proof of the mathematics of human serendipity.
"The mathematics of human serendipity?" Of what do you speak? Perhaps it would help if you clarify what you mean by "serendipity" because I'm not really sure what you're trying to say.

Computers, being machines, have a derivative path. So long as we're researching AI, that derivative path is plain and clear. Also, in the case of the automaton experiments (TraviS2000's post), they were all given the same orders, and followed them. There was no choice. . . the computer would have had to have primary instructions to investigate the feasibility prior to execution. . . save nothing of the machine's desire to stay at home and get high. . .
Humans are much the same way - but since their initial configurations are so variable, and it's nearly impossible to control all the inputs, we can't really trace their derivative paths.

Regarding instructions - consider the difference between a person with a college degree, and a person (there have been a few) who spent their entire childhood with no or extremely little human contact.

Moore's Law is dependent around processor speed, which has peaked (thermals). The reason computers are faster now is due to multiple cores on the same dye. Technically, it busts Moore's Law (two processors is not one processor).
Actaully, Moore's Law is about the quantity of transistors in an integrated circuit, not processing power per se. So a dual-core chip is, by definition, twice as advanced as a single-core chip according to Moore's Law.

Here's the original article, from 1965:
ftp://download.intel.com/museum/Moores_La...965_Article.pdf

"The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer."

(That was updated in 1975 to doubling every two years.)

As I also said elsewhere it's become a dicey apples vs. oranges issue, as it is now multiple processors versus one human brain.

. . . and if you've noticed, at some point multiple processors do in fact have committee meetings.
The brain isn't a single monolithic processor. There's a vision center, a speech center, high-level thought, motor control, body-function regulation, etc. etc. Often one center can get damaged and the rest will continue working just fine.
Reply
Old Mar 24, 2009 | 02:25 AM
  #57  
rustywave's Avatar
Registered User
 
Joined: Feb 2005
Posts: 3,605
Likes: 0
From: Chicago
Default

Originally Posted by 8D_In_Trunk,Mar 23 2009, 06:59 PM
. . . save nothing of the machine's desire to stay at home and get high. . .
...can you imagine if it actually got to that point? haha

<monotone computer voice>

COME ON. JUST ONE MORE MAGNET. IT WILL BE THE LAST ONE. I SWEAR.

</monotone computer voice>
Reply
Old Mar 24, 2009 | 10:00 AM
  #58  
UnkieTrunkie's Avatar
Moderator
20 Year Member
Liked
Loved
Community Favorite
 
Joined: Sep 2004
Posts: 109,425
Likes: 1,648
From: SJC
Default

[QUOTE=Elistan,Mar 23 2009, 07:09 PM] "The mathematics of human serendipity?"
Reply
Old Mar 24, 2009 | 10:02 AM
  #59  
thebig33tuna's Avatar
 
Joined: Jan 2007
Posts: 32,283
Likes: 0
From: Cincinnati, OH
Default



damn 8d ninja edits fast.
Reply
Old Mar 24, 2009 | 10:04 AM
  #60  
UnkieTrunkie's Avatar
Moderator
20 Year Member
Liked
Loved
Community Favorite
 
Joined: Sep 2004
Posts: 109,425
Likes: 1,648
From: SJC
Default

Originally Posted by thebig33tuna,Mar 24 2009, 10:02 AM
Yeah, we got that. . . I can't QC three systems at the same time.
Reply



All times are GMT -8. The time now is 11:27 AM.