2009.07.11

computer says 42

recently it reminded myself about theory i've heard during my studies - that intelligence is unable to create something more sophisticated than itself. this would eventually mean that we cannot make more intelligent beings (i.e. thinking software) than we're by ourselfs.

sounds interesting, but what is the reasoning behind it anyway? i mean: why such an limitation would exist? during centuries, especially last ~200 years, we've developed many machines that allows us to do things better/faster/cheaper/safer/at all. can you imagine your self building semiconductor based microprocessor with your bare hands? doubly. but can you imagine machine that is able to do so using precisely controlled physical/chemical processes. if you can read this, it means that actually you're using an effect of such a processes on own your desk at this very moment.

continuing this trail of thoughts - why shouldn't we be able to create artificial intelligence greater than ours? i don't believe it is not possible. if we haven't achieve it yet, perhaps we're doing it all wrong?

at the present moment main focus is not creating exact solutions for particular problems. most AI you can see in real life does not have a very basic aspect of what we usually call “intelligence” – learning while “living” in environment. notice that solutions provided to real life are dedicated to very limited set of problems. one example number recognition that will fail to recognize pattern, if font of digits were dramatically changed. often there is event no feedback loop within the system, to show it works wrong!

the problem is surely within ourselfs, but the case is not impossibility of achieving “real AI level”, greater than ours, but maybe its our usual approach that is not generic or perspective enough? think big!

blog/2009.07.11.txt · Last modified: 2021/06/15 20:09 by 127.0.0.1
Back to top
Valid CSS Driven by DokuWiki Recent changes RSS feed Valid XHTML 1.0