>>11304I've come up with better explanation of my point. When you say that "AI is just a statistics" it's like saying that "brain is just a chemical reaction" or "poem is just a sequence of letters" or "your favorite hentai is just a sequence of 0-s and 1-s on disk". Formally true, but complex and meaningful things can be a combination of many trivial elements.
> I would argue the parts we would describe as "intelligence" within A.I. are superimposed by the creator of the reward function(s) evaluating and guiding the pseudo random number variations. It is not inherent to the A.I..In case of humans it would mean that actual intelligence is "superimposed" by evolution rewarding us for survival and reproduction and not inherit to us.
> You just get a watered down version of the intelligence of the developer, who uses tons of data points and iterationsNo. If you can come up with formula like "let reward function be logarithm of probability of predicting next token", it doesn't mean that you have "concentrated knowledge" of the model which you've trained. You may argue that
data has this knowledge, but then again it'll bring you to the question whether
our knowledge is inherit to us or just derived from experience of universe around us.
> to get a very niche and limited statistical average, which can afterwards be applied to only similar data points the model has been trained withModel generalizes patterns it sees in training data and can work with something it hasn't seen before by extrapolating older experience. Of course it can't work with something entirely new, but then again, can you? I think not. You'll have to try, to fuck around and find out, to gather some experience about new phenomena and then act according to that experience.
> Another point I would somewhat ascribe to intelligence would be creativity; the act of creating something, that hasn't been before - not mere combining of things which already have been. I do not see our current tools reaching this point any time soon.It raises the question of whether humans are able to create something new of the thin air. Or do they just incrementally create new things by combining old ones until you can't recognize initial inspirations in the final result.
> Another big problem I see: consciousness.Why do you think that AI doesn't have it? Consciousness is something unverifiable in physical universe. I can only sure be about my own consciousness. And since I don't think that I'm special, I assume that other humans have it too.
What modern AI lacks in my opinion compared to humans is logical thinking. Which is funny because classical AI such as theorem proovers has merely this type of intellect. Maybe the future is combining these two approaches.
Also it has bad short-term memory but probably it's solvable by better architectures.