We are often going for mimicry, which is scary. Turing tests are incredibly retarded to begin with, and yet Turing ideas are considered great, what does this tell you about our own abstraction? That it is insufficient for making an AI system and at the same time predict its behavior. So will refine mimicry and specific AIs for jobs, general artificial intelligence(AGI) won't come until we figure out how our brains work. Consider the Gödelian argument against the Turing machine AGI and you will understand why there is no AI yet. It seems the most profitable approach is to fund small groups of geniuses to work independently, unaware of each other and even most common CS theory, until a breakthrough appears. Otherwise a circlejerk starts, side projects start, discussions and rhetoric take the place of thinking and writing etc Consider that by simply writing a lot of interesting stuff one can get socially rewarded and thus will often prefer to write interesting stuff than to further AGI depth.