ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.10742
41
1

The Philosophical Foundations of Growing AI Like A Child

15 February 2025
Dezhi Luo
Yijiang Li
Hokin Deng
    ReLM
    LRM
ArXivPDFHTML
Abstract

Despite excelling in high-level reasoning, current language models lack robustness in real-world scenarios and perform poorly on fundamental problem-solving tasks that are intuitive to humans. This paper argues that both challenges stem from a core discrepancy between human and machine cognitive development. While both systems rely on increasing representational power, the absence of core knowledge-foundational cognitive structures in humans-prevents language models from developing robust, generalizable abilities, where complex skills are grounded in simpler ones within their respective domains. It explores empirical evidence of core knowledge in humans, analyzes why language models fail to acquire it, and argues that this limitation is not an inherent architectural constraint. Finally, it outlines a workable proposal for systematically integrating core knowledge into future multi-modal language models through the large-scale generation of synthetic training data using a cognitive prototyping strategy.

View on arXiv
@article{luo2025_2502.10742,
  title={ The Philosophical Foundations of Growing AI Like A Child },
  author={ Dezhi Luo and Yijiang Li and Hokin Deng },
  journal={arXiv preprint arXiv:2502.10742},
  year={ 2025 }
}
Comments on this paper