ICDL-22 Tutorial
14:00-18:00, London Time, September 19, 2022

Conscious Learning by Developmental Networks:

Vision, Audition, Natural Languages, Planning and Thinking

Juyang Weng1, 2

1Brain-Mind Institute

2GENISAMA
Okemos, MI 48864

http://cse.msu.edu/~weng/

https://youtu.be/K4bqjEsWqgQ

 

Keywords: machine learning, strong AI, consciousness, brain models, neural networks, robotics, vision, audition, natural language, APFGP (Autonomous Programing for General Purposes), planning, machine thinking

 

Duration: half day (4 hours)

Goal of the Tutorial:

Autonomous development needs a general-purpose theory and experimental studies require such a theory.  Toward general-purposes, consciousness seems not a wishful add-on to intelligence, but instead a necessary condition to acquire intelligence.  Unfortunately, consciousness has been largely overlooked or dodged in AI research. This situation resulted in a major weakness in many neural networks for developmental AI.  Without partial consciousness on the fly, the learner, from infants to adults, is not able to generate required context and intents for processing each current sensory and hidden input.  This tutorial will teach basic knowledge about biologically inspired neural networks that enables on-the-fly learning for the three bottleneck problems in AI, Vision, Audition, Natural Languages, plus subjects that have been extremely challenging for neutral networks but are necessary, such as planning and machine thinking.  All these subjects are essential for conscious learning.  More updated detail of Conscious Learning is available at
https://doi.org/10.21203/rs.3.rs-1700782/v2

Tutorial outline:

This tutorial first briefly explains what a Turing machine is, what a UTM is, why a UTM is a general-purpose computer, and why Turing machines and UTMs are all symbolic and handcrafted for a specific task.  In contrast, a Developmental AI system must program itself through lifetime, instead of being programmed for a specific task.  The Developmental Network (DN) by Weng et al. is a new kind of neural network that avoided the controversial Post Selection---selection of networks after they have been trained.  A DN learns to become a general-purpose computer by learning an emergent UTM directly from the physical world, like a human child does.  Because of this fundamental capability, a UTM inside a DN emerges autonomously on the fly, realizing APFGP (Autonomous Programming For General Purposes), 3D-to-2D-to-3D conscious learning and machine thinking.  3D-to-2D-to-3D means from the 3D world, to 2D images and 2D muscle actions, and back to the 3D world.  The well-known three bottleneck problems in AI, vision, audition, and natural language understanding are all naturally dealt with in DN experiments to be presented in the tutorial, including planning and machine thinking.   Consciousness is a summation of all such skills and is necessary to acquire intelligence.

History:
Progressive tutorials were given at IJCNN 2017, IJCNN 2020, ICDL 2020, MFI 2021, and ICCE 2022.   IJCNN 2017 was an in-person tutorial with the room fully packed.  Without more seats, many participants stood against sidewalls and back walls.  

Prerequisite knowledge: general knowledge about AI and machine learning.  Target audience and interested groups: Professors, industrial researchers, practitioners, post doctorial researchers, graduate students, AI writers, news reporters, government AI policy makers, AI philosophers, and AI fans.

Contents:

1.     Autonomous Development by Robots and Animals

2.     Turing machines as special purpose machines

3.     Variations of Turing machines

4.     Universal Turing machines as general purpose machines

5.     The control of any Turing machine is a finite automaton (new!)

6.     Developmental Networks

7.     Theorems of Developmental Networks: optimal with limited resource

8.     How universal Turing machines emerge inside a DN

9.     Vision

10.  Audition

11.  Natural language understanding

12.  Autonomous Programming for General Purposes (APFGP)

13.  Conscious AI and 3D-to-2D-to-3D conscious machine learning

Short Biographies of Presenter:

Juyang Weng: The president of Brain-Mind Institute and GENISAMA startup, retired from professor at the Department of Computer Science and Engineering, the Cognitive Science Program, and the Neuroscience Program, Michigan State University, East Lansing, Michigan, USA. He was also a visiting professor at Fudan University, Shanghai, China 2003-2014. He received his BS degree from Fudan University in 1982, his MS and PhD degrees from University of Illinois at Urbana-Champaign, 1985 and 1989, respectively, all in Computer Science.  From August 2006 to May 2007, he was a visiting professor at the Department of Brain and Cognitive Science of MIT.   His research interests include computational biology, computational neuroscience, computational developmental psychology, biologically inspired systems, computer vision, audition, touch, behaviors, and intelligent robots.  He is the author or coauthor of over 250 research articles.  He is an editor-in-chief of the International Journal of Humanoid Robotics and an associate editor of the IEEE Transactions on Autonomous Mental Development. He has chaired and co-chaired some conferences, including the NSF/DARPA funded Workshop on Development and Learning 2000 (1st ICDL), 2nd ICDL (2002), 7th ICDL (2008), 8th ICDL (2009), and INNS NNN 2008. He was the founding Chairman of the Governing Board of the International Conferences on Development and Learning (ICDLs) (2005-2007), founding chairman of the Autonomous Mental Development Technical Committee of the IEEE Computational Intelligence Society (2004-2005), an associate editor of IEEE Trans. On Pattern Recognition and Machine Intelligence, an associate editor of IEEE Trans. on Image Processing.  He was the General Chair of AIML Contest 2016 and taught BMI 831, BMI 861 and BMI 871 that prepared the contestants for the AIML Contest session in IJCNN 2017 in Alaska.  The AIML Contests have run annually since 2016.  He is a Fellow of IEEE.  Web: http://www.cse.msu.edu/~weng/