| Peer-Reviewed

Communication Between Neural Networks, and Beginning of Language

Received: 8 December 2021    Accepted: 21 December 2021    Published: 31 December 2021
Views:       Downloads:
Abstract

There is a view that the cranial nerve circuit is composed of a combination of the same modules as the basic functions. According to that view, the author has presented the module (Basic Unit) that performs parallel-serial mutual conversion and has shown that the neural network that recognizes and generates arbitrary time-series data can be constructed by combining the module. In Chapter 2, the neural network that has the functions of federated learning and imitation that enable collective behavior of animals is shown, and added an idea of concrete circuit configuration to published papers. In Chapter 3, following a consideration of the fundamental role of language, a neural network with the same basic structure connected to the upper level of the neural network shown in Chapter 2 but with functions closely related to language is presented. The new neural network consists of a pair of neural networks that handle languages and images respectively. Each activated part is expressed using the Category theory concept. Category's entity is a set of Basic Units connected each other and changes of their state. The activated Categories are tied with the corresponding activation part in the pairing neural network, and interconverting is performed. The state of the Basic Unit may be inspired by sensory organs, but behave independently of the actuating behavior of conventional neural networks connected to the low position. Humans can generate an image of events that may occur in past or in future even if that are not directly related to the situation in front of the eye, and share their images by dialogue. The dialogue consists of time series data with a response format such as question or negation. The newly added neural network helps generate shared recognition.

Published in American Journal of Neural Networks and Applications (Volume 7, Issue 2)
DOI 10.11648/j.ajnna.20210702.13
Page(s) 38-44
Creative Commons

This is an Open Access article, distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution and reproduction in any medium or format, provided the original work is properly cited.

Copyright

Copyright © The Author(s), 2021. Published by Science Publishing Group

Keywords

Short-Term Memory, Long-Term Memory, Serial Parallel Conversion, Parallel Serial Conversion, Mirror Neuron, Prediction, Category Theory, Federated Learning

References
[1] D. C. Dennet, “From Bacteria to Bach and Back: The Evolution of Minds.” In Penguin Books, 2018.
[2] S. Dehaene, “Consciousness and Brain”, Penguin Books, 2014.
[3] S. Yanagawa. 2017, “Learning Machine That Uses Context Structure to Search Policy.” https://jsai.ixsq.nii.ac.jp/. SIG-AGI-007-07.
[4] S. Yanagawa, “New Neural Network Corresponding to the Evolution Process of the Brain”, American Journal of Neural Networks and Applications 2021; 7 (1): 1-6.
[5] Gyorgy Buzsaki, “Rhythms of the Brain”, Oxford University Press, 2006, p43 Networks and Applications, Volume 6, Issue 1, June 2020.
[6] L. Andrew Coward, Ron Sun, “Hierarchical approaches to understanding consciousness”, Neural Networks 20 (2007) 947-954.
[7] Qianli Ma, Wanqing Zhuang, Lifeng Shen, Garrison W. Cottrell, “Time series classification with Echo Memory Networks”, Neural Networks 117 (2019) 225-239.
[8] Paolo Arena, Marco Cali, Luca Patane, Agnese Portera, Roland StraussSC, “Modelling the insect Mushroom Bodies: Application to Sequence learning”, Neural Networks 67 (2015) 37-53.
[9] A. Damasio “The Strange Order of Things”, Pantheon Books, New York, 2018 p. 62.
[10] T M. Iacoboni, “Mirroring People.” Picador 33. 2008, Chapter 4, p106.
[11] Seisuke Yanagawa, “Each Role of Short-term and Long-term Memory in Neural Networks”, American Journal of Neural Networks and Applications, 2020; 6 (1): 10-15.
[12] Hoon Keng Poon, Wun-She Yap, Yee-Kai Tee, Wai-Kong Lee, Bok-Min Goi “Hierarchical gated recurrent neural network with adversarial and virtual adversarial training on text classification”, Neural Networks 119 (2019) 299-312.
[13] D. Everett, “How Language Began”, Profile Books, 2017.
[14] W. J. Freeman “How Brains Make up Their Minds” Weidenfeld & Nicolson Ltd. 1999.
[15] S. Yanagawa “A neural network that processes time series data in which element data is linked to symbols Third International Workshop on Symbolic-Neural Learning (SNL-2019) Poster presentation.
[16] Michael j. Healy, Thomas P. Caudill, “Episodic memory: A hierarchy of spatiotemporal concepts”, Neural Networks 120 (2019) 40-57.
[17] Ann Sizemore, Chad Giusti, Ari Kahn, Richard F. Betzel, Danielle S. Bassett “Cliques and Cavities in the Human Connectome”, Cornell University arxiv.org/abs/1608.03520, 2016.
[18] T. Leinster “Basic Category Theory”, Cambridge University Press 2017.
Cite This Article
  • APA Style

    Seisuke Yanagawa. (2021). Communication Between Neural Networks, and Beginning of Language. American Journal of Neural Networks and Applications, 7(2), 38-44. https://doi.org/10.11648/j.ajnna.20210702.13

    Copy | Download

    ACS Style

    Seisuke Yanagawa. Communication Between Neural Networks, and Beginning of Language. Am. J. Neural Netw. Appl. 2021, 7(2), 38-44. doi: 10.11648/j.ajnna.20210702.13

    Copy | Download

    AMA Style

    Seisuke Yanagawa. Communication Between Neural Networks, and Beginning of Language. Am J Neural Netw Appl. 2021;7(2):38-44. doi: 10.11648/j.ajnna.20210702.13

    Copy | Download

  • @article{10.11648/j.ajnna.20210702.13,
      author = {Seisuke Yanagawa},
      title = {Communication Between Neural Networks, and Beginning of Language},
      journal = {American Journal of Neural Networks and Applications},
      volume = {7},
      number = {2},
      pages = {38-44},
      doi = {10.11648/j.ajnna.20210702.13},
      url = {https://doi.org/10.11648/j.ajnna.20210702.13},
      eprint = {https://article.sciencepublishinggroup.com/pdf/10.11648.j.ajnna.20210702.13},
      abstract = {There is a view that the cranial nerve circuit is composed of a combination of the same modules as the basic functions. According to that view, the author has presented the module (Basic Unit) that performs parallel-serial mutual conversion and has shown that the neural network that recognizes and generates arbitrary time-series data can be constructed by combining the module. In Chapter 2, the neural network that has the functions of federated learning and imitation that enable collective behavior of animals is shown, and added an idea of concrete circuit configuration to published papers. In Chapter 3, following a consideration of the fundamental role of language, a neural network with the same basic structure connected to the upper level of the neural network shown in Chapter 2 but with functions closely related to language is presented. The new neural network consists of a pair of neural networks that handle languages and images respectively. Each activated part is expressed using the Category theory concept. Category's entity is a set of Basic Units connected each other and changes of their state. The activated Categories are tied with the corresponding activation part in the pairing neural network, and interconverting is performed. The state of the Basic Unit may be inspired by sensory organs, but behave independently of the actuating behavior of conventional neural networks connected to the low position. Humans can generate an image of events that may occur in past or in future even if that are not directly related to the situation in front of the eye, and share their images by dialogue. The dialogue consists of time series data with a response format such as question or negation. The newly added neural network helps generate shared recognition.},
     year = {2021}
    }
    

    Copy | Download

  • TY  - JOUR
    T1  - Communication Between Neural Networks, and Beginning of Language
    AU  - Seisuke Yanagawa
    Y1  - 2021/12/31
    PY  - 2021
    N1  - https://doi.org/10.11648/j.ajnna.20210702.13
    DO  - 10.11648/j.ajnna.20210702.13
    T2  - American Journal of Neural Networks and Applications
    JF  - American Journal of Neural Networks and Applications
    JO  - American Journal of Neural Networks and Applications
    SP  - 38
    EP  - 44
    PB  - Science Publishing Group
    SN  - 2469-7419
    UR  - https://doi.org/10.11648/j.ajnna.20210702.13
    AB  - There is a view that the cranial nerve circuit is composed of a combination of the same modules as the basic functions. According to that view, the author has presented the module (Basic Unit) that performs parallel-serial mutual conversion and has shown that the neural network that recognizes and generates arbitrary time-series data can be constructed by combining the module. In Chapter 2, the neural network that has the functions of federated learning and imitation that enable collective behavior of animals is shown, and added an idea of concrete circuit configuration to published papers. In Chapter 3, following a consideration of the fundamental role of language, a neural network with the same basic structure connected to the upper level of the neural network shown in Chapter 2 but with functions closely related to language is presented. The new neural network consists of a pair of neural networks that handle languages and images respectively. Each activated part is expressed using the Category theory concept. Category's entity is a set of Basic Units connected each other and changes of their state. The activated Categories are tied with the corresponding activation part in the pairing neural network, and interconverting is performed. The state of the Basic Unit may be inspired by sensory organs, but behave independently of the actuating behavior of conventional neural networks connected to the low position. Humans can generate an image of events that may occur in past or in future even if that are not directly related to the situation in front of the eye, and share their images by dialogue. The dialogue consists of time series data with a response format such as question or negation. The newly added neural network helps generate shared recognition.
    VL  - 7
    IS  - 2
    ER  - 

    Copy | Download

Author Information
  • OptID, Machida, Tokyo, Japan

  • Sections