Recently, the 37th Annual Conference on Neural Information Processing Systems was held in New Orleans, USA, as reported by IT House on December 31. At the meeting, a scientific team proposed that the artificial intelligence model Transformer has a function similar to the NMDA receptor. In addition to this discovery, they also designed new nonlinear functions that allow the Transformer model to simulate the kinetic performance of NMDA receptors. The NMDA receptor is a "smart" channel that promotes the formation of memories and skills. If glutamate is present in the brain, nerve cells become excited, delivering neurotransmitters to NMDA receptors. At the same time, magnesium ions are responsible for the gatekeeping, and when magnesium ions are squeezed out of the NMDA receptors, the neurotransmitter is allowed to enter the next nerve cell. In the hippocampus, the smooth transfer of matter ensures that short-term memory is effectively converted into long-term memory, and magnesium ions are responsible for selectively allowing matter to pass through. In the new study, the scientists found that there is a process similar to the magnesium ion gatekeeping in the NMDA receptor in the Transformer model, and that adjusting the parameters of the Transformer model can enhance the memory capacity of the model. This finding prompted researchers to think about whether it is possible to develop a method to regulate the memory consolidation process of the Transformer model by studying the mechanism of magnesium ion gatekeeping in the NMDA receptor, and make the model have systematic long-term memory and even consciousness.