Peter Blake Embracing AI

The esteemed British artist, Peter Blake, embracing AI and currently hosting an exhibition in Hong Kong that features AI prominently. At the remarkable age of 91, Blake has embarked on a captivating collaboration with an AI-powered robot, expressing his enthusiasm for the artistic possibilities this "kind of magic" offers.

Peter Blake Embracing AI

The esteemed British artist, Peter Blake, embracing AI and currently hosting an exhibition in Hong Kong that features AI prominently. At the remarkable age of 91, Blake has embarked on a captivating collaboration with an AI-powered robot, expressing his enthusiasm for the artistic possibilities this "kind of magic" offers.
7 November 2023

Researchers from New York University and Spain’s Pompeu Fabra University have developed a technique called Meta-learning for Compositionality (MLC), which enhances the ability of neural networks, like ChatGPT, to make compositional generalizations. Compositional generalization refers to the capacity to learn a concept and apply it to related uses of that concept. This technique involves training neural networks to become better at compositional generalization through practice.

The researchers have demonstrated that MLC outperforms existing approaches and can even match or exceed human performance in making systematic generalizations. They created MLC as a novel learning procedure where a neural network continuously improves its skills by receiving new words and using them compositionally, creating word combinations like “jump twice” or “jump around right twice.”

To test MLC’s effectiveness, human participants performed tasks identical to those conducted by the MLC. They had to learn the meaning of both real and nonsensical terms and apply them in different ways. MLC performed as well as or even better than humans in some cases, and it outperformed large language models like ChatGPT and GPT-4 in this specific learning task.

The researchers believe that MLC can further enhance the compositional skills of large language models, even though these models have improved in recent years, they still struggle with compositional generalization.

In this post:

Leave a Reply

Your email address will not be published. Required fields are marked *

More Posts