← Terug naar blog

Microsoft & Peking U Researchers Identify ‘Knowledge Neurons’ in Pretrained Transformers, Enabling Fact Editing | Synced

Data Platforms

Large-scale pretrained transformers learn from corpuses containing oceans of factual knowledge, and are surprisingly good at recalling this knowledge without any fine-tuning. In a new paper, a team from Microsoft Research and Peking University peeps into pretrained transformers, proposing a method to identify the “knowledge neurons” responsible for storing this knowledge and how they can be utilized to edit, update and even erase relational facts. — Lees op syncedreview.com/2021/04/27/deepmind-podracer-tpu-based-rl-frameworks-deliver-exceptional-performance-at-low-cost-6/

DjimIT Nieuwsbrief

AI updates, praktijkcases en tool reviews — tweewekelijks, direct in uw inbox.

Gerelateerde artikelen