Loading...

Abstracts: October 9, 2023

Abstracts: October 9, 2023

Abstracts is a podcast series that provides insights into the latest breakthroughs of the research community at Microsoft. In this episode, Dr. Sheng Zhang, a Senior Researcher at Microsoft Research, talks about his research on distilling large language models into smaller, more efficient ones for named entity recognition (NER), a crucial task in natural language processing (NLP). Dr. Zhang and his coauthors presented a method for mission-focused instruction tuning, which achieved state-of-the-art performance through their UniversalNER models. The paper has the potential to make NLP and other capabilities more accessible, particularly in specialized domains such as biomedicine.

If you want to learn more, you can view the paper, visit the UniversalNER project website for a demo, check out the code on GitHub, or explore the dataset and models on Hugging Face through the links provided.

Published on:

Learn more
Microsoft Research Podcast
Microsoft Research Podcast

An ongoing series of conversations bringing you right up to the cutting edge of Microsoft Research.

Share post:

Related posts

Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy