学术报告

学术活动

学术报告
12/02 2020
  • Title题目 (Seminar)Can physics contribute to natural langauge processing?
  • Speaker报告人
  • Date日期
  • Venue地点
  • Abstract摘要

    CAS Key Laboratory of Theoretical Physics

    Institute of Theoretical Physics

    Chinese Academy of Sciences

     Seminar

    Title

    题目

     Can physics contribute to natural langauge processing?

    Speaker

    报告人

    王本友

    Affiliation

    所在单位

    意大利帕多瓦大学博士生,欧盟玛丽居里研究员。
    在天津大学获得硕士学位,曾在丹麦哥本哈根大学,加拿大蒙特利尔大学,荷兰阿姆斯特丹大学,华为诺亚方舟实验室交流访问,多次受邀在MILA,头条,腾讯,华为等研究所和企业做主题报告。在工业应用方面,他2017年开始曾在腾讯全职工作,作为主要算法设计人员,在腾讯云上从零搭建了稳健的智能客服系统,服务中国银行,云南省旅游局等头部客户;并与腾讯同事合写的《推荐系统与深度学习》由清华大学出版社出版。在相对较短的学术生涯,他致力于构建更加鲁棒和智能的自然语言处理系统,兼顾技术合理性和语言学动机。迄今他和他的合作者一起获得了国际信息检索顶级会议SIGIR 2017最佳论文提名奖和国际自然语言处理顶级会议NAACL 2019最佳可解释论文,发表了包括国际顶级会议ICLR/SIGIR/WWW/NAACL/AAAI/IJCAI/CIKM等20余篇。

    Date

    日期

    2020年12月2日(周三)上午10:00-11:00

    Venue

    地点

    ITP South Building 6420

    Contact Person

    所内联系人

    张潘

    Abstract

    摘要

    The physics community has been working for many decades on processing data in a high-dimensional space. Quantum computing also shows great potential to accelerate computing. Recently, natural language processing has been largely improved by using large-scale pretraining language models (huge neural networks like BERT, GPT, etc.) with massive cheap unstructured corpora. Such neural networks dealing with massive data may therefore benefit from the principles and tools in the physics community, especially processing large-amount high-dimensional data in BERT and GPT. How physics contributes to NLP (or the opposite direction) in the context of large-scale pretraining language models is worthy investigated. In this report, many aspects of processing natural language, e.g., efficiency, effectiveness, and interpretability, will be discussed, which can be an expected playground of well-designed tools invented by the physics community, e.g., tensor networks.
附件下载: