K U N M T

Loading

  1. SOLAR 10.7B: Scaling Large Language Models with Simple yet Effective Depth Up-Scaling
    Dahyun Kim, Chanjun Park, Sanghoon Kim, Wonsung Lee, Wonho Song, Yunsu Kim, Hyeonwoo Kim, Yungi Kim, Hyeonju Lee, Jihoo Kim, Changbae Ahn, Seonghoon Yang, Sukyung Lee, Hyunbyung Park, Gyoungjin Gim, Mikyoung Cha, Hwalsuk Lee, Sunghun Kim (Equal Contribution(First Co-Author), Corresponding Author)
    arxiv, 2023

  2. Self-Improving-Leaderboard(SIL): A Call for Real-World Centric Natural Language Processing Leaderboards
    Chanjun Park, Hyeonseok Moon, Seolhwa Lee, Jaehyung Seo, Sugyeong Eo, Heuiseok Lim
    arxiv, 2023

  3. Language Chameleon: Transformation analysis between languages using Cross-lingual Post-training based on Pre-trained language models
    Suhyune Son (*), Chanjun Park (*), Jungseob Lee (*), Midan Shim (*), Chanhee Lee, Yoonna Jang, Jaehyung Seo, Heuiseok Lim 
    arxiv, 2022

  4. There is no rose without a thorn: Finding weaknesses on BlenderBot 2.0 in terms of Model, Data and User-Centric Approach
    Jungseob Lee (*), Suhyune Son (*), Midan Shim (*), Yujin Kim (*),Chanjun Park (*), Heuiseok Lim
    arxiv, 2022

KU NMT Group.

School of Computer Science

College of Engineering, Korea University

© 2024 KU NMT GROUP.

Contact US

  • Group Leader Email
    bcj1210@naver.com
  • Address
    #311 Aegineung Student Center, College of Informatics, Korea University, 145 Anam-ro, Seongbuk-gu, Seoul, 02841, Korea