논문리뷰
Dense Passage Retrieval for Open-Domain Question Answering
본 포스트는 2020년에 발표되어 arxiv에 등록되어 있는 'Dense Passage Retrieval for Open-Domain Question Answering'의 내용을 요약, 정리하였습니다. Paper Dense Passage Retrieval for Open-Domain Question Answering Open-domain question answering relies on efficient passage retrieval to select candidate contexts, where traditional sparse vector space models, such as TF-IDF or BM25, are the de facto method. In this work, we show that..
GPT Understands, Too
본 포스트는 2021년에 발표되어 arxiv에 등록되어 있는 'GPT Understands, Too'의 내용을 요약, 정리하였습니다. Paper GPT Understands, Too While GPTs with traditional fine-tuning fail to achieve strong results on natural language understanding (NLU), we show that GPTs can be better than or comparable to similar-sized BERTs on NLU tasks with a novel method P-tuning -- which employs trainable c arxiv.org Code(github) GitHub - THUDM/P..
Pitfalls in the Evaluation of Sentence Embeddings
본 포스트는 2019년 ACL의 'Pitfalls in the Evaluation of Sentence Embeddings'의 내용을 요약, 정리하였습니다. Pitfalls in the Evaluation of Sentence Embeddings Deep learning models continuously break new records across different NLP tasks. At the same time, their success exposes weaknesses of model evaluation. Here, we compile several key pitfalls of evaluation of sentence embeddings, a currently very popular NLP arx..
A Survey of Transformers - 中
본 포스트는 arXiv에 2021년 1월에 업로드된 'A Survey of Transformers'의 내용을 요약, 정리하였습니다. 이전 포스팅(A Survey of Transformers - 上)에서 이어지는 내용입니다. A Survey of Transformers Transformers have achieved great success in many artificial intelligence fields, such as natural language processing, computer vision, and audio processing. Therefore, it is natural to attract lots of interest from academic and industry researchers..
A Survey of Transformers - 上
본 포스트는 arXiv에 2021년 1월에 업로드된 'A Survey of Transformers'의 내용을 요약, 정리하였습니다. A Survey of Transformers Transformers have achieved great success in many artificial intelligence fields, such as natural language processing, computer vision, and audio processing. Therefore, it is natural to attract lots of interest from academic and industry researchers. Up to arxiv.org Transformer는 많은 AI 필드에서 사용되고 있는 아..