본문 바로가기

Programming/(Python)(Ubuntu)

GPT-2 BERT 이용 문장 생성

모음 : https://github.com/teacherpeterpan/Question-Generation-Paper-List

출처 : https://minimaxir.com/2019/09/howto-gpt2/

동영상 : https://www.youtube.com/watch?v=ldxclhDwVw4

논문 : https://arxiv.org/pdf/1911.02365.pdf

등등 : http://jalammar.github.io/illustrated-gpt2/

등등 : https://github.com/minimaxir/gpt-2-simple

등등 : https://openai.com/blog/gpt-2-1-5b-release/

등등 : https://medium.com/analytics-vidhya/understanding-the-gpt-2-source-code-part-1-4481328ee10b

등등 : https://www.topbots.com/generalized-language-models-bert-openai-gpt2/

등등 : https://medium.com/@Moscow25/the-best-deep-natural-language-papers-you-should-read-bert-gpt-2-and-looking-forward-1647f4438797

 

transformer : https://arxiv.org/abs/1706.03762

 

gpt pdf : https://s3-us-west-2.amazonaws.com/openai-assets/research-covers/language-unsupervised/language_understanding_paper.pdf

 

BERT: https://arxiv.org/abs/1810.04805

 

t5 : https://arxiv.org/abs/1910.10683

 

소스코드 : https://github.com/paul-hyun/transformer-evolution

 

GPT : https://medium.com/huggingface/encoder-decoders-in-transformers-a-hybrid-pre-trained-architecture-for-seq2seq-af4d7bf14bb8

 

 

 

반응형