- Help with pet project to learn - Running ChatGPT-2 at home https://github.com/openai/gpt-2 2 comments learnmachinelearning
Linking pages
- Better Language Models and Their Implications https://blog.openai.com/better-language-models/#content 207 comments
- GPT-2: 1.5B Release https://openai.com/blog/gpt-2-1-5b-release/ 166 comments
- GitHub - karpathy/minGPT: A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) training https://github.com/karpathy/minGPT 133 comments
- Better Language Models and Their Implications https://openai.com/blog/better-language-models/ 99 comments
- GPT-2: 6-Month Follow-Up https://openai.com/blog/gpt-2-6-month-follow-up/ 96 comments
- Building Custom Deep Learning Based OCR models https://nanonets.com/blog/attention-ocr-for-text-recogntion/ 69 comments
- Researchers, scared by their own work, hold back “deepfakes for text” AI | Ars Technica https://arstechnica.com/information-technology/2019/02/researchers-scared-by-their-own-work-hold-back-deepfakes-for-text-ai/ 50 comments
- GitHub - karpathy/minbpe: Minimal, clean, educational code for the Byte Pair Encoding (BPE) algorithm commonly used in LLM tokenization. https://github.com/karpathy/minbpe 31 comments
- How To Make Custom AI-Generated Text With GPT-2 | Max Woolf's Blog https://minimaxir.com/2019/09/howto-gpt2/ 21 comments
- GitHub - jeffbinder/promptarray: Text generator prompting with Boolean operators https://github.com/jeffbinder/promptarray 15 comments
- GitHub - certik/fastGPT: Fast GPT-2 inference written in Fortran https://github.com/certik/fastGPT 13 comments
- GitHub - rish-16/gpt2client: ✍🏻 gpt2-client: Easy-to-use TensorFlow Wrapper for GPT-2 117M, 345M, 774M, and 1.5B Transformer Models 🤖 📝 https://github.com/rish-16/gpt2client 10 comments
- The Illustrated GPT-2 (Visualizing Transformer Language Models) – Jay Alammar – Visualizing machine learning one concept at a time. http://jalammar.github.io/illustrated-gpt2/ 8 comments
- GitHub - maziarraissi/Applied-Deep-Learning: Applied Deep Learning Course https://github.com/maziarraissi/Applied-Deep-Learning 6 comments
- GitHub - taishi-i/awesome-ChatGPT-repositories: A curated list of resources dedicated to open source GitHub repositories related to ChatGPT https://github.com/taishi-i/awesome-ChatGPT-repositories 5 comments
- GitHub - microsoft/DialoGPT: Large-scale pretraining for dialogue https://github.com/microsoft/DialoGPT 4 comments
- GitHub - alfianlosari/GPTEncoder: Swift BPE Encoder/Decoder for OpenAI GPT Models. A programmatic interface for tokenizing text for OpenAI ChatGPT API. https://github.com/alfianlosari/GPTEncoder 3 comments
- Verifiable Inference - by Bidhan Roy and Marcos Villagra https://blog.bagel.net/p/the-inference-interference 3 comments
- GitHub - daturkel/learning-papers: Landmark Papers in Machine Learning https://github.com/daturkel/learning-papers 1 comment
- GitHub - galois-autocompleter/galois-autocompleter: Galois is an auto code completer for code editors (or any text editor) based on OpenAI GPT-2. https://github.com/iedmrc/galois-autocompleter 0 comments
Linked pages
- Better Language Models and Their Implications https://blog.openai.com/better-language-models/#content 207 comments
- GPT-2: 1.5B Release https://openai.com/blog/gpt-2-1-5b-release/ 166 comments
- GPT-2: 6-Month Follow-Up https://openai.com/blog/gpt-2-6-month-follow-up/ 96 comments
- https://d4mucfpksywv.cloudfront.net/better-language-models/language-models.pdf 0 comments
Related searches:
Search whole site: site:github.com
Search title: GitHub - openai/gpt-2: Code for the paper "Language Models are Unsupervised Multitask Learners"
See how to search.