Hacker News
- Jagged Flash Attention Optimization https://www.shaped.ai/blog/jagged-flash-attention-optimization 3 comments
- Show HN: Flash Attention in ~100 lines of CUDA https://github.com/tspeterkim/flash-attention-minimal 39 comments
- llm.c: multi-GPU, bfloat16, flash attention, ~7% faster than PyTorch https://twitter.com/karpathy/status/1786461447654125625 10 comments
- [R] Jagged Flash Attention Optimization https://www.shaped.ai/blog/jagged-flash-attention-optimization 15 comments machinelearning
- [P] Flash Attention in ~100 lines of CUDA https://github.com/tspeterkim/flash-attention-minimal 2 comments machinelearning
- Attention please: If your 950XL restarts unexpectedly (with a green flash or without), shutdowns or its screen becomes black, it's most likely the battery. Tested and confirmed. http://uk.businessinsider.com/apple-statement-on-iphone-shutdown-issue-2016-12?r=US&IR=T 69 comments windowsphone
- TIL when you divide a square number by its factor, the calculator flashes to show you the result. Impressed by Apple’s attention to detail everyday. https://streamable.com/92krc 5 comments apple
- To understand how brains pays attention, scientists studied 15 barn owls as lights flashed on a monitor. Owls' eyes are fixed in sockets so they turn their heads to look, tipping off scientists. Each flash triggered a group of brain cells called IMC neurons in an evolutionarily ancient brain region. http://blogs.discovermagazine.com/d-brief/2018/10/30/barn-owls-help-scientists-unlock-how-the-brain-pays-attention/#.w-i4z3pkgmi 107 comments science