Hacker News
- Nvidia Deep Learning Accelerator (NVDLA): free open inference accelerator (2017) http://nvdla.org 25 comments
- SparseDNN: Fast Sparse Deep Learning Inference on CPUs (Stanford) https://arxiv.org/abs/2101.07948 2 comments
- Deep Dive: Nvidia Inference Research Chip Scales to 32 Chiplets https://www.tomshardware.com/news/nvidia-msm-inference-chip,39780.html 5 comments
- Show HN: Try running deep learning inference on Raspberry Pi https://actcast.io 11 comments
- Nvidia Deep Learning Accelerator (NVDLA): a free and open inference accelerator http://nvdla.org/ 4 comments
- Baidu Deep Voice Explained: Part 1 – the Inference Pipeline https://medium.com/athelas/paper-1-baidus-deep-voice-675a323705df#.lbmwd3u9t 8 comments
- Baidu Deep Voice Explained: Part 1 – the Inference Pipeline https://medium.com/athelas/paper-1-baidus-deep-voice-675a323705df#.7obi5arpd 3 comments
- Show HN: Neuropod – Uber ATG's open source deep learning inference engine https://github.com/uber/neuropod 25 comments
- NovuMind is developing a deep learning chip to “do inference efficiently” http://www.eetimes.com/document.asp?doc_id=1332226 19 comments
- Amazon Elastic Inference – GPU-Powered Deep Learning Inference Acceleration https://aws.amazon.com/blogs/aws/amazon-elastic-inference-gpu-powered-deep-learning-inference-acceleration/ 3 comments
- The do's and don'ts regarding Swift compiler performance and type inference. I took a deep dive into compiler performance analyzing all kinds of type inference scenarios and I was pretty surprised by some results! 🤯 https://lucasvandongen.dev/compiler_performance.php 10 comments swift
- The do's and don'ts regarding Swift compiler performance and type inference. I took a deep dive into compiler performance analyzing all kinds of type inference scenarios and I was pretty surprised by some results! 🤯 https://lucasvandongen.dev/compiler_performance.php 4 comments iosprogramming
- Evaluating Deep Learning Techniques for Natural Language Inference https://www.mdpi.com/2076-3417/13/4/2577 2 comments science
- Meta AI Open Sources AITemplate (AIT), A Python Framework That Transforms Deep Neural Networks Into C++ Code To Accelerate Inference Services https://www.marktechpost.com/2022/10/09/meta-ai-open-sources-aitemplate-ait-a-python-framework-that-transforms-deep-neural-networks-into-c-code-to-accelerate-inference-services/ 2 comments machinelearningnews
- AITemplate, Meta's new GPU inference system for deep learning https://github.com/facebookincubator/AITemplate 2 comments programming
- Processing a batch of requests for deep learning inference on a rust server https://www.reddit.com/r/rust/comments/m1083p/processing_a_batch_of_requests_for_deep_learning/ 4 comments rust
- Accurate deep neural network inference using computational phase-change memory https://www.nature.com/articles/s41467-020-16108-9 7 comments science
- Deep Dive: Nvidia Inference Research Chip Scales to 32 Chiplets https://www.tomshardware.com/news/nvidia-msm-inference-chip,39780.html 3 comments hardware
- Deep dive into Convolutional Neural Network inference in C https://github.com/canyalniz/CNN-Inference-Didactic 10 comments learnmachinelearning
- Demo: Accelerate Deep Learning Inference on Raspberry Pi (2018 ver.) https://www.youtube.com/watch?v=DyR183n0ZXA 17 comments raspberry_pi
- Demo: Accelerate Deep Learning Inference on Raspberry Pi (2018 ver.) https://www.youtube.com/watch?v=DyR183n0ZXA 5 comments deeplearning
- Vathys.ai Petascale deep learning inference on a single chip http://web.stanford.edu/class/ee380/Abstracts/171206.html 24 comments hardware
- Accelerate Deep Learning Inference on Raspberry Pi https://www.youtube.com/watch?v=R5niixLtf2Q 20 comments raspberry_pi
- Intel Launches New Xeon CPU, Announces Deep Learning Inference Accelerator http://www.tomshardware.com/news/intel-xeon-cpu-fpga-ai,33036.html 3 comments hardware
- AMD GPU Performance for LLM Inference: A Deep Dive https://valohai.com/blog/amd-gpu-performance-for-llm-inference/ 10 comments amd
- [Russell] An entry level job with the Rays now requires a "Deep understanding of the fundamentals of Bayesian Inference, MCMC, and Autocorrelation/Time Series Modeling." https://twitter.com/d_russ/status/1462840761213403149?s=21 371 comments baseball
- Reduced false positives in autism screening via digital biomarkers inferred from deep comorbidity patterns https://www.science.org/doi/10.1126/sciadv.abf0354 6 comments science
- Open source that takes as input a deep learning model and outputs a version that runs faster in inference. Now faster and easier to use (New release) https://github.com/nebuly-ai/nebullvm 3 comments programming
- [P] Open-source to speed up deep learning inference by leveraging multiple optimization techniques (deep learning compilers, quantization, half precision, etc) https://github.com/nebuly-ai/nebullvm 9 comments machinelearning
- Nebullvm is an open-source library that accelerates AI inference. You input an AI model and it outputs an optimized version that runs 5-20x faster. It leverages deep learning compilers, making it easy to use them in just a few lines of code. We spent nights building it and would love your feedback! https://github.com/nebuly-ai/nebullvm 12 comments programming
- For deep learning developers, there's a new open-source library to boost AI inference (250+ github stars in the first day, 100+ usages right now) https://github.com/nebuly-ai/nebullvm 15 comments programming
- Blind And Sighted People Understand Colour Similarly. Blind individuals are able to draw upon deep understandings of how colours function, and make inferences about totally new objects based on their category alone, in a way that closely resembles those with sight. https://digest.bps.org.uk/2021/09/30/blind-and-sighted-people-understand-colour-similarly/ 11 comments science
- Consciousness in active inference: Deep self-models, other minds, and the challenge of psychedelic-induced ego-dissolution | Neuroscience of Consciousness https://academic.oup.com/nc/article/2021/2/niab024/6360857 16 comments cogsci