- China's AI model glut is a 'significant waste of resources' due to scarce real-world applications for 100+ LLMs says Baidu CEO https://www.tomshardware.com/tech-industry/artificial-intelligence/chinas-ai-model-glut-is-a-significant-waste-of-resources-due-to-scarce-real-world-applications-says-baidu-ceo 66 comments technology
Linking pages
- HP discontinues online-only LaserJet printers in response to backlash — Instant Ink subscription gets the boot, too | Tom's Hardware https://www.tomshardware.com/peripherals/printers/hp-discontinues-online-only-laserjet-printers-in-response-to-backlash 587 comments
- AMD now has better brand recognition than Intel — firm rides AI wave to win on Kantar’s BrandZ Most Valuable Brands report | Tom's Hardware https://www.tomshardware.com/tech-industry/amd-now-has-better-brand-recognition-than-intel 48 comments
- Qualcomm hires former AMD ray tracing expert for its GPU team | Tom's Hardware https://www.tomshardware.com/pc-components/gpus/qualcomm-hires-former-amd-ray-tracing-expert-for-its-gpu-team 41 comments
- Chinese GPU maker Moore Threads' MTLink fabric tech challenges Nvidia's NVLink, can now scale to 10,000 GPUs for AI clusters | Tom's Hardware https://www.tomshardware.com/pc-components/gpus/chinese-gpu-maker-moore-threads-can-now-scale-to-10000-processors-for-ai-clusters-mtlink-fabric-tech-competes-with-nvidias-nvlink 23 comments
- AMD engineer discusses firm's 'Layoff Bug' — infamous Barcelona CPU bug revisited 16 years later | Tom's Hardware https://www.tomshardware.com/pc-components/cpus/amd-engineer-discusses-amds-layoff-bug-infamous-barcelona-cpu-bug-revisited-16-years-later 13 comments
- Google claims new AI training tech is 13 times faster and 10 times more power efficient — DeepMind's new JEST optimizes training data for impressive gains | Tom's Hardware https://www.tomshardware.com/tech-industry/artificial-intelligence/google-claims-new-ai-training-tech-is-13-times-faster-and-10-times-more-power-efficient-deepminds-new-jest-optimizes-training-data-for-massive-gains 8 comments
- Research shows more than 80% of AI projects fail, wasting billions of dollars in capital and resources: Report | Tom's Hardware https://www.tomshardware.com/tech-industry/artificial-intelligence/research-shows-more-than-80-of-ai-projects-fail-wasting-billions-of-dollars-in-capital-and-resources-report 1 comment
- Intel preps Lunar Lake-optimized adaptive sharpening filter for Linux deployment | Tom's Hardware https://www.tomshardware.com/pc-components/gpus/intel-preps-lunar-lake-optimized-adaptive-sharpening-filter-for-linux-deployment 0 comments
Linked pages
- AI models that cost $1 billion to train are underway, $100 billion models coming — largest current models take 'only' $100 million to train: Anthropic CEO | Tom's Hardware https://www.tomshardware.com/tech-industry/artificial-intelligence/ai-models-that-cost-dollar1-billion-to-train-are-in-development-dollar100-billion-models-coming-soon-largest-current-models-take-only-dollar100-million-to-train-anthropic-ceo 726 comments
- Asus tests BIOS update enabling Microsoft Dynamic Lighting control — makes it easier to avoid Armory Crate software | Tom's Hardware https://www.tomshardware.com/pc-components/motherboards/asus-tests-bios-update-enabling-microsoft-dynamic-lighting-control-makes-it-easier-to-avoid-armory-crate-software 72 comments
- Biggest password database posted in history spills 10 billion passwords — RockYou2024 is a massive compilation of known passwords | Tom's Hardware https://www.tomshardware.com/tech-industry/cyber-security/biggest-password-leak-in-history-spills-10-billion-passwords 70 comments
- Ryzen AI 7 Pro 360 exposed in new benchmark — octa-core Zen 5 chip falls behind the Core Ultra 9 185H | Tom's Hardware https://www.tomshardware.com/pc-components/cpus/ryzen-ai-7-pro-360-exposed-in-new-benchmark 21 comments
- Google claims new AI training tech is 13 times faster and 10 times more power efficient — DeepMind's new JEST optimizes training data for impressive gains | Tom's Hardware https://www.tomshardware.com/tech-industry/artificial-intelligence/google-claims-new-ai-training-tech-is-13-times-faster-and-10-times-more-power-efficient-deepminds-new-jest-optimizes-training-data-for-massive-gains 8 comments
- Elon Musk's liquid-cooled 'Gigafactory' AI data centers get a plug from Supermicro CEO — Tesla and xAI's new supercomputers will have 350,000 Nvidia GPUs, both will be online within months | Tom's Hardware https://www.tomshardware.com/tech-industry/artificial-intelligence/elon-musks-liquid-cooled-gigafactory-data-centers-get-a-plug-from-supermicro-ceo-tesla-and-xais-new-supercomputers-will-have-350000-nvidia-gpus-both-will-be-online-within-months 5 comments
- ASRock preps AMD GPUs for AI inference and multi-GPU systems — Creator series GPUs with dual-slot, blower-type design and 16-pin power connectors | Tom's Hardware https://www.tomshardware.com/pc-components/gpus/asrock-preps-amd-gpus-for-ai-inference-and-multi-gpu-systems 1 comment