Hacker News
- Nightshade, a tool allowing artists to 'poison' AI models https://venturebeat.com/ai/meet-nightshade-the-new-tool-allowing-artists-to-poison-ai-models-with-corrupted-training-data/ 71 comments
Linking pages
- University of Chicago researchers seek to “poison” AI art generators with Nightshade | Ars Technica https://arstechnica.com/information-technology/2023/10/university-of-chicago-researchers-seek-to-poison-ai-art-generators-with-nightshade/ 6 comments
- Peak Data - by Kevin Zhang - East Wind https://eastwind.substack.com/p/peak-data 1 comment
Linked pages
- Generative AI's secret sauce, data scraping, under attack | VentureBeat https://venturebeat.com/ai/generative-ai-secret-sauce-data-scraping-under-attack/ 9 comments
- GitHub - CompVis/stable-diffusion: A latent text-to-image diffusion model https://github.com/CompVis/stable-diffusion 4 comments
- [2310.13828] Prompt-Specific Poisoning Attacks on Text-to-Image Generative Models https://arxiv.org/abs/2310.13828 1 comment
Would you like to stay up to date with Computer science? Checkout Computer science
Weekly.
Related searches:
Search whole site: site:venturebeat.com
Search title: Meet Nightshade, the new tool allowing artists to 'poison' AI models | VentureBeat
See how to search.