ryujin470@fedia.io to Technology@beehaw.org · 2 days agoResearchers show that training on “junk data” can lead to LLM “brain rot”arstechnica.comexternal-linkmessage-square8fedilinkarrow-up159arrow-down10
arrow-up159arrow-down1external-linkResearchers show that training on “junk data” can lead to LLM “brain rot”arstechnica.comryujin470@fedia.io to Technology@beehaw.org · 2 days agomessage-square8fedilink
minus-squarejarfil@beehaw.orglinkfedilinkarrow-up11·edit-22 days ago Using a complex GPT-4o prompt, they sought to pull out tweets that focused on “superficial topics” Wait a moment… They asked an LLM, to tell them what was “junk”, and another LLM, trained on what an LLM marked as junk, turned out to be a junk LLM? It talks about model collapse, but this smells like research collapse.
Wait a moment… They asked an LLM, to tell them what was “junk”, and another LLM, trained on what an LLM marked as junk, turned out to be a junk LLM?
It talks about model collapse, but this smells like research collapse.