mac@lemm.eetoTechnology@lemmy.world•“Model collapse” threatens to kill progress on generative AIsEnglish
101·
17 hours agois it not relatively trivial to pre-vet content before they train it? at least with aigen text it should be.
is it not relatively trivial to pre-vet content before they train it? at least with aigen text it should be.
Just out of curiosity, what kind of content do you come by that you’d like to save a full page copy of?
I also dont like videos for this stuff. Summarized using kagi’s universal summarizer, sharing here: