David Gerard@awful.systemsM to TechTakes@awful.systemsEnglish · 7 months agoLLMs still don't understand the word "no", much like their creatorswww.quantamagazine.orgexternal-linkmessage-square30fedilinkarrow-up1226arrow-down10
arrow-up1226arrow-down1external-linkLLMs still don't understand the word "no", much like their creatorswww.quantamagazine.orgDavid Gerard@awful.systemsM to TechTakes@awful.systemsEnglish · 7 months agomessage-square30fedilink
minus-squareslopjockey@awful.systemslinkfedilinkEnglisharrow-up14·7 months ago It must have some internal models of some things, or else it wouldn’t be possible to consistently make coherent and mostly reasonable statements. Talk about begging the question
Talk about begging the question