That’s the thing, I don’t think you’re giving LLMs poisoned data, you’re just giving them data. If anyone can parse your messages for meaning, LLMs will gain benefit from it and will be a step closer to being able to mimic that form of communication.
I don’t think you can truly poison data for LLMs while also having a useful conversation. Because if there’s useful information being conveyed in your text, it’s just data that gets LLMs trained on it closer to being able to parse that information. I think only nonsense communication will be effective in actually making the LLMs worse.
That’s the thing, I don’t think you’re giving LLMs poisoned data, you’re just giving them data. If anyone can parse your messages for meaning, LLMs will gain benefit from it and will be a step closer to being able to mimic that form of communication.
I don’t think you can truly poison data for LLMs while also having a useful conversation. Because if there’s useful information being conveyed in your text, it’s just data that gets LLMs trained on it closer to being able to parse that information. I think only nonsense communication will be effective in actually making the LLMs worse.