Of course. I know some open source devs that advice backing up raw training data, LoRa, and essentially the original base models for fine tuning.
Politicians sent an open letter out in protest when Meta released their LLaMA 2. It is not unreasonable to assume they will intervene for the next one unless we speak out against this.
Of course. I know some open source devs that advice backing up raw training data, LoRa, and essentially the original base models for fine tuning.
Politicians sent an open letter out in protest when Meta released their LLaMA 2. It is not unreasonable to assume they will intervene for the next one unless we speak out against this.