• 0 Posts
  • 11 Comments
Joined 10 months ago
cake
Cake day: January 3rd, 2024

help-circle




  • I personally don’t see that much of an issue of people making “nudes” of others since they’re fake anyway. I see an issue when they’re used for things like bullying, blackmail, etc. That is technically already illegal, just not well enforced for any sort of digital topic and hasn’t been for over a couple of decades now. Hence why I find the attention the LLM stuff gets exceptionally hypocritical and overblown, because non of them really cared when someone simply got cyberbullied, or blackmailed through classically edited images - let alone screamed for the outlawing of editing software or social media.


  • I’m saying there’s not really anything more I can do when it comes to EA. The company is already completely down the shitter for me and they’re the ones who would have to gain my trust back. That’s the only possible development. The problem is that many people don’t have a similar spine for actual principles like this, and the majority of people simply don’t even care. That’s why this rotten company is not just still a thing, but continues to do what they’ve done for the last couple decades.




  • to the extent that it doesn’t violate the law or other peoples’ rights

    Am I the only one who finds this so weird when we talk about LLMs? If someone makes a bot that resembles some specific person, that person’s rights aren’t really violated, and since they’re all fictional content, it is very hard to break actual laws through its content. At that point we would have to also ban people’s weird fan fiction, no?

    Not arguing about whatever they want or don’t want on their platform, but the legal & alleged moral questions / arguments always weird me out a bit, because there’s no one actually getting hurt in any sort of way by weirdos having weird chats with computers.

    The bigger issue is the enforcement. Either you monitor an absurd amount of content, which is worse for privacy, or you straight up censor the models, which makes them typically restrictive even in valid cases / scenarios being played out (other platforms went through this, with a consequential loss of users).



  • Erotic text messages could be considered pornographic work I guess, like erotic literature. But I think they just start to realize how many of their customers jailbreak GPT for that specific purpose, and how good alternatives have gotten who allow for this type of chat, such as NovelAI. Given how many other AI services started to censor things and how much that affected their models (like your chat bot partner getting stuck in consent messages as soon as you went into anything slightly outside vanilla territory), and how much drama that has caused throughout those communities, I highly doubt that “loosening” their policy is going to be enough to sway people towards them instead of the competition.