cross-posted from: https://lemmy.ml/post/15741608

They offer a thing they’re calling an “opt-out.”

The opt-out (a) is only available to companies who are slack customers, not end users, and (b) doesn’t actually opt-out.

When a company account holder tries to opt-out, Slack says their data will still be used to train LLMs, but the results won’t be shared with other companies.

LOL no. That’s not an opt-out. The way to opt-out is to stop using Slack.

https://slack.com/intl/en-gb/trust/data-management/privacy-principles

        • thepaperpilot@incremental.social
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          For matrix specifically, I recommend fluffy chat on mobile and cinny for web/desktop. Most notably, they both support the not-yet-official spec on custom emojis and stickers, which I think is important for any slack-like.

          For the server (since you want to self host), you’d probably want to do Synapse - it supports not being federated as well as SSO. Also it wasn’t mentioned by mp3, but xmpp is another protocol that’s used by many large companies for internal chat systems as well.

      • jqubed@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        Does IRC still exist? I remember laughing when I first saw Slack and its early competitors because people were excited about it and when I finally used it I realized it was basically just IRC with a nicer interface. I’m assuming these offer improvements like encryption?

        • just another dev@lemmy.my-box.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Nah, there’s tons of features that slack has over irc. To start with inline media (images, audio, video), but most importantly lots of out of the box external integrations and webhooks.

          • jqubed@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            Yeah, now there is, but I don’t think a lot of those features were in when I first used it over a decade ago. It became a lot more useful over the years.

            • Kushan@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              It didn’t require using arcane commands just to sign up and log in. I love IRC and will always remember it fondly, but it wasn’t easy for a novice to use and that’s why things like slack and discord took off.

        • ryguyflyguy@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Unless you’re setting it up for a business, the free tier should be enough. They have options for both self-hosting and hosted versions

          • tsonfeir@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            It’s a business with 300 users, but only about 50 of them would even use it. The others DO need accounts for the one time per month they login. But with their pricing, and SSO plan, that’s $3000/m

            What… the… fuck.

            I am in the wrong business lol.

  • Gamers_Mate@kbin.run
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    Instead of working on their platform to get discord users to jump ship they decide to go in the same direction. Also pretty sure training LLMs after someone opts out is illegal?

    • FaceDeer@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      Also pretty sure training LLMs after someone opts out is illegal?

      Why? There have been a couple of lawsuits launched in various jurisdictions claiming LLM training is copyright violation but IMO they’re pretty weak and none of them have reached a conclusion. The “opting” status of the writer doesn’t seem relevant if copyright doesn’t apply in the first place.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          If copyrights apply, only you and stack own the data. You can opt out but 99% of users don’t. No users get any money. Google or Microsoft buys stack so only they can use the data. We only get subscription based AI, open source dies.

          If copyrights don’t apply, everyone owns the data. The users still don’t get any money but they get free open source AI built off their work instead of closed source AI built off their work.

          Having the website have copyright of the content in the context of AI training would be a fucking disaster.

        • FaceDeer@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          6 months ago

          Nor is it up to you. But fact remains, it’s not illegal until there are actually laws against it. The court cases that might determine whether current laws are against it are still ongoing.

    • NovaPrime@lemmy.ml
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago
      1. It’s not illegal. 2. “Law” isn’t a real thing in an oligarchy, except insofar as it can be used by those with capital and resources to oppress and subjugate those they consider their lessors and to further perpetuate the system for self gain
  • mPony@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    The headline is patently false.

    FTA

    We do not develop LLMs or other generative models using customer data.

    If you want to act like you’re better than Reddit you’re going to have to start actually being better.

  • ArkyonVeil@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    These techbros are some grade A geniuses. Goodness, training LLMs on private data from Slack!? One of the most popular corporate messaging apps? Surely! Only good things can come out from training data on corporate secrets into a braindead auto-complete model that can leak its oversized brain if you prod it just right.

    Of course, no private data has ever been leaked from an LLM, right?

  • Veraxus@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    The customer is never just a customer… even if you are paying. You are always the product now. Always.

  • Veraxus@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    Well, that’s just great. Another reason for corporations to force everyone into the miserable, counterproductive dumpster fire that is MS Teams.

  • just another dev@lemmy.my-box.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    Customers own their own Customer Data.

    Okay, that’s good.

    Immediately after that:

    Slack […] will never identify any of our customers or individuals as the source of any of these improvements to any third party, other than to Slack’s affiliates or sub-processors.

    You’d hope the owner would get a say in that.

    • stellargmite@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      So you sign up to confirm that your IP is yours, while simultaneously agreeing to sell it off , but the source will be anonymous other than to who it’s sold to or anyone else Slack decides they want to know. These tech contracts and TOS should just say “we will (try) not do bad, but you agree to let us do bad, and if bad happens its your fault”.

  • Ultraviolet@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    Remember when every platform renamed PMs to DMs and everyone who pointed out that they’re trying to remove the expectation of privacy was “paranoid”?

  • ConfusedPossum@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    I use Slack at work everyday. I suppose this does feel off in some way but I’m not sure I’m the right amount of upset about this? I don’t really mind if they use my data if it improves my user experience, as long as the platform doesn’t reveal anything sensitive or personal in a way that can be traced back to me.

    Slack already does allow your admin to view all of your conversations, which is more alarming to me

    • SeedyOne@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      The problem is where you said “as long as” because we already know companies AND the AI itself can’t be trusted to not expose sensitive info inadvertently. At absolute best, it’s another vector to be breached.

      • ConfusedPossum@kbin.social
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        It’s obvious when you say it like that. I don’t like the idea of some prompt hacker looking at memes I sent to my coworker

  • Azzu@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    I’m sorry, can everyone not read the actual link? They specifically say they are not training generative AIs, i.e. LLMs.

    They are using the data to train non-generative AIs for stuff like emoji and channel suggestions. I.e. “you use this emoji a lot, so it’s displayed first” or “people that are in these channels of yours also join these other channels you’re not in yet”.

    This is class A misinformation being spread here, good job. It’s unbelievable that I’m the first one to actually verify the truth of this post, because I wanted to share it further and that’s what you do before.

    • gila@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      It also explicitly states in the posted screengrab that the opting-out user’s workspace won’t contribute to the underlying models. How would that be separate from using info on their workspace as training data for any kind of model? My interpretation of that is the data would be used to inference on the models, not train them.