Well this article and line of comments is specifically about porn and women as objects of sexual desire, that would cause people to want to chat with OnlyFans models. I don’t think that’s changed over the years, if you look at the body types that were featured in Playboy, Hustler, Perfect 10, or lad mags like Maxim, Stuff, FHM, or even things like Sports Illustrated’s swimsuit issues. Pretty much across the board, from the 70’s through the 2000’s, these types of magazines featured young women of what I’m assuming are the “in vogue” proportions alluded to in the article. And I assume aren’t that different from things like the Raquel Welch poster featured in the Shawshank Redemption.
Speaking of posters, the 90’s included Baywatch and Pamela Anderson, who was on a lot more dorm room posters than Jennifer Aniston (who, by the way, wasn’t that far off of what I’m describing as the standard across multiple decades).
Not ownership. Just permission to copy and distribute freely. Which basically is necessary to run a service like this, where user-submitted content is displayed.
It’s more of a fuzzy area, but simply by posting on a federated service you’re agreeing to let that service copy and display your comments, and sync with other servers/instances to copy and display your comments to their users. It’s baked into the protocol, that your content will be copied automatically all over the internet.
Does that imply a license to let software be run on that text? Does it matter what the software does with it, like display the content in a third party Mobile app? What about when it engages in text to speech or braille conversion for accessibility? Or index the page for a search engine? Does AI training make any difference at that point?
The fact is, these services have APIs, and the APIs allow for the efficient copying and ingest of the user-created information, with metadata about it, at scale. From a technical perspective obviously scraping is easy. But from a copyright perspective submitting your content into that technical reality is implicit permission to copy, maybe even for things like AI training.