• definitemaybe@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    3 hours ago

    It’s a win, regardless, but the response is important.

    The response should be:

    • Make dark patterns illegal (highlighting options that prefer the platform over the user, making it harder to cancel, "opt inx not “opt out” to all non-core features, etc.)
    • Require a clear “click through” step in account creation that underage use has been proven to be harmful, leading to anxiety and death (and then let parents make their own, informed, choices)
    • Clear legal limits on data storage and retention to only include data necessary for the platform functions (i.e. mouse tracking and other invasive analytics are illegal)
    • User options to delete all data, or all data older than a given rolling date window (i.e. only retain 1-year of data, up to and including deleting old posts/content)
    • Clear legal limits on data analytics
    • Open audits of algorithmic feeds to ensure they are reasonable and not encouraging “engagement” with harmful/controversial content at elevated levels
    • No sharing of any user details with any external “partners” (advertisers), beyond very broad categories (age, location data at the 1M+ population region, gender)
    • Data portability
    • Require platform interoperability (i.e. alternative front ends through API or website loading through an intermediary client app)

    Like, there’s nothing wrong with social media as a concept, it’s that profit seeking + network effects + regulatory capture have incentivized harmful social media.