Settings

Theme

Protect me from what I want

tbray.org

188 points by lolsoftware 3 years ago · 224 comments (215 loaded)

Reader

andrewmutz 3 years ago

I completely agree with the article. There is a vast soup of user generated content that could be displayed and there's always an algorithm of some sort employed, even if that algorithm is "show me only content from people I explicitly follow and sort it chronologically".

If you wanted to give the social media companies the benefit of the doubt, you would say "they just want to have a feed that the user enjoys and the best metric they have to tune their algorithm is is engagement". If you didn't want to give them the benefit of the doubt, you might say "They want to surface a feed that keeps you hooked so they make the most money on ads, so they use the metric of engagement".

I also completely agree with Tim that a marketplace of algorithms is the best way to solve this problem. If users can choose their algorithm, they can choose between the "sugar high" content that is optimized to maximize engagement and time-on-platform if thats what they want. If they'd rather choose one that maximizes for other metrics, they are free to do so. Examples of metrics that a given user might prefer: "A monthly survey that the user fills out that asks how good the recommendations are" or "a daily survey asking questions related to mood/mental health/etc".

  • omeze 3 years ago

    I think this “marketplace of algorithms” idea (aka algorithmic choice) is a bit naive. Imagine using an app for the first time. What algorithm do you choose? Now remove your technical context and imagine reading through the top 10 algorithms to make a choice of which one to select. Twitter actually has quite a few knobs for curation (eg you can apply Regexps to filter content), and Id be very interested in knowing how many users actually use those features. Id guess its a rounding error.

    • andrewmutz 3 years ago

      I envision it not as a configurable app, but instead each approach to algorithmic feeds is a separate app/website. All the apps have all the underlying fediverse/activitypub data available for them to serve up, but each has a different approach to curating content. Users would just choose the type of experience they want by choosing different apps/websites.

    • hackerfromthefu 3 years ago

      Its very easy, you have a configuration setting with the list of algorithms and a short textual description. Users try them and quickly change off ones they don't like till they settle on one they like best.

  • edanm 3 years ago

    There is in a sense a marketplace of algorithms, or at least, a marketplace of user experiences. That's the actual marketplace. It's the reason Facebook is losing marketshare to Tiktok.

gfaster 3 years ago

> And anyhow, those algorithms are just showing you what you want. Don’t try to deny it, if it wasn’t what you wanted you wouldn’t be doomscrolling so much, would you?

I believe this to be based on a faulty assumption: people do what they want. While it may be true that at the reptile-brain level we do want what the Algorithm feeds us, that statement is equivalent to saying that someone who is addicted to opiates wants to feed their addition. It's only true at the most shallow of levels.

The article at large seems to realize that, but I think what is needed is to distinguish "want" as what we as rats in the Skinner box want and "want" as what we humans who do not wish to be prey to that Skinner box. That is the promise of platforms without the Algorithm, just that it wont try to take advantage of users.

(As a side note, I am an advocate for using capital 'A' the Algorithm to refer to the content-aware, black-box recommendation engines that run social media sites. That let's us continue to talk about lowercase 'a' algorithm to refer to sorting and such)

  • Ieghaehia9 3 years ago

    Even on a reptile-brain level, the article's use of wanting is sort of an equivocation. The brain has different systems for "wanting" and "liking". Cocaine affects the former, opioids the latter.

    Capital A Algorithms maximizing engagement are very much on the "wanting" side. Firing off an angry response to a political opponent's belittling post is rarely ever really /enjoyable/.

    • StrictDabbler 3 years ago

      TikTok's algorithm is firmly on the "wanting" and not liking side. That's why it creates so much ragebait.

      The moderation is very strict so the ragebait is limited to people ruining (cheap) wedding dresses with colored soap and screwing toilet-seats to the trunks of junked cars, or whatever. A vast flood of videos that only exist because anybody watching them will be disappointed and angry.

      If the TikTok mod team slacked up for a second you'd see nothing but culture-war political ragebait and hate speech. It hovers on the edge even now.

  • civopsec 3 years ago

    Skinner seems like a way too outdated reference to use.

mgerdts 3 years ago

An algorithm that I can tune by having private +1, -1, and "hell no" buttons would be helpful. If I follow too many people and/or hashtags, I will end up with so many posts/toots that I spend a bunch of time reading through things that really aren't that interesting. If I had a way to easily mark content as interesting or not, an algorithm could construct my feed such that the things that I find interesting are more likely to be read. There have been people that I followed because they had an interesting post or two, then the other side of them started posting that is not interesting to me.

As an example, one person posted that they were working on end-to-end encrypted messages. That was really cool, nearly instant follow. This person identifies as a furry and posts a lot more content related to their furry identity than E2E encrypted messages. I'm not really into the furry scene, even as a spectator. Nothing against this person - I'm happy they can be who they are. I just wish there was a way that I could tune into the once in a while updates about the tech project without the part that is of no interest to me. That person may be equally annoyed with my feed if I posted regular updates of my love for mac n cheese.

Importantly, the +1/-1 that I give should be used only for my feed with no feedback to the poser. My -1 does not mean I disapprove and I certainly don't want it to hurt anyone. I don't want it to influence others' feed. A -1 needn't mean that I never see something like that, but a "hell no" should.

Such a system would allow me to follow more people and be exposed to different ideas that come from more exposure. It's then up to me to tune my feed to match my interests and available time.

  • lonk11 3 years ago

    What you are describing is similar to how my hobby project works: https://linklonk.com

    When you upvote content (+1) you make your connection stronger to those who posted that content and also those who also upvoted it. When you downvote (-1), your connection to the poster and other upvoters becomes weaker. The content in your feed is ranked base on how strongly you are connected to those who upvoted the content.

    For example, I upvoted the OP post on LinkLonk: https://linklonk.com/item/3406294698351951872. If you upvote that item then you will get connected to me and to two RSS feeds that posted that content: the blog's feed (https://www.tbray.org/ongoing/ongoing.atom) and the the HN's feed of newest items (https://news.ycombinator.com/rss). As a result you will start to see other content from those RSS feeds and from my "main" channel more prominently in your recommendations.

    Users can post/submit links under different user channels. So if your hypothetical E2E person were to post E2E stuff under their "E2E" channel while keeping the furry stuff under their "furry" channel then you would only get connected to their E2E channels if you only +1'd content from that channel.

  • Karrot_Kream 3 years ago

    When RSS was hot stuff circa 2005, a friend on IRC and I were working on a bayesian classifier for RSS entries and it worked just like this. +1 trained it as "ham" and -1 trained it as "spam". It failed (among other reasons) because we were both really young and didn't have enough mathematial/computational maturity. With modern ML techniques, this doesn't seem hard to do at all. But you'd need access to the actual data stream which is something that the big social networks don't seem willing to provide.

  • GauntletWizard 3 years ago

    I think it's actually important for you to see all the sick shit. I think it's worth rubbing your nose in it.

    It's amazing how many technologists have a strong preference for truth in so many areas, but choose not understanding things they find disgusting and why they find them disgusting.

    • mgerdts 3 years ago

      If you are implying that I think furries are “sick shit” you are confused. I just don’t care, much like most people do not care about my hypothetical obsession with cheesy pasta. To be clear, in my example the posts that I’d prefer not to see seem to be friends wearing animal costumes and cartoonish images of animals. On the face of it, it was all quite tame.

      By being able to declutter my feed of things I don’t find interesting, I am likely to follow more people and hashtags, exposing me to more voices. With some effort and luck that includes more diverse voices.

      Alternatively, I could just not follow people that have a low signal to noise ratio (per my interests) and I could block or mute those that frequently post uninteresting things to hashtags that I follow. I think this approach exposes me to less diversity.

nostromo 3 years ago

Most Twitter users, as best I can tell, aren't actually complaining about what they see. They are complaining about what others see.

In their view, they have the "correct opinions" and have not been biased by any algorithms or moderation. Meanwhile, the people that disagree with them are the ones being duped by algorithms, bad moderation, and bad actors.

That's why there's a huge censorship movement right now in the US. People aren't saying, "protect me from what I want" -- they are saying, "protect others from what they want, but leave me alone." Which is, of course, entirely hypocritical.

  • ska 3 years ago

    > Most Twitter users, as best I can tell, aren't actually complaining about what they see. They are complaining about what others see.

    FWIW this doesn't match my experience at all, but I've hardly made a study of it. What I mean by that is that the limited complaints I have heard are mostly about a) bad quality advertisement targeting and b) political targeting, also mostly bad quality.

    "quality" here meaning accuracy of the targeting, not a comment on the content.

    • prawn 3 years ago

      Nor mine. My complaints are about what I see. I don't want an overload of ads, and especially not irrelevant ads. I don't want suggested content/topics. I don't want suggested follows. I just want to see tweets from the people I chose to follow. Same with Instagram.

      But I think the reason they push against this is that is risks people losing interest as the sort of people I follow aren't strong/consistent content creators. They're just people I know personally and want to stay in touch with. Twitter wants to be a firehose of stuff that fires me up. Instagram wants to be endless TV channels of constant content. Whereas I like being able to get up to date on everything my friends have posted, and then go back to what I was doing.

    • stickfigure 3 years ago

      Oh come now. I constantly hear this concern from my friends (almost exclusively left-of-center).

      And they aren't totally wrong? I have a qanon-er in my family; she's carefully curated her media sources to guarantee a pure, steady stream of nutjob garbage. The thing is, censoring twitter/facebook/youtube makes no difference - these people will find their fringe no matter how hard they have to look. At least on youtube there's a chance the algorithm will show some non-bullshit.

      I don't know what "the answer" to any of this is, but I suspect we all need to be a bit more tolerant of online stupidity.

      Also: If your social media feeds are showing you crappy content, you are to blame. It's incredibly easy to like/dislike/hide-like-this content. My feeds are fantastic. I suspect the people complaining about their feeds actually like garbage and even more so like to complain.

      • twoWhlsGud 3 years ago

        Old joke: "If you owe the bank $1M it's your problem, if you owe the bank $1B it's theirs". New joke: "If a few citizens become radicalized/crazy in ways that threaten the very functioning of your society it's their problem ... "

        • pessimizer 3 years ago

          ...but if the majority of citizens act in ways that threaten the very functioning of your society, you may live in a monarchy or oligarchy.

          • toofy 3 years ago

            i’m not sure why this would be the only option you come to.

            it’s much more likely that folks are concerned because the populism wishes to destroy pluralism.

      • ilyt 3 years ago

        > Also: If your social media feeds are showing you crappy content, you are to blame. It's incredibly easy to like/dislike/hide-like-this content. My feeds are fantastic. I suspect the people complaining about their feeds actually like garbage and even more so like to complain.

        I'd imagine most people have reasonable assumption than YT will only show you more stuff that you like (as in press the like button), not that any engagement, even downvotes, will continue to put more shit into your stream.

        Ever wondered why YTbers ask for comments and don't shy from saying "if you didn't like it, downvote and tell me in the comments"? That still drives engagement and that is all to algorithm.

        Once you get it, it's easy to keep the shit you don't want in the stream, but not everyone is well versed in algorithmical ways so it's not that hard to get into algorithmics hole

      • shadowgovt 3 years ago

        > these people will find their fringe no matter how hard they have to look

        This is likely true. Most of the discussion around "the algorithm" is mainly centered around whether there's something about the way the major online sharing channels are selecting what to present that acts as a funnel towards QAnon and other radical theories, not whether people already dedicated to finding such information can find it.

        • pixl97 3 years ago

          YT: "you searched WWII", recommended video is "how to become a nazi in 2022"

          I had to click do not recommend this content on a ton of videos.

          • shadowgovt 3 years ago

            Precisely. If I had to hazard a guess, I think their algorithm is hyper-optimizing for linkages, and when linkages are broad but tenuous, it hyper-optimizes for the ones with only the slightest dominance.

            Hypothetically: all sorts of people are interested in the history of World War II... it's a big war, it defined politics for over half a century, and it's taught every year in the history classes of at least a hundred nations. So you're looking at a topic with hundreds of "also likes" connections but no strong winner.

            Perhaps there is a 0.000001% chance that if you watch a WWII video, your next video will be one on modern Nazism (including recruitment videos). I wonder if YouTube's algorithm boosts that small fluctuation in a sea of noise into a single recommendation result (or even grabs the top 5, but the signal is so noisy that this one still shows up in that top 5)?

      • SideburnsOfDoom 3 years ago

        > Also: If your social media feeds are showing you crappy content, you are to blame.

        Yes and no. Youtube radicalisation / algorithmic extremism is a thing, because the engagement-maximising algorithm has a strong bias for rabbit-holes, kooks, con-artists and incendiary nonsense. The "funnel towards QAnon" that the sibling comment mentioned.

        Unless you lean hard against it - which first requires awareness of the problem and the skills or knowledge to spot it - it will be in your feed. "you are to blame" is to an extent blaming the victims.

        • XorNot 3 years ago

          The sentiment I've heard a lot is "I'm not clicking on that, it'll take weeks to undo the damage to my recommendations".

          • rchaud 3 years ago

            How many average users are even aware of this?

            It took until 2021 for people to figure out that quote-tweeting politician hot takes to dunk on them, actually helped politicians' reach because Twitter counted it as engagement.

            • SideburnsOfDoom 3 years ago

              > How many average users are even aware of this?

              An increasing number, but still very low. And awareness in itself won't solve the issue entirely. This is why the "you are to blame" sentiment is to an extent blaming the victims.

      • unsupp0rted 3 years ago

        Please teach me how I can get YouTube to stop showing me videos about cars and food reactions. And American local news channels.

        Dislike button does nothing.

        Hide because I don’t like this video option does nothing. It hides that video and keeps recommending competing channels on the same topic.

        Don’t recommend channel button blocks the channel so that three more competing channels on the same topic take its place

      • viraptor 3 years ago

        I think there's some nuance here though, where people do make a distinction between opposing views and harmful content. (likely not everyone, but still)

        What I mean is, I don't care if someone reads a politician I hate. I care if they read a politician who enables the next pizzagate or worse. Or if they read state sponsored misinformation while thinking it's genuine opinions. And yeah, I do acknowledge there's some overlapping grey area.

        • pessimizer 3 years ago

          The main takeaway is that you care what other people read, not about what people are making you read.

          • kannanvijayan 3 years ago

            > The main takeaway is that you care what other people read, not about what people are making you read.

            As do I, and you. I was pretty happy when all of those ISIS accounts that were spewing violence and hate got shut down. I'm sure you have your boundaries on what you'd want society to allow and restrict on public mediums.

            The specific details of what those boundaries are will depend on your personal notion of what constitutes communication which can be considered abusive. As it will for any other person.

            Copyright violations. Beheading videos. "Pornography" or pseudo-"pornography" involving minors. Direct threats of violence towards individuals or groups. Deepfakes generated without consent. Etc. Etc.

            I guarantee you if we have a back and forth discussion we will discover where your boundaries lie across the myriad issues where people typically want to control public discourse.

            • ilyt 3 years ago

              Right but I don't want some wanker at Twitter or Facebook office decide that.

              If I want to press the ban/blacklist button on tag, account, group, or whatever social unit to not see it again that's my decision. Sure, some of them should be default (probably don't want people to get porn the second they sign up), but user should be in power to moderate and filter their own stream.

              > Copyright violations. Beheading videos. "Pornography" or pseudo-"pornography" involving minors. Direct threats of violence towards individuals or groups. Deepfakes generated without consent. Etc. Etc.

              3/4 of what you mentioned is illegal in most places in the first place so it isn't point of contention.

              And that isn't really a problem. Site deciding this or that political view is now bad is.

              You try to put removing the illegal/disturbing content in same category as worldview manipulation. The first is way more black and white than the second and should not be considered together, even if similar systems are used for them.

              • kannanvijayan 3 years ago

                The distinctions you're bringing up seem kind of perfunctory to me.

                The wanker at the twitter office that was presumably appointed by twitter management to do that, yes? And some people want to appeal to that particular seat of power to influence and limit discourse along some dimension, within twitter.

                And yet other appeals to higher powers, such as governments - control communication with deeper consequences across broader domains.

                What's legal or illegal evolves with politics and culture. So there's no fundamental purchase there for the kind of moral conversation you were trying to elicit.

                If in a few years the people you accuse of wanting to control other people's speech are able to get some laws passed making the speech they want controlled properly illegal, I'd venture you would resist accepting that as suddenly legitimate - even if those things would be "straight up illegal" at that point.

                I guess I should have been more explicit in the first response - but what I'm suggesting is that this conversation is better had in less absolutist terms than what you proposed. There was the implication that the other person somehow inherently wanted to control communication in a qualitatively different way than you (or I did).

                That's not to go down the path of sophistry - but just to suggest to orient the conversation around where the boundaries should be placed in practical terms, and discuss where the differences in boundaries lie and on an issue by issue basis evaluate that, rather than absolutist/ideological terms.

                "You want to control speech (and I don't)" doesn't really lead anywhere in terms of discourse. It's a dead end.

                • dumpsterdiver 3 years ago

                  > (parent) Right but I don't want some wanker at Twitter or Facebook office decide that.

                  > What's legal or illegal evolves with politics and culture. So there's no fundamental purchase there for the kind of moral conversation you were trying to elicit. If in a few years the people you accuse of wanting to control other people's speech are able to get some laws passed making the speech they want controlled properly illegal, I'd venture you would resist accepting that as suddenly legitimate - even if those things would be "straight up illegal" at that point.

                  The parent commenter was simply saying that the abhorrent examples you provided were not a point of contention, and not relevant to a conversation about free speech. They weren't attempting to put the word of the law onto a pedestal as you seem to suggest, they were simply pointing out that the things you mentioned are universally considered bad all around the planet, whereas freedom of speech is not.

                  The situation you've outlined in which one political party successfully silences their opponents through legal means is exactly why a "Site deciding this or that political view is now bad" is so dangerous.

                  • kannanvijayan 3 years ago

                    Copyright violations are a violations against a legal abstraction. Deepfakes are a new form of deception (I didn't imply sexual deepfakes). I used the language of "minors" to explicitly reference those gray legal areas around boundaries and teens sexting each other (which has landed those children in legal quagmires). None of those examples are "abhorrent" as much as "complicated".

                    > "Site deciding this or that political view is now bad" is so dangerous.

                    I'll admit again: I was quite happy when accounts supporting and promoting ISIS were banned from various social networks. I was more active on social media then (less so now), and it made me feel very uncomfortable to be on the same platform being leveraged to the ends of those radicals. They were just using it so _gleefully_.

                    I don't know if I'll be accused of wanting to control what people read, because that's what that is. But I think that falls into the political views most of us agree we don't mind if a social network purges. Violent revolutionary religious politics. It's somewhat natural that they'd have a business interest in maybe not hosting that stuff.

                    So is violent revolutionary religious politics the boundary? Companies should be free to police those political accounts, but less free to police others? Is it a sliding scale or a hard boundary? Who determines the degree of violence? Is implied violence included? If so how do we determine implied violence?

                    I'm not attempting to answer those questions. I'm saying those are the questions that a statement to the effect of "You want to control speech (and I don't)" is trying to avoid.

                    That said, those questions are hard, and maybe arguing about them and answering them actually requires a long time and a lot of consideration and doesn't fit neatly into a hacker news thread.

                    • dumpsterdiver 3 years ago

                      > Copyright violations. Beheading videos. "Pornography" or pseudo-"pornography" involving minors. Direct threats of violence towards individuals or groups. Deepfakes generated without consent. Etc. Etc.

                      > None of those examples are "abhorrent" as much as "complicated".

                      Uhh, if you say so. They are your words after all, not mine.

                      > So is violent revolutionary religious politics the boundary?

                      I think the parent commenter's point was that you were both pretty much on the same page as far as being fine with illegal content being moderated, but the one thing they absolutely were not okay with is when powerful political entities (read: social networks) censor their political opponents in order to consolidate power for their own party.

                      • kannanvijayan 3 years ago

                        > Uhh, if you say so. They are your words after all, not mine

                        Was there a particular reason you ignored the elaboration and re-quoted the original comment? And if there is something you want to express about it please do. This is kind of a meaningless retort.

                        > I think the parent commenter's point was that you were both pretty much on the same page as far as being fine with illegal content

                        I don't remember a due process being followed for most of those ISIS accounts being banned. I don't remember the social networks restricting their bans to those people that had been convicted of illegal communications.

                        And yet I supported that move. I understood that the network would not want that sort of crap on its airwaves, and that might see a business interest in not having its platform associated with that movement, and want to eliminate the ability of that movement to use their platforms to further their aims.

                        As much as one might want to reach for "illegality" is some clearcut dividing line, that simply doesn't apply here. Most of us supported accounts getting banned with no due process, no oversight, or anything - when it was violent revolutionary religious politics that we disagreed with.

                        • ilyt 3 years ago

                          >I don't remember a due process being followed for most of those ISIS accounts being banned. I don't remember the social networks restricting their bans to those people that had been convicted of illegal communications.

                          > And yet I supported that move. I understood that the network would not want that sort of crap on its airwaves, and that might see a business interest in not having its platform associated with that movement, and want to eliminate the ability of that movement to use their platforms to further their aims.

                          Again, you're missing the point, blocking proven terrorists is not the same as the politic you don't like.

                          You're trying to excuse blocking benign stuff by desperately trying to group them with stuff most people would agree is probably borderline illegal and harmful

                        • dumpsterdiver 3 years ago

                          Yes there is a reason I used your original comment, it was an attempt to hold you accountable for your words after you backtracked. The actions you mentioned in your original remark was more along the lines of abhorrent than questionable and I wanted to make that clear.

                • ilyt 3 years ago

                  > The distinctions you're bringing up seem kind of perfunctory to me.

                  If you don't see distinction between posting pedophilia/beheadings and different political views that's entirely you problem.

                  "We don't allow what law doesn't allow, and moderate what is considering explicit by default, but you can opt out of that" I think is entirely fine line for platform to stand on. Anything above that is them fucking around with public opinion for their benefits.

                  >The wanker at the twitter office that was presumably appointed by twitter management to do that, yes? And some people want to appeal to that particular seat of power to influence and limit discourse along some dimension, within twitter.

                  > And yet other appeals to higher powers, such as governments - control communication with deeper consequences across broader domains.

                  Government is supposed to work toward interest of its people, not corporations, that's the difference here. And, well, the government is only entity that can tell corporation to behave. You can just not vote in the wankers to the office too.

                  Government (in first world countries) will only tell you to stop once you actually start to incite violence, not when you post some wrongthink on social media (except UK I guess...).

                  Are there governments worse than corporations ? Sure, but corpos can just not participate in their market. They will, for the money, and that's all there is to say about corporate goals.

                  > What's legal or illegal evolves with politics and culture. So there's no fundamental purchase there for the kind of moral conversation you were trying to elicit.

                  Sure, slavery was legal at some point. Doesn't really matter. That's not the point.

                  The point here that you are desperately trying to miss is that corporation should not have power to manipulate public opinion at will, because only target for corporation is to earn more money

                  And you can tell government to change the laws, that's how we got slavery banned, women getting rights to vote and minorities being not repressed. Can't do that to corporation. Of course you need to want government act in benefit of its people which US have hard time doing (the stuff your "lobbyists" are doing would get them arrested in EU...) but even that skewed system is still better than Zuck or Elon or other clown deciding what people should see or not see.

              • fzeroracer 3 years ago

                The problem with your ban/blacklist option is that they first require users to be exposed to that content in order to personally ban it. Either you yourself will have to watch the beheading video in order to know to ban it, or another group of users will need to tag it as 'disturbing' and you will need to ban 'disturbing' content'.

                Most people do not like casually being exposed to such content, and so any sort of soft policy against it results in users self-selecting out of the social ecosystem. It's no longer just your decision, but one that affects the entire platform. And soon the entire platform becomes dedicated to that disturbing content because everyone else has left.

                You could go to Parler or Gab or whatever site right now to see the results of your experiment in action and see why user self-moderation leads to the destruction of the site. This is why users offload that mental stress to the owners of the site who then hire people to manually filter the cruft themselves (often for the worse of the people doing said filtering, but that's a whole 'nother thing).

                • ilyt 3 years ago

                  > The problem with your ban/blacklist option is that they first require users to be exposed to that content in order to personally ban it.

                  If you actually read my comment properly you'd notice that I explicitly said there should be a default ban list blocking that

                  > Either you yourself will have to watch the beheading video in order to know to ban it, or another group of users will need to tag it as 'disturbing' and you will need to ban 'disturbing' content'.

                  That's orthogonal problem; If you wanted to get rid of that entirely you'd have to pre-screen every content and that's just not feasible, nor any of the media does it aside from some automated filters on keywords.

                  Nor it is actual problem if you don't subscribe to people that post that; random nobody deciding to post that won't be featured in any feeds anyway because of how algorithms work so there is very little chance it will land on accident in someone's feed.

                  Of course someone could play the long con and get popular only to start publishing disturbing stuff, but none of current systems stop that and I'm not sure you even could.

                  • fzeroracer 3 years ago

                    Your comment goes against what you originally said then, which is that you 'don't want some wanker at Twitter or Facebook office' to decide that. You're making a concession that you do want them to decide SOME content that is by default blocked, which then this becomes a matter of degrees. Some people would disagree with your stance and think that it still violates free speech, hence Parler/Gab etc. Plenty of disturbing content is not illegal, which is where the gray area of moderation comes into play.

                    Even if you don't subscribe to people that post that, there's still the issue of network effects. They subscribe to someone that subscribes to someone that subscribes to someone that posts a beheading video that gains popularity, and so the algorithm prioritizes that traffic because it's within your graph. You can easily notice these effects just on Twitter if you pay attention closely to what it decides to show you.

        • nostromo 3 years ago

          This is an age-old panic that happens with each new technology.

          The printing press was seen as dangerous. Then the phone, then TV, and then the internet. And now it's social media.

          Each time the arguments are the same. People worry that the "wrong people" will have the ability to share ideas to a broad audience.

          And yet each time we've shown that more speech is correlated with an expansion of civil rights and liberal governance.

          • dsfyu404ed 3 years ago

            You can credit the printing press and increase in accessible information to the literate classes with the enlightenment, broader literacy, modern banking and a whole lot of other good things but the backdrop of all these developments was 100-300yr (depending on how you count) of religious and succession wars as some power was taken from the catholic church and the rulers that were closest to it and redistributed to a broader set of local rulers.

            It was definitely a step forward for humanity but it wasn't all unicorns and rainbows.

          • toofy 3 years ago

            > Then the phone, then TV…

            doesn’t the FCC heavily limit broadcast television and radio who and what behaviors are tolerated? like very heavily, no?

            and aren’t there limitations on what you can and can’t do with a telephone?

            • Karrot_Kream 3 years ago

              You might be thinking of the Hays Codes which were voluntary or the Fairness Doctrine, which was repealed by the FCC in 1987.

      • ska 3 years ago

        > Oh come now. I constantly hear

        Path dependence, it's a funny thing.

    • wallfacer120 3 years ago

      If you haven't noticed the chorus of coastal media types asking for people they disagree with to be prevented from speaking in any public forum, then you are truly oblivious.

      • shadowgovt 3 years ago

        I've seen them asking for private forums to stop giving such folk a platform, because an organization can choose to do what it will with its private property.

        But public forums? No, I can't say I have. In fact, I remember witnessing a President giving a large speech in a public forum that was followed by a riot at the Capitol (whether said speech was causal to the riot is still a matter for the courts). Nobody challenged his right to give the speech until the question of whether it was "incitement to riot" came up, and that challenge is unrelated.

        • ilyt 3 years ago

          Define "public forum". Because both Facebook and Twitter have way more range than most of what would be defined as "public forum" does

          • shadowgovt 3 years ago

            Range doesn't make something a public forum. Generally, control of the space by the public (or their delegates) does.

            A letter to the editor of the New York Times has the potential to be seen by 343,000 households (a number down from 1.3 million). This never made the New York Times' letters page a public forum.

            Twitter, for example, was at least previously accountable to shareholders. Now, it's very much a private forum.

            • ilyt 3 years ago

              And New York Times have competition in the space.

              There is no meaningful competition to what Twitter or Facebook does. If you stifle voices on that platform majority of the country won't hear them.

              Them not being technically public is the problem in itself , they function like public forum for opinions but one that is manipulated by corporation.

              • shadowgovt 3 years ago

                Twitter is competition to Facebook, Facebook to Twitter, and as we're observing thanks to Twitter coming under new management, Mastodon is competition to Twitter also.

                > If you stifle voices on that platform majority of the country won't hear them

                This is technically correct (the best kind of correct) but doesn't imply either a legal or moral responsibility on Twitter's part. Freedom of the press has never meant one is obliged to let others use one's press; in fact, it means the opposite.

                If people want a channel more unbiased than Twitter [1], they should build it. The Musk purchase has shown how fragile the bird really is when the users lose confidence in it.

                You note that them not technically being public is a problem. I think you raise an interesting question: what would a technically public forum look like and who would build it? If the government built it, do we imagine it would run the Post story, or would those in charge of it squelch it because the story is harmful to the active regime? It is, perhaps, best for the government to be out of the business of editorial work; leave that to the press.

                (... and if one wants a government-subsidized communication channel that isn't edited by someone else: that's the Internet and we already have it. Then one's goal to combat Twitter becomes building a service as easy-to-use as Twitter that everyone can use; make the press itself as free as possible. I recommend supporting the Mastodon project if that's one's goal. Or put effort into making one of the many, many blogging software packages of the world ever more turnkey. The way to maximize freedom of the press is to maximize presses, not constrain how the freedom can be used).

                [1] long experience suggests to me what you actually get when you try this experiment is a channel with different biases, but by all means.

      • jshen 3 years ago

        Or maybe we’re all in filter bubbles, including you.

      • ska 3 years ago

        > "chorus of coastal media types"

        KIIS-FM has a choir?

  • Animats 3 years ago

    > I dunno. Mostly I don't want to see ads. I liked Facebook when it just told me what my friends were doing. I don't want a "news feed" from a social network.

    But the news media suffer from a huge punditry to hard facts ratio problem.

    Today's first-screen news stories on major outlets that didn't begin as a press release or punditry:

    * Fox: "Remains of missing toddler Quinton Simon found in landfill, mother charged"

    * Washington Post: "Covid deaths skew older, reviving questions about ‘acceptable loss’"

    * New York Times: "Antiwar Activists Who Flee Russia Find Detention, Not Freedom, in the U.S." "Exhumed Grave Near Kherson Shows Occupation’s Brutality"

    * New York Post: nothing.

    * Reuters: "Hawaii's Mauna Loa volcano erupts for first time in nearly 40 years".

    * The Times (of London): nothing, because they have such large ads and banners that the content is hidden.

    * South China Morning Post: nothing, again mostly because of oversized banners.

    * Le Monde: "The unflinching gaze of Ukrainian drones in Bakhmut"

    * CNN: multiple stories, because the first screen has many headline links and small banners.

    Some of this is optimizing the use of screen space for revenue, not information.

    • jrmg 3 years ago

      I liked Facebook when it just told me what my friends were doing. I don't want a "news feed" from a social network.

      I have the completely un-researched and un-backed opinion that Twitter changing its prompt from "What are you doing?" to "What's happening?" in 2009 has had an unappreciated, oversized negative affect on the world.

      • cxr 3 years ago

        I hold an opinion that can be considered somewhat opposite[1].

        Similarly, a commenter from Reddit[2] writes:

        > I haven’t personally met anyone active on Twitter for years. There are very specific types of people that want to be sharing things in that way and from my experience they are very narcissistic.

        > Not that Reddit doesn’t have their share of asssholes, but I find that because it is anonymous it is also more tolerable. At least it was the case until a few years ago, now Reddit became very popular and the quality of the posts has declined immensely in almost all subs.

        > Twitter is just a bunch of people self promoting themselves as more intelligent, more informed, more socially and environmentally conscious, etc. Basically IG for unattractive people.

        Already implicit in the "What are you doing?" prompt is the hallmark of modern social media that focuses on positioning individual personality and punditry over common, for-the-greater-good, "kitchen table" discussion in service to a given topic—the focus is on you, the contributor, rather that taking backseat to whatever it is that you're supposed to be contributing to. I'm not sure that this can be rolled back at this point, though.

        1. <https://news.ycombinator.com/item?id=31180315>

        2. <https://old.reddit.com/r/samharris/comments/z59ccj/annaka_on...>

      • rchaud 3 years ago

        That change in language was likely prompted by a boost in news media relevance for Twitter after the Iran protests against then-leader Ahmedinejad in summer 2009.

        Prior to that, I had not heard of Twitter. I was on Facebook and Digg and that was about it.

  • 98codes 3 years ago

    For me, it was never about what I did see, it was always about what I didn't see. Want to show me extra stuff that you think matches? OK, but FFS let me see the stuff that I have purposefully, manually subscribed to/followed.

  • k__ 3 years ago

    That's true.

    My timeline is pretty chill.

    But when I look what's trending in my country, Twitter is a garbage fire.

  • 0x457 3 years ago

    Hmm, I see a lot of trash that I don't want to see "suggested" by twitter in my feed on the time. The thing is...I don't care that others see it, like it, retweet it. It just, personally, I don't want to see content from accounts I'm not subscribed - I don't care how many of my accounts, that I follow, follow those accounts. Like, 2/3 of my feed is not from accounts I follow: promoted tweets and suggestions.

    That's why I quit. I still log in from time to time to keep my handle alive and check out some crazies...

  • P5fRxh5kUvp2th 3 years ago

    ^ and this sentiment is repeated by everyone who thinks they're smarter than everyone else.

    So now where are we?

  • BizarroLand 3 years ago

    Isn't it odd that we are both always perfect and always getting better with time?

    I don't know anyone who consistently looks at their own thoughts and opinions from a self-critical eye with the goal of probing their mental models for flaws in the structure for the express purpose of fixing their logical or factual errors, but surely we do gradually over time.

    That is probably a very difficult thing to do all at once, although I suppose it does happen little by little. Surely, no matter how rarely or in what brilliant glimpses we get time chisels away our ignorance so long as our ignorance is more pliable than the steel of the chisel of our self-education.

    I doubt many adults feel as though they have suddenly sprung fully formed from the womb of childhood, fearfully and wonderfully made in the image of perfection, and yet most of us feel that as of right now, our small flaws aside, we are as close to perfection as we have ever been or possibly ever will be.

  • ilyt 3 years ago

    Eh, I think many of them are fully aware that it is the platform that pushes views that agree with them, hence any notion of removing "moderation" seems like helping "the enemy" get a hold.

  • jrmg 3 years ago

    I don't think that's correct. I think most people are complaining about what they see, and how they feel when using Twitter.

    I'm basing my opinion on:

    - The experience of seeing people move to Mastodon and comment on how much more pleasant they find it (I do think this might be fleeting - more because they're seeing only early adopters than because there's intrinsic 'bitterness' missing on Mastodon, at least compared to 'old Twitter').

    - My own experience of Twitter. It just makes me anxious and angry now. Every time. Yet I keep going back! It's transformed from a pleasant place to converse and browse to somewhere where everyone is angry at everyone else all the time. I'm fairly liberal in my opinions, so usually people I see are angry at either right wing folks or other left wing folks being liberal in slightly different ways - but I do also see the retweets of angry right wing tweets with the "hey, isn't this opinion SO WRONG" opinion. This is with the 'chronological' feed. With the 'algorithmic' feed, it's _even worse_: I now get more angry people I don't know on both sides of the spectrum, and I don't see the non-angry normal conversations or life updates from people I do follow (presumably because they're low-engagement).

    - I _do_ think about how this affects others, you're right - and I imagine that those with more right wing opinions are in much the same state as me, but more 'opposite', and _even more angry_ because right wing accounts seem to be more likely to be aggressive than left wing ones. So, yes, I am concerned about what others see - but only in as much as I'm concerned about what everyone sees and how the current political climate is descending into an angry selfish madness where no-one is prepared to actually listen to anyone else. I do think this angry selfish madness is more prevalent on the right, especially in mainstream politics.

    Maybe my experience is not the majority one, but I think you should take it into account.

    • toofy 3 years ago

      > My own experience of Twitter... it’s transformed from a pleasant place…

      yep. this is what many of these people just continually fail to understand.

      it’s just not fun being around abusive, abrasive, anti-social people constantly.

      its not complicated. it’s not scandalous. it’s not shocking. people don’t enjoy being around anti-social rude people.

      when they’re given an alternative where the promise is less shitty behaviors (the current iteration seems to be mastodon) then people go there, and this drives a certain group of people crazy.

      what we should be asking is why it bothers them so strongly when some people say “i’m personally going to move away from that rude abrasive person over there.”

      why are they so insistent that we spend our free time around abrasive assholes? they’re adamant about this. why?

  • justlikeplace 3 years ago

    Sounds just like Hacker News.

  • bitwize 3 years ago

    Read Marcuse's "Repressive Tolerance" to see why this is the case:

    https://www.marcuse.org/herbert/publications/1960s/1965-repr...

    The tl;dr is that if you actually want to move toward a society of free and equal human beings and eliminate oppression, "free speech for all" is not how you go about it. One side of the political spectrum supports equality and emancipation, the other is opposed to it. Therefore the speech of one side must be tolerated and supported; the speech of the other must be squelched and restricted.

    The internet has had its time to experiment with free speech for all, and now we have Nazis. Social media companies saw greater value in muzzling the Nazis than they did in hewing to outdated libertarian internet values.

    • petermcneeley 3 years ago

      > Therefore the speech of one side must be tolerated and supported; the speech of the other must be squelched and restricted.

      This sounds like a distinction between say a friend and an enemy. Sounds like a political concept I have heard before.

    • hutzlibu 3 years ago

      "of free and equal human beings"

      The problem starts with how you define those two(or any political expression). Because by many definitions, they are contradictory.

      Equal chances? Money gives people advantages, so all should have the same money? Even if all would have the same money, some are born smart and some are born dumb. Genetic equalisation?

      Pretty much against the idea of "freedom" like I know it.

      When people are free, some will party all day, some will work like a dog.

      So some will get rich. Some will get by, some will starve.

      Unless you mean strictly "equal in front of the law", which we supposedly have, but we all know that the one with the more expensive lawer (or any lawer at all) has a big edge. So de facto we are not even equal in front of the law.

      The big question is, how and what would you change about it?

      I would actually start with free speech, because when people cannot speak freely, they will start speaking in code. And this will just make the whole political discussions even harder, as it will blur the definitions even more. I think this is mainly what we have on the internet. Lots of talk and lots of people thinking they are right, but only little actual communication and adressing the core problems because people are mostly talking about different things.

    • sien 3 years ago

      And read Marcuse for gems like this :

      "They would include the withdrawal of toleration of speech and assembly from groups and movements that promote aggressive policies, armament, chauvinism, discrimination on the grounds of race and religion, or that oppose the extension of public services, social security, medical care, etc."

      That is, if you oppose the expansion of government you should have your right to free speech removed.

      From : https://en.wikipedia.org/wiki/Herbert_Marcuse

      Which shows the problem with idea of 'we must remove free speech for hate speech' because one of the originators of the idea intended that free speech should be withdrawn from people he disagreed with in ways that many or even most people in most Democratic states would find abhorrent.

    • kneebonian 3 years ago

      > The internet has had its time to experiment with free speech for all, and now we have Nazis.

      Interestingly enough did you know that the original Nazis came about before the internet ever existed, way back in the 1930's, further did you know Nazis existed prior to the internet as well. It is true, in fact one may argue that there is no real correlation between the existence of Nazis and the internet, just their visibility.

      But in a less sarcastic vein, I can only speak from my personal experience, but from what I see the more we try and suppress speech that some do not like, whether labeling it as hate speech, misinformation, or whatever else has produced in fact the exact opposite of the intended result, where more people are willing to tolerate actual Nazi's, question official information more readily and tolerate real racism more because the label became so broadly applied that it lost any meaningful effect.

      Finally I also heard one interesting theory that part of the reason one side of the political spectrum is currently struggling so much is because the other side had even their moderate views censored from twitter, whereas another side was treated in a much more lenient fashion. The result was only the most reasonable and centerist voices of one side were shared, whereas the other side had many of their most radical and vocal members airing their views for all to see, which was a turn off to many, whereas the most extermist content on the other side was hidden and squelched.

    • nostromo 3 years ago

      This is the exact logic used by Joseph McCarthy, and I think it's safe to say that was a failed experiment.

      > Therefore the speech of one side must be tolerated and supported; the speech of the other must be squelched and restricted.

      This is actually how you get Nazis. You know, the real ones that existed before the internet. Not 4chan weebs trolling boomers online, but the ones that would jail or kill you for having the wrong opinion or wrong affiliation.

      • ska 3 years ago

        That seems a-historical. The way you get the real nazis is by not acting soon enough when they target/vilify/'other' marginalized groups, or attempting to get a foothold in political institutions.

        • bitwize 3 years ago

          It is ahistorical. Germany outlawed Nazi ideology and went 80 years without renazifying; the danger of such was very real after WWII. Marcuse, a German Jew, was considering the real, immediate, terrifying problem of how to prevent another Holocaust.

          • nostromo 3 years ago

            You're cherry picking.

            Look at Joseph McCarthy and see a counter-example. He used the exact same logic: to prevent a Soviet-style totalitarian government in the US, we had to be intolerant of everyone left of mainstream Democrats.

            It destroyed many peoples' lives, most of who were not at all sympathetic to Soviet-style communism.

            Similar to your example, we could also argue that McCarthy was indeed effective at preventing Soviet-style communism in America (since the US still has free speech and there has not been another holocaust). But in both cases we don't have convincing evidence that suppressing speech was actually helpful.

            • woooooo 3 years ago

              Worth adding on that McCarthy knew exactly what he was doing and didn't care. It raised his profile. (Source: master of the senate, caro)

              Those who validate dirty pool in pursuit of the greater good will always be enabling people like that.

            • pixl97 3 years ago

              Being moderate is a delicate balance where foes seek power in whatever way possible. With this said I get quite concerned when someone says it's their freedom to plot the end of my existence.

          • int_19h 3 years ago

            Germany still has plenty of open Nazis for all their efforts at censorship. Something like the infamous march in Charlottesville was breaking news when it happened precisely because it's so unusual, but much larger neo-Nazi marches happen in Germany on a regular basis.

    • trillic 3 years ago

      A bit of a paradox, but a tolerant society does not tolerate intolerance.

      • HPsquared 3 years ago

        It's a logical contradiction. There seems to be some overloading of the word "tolerance".

        • googlryas 3 years ago

          It's not, it is just a term in common usage taken to the extreme in an attempt to be clever, in contravention of all conversational norms. I don't think there is any definition of tolerance that is mathematically sound and absolute.

          It would be like saying "a peaceful person would never punch someone in the nose". But what if you needed to punch someone in the nose because they were trying to kill you for no reason? That doesn't mean peaceful people don't exist, or that peaceful people need to let random people murder them. It just means that the determination of peacefulness is contextual.

        • petermcneeley 3 years ago

          Yes such overloading leads to paradoxes. My favorite https://en.wikipedia.org/wiki/Unexpected_hanging_paradox

          • drdeca 3 years ago

            Huh, does the unexpected hanging paradox come from such an overloading? It isn't obvious to me how it is. I don't mean that in a way to suggest that it seems unlikely to me. It seems quite plausible to me that it is. I just don't see in what way. Could you elaborate on how it comes from an overloading?

        • SideburnsOfDoom 3 years ago

          FYI, I think that you missed that this idea that "a tolerant society does not tolerate intolerance" is better known as "Karl Popper's paradox of tolerance" You can find it under that name, numerous articles, from Wikipedia on down (1). Since 1945. it's widely known and accepted. It's not an actual contradiction, most "paradoxes" only appear that way on a superficial inspection.

          it would be like saying, as the sibling comment did "a peaceful person would never punch someone in the nose" ... except that a society that aims for peace, and is beset with violent people, is going to have to quell that violence. This might involve reserving the right to punch the punchers.

          https://en.wikipedia.org/wiki/Paradox_of_tolerance

          • Andrew_nenakhov 3 years ago

            You are conflating protection from violence with protection of free speech. Paradox of tolerance is a hypocritical concept used by people to excuse their own intolerance.

            Restrictions on free speech are much more dangerous than any hate speech. You won't get a new Hitler by blocking speech you don't like. It's the opposite: new Hitler will start with blocking free speech. That's what every authoritarian regime does first.

            • SideburnsOfDoom 3 years ago

              > You are conflating protection from violence with protection of free speech.

              No, I am saying that they are "like" each other. It's a comparison not a conflation. Mr Popper is absolutely not conflating anything - his original formulation is very much about speech.

              > blocking free speech. That's what every authoritarian regime does first.

              You mean, after they become the regime, and get a lock on the power to block free speech; which in turn is after using the free speech ability to emit the divisive populist rhetoric that propels them to that power? There is no regime ever, that blocks free speech before it is voted into power. At that stage they're only to happy to use it. The shutting it down comes later.

              So, not first at all then. That was Popper's point.

              • Andrew_nenakhov 3 years ago

                > There is no regime ever, that blocks free speech before it is voted into power.

                But there is, we've seen it just recently! Take the US presidential elections in 2020: big media companies, bigtech all were favouring one political party and have successfully suppressed crucial damaging information against their side and baselessly labelled it as 'fake news', which helped the favoured party take power. It immediately proceeded by shutting down the media accounts of the defeated opponent.

                The ongoing hate-campaign against musk that started after him buying twitter can absolutely be interpreted as a reaction to breaking the monopoly on the flow of information.

      • Andrew_nenakhov 3 years ago

        "paradox of tolerance" is a hypocrisy. If you are intolerant only to the intolerant you are not tolerant, it is as simple as that.

        • SideburnsOfDoom 3 years ago

          There's absolutely nothing hypocritical about e.g. "defending peace". It's realistic. between Karl Popper's well-known writings on the subject, and one accusatory sentence from someone on the internet, I'll take Karl Popper.

          • Andrew_nenakhov 3 years ago

            To your knowledge, you have just used a fallacy called "Argument from authority", a popular demagogue technique.

            Just because someone wrote in length that white is black, white does not become black. Likewise, intolerance does not become tolerance.

            You do not tolerate something => you are not tolerant.

            You oppose free speech only for really horrible people => you oppose free speech.

            It's really binary, and wherever Popper wrote, it does not change this simple truth. But, of course, you can hypocritically pretend that it does.

            • tpm 3 years ago

              You probably should read Popper before dismissing his ideas. If you don't want to, it's fine, but in that case it's hard to take your dismissal seriously.

              Like more or less anything real, this is not a simple binary problem.

              • MrPatan 3 years ago

                "...for it may easily turn out that they are not prepared to meet us on the level of rational argument, but begin by denouncing all argument; they may forbid their followers to listen to rational argument..."

                That's Popper.

                Does that sound like "we can't let people speak"?

                Or would that actually be more in the lines of "we should expose their followers to the arguments their leaders don't want them to see"?

                • XorNot 3 years ago

                  People who don't fear physical violence make this argument all the time. They don't know why it's wrong, because again, they do not fear violence. Speech is just things on the internet, that aren't actually real.

                  They don't get doxxed, they can't be identified in a crowd, they can blend with whatever the majority is.

                  It is very easy to defend "free speech absolutism" when you're not the target of hate speech. When you're not the target of harassment. When no one is declaring that your rights should be removed, that violence against you or one of your group memberships should be acceptable.

                  "We just need to have good arguments" is something said by a person who's participation in the discourse is entirely voluntary, and who's stakes aren't "I have a right to make decisions about my own body", "I have a right to live unharassed in my private life", "I am equally entitled to life, liberty and the pursuit of happiness".

                  "Free speech absolutism" is something only ever advocated for by people who don't have to care about what that speech is advocating. Who have the privilege to turn a blind eye to stochastic terrorism, and will argue out one side of their mouth that "police have no duty to protect you from crime before it happens" while arguing out the other "well a few bad actors should be dealt with by the police".

                  Meanwhile, in the real world - bomb threats to children's hospitals[1].

                  [1] https://www.theguardian.com/us-news/2022/aug/31/boston-child...

                  • mardifoufs 3 years ago

                    This is a complete strawman. I'm muslim, born in a muslim country, and am living in the west yet I absolutely agree with the person you are replying to. Muslims are targeted by hate speech constantly, and there was a period in time (like, 3-4 years ago) when that was particularly intense. Much more so than almost any other minority, and in a much more concrete way (christchurch, travel bans, secret prisons, etc)

                    Yet, I still would agree with a free speech absolutist much more so than with someone who thinks they know best about what speech is ok or not. And that's precisely because I'm a minority. I know that having a gatekeeper that decides what is or what isn't hate speech is never a good idea in the long run except if you belong to the majority who ultimately gets to decide what speech is ok or not.

                    It is also quite ironic to use "stochastic terrorism" (which is a dangerous term in and by itself) as a justification for your argument. It's a complete regurgitation of early 2000s talking points, especially to imply that those who disagree with you are turning a blind eye to some form of terrorism. In a way, it proves my point about being suspicious of any attempt to restrict free speech. Because it's the same exact tactic that was used to prop up islamophobia and justify human rights violations (I mean, who would want to turn a blind eye to terrorism, right?!).

                    • XorNot 3 years ago

                      This is just arguing via slippery-slope fallacy. You're pretending that action against one type of speech would implicitly allow action against others without any due consideration or decision making process.

                      What were you prevented from saying "because it was hate speech" with no extra deliberation? It's almost like the content matters.

                      • mardifoufs 3 years ago

                        It's not slippery slope when it literally happened to muslims not even a decade ago. What are you talking about? You are ignoring most of my comment. I'm not pretending anything, you literally used the same terrorism scare that was used not even a decade ago against Muslims. You are the one arguing for some sort of slippery slope, where if we allowed more free speech we would end up with terrorism and deaths. Your entire premise is based on a slippery slope between free speech and "stochastic terrorism".

                        Content does matter, in an ideal world. But in reality that type of "consideration" will be weaponised against minorities. You are arguing against yourself when you say that we just have to have "due consideration and a good decision making process" to decide what would be allowed. When said process will inherently be controlled and steered by the majority, in any democratic country. It's the type of argument that usually comes from privileged white people who have never experienced what "due consideration" means. It's a completely ridiculous premise, because it does not fit reality. Not the reality of most minorities at least.

                        • XorNot 3 years ago

                          > It's not slippery slope when it literally happened to muslims not even a decade ago

                          You keep saying this, and I am asking you: what were you prevented from saying? What Muslim voices in the West were silenced unfairly?

                          You insist it happened, but in response to what? What were you not allowed to say that was just so unfair. And no, being criticized for saying it doesn't count.

              • Andrew_nenakhov 3 years ago

                I've read him, thank you, quite a few years ago. Wasn't impressed.

                Some of people here who take free speech for granted do not know its value. I live in a very authoritarian country, that once had free speech, and I have seen how restrictions on it creep in. And what I see happen in US and Europe follows a far too familiar path.

                • tpm 3 years ago

                  I lived in a authoritarian country and think that you are now doing yourself a slippery slope fallacy. I'm also very sorry about what is happening to your country, but the material (sources and distribution of wealth and power) and cultural circumstances of European countries are so much different that the patterns you are seeing might not apply here.

          • marcosdumay 3 years ago

            No, it's hypocritical. You defend the Rule of Law, or Democracy, or some people's self determination.

            But if you start framing things as "defending peace (by force)", you may quite possibly have your arguments taken over by the people attacking whatever you want to defend. Because they can use the argument just as well as you.

          • P5fRxh5kUvp2th 3 years ago

            It's still a hypocrisy, but so what.

            People say it like it's a damnation of the idea. Things in life are fuzzy, that's why it's literally called a paradox.

            The most tolerant people in the world are not going to be ok with a 20 year-old having a sexual relationship with their 12 year old daughter because of the danger involved. That's the paradox of intolerance, that everyone has a limit, and if you have no limit, someone else will pick it for you and cause damage on their terms.

            We don't generally tolerate violence in the west. Yes, it's an intolerance, but it's for the greater stability of society.

            but I have absolutely seen people use the paradox of tolerance to argue for some shitty opinions, so it must still be dealt with care. You can imagine a KKK member using it to defend their choice of not allowing blacks to be free of lynching.

            • Andrew_nenakhov 3 years ago

              It's ok to be intolerant at some things. What's not ok is to be intolerant and still claim to be tolerant. That's hypocrisy.

              Also, do not equate actions and speech - and we're mostly arguing about free speech here, right?

              • P5fRxh5kUvp2th 3 years ago

                When the only way for you to be "right" is to argue the absolutist version of the definition of a word (tolerant in this case), you've lost.

                Furthermore, such black and white thinking is often a sign of mental illness.

        • bryanlarsen 3 years ago

          You do realize that both "paradox" and "hypocrisy" have the same basic definition, don't you? They both mean "self-contradictory", the biggest difference is in connotation, not meaning.

          • Andrew_nenakhov 3 years ago

            No. These words do not have the same basic definition.

            Hypocrisy: The practice of professing beliefs, feelings, or virtues that one does not hold or possess; falseness. [0]

            Paradox: A statement that seems to contradict itself but may nonetheless be true. [1]

            Every time I see someone here mentioning someone the paradox of tolerance, it is to support restriction of speech for someone else. (someone really bad, of course - fascist, antivaxer, racist, you name it), and these people always hypocritically consider that they support free speech, just not for those bad people.

            [0]: https://www.wordnik.com/words/hypocrisy

            [1]: https://www.wordnik.com/words/paradox

            • bryanlarsen 3 years ago

              I'm not sure how you can accuse people of hypocrisy for being intolerant of the intolerant when they just told you they're being intolerant of the intolerant.

              • Andrew_nenakhov 3 years ago

                Because they consider themselves tolerant, but with an escape clause, and fail to admit that this clause makes them intolerant too.

  • SideburnsOfDoom 3 years ago

    > Most Twitter users, as best I can tell, aren't actually complaining about what they see. They are complaining about what others see.

    I don't see any evidence for that

    > In their view, they have the "correct opinions" ... the people that disagree with them are the ones being duped

    This is getting very straw-man

    > That's why there's a huge censorship movement right now in the US

    Oh really, is there?

lcnPylGDnU4H9OF 3 years ago

I read this as a problem with how the term "algorithm" is used by / when communicating to the layperson (basically, the general problem of "buzzwords"). It's just being simplified; possibly overmuch but that's not relevant to the layperson's complaints. I think it's fair to say that the layperson understands that they don't want an algorithm which manipulates them to spend more time "engaged" than they otherwise would.

I do think there is value to algorithms which incidentally increase engagement as argued by the article (e.g. capturing and isolating email spam), but there is understandable push-back against an algorithm that is, additionally or not, being optimized for pupils pointed at screen.

KaiserPro 3 years ago

Instagram, youtube & tiktok are a reflection of what you click on.

But, to make them nice, you need to give them both negative and positive feedback at the very early stage.

If you see something that you don't like? Don't ignore it, hit the "..." and select the "get tae fuk" option. Otherwise it'll keep trying that genre of what ever is popular with a similar persona to yours.

The problem is, the "I don't like this" button is often hidden. (in youtube I'm not sure how well weighted the downvote button is).

For mastodon its hard, whilst the data is out there for most people to grab, the models need to be stored and trained somewhere. Your model is a PII risk. Also, there isn't much info out there on how to make a good recommendation engine (for obvious reasons) Unlike object detection/text to speech/OCR and other freely available models, recommendation models are as valuable as the dataset. This means they are rarely made public.

  • floxy 3 years ago

    >Youtube...Don't ignore it, hit the "..." and select the "get tae fuk" option.

    Is there a way to downvote something on the Youtube "home" page? Before you would click on it to play it? There is a vertical ellipsis, but the only options under that are "Add to Queue" and "Share".

    • themacguffinman 3 years ago

      I'd guess it's because you're not logged in. There are three options if you're logged in:

      - Not interested

      - Don't recommend channel

      - Report

      • floxy 3 years ago

        >I'd guess it's because you're not logged in.

        Thanks for the informative answer.

        • KaiserPro 3 years ago

          if you are on firefox you can use multi-account-containers to limit the reach of google.

          I have a container for youtube which is logged in with a specific account. Its different from any work account that I have.

          This improves the recommendations 100x. As it means that any videos you view for work, research or away from your main interests don't pollute your recommendations.

eatonphil 3 years ago

Even if Mastodon doesn't want to implement or play with "algorithms" (ignoring as is correctly pointed out that everything is/has an algorithm), all Mastodon content is ActivityPub content, right? So someone can put together a Mastodon alternative that supports pluggable algorithms, no? No need to convince the Mastodon devs who don't want this sort of thing.

  • tensor 3 years ago

    Unfortunately every time I hear this sort of thing be suggested there is a rallying cry from the original mastodon crowd to block said thing. It's really unfortunate and why I don't see mastdodon as a truly viable alternative to twitter. Perhaps if a fork of it gains enough influence that isn't hostile to algorithms and indexing it would work better.

    • fleddr 3 years ago

      Instance blocking will ultimately create the exact same dynamic as found at Twitter.

      Aggressive online activism, regardless of your political direction, is here to stay. So it can and will be used to pressure other instances into blocking yet other instances.

      • miloignis 3 years ago

        Small and single-user instances don't really fall prey to this - how is one even supposed to know what a single-user instance is blocking if you don't tell them?

        Anyway, even for those that do what you'd have is a stratification into levels of instances with different tolerance levels of blocking. I think this is a very different dynamic than exists on Twitter, can you explain more about what the identical dynamic would be?

        • fleddr 3 years ago

          The dynamic would be that the tolerance level you refer to would be as low as found on Twitter: people perpetually offended at the smallest of disagreements and aggressively silencing, blocking, canceling the supposed offender.

          An example I saw on Mastodon. A mid-sized instance wanted to stop federation with another instance that is well known and large. Why? Because that instance is federating with yet another instance containing a user supposedly affiliated with some right wing movement. Said user had not posted anything yet.

          So we're talking about a 3rd order blockage in this example. Now you might reason, the mod of the large instance is under no obligation to do anything, it's their instance. But that's not how the game is played. You wouldn't want to be called an enabler of fascism, would you? You'll comply or be labeled a monster. Or they could lobby to put your instance on a standard block list that is widely reused.

          When you let this run unchecked, you'll end up with Twitter: bubbles and fighting "the other side".

lifeisstillgood 3 years ago

One other thought is that most algorithms don't need to be very sophisticated.

So DuckDuckGo is (mostly) predicated on the idea that if you type "mens running shoes" into a search box you can confidently sell ads for trainers all day, and possibly other sporty goods. The bet Gabriel Weinberg is making is that the accuracy / profit for selling those ads is not significantly worse than using the text plus a thousand data points about that persons pregnant daughter and bathroom habits. Inwoukd love to see some study on that (not sure how - maybe have google return duckduckgo results and see what they buy. Hell I would bet they have already done this !)

Anyway I suspect that I would get as much out of twitter if I just get what David Attenborough reads on twitter (ie a manual curation from someone interesting and well versed in the world)

Sometimes the best algorithm is a much more educated and experienced human

andrewmcwatters 3 years ago

The shock people experience when you tell them any sorting, especially sorting chronologically, requires an "algorithm" is disappointing.

  • senko 3 years ago

    It makes perfect sense considering the wider public understands "algorithm" as "big,opaque,unknowable way of doing something probably tilted towards interests of the corporation/government", which is how the word has been used in mass media for years.

    • Pxtl 3 years ago

      Kind of like the word "agenda", which basically just means a "to-do list". But an "agenda" often represents the accusation that you've got a secret nefarious goal.

      Or even "rhetoric" which is just the term for the art of speaking well, but now has been taken to imply flashy and dishonest tricks of speech.

    • 082349872349872 3 years ago

      compare the wider public understanding of "hacker"

  • xsmasher 3 years ago

    When people complain about the "the algorithm" they're not talking about bubble sort. It's shorthand for something - "biased feed algorithms" maybe.

    You may as well tell them tomatoes are actually a fruit.

  • AdamH12113 3 years ago

    "Algorithm" is similar to "cloud" and (much earlier) "online" in that it's acquired a popular/marketing meaning distinct from its original technical meaning. It's an example of an improper noun[1].

    [1] https://siderea.dreamwidth.org/1773806.html

  • ne0flex 3 years ago

    I can't recall where I saw it, but a few years ago I saw an interview that involved some techie. The host asked them about Twitter and said, "what if we get rid of the algorithm, and, say, put everything chronoligically." The interviewee responded with "Well if you get rid of the current algorithm and replace it with that, you'll have an algorithm that sorts in chronological order." to which the interviewer's response was, "no, I'm saying we get rid of the algorithm."

  • yamtaddle 3 years ago

    I think most people don't regard something they could easily—if tediously—do by hand without ever so much as thinking of a mathematical symbol or operation (I mean, I guess they do a bit of you're sorting by numbers, but "is this number/date higher or lower?" only barely qualifies as math for most people, if at all) as algorithmic, even if it entirely is, by the strict definition.

  • jasonlotito 3 years ago

    They aren't complaining about algorithms. They are complaining about an algorithm that determines what people should see for the sole purpose of increasing some engagement metric. This does not necessarily mean giving the person what they want to see.

  • bawolff 3 years ago

    I mean, it is mostly just people dont know what the word means. Its so vauge you could replace it with "do something" and be correct 99% of the time.

    Sorting requires "doing something", doesn't sound too shocking.

Pxtl 3 years ago

Yeah, this is pretty much what I'm hoping for on Mastodon. On Twitter I would occasionally turn on the "home screen" instead of the "latest" feed to see what's going on in a larger world. Including such a facility (or even multiple alternate implementations of such a facility) would be nice in mastodon.

The other part of the post - the abuse? That part is harder. Fundamentally, commercial social networks spend a crapload of money and traumatize working-class moderators all over the world trying to keep spammers, scammers, illegal pornography, hatespeech, and worse off their platforms. I don't know how sustainable it is for hobbyists to implement that.

Realistically, I'm expecting that if Mastodon truly takes off, it will gradually bifurcate: 1) Free instances that are hives of scum and villainy 2) Paid instances that are well-moderated and pleasant

Fortunately, Mastodon servers have the ability to federate with other servers in a "second class citizen" sort of way, so members of group 2 will be able to follow people on group 1, but the mayhem of free instances will be purely opt-in on a per-user basis.

I mean Mastodon has a lot of problems. Its tech stack is a nightmare to admin, and it is notoriously wastefully chatty when it comes to server-to-server messaging. But those faults can be fixed.

The social problem is much harder.

moron4hire 3 years ago

It seems to me that there is a "solution" that goes completely unspoken in these conversations.

There's the one side that says, "don't do anything opaque, just show me a chrono-timeline, I'll deal with the deluge myself". And another side that says, "you delusional to think you can deal with a deluge".

But maybe there is a third position: don't get into a deluge in the first place. Maybe the answer is actually that we shouldn't be trying to follow thousands of people on social media. Maybe there is no meaningful way to keep track of that many people and still be able to existentially understand them as, ahem, people anymore.

So far, my experience on Mastodon is bearing this out. I have almost exactly 10% of the followership on Mastodon than I do on Twitter. Yet I'm easily having 5x more conversations. The quality of those conversations is very significantly better, but I'll leave that hairy ball of unquantifiability by the wayside for now. So far, "engagement" on Mastodon is 50x better than on Twitter.

nonameiguess 3 years ago

It would be nice if there was a way to talk about this subject without immediately devolving into partisan political comparisons. This is a far more fundamental issue with recommendation systems, a conflict between immediate gratification and addiction mechanisms versus deeper nourishment and satisfaction. You can see this in so many arenas of human consumption.

Consider diet. If you log everything I ever consume and then measure the probability I consume the same thing again in the next 30 minutes, what are you going to find? You'll be recommending I eat nothing but donuts, hard liquor, and potato chips.

Consider video. If you log everything I ever stream and watch and then measure the probability I stream and watch something similar in the next 30 minutes, you'll be recommending nothing but ChiveTV-style fail videos and short-form hot takes.

But are these the things I actually most want? Humans are complex creatures with desires and preferences that don't always express themselves in terms of quick re-consumption of similar items. If you ask me my favorite film, I'm going to say Apocalypse Now, not an epic fail compilation. But if you measure what I'm more likely to watch on repeat for 8 hours, it's going to be the fail compilation. If you ask my preferred foods, I'm going to say lean meats and vegetables, but if you measure what I'm most likely to repeatedly eat for hours without stopping, you're going to find a bunch of dessert foods and snacks.

The types of things people consume that are most nourishing and satisfying, since they actually provide nourishment and satisfaction, are not things that lead to immediate re-consumption of the same thing. The types of things that lead to immediate re-consumption are things that are insubstantial, don't require thought or reflection, don't lead to satiety, and things that are addictive.

I have no idea what sort of solution to this scales and will satisfy people that don't like gatekeepers, because the reality is, I've found what films and television shows and music albums I've most liked from top 100 lists curated by experts, and I've found what foods best nourish me and lead to the long-term health and physique outcomes I'm trying to achieve by the same method, expert recommendations from people well-versed in science.

You can't automate this, but to work, the public has to trust a curator, and it seems most of the public doesn't trust anyone to do this, or if they do, they'd rather trust the Critical Drinker and Liver King instead of the American Film Institute and FDA.

PaulDavisThe1st 3 years ago

Once again I come to sing the praises of browser add-on Tweak New Twitter.

I see just tweets from people I follow, with their retweets available on a separate tab. No trending, no random tweets. Twitter the way I originally imagined it was supposed to be.

No need for any algorithm beyond what this offers, unless you're really keen on discovering random people to follow, and the people you already follow do not retweet. In which case ... you have my sympathy.

fleddr 3 years ago

If one was to be a purist, you'd remove the boost option.

With a boost you want to increase the visibility of a post by somebody else. Probably because you agree with it. Something has to happen with that boost signal, otherwise boosting is pointless. Hence, in some way a boosted or often boosted post is promoted over ones that are not boosted.

As the boosted post gets more eye balls, it will get even more boosts. Not necessarily because it's so awesome, simply because it's the post that is shown more prominently.

This simple snowball effect has viral potential, hence it's just as corruptible as Twitter. It can and will be used agenda-driven. Power always consolidates, hence soon you'll have the elitist layer with large followers boosting each other's posts. The other 95% gets zero traction or engagement.

The other downside of boosts, retweets, quote tweets is that it discourages writing your own original posts. More than 80% of what Twitter calls tweet activity is simply retweets, not new tweets.

Finally, if you want organic social media, people to follow should never ever be automatically recommended. This triggers the exact same snowball effect.

  • mhink 3 years ago

    > The other downside of boosts, retweets, quote tweets is that it discourages writing your own original posts. More than 80% of what Twitter calls tweet activity is simply retweets, not new tweets.

    I don't really see a way around this unless you explicitly try to prevent people from posting things that have been posted before. IIRC, retweets were originally just normal tweets that people tacked the letters "RT" in front of. It was an emergent social pattern that got reified into a feature.

    • fleddr 3 years ago

      I'm under no illusion that such feature will ever be removed, but I stand by the concept/idea that social media should be about people sharing their own thoughts and ideas, rather than this lazy and mindlessly redistribution.

      • miloignis 3 years ago

        I think it's the opposite - without an "Algorithm" (instead just the simple algorithm of sorting posts by time) there's no natural way to find other interesting people to follow!

        What Mastodon does is replace a computerized algorithm that tries to find you things you would be interested in with a "social algorithm" which is people you follow boosting posts that they like or find interesting.

        It's like getting book recommendations from a friend instead of Amazon! Sharing things that we like with others is a deeply human activity, one I think is generally very positive.

        Anyway, if you don't like what someone's boosting, you can prevent their boosts from showing up on your feed. Great thing about a FOSS, interoperable system is it's configurability!

        • fleddr 3 years ago

          "I think it's the opposite - without an "Algorithm" (instead just the simple algorithm of sorting posts by time) there's no natural way to find other interesting people to follow!"

          Sure there is, it's called "effort". And that's the problem. You can scan directories of users, search based on topic, tags, explore chronological timelines of instances. Manually craft your feed like this, organically.

          But I agree that this will not work anymore these days as people are used to algorithms taking care of this.

  • rpdillon 3 years ago

    I haven't read the source, but I was under the impression that boosting is just retweeting. It increases visibility because it is now sourced from another timeline, not because there's a hidden value that's being incremented.

  • shadowgovt 3 years ago

    Boost is a shortcut for "re-toot by copying and pasting all the contents and a link to the original" with some convenience sugar on the timelines.

    If boost were removed, users would re-invent it.

jasonlotito 3 years ago

> Don’t try to deny it, if it wasn’t what you wanted you wouldn’t be doomscrolling so much, would you?

This is the most ignorant and idiotic[1] take I've seen on something like this.

Don’t try to deny it, if it wasn’t what you wanted you wouldn’t be smoking so much, would you?

Don’t try to deny it, if it wasn’t what you wanted you wouldn’t be drinking alcohol so much, would you?

Don’t try to deny it, if it wasn’t what you wanted you wouldn’t be taking oxy so much, would you?

Don’t try to deny it, if it wasn’t what you wanted you wouldn’t be seeing ads so much, would you?

> These ML models know what you want and that’s what they show you.

No, they don't. This is 100% wrong, and coupled with the previous comment, shows a complete lack of knowledge in this area.

These ML models don't know what you want.

They know what you engage with. What you engage with isn't necessarily what you want. And ANY suggestion otherwise is wrong. Simply put, engaging with something doesn't necessarily mean you want to see more of it.

[1] Yes, that's a hard stance to take, but I stand by what I said. But you read this comment, which means you wanted it. Don’t try to deny it, if it wasn’t what you wanted you wouldn’t read any part of it, would you?

  • shadowgovt 3 years ago

    Interventions to modify behavior (at least successful ones) that I'm familiar with start with understanding that you do want these things (or are at least ambivalent about them) and then getting to how you have conflicting desires and the behavior you want to modify may be incompatible with something you want more if you cling to it.

    If smoking had no side-effects, I certainly wouldn't care if my family smoked. But I'm trying to get the one dying from pneumonia to understand that smoking suppresses her immune system. Trust me: she wants that cigarette though; someone doesn't go that far out of their way to hide the lighter and the pack if they don't want to.

  • PaulHoule 3 years ago

    People aren't sure what they want. In particular, you 20 years from now might not agree with what you think you want right now. See https://www.themarginalian.org/2016/03/22/why-love-hurts-eva...

compiskey 3 years ago

What criteria are being used to determine this is what we want?

People also want heroin and we use literal impact on society to curtail use. Freedom of reach is what we’re talking about limiting; right and left seem to both demand they have access to my beliefs, and leverage big corp as a tool to hide their political agenda to that effect as government social programs are accountable to public oversight.

I mean I’m pretty much convinced at this point human languages lack the nuance to make any conclusions of sound logic at this point except in contexts they came up in; Anglo social norms. The correctness of a colloquialism comes along with a command to use it in a certain emotional way.

We’re just engaged in spiraling emotional pettiness; how dare you offend thy sensibilities, good sir!

  • wallfacer120 3 years ago

    > What criteria are being used to determine this is what we want?

    The fact that it's what people choose. This isn't difficult.

    • AlexandrB 3 years ago

      Not in my experience - at least not entirely. There's almost certainly some weights/criteria that have nothing to do with what I want (in a previous thread someone suggested that YouTube may weigh videos by availability at the nearest CDN, for example). But these algorithms also interact with botting and SEO (for lack of a better term) in a way that surfaces stuff that I give no shits about but seems to be popular.

throwaway290 3 years ago

> Mastodon introduces a feature where you can download and install algorithms, which can be posted by anyone. They are given the raw unsorted list of posts from people you follow and use that produce a coherent feed. You might have to pay for them. They could be free. They could involve elaborate ML, or not. They might sometimes pull in posts from people you don’t follow. They could be open-source, or not.

That, or just use a separate ActivityPub client developed with this feature in mind. After all the point of it all is open and standardized APIs.

PaulHoule 3 years ago

It bugs me that people call these "algorithms" when they really are "heuristics".

An algorithm for sorting an array sorts it correctly 100% of the time, at least it is supposed to

http://envisage-project.eu/proving-android-java-and-python-s...

Something that ranks your feed is a heuristic, a rule of thumb that works right frequently.

nullc 3 years ago

I have to protest the assumption that if people are spending time on it is what they want.

It can be and often is what people fear-- what they DON'T want. It's a pretty good survival technique, in general, to pay attention to things that threaten you.

Companies monetize this human tendency, same as any other.

Playing on people's wants may waste their time or make them dull, but playing on their fears also harms them by producing constant stress.

winReInstall 3 years ago

I think a happier person produces, does not consume. So a ideal algorithm would be shoving the user towards being productive, creative. To accomplish that, the feed would have to redirected to people who "educate" people to become creators.

Take that nice little austrian post-card creator living in bavaria i follow on twitter. Every now and then a political rant, but nothing.. oh No... oh No No No..

jimmytidey 3 years ago

Am I missing something? I have Twitter set to show the timeline of tweets as they happen, which although I accept it requires a deduplication algorithm is broadly non algorithmic.

I realize there is also another 'curated' timeline setting, but you can always switch it off.

There are all kinds of problems with Twitter, but I'm surprised to see so many people saying 'the algorithm' is one of them.

  • rsynnott 3 years ago

    That’s not the default, and remember that even if you’re using the chronological feed, the people you follow, who are retweeting stuff, are largely using the non-chronological attention-driving feed. You’re still exposed to The Algorithm (TM), just less directly.

    • jimmytidey 3 years ago

      This is such a fascinating insight, I have not sufficiently considered how many people might be using the algorithmic feed.

      The main problem I experienced with their 'curated' feed was that the news was so old. If I'm watching Twitter its because there is a news event happening.

      So many times I'd see a tweet "Minister resigns!" and I'd think - "Another one, this is the end of the government!" only to discover Twitter has surfaced a tweet from 24 hours ago.

kornhole 3 years ago

If we had a succinct way to explain or logic to point to, it would help clear the confusion about the massive difference between algos built into Mastodon SW and the ML type algos deployed on platforms.

thomastjeffery 3 years ago

We end up obfuscating too much of the subject every time we say "algorithm".

We are talking about filtering and sorting. There is no need to be more abstract than that.

djmips 3 years ago

It's not what I want - it's what an AI model thinks I want and I feel like that's modelled after the average 'want'.

thereald0tt 3 years ago

No, it isn't showing you what you want. It is showing you what it thinks will get you to spend more time on the service, that's all

unicornhose 3 years ago

A good post, I like the direction and will follow bottoms-up ranking design with interest. :)

tensor 3 years ago

I'd love my own personal machine learning filter/ranking algorithm.

hdjjhhvvhga 3 years ago

> being used on Twitter to combat Nazis and incels

There's something unsettling about putting these two terms next to each other.

lifeisstillgood 3 years ago

I think of this as does Twitter / other social media act as a common carrier or not? If twitter just hoses you down with your followed accounts in chronological order then they probably get to be called a common carrier. If not then they are doing some level of curation, choice, publishing and that is "the" algorithm - and yes it might be tweaked to be "better" (good luck defining that) but it always exists.

Whatsapp is I think a common carrier by any sensible definition - and yet look at the anti-social effects ascribed to it - murders in India, right wing election issues in Brazil.

There is no algorithm controlled by whatsapp on who someone adds to their group and shares pro/anti political messages. There is just the real world social graph being imprinted on the technology

And I think this is the problem - people.

Tech does not mindlessly control us, it simply provides us with our bubbles.

I think I am violently agreeing with tim bray.

californiadreem 3 years ago

Ah, the halcyon days of thrusting off the shackles of capitalism to have freedom in a decentralized digital environment where a new Social Contract is being negotiated! There's no way our idealism will feed a new class of nouveau riche that will sacrifice community ideals for profit.

Or to be more clear, Google ("Don't Be Evil") was founded in the same year that John Perry Barlow published A Declaration of the Independence of Cyberspace (1996). Barlow is dead and so is his dream. The Empire, long divided, must unite; long united, must divide.

Which is to say, federate the web all you want, but a Hamilton is inevitably going stomp on your Jeffersonian dream (and vice-versa).

https://www.eff.org/cyberspace-independence

  • wallfacer120 3 years ago

    i wonder if there's a single ill in the world that you don't lay at the feet of "corporations."

    • californiadreem 3 years ago

      Commenting on the naïveté of Mastodon users endlessly repeating the history of modernity is a strange way to affirm any position regarding corporations, but your unsolicited reply is pretty clear that you like them.

progrus 3 years ago

“Stop trying to make Mastodon happen. It’s not going to happen.”

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection