Sincerity Wins the War

wheresyoured.at

62 points by treadump 15 days ago


bananalychee - 15 days ago

What the author presents as "sincerity" comes off as injecting (his) biased views into reporting. The post devolves into a tedious series of anecdotes that ostensibly prove that "context" can reframe a story, and he argues that sincere reporting should take that context into account, which is reasonable in principle, but he doesn't seem to realize that he's only presenting context that suits his worldview and tosses out the rest. For example, he decries journalists being wrong or underemphasizing his bias by failing to account for data that proves him right in retrospect. In the same paragraph, he smears reporters for under-weighing and over-weighing soft data. That's easy to do in hindsight. My takeaway is that he undermines his own premise by demonstrating everything that can go wrong in opinionated reporting: cherry-picking, double standards, and confirmation bias.

P.S.: the most surprising thing to me about this blog post is that it went through an editor.

JohnMakin - 15 days ago

> My CoreWeave analysis may seem silly to some because its value has quadrupled — and that’s why I didn’t write that I believed the stock would crater, or really anything about the stock.

I think the underlying belief causing people to believe things like this are "silly" or that AI criticism is overstated is that the market does not really make mistakes, at least not in the aggregate. So, if XYZ company's CEO says "Our product is doing ABC 300000% better and will take over the world!" and its value/revenue is also going up at the same time, that is seen as a sign that the market has validated this view, and it is infallible (to a point). Of course, this ignores that the market has historically and often been completely wrong, and that this type of reasoning is entirely circular - pay no attention to the man (marketing team) behind the curtain or think about it too hard.

MisterKent - 15 days ago

My tech friends and I cannot wait for this agentic bubble to pop. Much like the dotcom bubble, there's absolutely value in AI but the hype is absurd and is actively hurting investments into reasonable things (like just good UX).

The hype and zealotry remind me of a cult. And as I go higher up the chain at my big tech company, the more culty they are in their beliefs. And the less they believe AI can do their specific jobs, and the less they have actually tried to use AI beyond badly summarizing documents they barely read before.

AI, as far as I can tell, has been a net negative for humans. It's made labor cheaper, answers less reliable, reduced the value we placed on creativity and professionals in general, allows mass disinformation, and mostly results in people being lazier and not learning the basics of anything. There are of course spots of brightness, but the hype bubble needs to burst so we can move on.

tptacek - 15 days ago

In what way is this piece saying something different than Zitron said on June 9 in his "Never Forget What They've Done" piece?

https://www.wheresyoured.at/never-forget-what-theyve-done/

KerrAvon - 15 days ago

We need more reality-grounded takes like this one. I do have a quibble:

> These LLMs also have “agents” - but for the sake of argument, I’d like to call them “bots.” Bots, because the term “agent” is bullshit and used to make things sound like they can do more than they can[…]

I'd argue "agents" is actually reasonable technical jargon for this purpose, with a history. Tog on Interface (circa 1990) uses the term for a smart software feature in an app from that time period.