Category Archives: Technology

Apparently, top editors suspect the Matrix is coming

It really struck me that a day or two back, the editors of two of our nation’s premiere newspapers led their reports with the news that Sam Altman — a guy I’d only heard of, before this week, because I’ve listened to some really deep, detailed podcasts about AI in recent months — had been hired by Microsoft after being fired by OpenAI.

If you haven’t spent years of your life agonizing — and I mean agonizing — over what to put on a front page and how to play it, day after day, this may not seem to mean much.

But it meant volumes to me. Excuse me for oversimplifying the definition of a lede story, but it basically means that, at least for a moment or a day, this guy being hired was more important to the world than anything going on in Israel or Gaza or Ukraine or anywhere else in the world. Nothing presidents, kings or dictators were doing anywhere mattered as much.

Now why would that be? This is something you might expect to see, and sometimes still do, in a paper that’s historically all about business, by which I mean The Wall Street Journal. But these are general-purpose newspapers, and the cream of the crop.

So what pumps this up so?

Well, the guy was canned from OpenAI because some people on the board were worried about what AI might do to the human race, and thought Sam wasn’t as worried about it as they were.

But that’s a tempest in a teacup unless you, the editor making the play decision, think this guy’s work situation really IS of some sort of monumental importance to our shared fates — either because you’re worried about the Matrix or Skynet or some such, or because you think AI is so awesome that you believe where Sam has a job, and who he’s working for, overrides everything else in the world.

You wouldn’t be seeing this if the guy was the head of McDonald’s or something — unless, maybe, in the WSJ. They still love them some business.

So… if I see the White Rabbit, should I follow it? Should I keep an eye out for Terminators?

A Lyric Just in Time

I had a fun little exchange on Twitter with a friend a couple of weeks back, when he posted this quote:


Hey, it’s always fun when people start quoting Elvis Costello. For me, anyway.

So I listened to the song several times, and got to thinking about how that one line is more than just fun:

He stands to be insulted and he pays for the privilege

You know how I frequently make the point that it’s harder and harder to get the kind of people who ought to run for elective office to run anymore? Reading those books from the late 19th century lately has driven home the point so much more painfully. Why do we almost never see the likes of Teddy Roosevelt or James Garfield — or, to reach higher, Abraham Lincoln — step forward any more? Or for that matter, the extraordinary men who served under them, in key positions — John Hay, Elihu Root, Henry Cabot Lodge?

Well, I know why — because of 24/7 TV “news,” and more recently and intensely, social media. Things that climb all over you and mobs that can’t wait to cancel you for the most trivial things. Consequently, instead of people who set brilliant careers aside to give back to the country by sitting down with other serious people and working out the country’s real problems, you get people who don’t give a damn about any of that. They don’t want to work out problems with anybody. They just want to posture for their respective bases.

And to gain the “privilege” of doing this, they spend every moment between elections raising the money to pay for it.

I even felt a moment of gratitude today when I heard the House GOP had gone behind closed doors to nominate a new speaker. No strutting or posturing for the mob. And they came out with Scalise, which I think is better, or at least not as horrible, as the alternative. Which isn’t much to celebrate, of course.

Anyway, Elvis said it better than I have:

He stands to be insulted and he pays for the privilege…

I’ll close with the video:

Top Five Gripes About Apple

Years ago, I had no liking for Apple Inc.

I didn’t hate it or anything. I just wasn’t interested. I was a PC guy, and had been since 1991, and therefore had little reason to interact with the opposition. (I had a Mac in the office that I had to use to access the newspaper’s photo archives, because the photo department worked in that universe, but I didn’t like using it at all.)

But then, in (I think) the spring of 2011, I replaced my Blackberry with an iPhone. And I loved it. About a year later, I got an iPad — and if anything, I loved it more.

I still don’t have much use for Macs. They’re very solidly made — PCs feel structurally chintzy by comparison — but some very key functions that I perform without conscious thought on a PC (and have for more than 30 years) don’t work the same way, which slows me down. Also, some of the keys are in a slightly different place, leading to lots of errors. And the errors are hard to quickly fix, because the backspace key is positioned slightly differently.

Never mind that. I love my iPhone and my iPad, and have since the start. What I truly hate is the way Apple keeps changing them, apparently in the grossly mistaken belief that it is improving them.

And sometimes you do get actual improvements — greater speed and storage capacity, a better camera, sharper resolution, etc.

But other times Apple goes out of its way to take away good things, things that make life easier. And I hate that.

Here are the five worst, in terms of lost function on these products. Now, I’ll acknowledge that some of these losses were not inflicted by Apple. Maybe — and with No. 4 on the list, that seems a strong possibility — somebody else did it. And if y’all holler out, “That’s not Apple’s fault, ya eejit! You need to make this or that adjustment in the settings!,” I will be grateful. Here they are:

  1. Taking away the Home button. This was definitely and purely Apple’s fault, and it is by far the greatest sin on my list. And I have not been able to find a good reason for it. Oh, I’ve read about the business of removing a (theoretical, since I’ve never run across it with my four iPhones) mechanical vulnerability. But I suspect it was really about aesthetics, and that really ticks me off. If I want a bigger screen, I’ll use my iPad — which, incidentally, has a home button. Anyway, this is why I use an SE2, and if I replace it, I’ll get an SE3.
  2. Taking away the headphone jack. OK, we just dropped way down in importance. While I consider removing the home button a major offense, this one’s more of a misdemeanor — if that. Usually, it’s OK. Except when the only earbuds I can find is the old kind, and the tiny adaptor has gone missing. Or I don’t have earbuds, and want to use the free ones provided by the airline. Or — and here’s the semi-biggie — I badly need to recharge, but I want to use my earbuds at the same time. That’s one’s kind of moot, now that all sound from my phone goes to my hearing aids via Bluetooth. But it was a problem, and I suppose still is for people with normal hearing. And no, I would not consider investing in Airpods. Without a cord, they’re too easy to lose.
  3. Taking away the “find on page” function. OK, this one has really been ticking me off, because this was a huge part of the way I used my iPhone and iPad. I call up, say, a Wikipedia page that’s a couple of thousand words long, and I use the search function to go straight to what I’m trying to find out. But now, for a couple of months, I’ve had to wait until I’m at my PC to do this. At this point, I would cry out in rage, except, well… NEVER MIND… I finally found out, by searching Google one more time, that they just moved that function from the “share” button to the three dots in the corner. I still think that after this change, they should have prevented my phone from working until I had read a clear notice telling me this, but I’m satisfied. Let’s move on…
  4. Stopping me from tweeting straight from various apps. OK, I strongly suspect the real culprit here is Elon Musk. But since this is only a problem on my Apple devices, it made the list. And when you read as many different publications as I do each day, and tweet frequently, it’s a pain. After writing the tweet on the publication’s app, I get a notice that it has failed to post, and have to go to the Twitter app, call up “drafts,” and tweet it from there. And this, to me, sounds like a Musk thing.
  5. Constantly changing the freaking hardware. Have you heard about this?

Of course it we had an Apple Store in this town (a gripe that didn’t quite fit on this list), I could have taken my phone to the Genius Bar first, and wouldn’t have written this post. Because before posting, I made one more effort to find a workaround for some of these things, and I found one for restoring the home button. Sort of. And without my rage over that one, I wouldn’t have started on this tirade.

But here’s the thing — why should I have to find tricky workarounds for things that worked beautifully, simply, obviously and intuitively?

OK, enough. Bottom line, Apple makes some pretty great products. In fact, I almost had to go with only four things instead of five, because I had trouble thinking of the fifth. And in truth, this is not really a Top Five Gripes. It’s more like Five Gripes. But the feature is called “Top Five Lists,” and you wouldn’t want me to mess with the Hornby rules, would you?

 

 

 

 

Living the fantasy…

If only this creature could evolve a LOT more quickly…

No, this isn’t about something starring Stormy Daniels. (That would be the grand jury case I haven’t bothered to comment on this far…)

No, this fantasy is less lurid, but probably more important in the long run. Basically, this post was originally a comment I wrote in reaction to one of bud and Doug’s usual arguments over public vs. private.

Then I started riffing a bit, and it led to something that was really more involved than a comment should be, so I’m raising it to post level.

Here’s what I wrote, in response to this and previous comments:

Well, bud already cited ONE example of something that was done right. I’ll leave it to him to answer your question further. I’m not going to take a couple of years off from life to become someone capable of parsing road contracts and passing judgment on their efficacy.

It’s a silly argument, anyway, talking public vs. private on a function that will only EVER be undertaken by government. The private sector isn’t going to build highways — except as contractors working for the government. That’s the way it is. I’d be fascinated to hear your alternative plan for providing the infrastructure that makes it possible to have an economy in which private businesses can thrive WITHOUT these danged, pesky governments….

It raises some interesting dystopian scenarios. If we ever do get to a scenario in which it economically feasible for a private entity to provide general infrastructure, that private entity will essentially BE the government, at least within that area. You know, like in the Middle Ages, when the local lord of the manor was over everything. The society in which we live is the product of several centuries of Europeans striving to disengage from that sort of system, and try to build a system in which the things upon which we all depend are controlled, at least indirectly, by all citizens.

Not that we haven’t moved that way a number of times as technology has progressed. In the 19th century, it was the railroads. And eventually, government stepped in to control the freewheeling mastery of the environment that the railroad barons wielded. Over time, other technologies have asserted similar societal dominance. (Anyone ever see “The President’s Analyst,” in which — SPOILER ALERT — the power behind everything was the Phone Company?)

Today, we’re engaged in debates about technology that plays a bigger, wider role in our lives than railroads ever did — dominating and reshaping not only how we communicate, but how we think (ones and zeroes). And of course, all that’s in private hands.

So maybe I should take back my comment about it being dystopian fantasy. We’re dealing with the fantasy now…

I was teetering there on the cusp of busting out on a bunch of topics that are as habitual to me as public vs. private are to bud and Doug, things that are all affected profoundly by the things that have been coming out of Silicon Valley:

  • The Rabbit Hole.
  • The way technology has exponentially increased the problem of political polarization in our society. It had been a problem for decades, but in the years since the development of “social” media and broader technology that makes everyone on the planet more powerful (in terms of ability to instantly communicate with every other person on the planet, without editors or fact checkers or any other sort of mediation) than any newspaper publisher in history ever dreamed of being, humans have been trained to think like computers, in binary terms — ones and zeroes. Everything is black and white (words that newspapers now capitalize, by the way). There are no degrees of gray; there are no subtleties or nuances. There is no tolerance of those who disagree.
  • Of course, I include traditional media in this failure to cope with the problem. The new technology, having reshaped brains, distorts political events so that this madness is what the MSM have to cover, and tragically, they fall back on their old, comfortable love of conflict, covering politics like sports. And I don’t mean multilateral sports like golf or marathon running. I mean contests in which there are only two teams, and therefore only two ways of looking at anything. The deepest questions that get asked are: Who’s winning? Who’s losing? Which is profoundly tragic.
  • The inability of Western-style liberal democracies to deal with such polarizing forces, causing elements of the public to turn, over and over, to more oppressive, far less liberal, figures and imagined solutions. (The latest victim being Israel, trailing behind the U.S., of course.)
  • On a much deeper level than any of the above, the inability of Homo Sapiens to effectively cope with the change, since evolution takes millions of years longer than technological development.
  • That last point alone, of course, is one we could worry over for the rest of our lives, and still not get anywhere close to a helpful answer.

Seeing all of it as too much for a comment, I brought it here…

In “The President’s Analyst,” The Phone Company was Behind It All…

Good thing we’ve got these smartphones, huh?

OK, technically this image is from my PC, not my phone, but you get the idea…

That’s that I said to my wife this morning: Good thing we’ve got these smartphones! Or pretty much any device with Google. (Or Microsoft’s Sydney, if you’re the adventurous type.)

This morning, after my shower (sure, I work from home now, but occasionally I do still take a shower), I was drying off and for the life of me, could not remember how many guitar pickers there were in Nashville. I was thinking it was 1,552, but I kept running it through my head, and I wasn’t at all sure about it.

Fortunately, my iPhone was right there on the cabinet where we keep the towels, so I didn’t have to wait until getting dressed and leaving the room to get my answer. Before I started shaving, I Googled it:

Nashville cats, play clean as country water
Nashville cats, play wild as mountain dew
Nashville cats, been playin’ since they’s babies
Nashville cats, get work before they’re two

Well, there’s thirteen hundred and fifty two
Guitar pickers in Nashville
And they can pick more notes than the number of ants
On a Tennessee ant hill…

And so forth. So I was like 200 git-tar pickers off. No telling what would have happened if I couldn’t have found out right away. My head might have exploded or something.

I don’t know what we did before having these phones, and Google. Well, I sort of know. I had a dictionary on my desk at the paper. I tried to avoid looking at it, and fortunately I’m good at spelling so I seldom had to. But occasionally I would think, “Is that really the word I want in this context?” and open it.

Well, that would be it for awhile. I’d look up that word, and the definition would contain another word that I just had to look up or bust. And something about that word would remind me of another one I hadn’t run into in awhile, and this suggested fond memories, and couldn’t resist looking that one up too for old times’ sake, and before you knew it, I’d have been darting here and there in that volume for 15 or 20 minutes, with deadline bearing down on me.

Of course, today we have Google and HTML links, which are among the most wonderful inventions in human history, and the problems I had back in the day with a mere dead-tree dictionary look pretty pitiful. Or at least quaint.

But it’s fun. Anyway, after looking up the Lovin’ Spoonful, I thought about taking a crack at Wordle, but resisted the temptation and went ahead and shaved. Discipline, baby, discipline.

Managing one’s time takes more willpower than our fathers e’er dreamed of…

We have indeed met the enemy, but he is not yet ours

Ross Douthat made a good point today, although it’s a depressing one.

In the column, “I’m What’s Wrong With the Humanities,” he brought up the subject addressed in a sobering recent piece in The New Yorker, “The End of the English Major.”

We all have shaken our heads over those stupid kids today who can’t seem to make their way through so much as a sentence of 19th-century prose:

Like all the others who managed to make their way through Nathaniel Hawthorne in high school, I read this with a mix of smugness and horror. Then, naturally, I‌‌ scrolled to the next declinist indicator, the next sign of the cultural apocalypse.

What I did not do was click through and read the whole Heller piece (though I have read it now, I swear it!). Even more conspicuously, I definitely did not go pick up a copy of “The Scarlet Letter” or any other 19th-century novel and begin reading it for pleasure.

“The answer to the question, ‘What is wrong?’ is, or should be, ‘I am wrong,’” G.K. Chesterton once wrote. And any response to the question of what’s happened to the humanities has to include the same answer. The Harvard undergraduates who can’t parse a complex sentence from the American Renaissance are part of the problem. But so is the Harvard-educated newspaper columnist and self-styled cultural conservative who regularly unburdens himself of deep thoughts on pop TV but hasn’t read a complete 19th‌ -century novel for his own private enjoyment in — well, let’s just say it’s been a while…

Oh, Douthat lets us know he’s started to read, say, Les Misérables, but only gotten a hundred or so pages into it. He has similarly failed with shorter works.

He cites some of the things that he lets get in the way: website browsing; looking at his iPhone, “even at a live performance;” and long-form television, an obsession he attempts to justify by talking up Golden Age TV’s supposed literary virtues.

I have to confess to all of those, plus:

  • The little work I do these days to pay the bills.
  • Naps, which fortunately I’m able to blame on my stroke.
  • My fitful blogging.
  • And other stuff…

So it is that, while I have boasted a number of times here about how awesome “Moby Dick” truly is, and how I’m reading it with great enjoyment and a commitment to finishing it, I have failed to get anywhere near the point at which they finally find the white whale.

I’ve been saying that since — well, since I was still working as a newspaper editor. That’s quite a while, in blog terms.

Douthat goes on from moaning about the problem to prescription, but I’m not sure how workable his medications are. For instance, he refers to a piece in the WSJ headlined, “College Should Be More Like Prison.” To be fair, the idea is more reasonable than it sounds — the author of that piece (which, alas, I cannot read, since I let that subscription lapse) was referring to things she has learned from teaching maximum-security inmates. But I find it hard to imagine it being a practical cure for the rest of us.

I’ve gone on and on about, for instance, what diving down the Rabbit Hole has done to our ability to think, and to have a functioning representative democracy. But that’s as far as I’ve gotten, and I’m still at the whiteboard working on the diagnosis. I await the inspiration that leads to a remedy…

These days people love to quote Pogo’s twist on a famous saying: “We have met the enemy and he is us.” Which works in this instance.

But I’m thinking of the saying that Walt Kelly was playing on, from Commodore Perry: “We have met the enemy and they are ours.”

Perry had captured two ships, two brigs, a schooner and a sloop from the British at the Battle of Lake Erie.

I, for one, cannot yet claim that the current enemy is ours. In darker moments, I fear that we have struck our colors, and we are his…

Perry at Lake Erie

Has Ford sped past Tesla? This writer thinks so…

Admittedly, only 15,617 of these have sold, but that’s 15,617 more than the Cybertruck.

One of the greatest challenges I have on this blog is that I’ve largely lost interest in writing about the things that tout le monde is yammering about — the front-page stuff, which I usually find unbelievably repetitive and boring. But the world is still full of interesting things, things I’d like to discuss.

Unfortunately, the interesting ideas are usually here or there and coming from a single source. Which would have been fine back before newspapers figured out that their business was now online and that had to put up paywalls. (When I started blogging 18 years ago, that was not the case, except maybe for The Wall Street Journal.)

So when I see something I want to talk about, it tends to be in something I subscribe to, and others don’t.

That’s true of today’s topic, but I think I can summarize it easily enough to give everybody the idea. And I’ll try to quote from it within the vague Fair Use standard.

It was a “guest essay” in the Opinion section of The New York Times over the weekend, and it was headlined, “A 120-Year-Old Company Is Leaving Tesla in the Dust.” It’s written by Ezra Dyer, a columnist for Car and Driver magazine.

His point is that while everyone remains dazzled by Tesla, and while Elon Musk is making a spectacle of himself with his efforts to destroy Twitter, Ford has quietly sped past Tesla by such critical measurements as driverless and electric vehicles.

He starts off admitting that he once thought Tesla was the cool company, not only because it sold the only EVs you could drive for a reasonable distance without recharging, but also apparently because of the razzle-dazzle:

It made cars that performed animatronic holiday shows using their lights and power-operated doors. It came up with dog mode (a climate control system that stays running for dogs in a parked car), a GPS-linked air suspension that remembers where the speed bumps are and raises the car automatically, and “fart mode” (where the car makes fart sounds)….

But then, as a journalist covering the company, he started noticing that the people who worked there, his sources, were terrified to talk to him, being as evasive as spokesmen for a totalitarian government.

If you want to work for a flexible, modern company, you don’t apply to Tesla. You apply to 120-year-old Ford.

Tesla’s veneer of irreverence conceals an inflexible core, an old-fashioned corporate autocracy. Consider Tesla’s remote work policy, or lack thereof. Last year, Mr. Musk issued a decree that Tesla employees log 40 hours per week in an office — and not a home office — if they expected to keep their jobs. On Indeed.com, the question, “Can you work remotely at Tesla?” includes answers like, “No,” and “Absolutely not, they won’t let it happen under any circumstances,” and “No, Tesla will work you until you lose everything.”

But on the other hand, the cars make fart noises. What a zany and carefree company!…

More substantially, he noticed how Tesla lagged on the actual product front. He says Ford’s self-driving equipment is actually farther along than Tesla’s — and Tesla charges $15,000 for its feature that doesn’t fully work (he says the National Highway Traffic Safety Administration has referred to it with the charming words, “Full Self-Driving Software May Cause Crash“), on the premise that it may work sometime in the future. As for EVs:

Tesla’s long-promised new vehicles, like the Cybertruck and a new version of its Roadster, also keep getting delayed. The Cybertruck was unveiled in 2019, and on Tesla’s most recent earnings call Mr. Musk admitted that it won’t be in production this year, which is becoming an annual refrain. Sure, Ford sold only 15,617 electric F-150 Lightning pickups in 2022, but that beats the Cybertruck’s sales by, let’s see, 15,617…

Anyway, I thought all that was interesting. I don’t know that Tesla is slipping, but I’m impressed at what I read about boring ol’ Ford. I guess it helps not to have a, shall we say, problematic eccentric in charge. Although, of course, Ford once had that problem, too.

Personally, I drive neither a Tesla nor an F-150. But by way of full disclosure, I do drive a 2000 Ranger. It doesn’t do anything fancy. It’s a four-cylinder straight shift, and it doesn’t even have power windows. But it keeps running, and I hope it does so for years to come…

Meanwhile, I gather from the column, this model can make fart noises!

A matter of perspective and proportion…

I really need to go through the notifications on my iPad and turn some of them off. Or turn most of them off.

I would start with that irritating app called “Apple News,” except… occasionally, it offers me something interesting from The Wall Street Journal. I recently dropped the WSJ from my subscriptions, because I wasn’t using it enough to justify paying for it – and the cost is high, compared to my other subscriptions. When Apple News scoops one up to offer me for free, I can read it. And I like to check in with the WSJ – which has probably the strictest paywall in the business – occasionally. That app lets me do it.

So I like getting notifications when they have one – because I’m not going to be looking there on a regular basis. I need the heads-up.

Unfortunately, that means I get a lot of junk from it as well.

As you can see above.

But as you can also see above, they’re not the only ones hassling me. You’ll see notes from The Guardian, The New York Times and The Washington Post. None of which I would want to turn off, because there are no entities in the world more likely to alert me to actual news, which is, you know, what I subscribe to five newspapers to get. (Well, that, and commentary.)

The problem comes when we get to deciding what “news” is.

As you can see, for awhile there last night, the most important in the universe was that Beyoncé has won a heap of Grammys. Which I suppose is important to her, at least. Personally, I have never cared for a moment about who has or has not won a Grammy, much less who has won the most of them. There was a time when I cared about who won this or that Oscar. But I quit caring about that a quarter-century ago. And now I’m not sure I can tell you clearly why I ever did care. It mystifies me.

But a lot of people care about things I don’t care about. For instance, I’ve noticed that some people – perhaps even some of you – take an interest in football.

So never mind me.

We have all these news organizations in consensus about the fact that Beyoncé winning all these music awards is the most important thing happening, so they must be right – right? In fact, it makes you wonder what’s wrong with The Washington Post, wasting time telling me about some dumb ol’ earthquake that has now killed – let me go check – 3,800 human beings.

But wait – that was a few minutes earlier than the really earth-shaking news at the Grammys. So surely the Post got on the stick later. Well, actually, I don’t think they did. I never got a notification from them about it, last night or today.

Which makes those slackers, well, my kind of newshounds, I suppose.

Now, you will protest that those notifications are merely a snapshot of a few minutes in time, and that those other organizations no doubt turned to actual, hard news later. Especially the NYT. And you’d be right – at least in the case of the NYT.

But you’d be putting your finger on something that still worries me.

You see, back in the olden days, when newspapers still roamed the Earth and I spend a great deal of time each day agonizing over what to put on the front page and how prominently to play it, editors saw it as their job to present news all at once, and in a hierarchy of importance. We assumed people had a finite amount of time in their lives, and didn’t want to waste any of it. So we told them the biggest news right up top, but gave them the other stuff, too, in case they had time for it. That was up to them.

We were able to spend time weighing how to present things, and in what order, because we only presented it once a day – or two or three times if we had that many editions. So we had some time to think before deadline arrived.

No more. Mind you, I think it’s awesome that it is now possible to provide news to readers right now, without having to spend the day using 19th-century technology to physically get a paper product to them. I used to fantasize about that back in the early ’80s – at that point, there were no more typewriters, and all writing was done on computers (a mainframe system), and I kept thinking, What if when I hit the button to send this to the copy desk, it just went straight to the reader?

And when that became possible, I rejoiced. But then something else happened. We went from being able to send stories out immediately to having to send them out immediately. No time to stop and think, How does this compare to all the other things going on?

No. Whatever was happening now became the most important thing in the world, the way things had always been on TV news – which was something I didn’t like about TV news. You could only see one thing at a time, so at that moment, there was nothing else.

Suppose you – like so many – didn’t agree with what the editors said was the most important news. That didn’t matter. You could decide for yourself. It was all presented to you at the same time, instead of this stream-of-unconsciousness madness that we have now: Now, it’s THIS is the most important thing. No, THIS is. No, THIS is…

And for awhile last night, that most important thing was that Beyoncé had won those awards – so I received a tsunami of notices about it.

Of course, newspaper readers can STILL see all the news presented on a paper’s app. Which is great. And it’s all freshly updated. And better yet, now the TV stations have websites where you can see a bunch of stuff being offered – not in any thoughtful hierarchy, but at least there’s a selection.

So that’s good – as long as you go looking for your news that deliberately, and consider it more or less holistically.

But I fear that not enough people do. I worry that too many let it wash over them the way the Grammys were washing over me last night. And I think it causes them to lose all perspective. And it causes the journalists to lose it, too, since decisions of what to cover and how to play it and what to send notifications about are now so driven by clicks.

At this point, many of you are rolling your eyes and thinking (as many of you habitually do), there goes that has-been newspaperman, reminiscing about how great things were in the old days. Which means you’re missing the point entirely.

It’s not about me. I actually love my iPad and the incredibly wide access to dependable news sources it gives me. In the unlamented old days, I wouldn’t have been able to subscribe to all these papers and received them while the news was still hot. And this is of great value.

But I worry very much about the effect these “news” tsunamis I’m speaking of have on society as a whole. It’s not just a matter of people being overly concerned with silly pop culture stuff. Hey, I love pop culture, as any reader of this blog knows. But the problem is, serious things – such as politics – get covered this way as well. It’s gotten to be all about the outrage of the day, the stupidest things that were said or done, the things most likely to drive us farther apart from each other. And yeah, it helps explain – not entirely, but in part – how Donald Trump got elected in 2016.

As I’ve said so many times, nothing like that ever came close to happening before that election. And I keep trying to figure out why it did happen. And this is one of the things I see contributing to it – this utter lack of perspective and proportion with regard to news…

Anybody having trouble with the blog?

via GIPHY

I mean, trouble other than the usual “dealing with that idiot Brad Warthen” stuff.

I’m talking about weird technical problems.

Starting a couple of days back, right after I posted Paul DeMarco’s piece about his trip to Sicily, Paul told me via text that it wasn’t showing up, and in fact, the most recent post showing at the top of the home page was this highly forgettable one, from way back on Jan. 16.

I know that I don’t post with anything near my old frequency, but there had been nine posts after that one. Ten now, counting this one.

Anyway, when Paul told me that, I immediately checked, and everything was fine!

But that was on Chrome. Before reporting back to Paul that he was imagining things, I tried looking at the blog on Firefox, Edge and Safari. No dice. The most recent post was the one from Jan. 16. Which, let’s face it, was not a great post.

And I found later that my wife couldn’t even get the recent stuff on Chrome on her iPad.

I’ve been scrambling — whenever I’ve had a moment for the blog — ever since. Night before last, I spent 52 minutes on hold with my hosting service, and never got to speak to anybody. No luck with their “chat” service, either.

I’m about to try them again. But in the meantime, things have changed. This morning, everything’s fine on my Firefox browser — as well as Chrome, of course. But Paul said this morning he couldn’t get the recent stuff on Chrome. I urged him to try clearing out his cache. He did, and it worked! I can’t swear that would work for everybody.

Meanwhile, this morning I discovered another problem. I got an email from Ken complaining about his comments not appearing. So I was like, “What comments?” Because there weren’t any when I logged in this morning. But before I finished answering him by email, I looked again — and there they were, with some from other folks.

Weird.

And yeah, what’s weirder is me telling you about all this stuff when, if you have a problem, you probably can’t see this post.

But if you can, please let me know whether you have HAD any problems, and please describe them. I’m still trying to work this out…

Some stats documenting our Raskolnikov Syndrome

Fyodor Mikhailovich Dostoevsky

Yeah, I’m on about my Raskolnikov Syndrome theory again. But hey, I haven’t mentioned it since April, so…

The theory is that people lose their minds — and often become shockingly violent — when they cut themselves off from other people. Ten years ago, I summarized it in part this way:

I’ve long had this theory that people who do truly horrendous things that Ordinary Decent People can’t fathom do them because they’ve actually entered another state of being that society, because it is society, can’t relate to…

You know, the way Raskolnikov did. Brilliant guy, but as he cut himself off from family and friends and sat in his grubby garret brooding on self-centered theories, he became capable of horrible things. Well, you know what he did. If you don’t, read the book. Everyone should. I suspect it’s what made Mel Brooks say, “My God, I’d love to smash into the casket of Dostoyevsky, grab that bony hand and scream at the remains, ‘Well done, you god-damn genius.’ ”

Anyway, it’s come up again because of this piece I read the other day in The Washington Post., headlined “Americans are choosing to be alone. Here’s why we should reverse that..”  It included some scary numbers, to me:

And now for the scarier news: Our social lives were withering dramatically before covid-19. Between 2014 and 2019, time spent with friends went down (and time spent alone went up) by more than it did during the pandemic.

According to the Census Bureau’s American Time Use Survey, the amount of time the average American spent with friends was stable, at 6½ hours per week, between 2010 and 2013. Then, in 2014, time spent with friends began to decline.

By 2019, the average American was spending only four hours per week with friends (a sharp, 37 percent decline from five years before). Social media, political polarization and new technologies all played a role in the drop. (It is notable that market penetration for smartphones crossed 50 percent in 2014.)

Covid then deepened this trend. During the pandemic, time with friends fell further — in 2021, the average American spent only two hours and 45 minutes a week with close friends (a 58 percent decline relative to 2010-2013)…

On average, Americans did not transfer that lost time to spouses, partners or children. Instead, they chose to be alone….

Take that, and combine it with the Rabbit Hole, and you have a dangerous situation, with a society that is dangerously alienated, and no longer understands what a fact is. And yeah, I’m talking about the 2016 election, and the “stop the steal” cult, but a lot of other stuff as well.

Look around at some of the bad craziness going on, and this helps explain it…

If I were inclined to be a pessimist, here’s what I’d worry about

I hope Gary Larson doesn’t sue me for using this. I just saw it on Pinterest, and thought it a way better illustration for this post than the boring shot of Putin I originally put here.

Well, these are some of the things I’d worry about. Not all are even near the top of the list. These are just things that were in the news today — actually, all three were in one of the several papers to which I subscribe — so they’re on my mind at the moment.

So worry away, folks…

  1. Classic American tragedy — The headline was “Teen sought in Amber Alert dies in shootout after running toward deputies.” Basically, a 15-year-old girl that authorities were seeking to rescue from her armed-and-crazy, murderous father is now dead — shot by, well, authorities. So your initial reaction is, there go the stupid cops again. But then, if you care at all, you actually read about what happened. And you see it’s not so simple. What happened (so far as know at this point) was, shots were fired near a school. The school is placed briefly on lockdown. Then cops find a woman with multiple gunshot wounds, who is pronounced dead at a hospital. The call goes out to look for the husband, Anthony Graziano, and the couple’s young daughter, Savannah. Graziano’s Nissan is spotted, and pursued. He starts shooting, putting several rounds through a police car windshield. With bullets still flying both ways, someone, “wearing protective equipment, including a tactical helmet, emerged from the passenger side of the vehicle, ran toward sheriff’s deputies and then fell amid the gunfire.” When it’s all over, it’s discovered that someone is Savannah, and she and her father are both dead. What do you think should be done to prevent such things? This is very much like what happened to Breonna Taylor — someone with the victim starts shooting at police, and the victim is killed in the crossfire — but since she was black, a lot of people simplified it to “racism.” With Savannah being white, one is tempted to simplify by saying, “guns.” For instance, since I watch at LOT of British cop shows, I think, why can’t our cops go unarmed, like them? But of course that ignores the fact that there are 393 million guns in private hands in this country, and a lot of those hands belong to people who like to shoot first, like Graziano. So no, I don’t know that answer, but I’m pretty sure it can’t be summed up in one word.
  2. A big AI advance — I often sneer at artificial intelligence, noting that it may be artificial, but it certainly isn’t intelligent. Well, something like this makes me take a step back, and have “Matrix” thoughts. See that block of images below. None was taken by a camera. And they were generated not by hours of work by a CGI artist, but by “the artificial intelligence text-to-image generator DALL-E.” The one at the upper right came into being in response to the phrase, ““A woman in a red coat looking up at the sky in the middle of Times Square.” The only human input for the one at bottom left was, “Red and yellow bell peppers in a bowl with a floral pattern on a green rug photo.” I don’t know what the prompt was for the boy in black-and-white, but this is scary. Note that I say, “the phrase,” “input,” and “prompt.” Each time, I almost wrote “command,” but dare we speak of issuing orders to our future digital overlords?
  3. Ukraine dilemma — If you don’t spend too much time thinking about it, you can conclude that the thing to do is simply cheer for Ukraine to win, and Putin to lose. And I do. But I also worry. As I have since the start. Those of you who think Brad is just this wild warmonger — because I would sometimes use military force when you would not — may have been taken aback by the way I worried when all this started. I was running about like Neville Chamberlain, wringing my hands — sort of, anyway. Once it started, I continued to worry, while following the above formula. But while I rooted for Ukraine, and was pleased by that country’s recent successes, I continued worrying about the big picture, which goes like this: Putin needs to be humiliated, so he stops doing this. He didn’t pay a price in Georgia, or for his early moves on Ukraine. This has to stop. He needs to go. But he’s got all those nukes, and what will he do with them on his way out the door? Anyway, I urge you to read this piece, “Putin is limping toward an endgame in Ukraine. Should the West go along?” Read the whole thing, if you can. It basically asks, if fixing “elections” so he can save some face by annexing part of Ukraine — again — should we let him do this disgusting thing, to prevent a nuclear holocaust? My gut, of course, says the hell with him. But I don’t want nuclear hell unleashed on the rest of us, either. What’s the right move?

The first and the third problems are very similar. Any intelligent, or merely satisfying, response to either has enormous barriers in front of it. Get rid of those 393 million guns (the only thing that would really fix the problem)? Good luck. And imagine Joe Biden, in this poisonous political environment, trying to steer a course that does something enormously sickening to all sides, in order to avoid Armageddon. Forget about the consequences in the midterms — would it even be possible to do it?

Maybe we should stop worrying about 1 and 3, and let 2 happen, so the algorithms can make the decisions.

Anyway, as I said, if I were inclined to be pessimistic about life, the universe and everything, I’d spend all my time thinking about things such as these…

The upper-right was generated by “A woman in a red coat looking up at the sky in the middle of Times Square.”

Open Thread on Technology for Tuesday, August 23, 2022

The Singularity hasn’t arrived, but we’re all pretty obsessed with the Matrix, as it currently exists…

Editor’s note: I wrote this on Tuesday, but didn’t post it because I thought it wasn’t very good. But today — Friday — I decided not to waste that time I spent typing it. So here it is, with only slight editing. But I didn’t take the time to edit all the places where it said “today,” which at the time meant Tuesday.

I have to be careful here. After all, there are already those who see me as an old guy (the insolent puppies). I don’t want to give them any additional reason to see me as Uncle Ben in “Spider-Man,” looking in the physical, dead-tree newspaper for a job (which shows you how long ago 2002 was), and seeing a help-wanted ad for a computer analyst, moans, “My Lord, even the computers need analysts these days!”

All my adult life, I was always on the leading edge of technology — when newspapers went from typewriters to mainframe, and then from mainframe to PCs, I was one of the people who learned it first and taught the others. I paginated the editorial pages before the rest of the newspaper followed. When I got canned in 2009, I was the only person at the paper actively blogging and regularly interacting with readers online.

But lately I’ve been noticing something a bit unsettling. Gradually, the news I read is less about what people do, and more about what their technology does. I’m not saying the singularity is imminent — artificial intelligence is still too stupid — but we’re moving in that direction, in terms of what we pay attention to. Maybe it’s because we’ve spent too much time observing stupid people, and no longer notice the intellectual limitations in the tech.

Anyway, these were all in The Washington Post today:

  • You’re charging wrong: 5 ways to make gadget batteries last longer — Hey, I love my iPhone and my iPad, and am on decent terms with my PC. But I’ll respect them all more — especially the iPhone — when the batteries are better. Or at least, more reasonable. Here’s what reasonable would look like: When I take off my phone and am not using it — which means when I’m sleeping — it should be charging, and without damaging the battery. And please, don’t do this thing where you take all fricking night to charge. Ever since that started, I’ll wake up in the night and reach over to unplug it, because it’s been a couple of hours and should be charged — but it’s nowhere near done, because it’s aiming to finish around 5 a.m. I’ve tried turning off this “convenient” feature in the past, but failed. So it charges all night, but gradually. But what if I needed to grab it and go in the middle of the night?
  • How a photo of a woman yelling in a guy’s ear became a viral meme — That sounds stupid, doesn’t it? That’s because it is. Not as stupid, say, as ‘haul videos” were, but pretty dumb. Apparently, it’s news because as a meme, it is somehow evocative of other memes, and has meaning to someone who spends all his or her time thinking about memes instead of, say, great literature. It’s an actual international sensation, apparently.
  • Strangers rallied worldwide to help this Maryland mom find where she parked her car — In this case, the amazing part isn’t about the technology. The amazing thing is the way this lady managed to lose the car she had hurriedly parked on the way to take a child to the doctor. Which is reasonable to anyone who has had to spend a little time remembering exactly where in the lot, or the garage, the car was parked. That I get. What blows my mind is that she didn’t even know in which nearby parking garage she had parked it. Which means she arrived at the doctor so flustered that she didn’t know how she’d gotten there, even roughly. So after unsuccessfully searching, she posted something about it on social media, and went home, defeated. And people around the world jumped in to solve the mystery, and two days later, someone found it. Which is cool, and even nice. But how did this happen to begin with?
  • Down and out and extremely online? No problem: Just enter a new ‘era.’ — You’ll have to read a few grafs of the story even to understand what it’s about. But when you do, you may react as I did, wondering how anyone could become this lost in narcissism. (Which is really something, coming from a guy who blogs.) And then, you’ll wonder about something even more perplexing: Who would actually watch such a thing? Compared to this, haul videos actually made sense.
  • Former security chief claims Twitter buried ‘egregious deficiencies’ — I put this last, but this morning, this was actually the lede story on the app. So Elon Musk isn’t the only one complaining. But then, he’s looking for something in Twitter other than what I see, and enjoy. I use it all the time, and it works great. I post something, and it shows up, and people interact with it. Yeah, lying to regulators is a bad thing and all, but if you want to go after a social medium that really sucks, take on Facebook. Or Instagram. Or Snapchat. Twitter remains my fave.

This saturation in tech news today reminded me of another story about something I want to complain about, from last week:

How to send text messages from the comfort of your computer — The only reason I read this was because I use an iPhone for my phone, and a PC for my computer. Which means I’m up the creek, unlike people who use all Apple products — their texts are shared smoothly on all their platforms. So I started reading, thinking that maybe, just maybe, I won’t have to shell out a fortune to get a Mac when my Dell gives out. And I read on even though the subhed warned me what was coming: “The process ranges from ‘surprisingly simple’ to ‘ugh’ depending on your mix of devices.” Of course, they save the “iPhone + Windows” scenario for the end, at which point they say that it’s technically possible, but…

So I kind of wasted my time there…

I’m a bit obsessed with my iPad, and my iPhone knows it

I hardly go anywhere without my iPad.

Certainly not if I’m going somewhere work-related — a meeting or an interview or whatever — because it’s easy to carry and can perform most work-related functions.

But I don’t go on vacation without it, either. And in the past, I haven’t even left the hotel or B&B without it. When we went to Thailand and Hawaii several years ago, I carried it in a drawstring bad strapped across my chest (I long ago outgrew the trying-to-look-cool thing) or back. See the embarrassing image below.

But by the time we went to Ireland and then to Boston, I’d decided if I absolutely had to do something while walking about on vacation, my phone would do. If I can keep the blasted thing charged.

Still, the iPad goes with me nearly everywhere.

And my iPhone has noticed. Lately, it’s been acting a bit sarcastic about it. Every time I leave the house now — for a walk, or to go to the grocery — I’ve started getting these notifications, like the one above, as soon as I’m a few blocks from the house.

They’re like, “Hey, you — it looks like you left your baby behind! Don’t you want to run back home and get it?”

OK, so maybe this isn’t petulance on the part of the phone. It seems to have started when I allowed the iPad to update its operating system recently. And there seems to be an easy way to turn off such notifications.

But… maybe one of these times, I really would want to go back and get it. So I’m leaving it on for now. I’ll just have to see how much it bugs me going forward…

Here I am with me mate Mark, whom I met on the road to Kanchanaburi in 2015. He’s a retired roofer from England. Note that in addition to the drawstring bag, I’m wearing my tropical-weight travel vest. So I’m really not kidding when I say I’ve outgrown trying to look cool on the road.

 

And next, the email…

Now that I’ve spent every spare moment I could find for several days gradually putting that post about Boston together — determined to get that done once after one of my trips — I can turn to catching up on email.

Not work email — I’m up to date on that chore. I mean my personal email. The screenshot above from my iPad indicates the scale of the problem. Not exactly, but that “2,031” you see over the email icon is roughly how many I have sitting unexamined in my personal In box.

Which I hate. Of course, I’ll delete most of them — not even opening the overwhelming majority of those before I do — and save most of the few left to folders, also unread. You know, just in case they prove useful at some point in the future. Which they almost certainly will not. But even the system I have for committing unthinking mass murder to hundreds or thousands of messages can take me a couple of hours, when there are this many. Of course, it’s not the many that cause it to take so long — it’s the few I open and glance at, and perhaps even read.

Remember when — 25 to 30 years ago — we thought email was a convenience? And certainly it was, compared to snail mail, which takes so much time and physical effort to process even a single letter. It was made more seductive, there in the early-to-middle ’90s, by the fact that relatively few people out there had email, which really cut down on the volume.

Yesterday, I was talking about something else with an IT professional — no, not a funny one like this guy, but a real one — and he was talking about some new technology he was working with, and I asked him to let me know if he ran across any new technology that eliminated the hassle of email.

Trying to be helpful, he made a suggestion or two, but I had tried them already, leading to failure. For instance, he suggested creating folders in which to dump things that might require some action (or at least reading), so they can be addressed later. With bitter regret, I told him of my hundreds of such folders, which have done nothing to reduce the work — and which, of course, I pretty much never look back at. The junk just sits there.

Part of it is my personality. I’ve always been a pack rat, and I have a great, almost mortal, dread of having something in my hands at one point, throwing it away, and then desperately needing it at some later date. (This, of course, predates email. My office, or my desk if that’s all I had, would always be a forest of piles of paper. To this day, I defend this system because of something that happened once in the very early ’90s — Managing Editor Paula Ellis, knowing my habits, came to me and asked whether I had a copy of a memo that had been distributed in the newsroom several months earlier. Certainly, I said. I went immediately to the right pile and shuffled through it for a moment — then proudly handed it to her. See? My way was the right way. It may have only happened once, but it happened…)

Then there is the problem of my chosen profession — or rather, the profession that chose me. It’s very difficult for a journalist — at least this one — to throw away written information. It may not be useful later, but on rare occasions it can be critical later. This only got worse when I turned to opinion writing — and much worse than that when I took up blogging. At least, when I was a beat reporter 40-odd years ago, there was a limit to the range of things I might write about. No longer. And since I still blog, this perceived need to hang onto things has continued well past the end of my newspaper career.

Even the stupidest, most useless, boring piece of crapola — say, an appeal for money from a political campaign — can inspire me to write something, depending on my mood. Sometimes, I write only to share how stupid, useless and boring it is.

Oh, well.

I’ll turn to it later, and get it done eventually. Right now, I think I’ll turn, however briefly, to some actual paying work…

 

 

 

Sometimes, robots try too hard, and assume too much

Those of you who travel more than I do (these days, I hardly leave my house!) probably noticed this before, but it’s the first time I’ve run into it.

As I mentioned, we took a quick trip to Memphis a few days back, and while we were there, I looked at my phone and noticed I had an appointment with one of my doctors set for the next day — which was the day we’d be driving back. The time was for right about the time we’d be leaving Memphis. (I knew I had an appointment that week, but I’d thought it was later in the week.)

So I called, and they called me back the next morning as we were about to leave, and reset the appointment for this coming Friday at 9:10. So I entered that into my iPhone.

At least, I think that was the time. I just looked at my calendar, and it says 10:10. (As you can see above, I smudged the doc’s name. Y’all know I usually don’t worry too much about my own privacy in a medical context, but I try to respect that of my physicians’.)

So… I’m left to assume that since I entered the time as 9:10 when I was in the CDT zone, my iPhone automatically “corrected” it when we cross the line back into EDT.

I fixed it earlier on my phone, but it still shows up on my PC as 10.10. (Which is another technical problem.) So now I’ve put in a call to the doctor’s office to make sure.

Anyway, I hadn’t known iOS could be quite that “helpful.” Or that presumptuous…

Roy Rogers and Wonder Woman! You can’t beat that…

OK, I was kinda ragging on Google Adsense the other day for the “older women” ad they put on my blog, but here’s one I actually like.

Not enough to click on it, mind you — that’s against the rules. I just like it as a visual enhancement of the blog.

My childhood hero Roy Rogers, with the best Wonder Woman! (No offense to Gal Gadot, but this one was pretty special.)

Can’t say I appreciate the slight suggestion that there’s something salacious going on (“opens up about his life”), this being Roy and all. I mean, he’s just being polite posing for a picture. The photographer probably said, “Pretend she’s Trigger!” and Roy obliged.

Of course, she’s not Trigger, but I don’t hold that against her. Not her fault at all. She’s doing her best, and her best is pretty good…

Hey, you promised me OLDER women!

AdSense won’t let me click on its ads, but at least I can occasionally make fun of them. I think.

Yesterday, they put the one you see above in the top position on my blog. I don’t know why. You probably didn’t see it (at least, I hope not), but for some reason the algorithm thought Brad Warthen wanted to.

I don’t have any guilty secrets that this reveals, or I probably wouldn’t post this. I haven’t been searching for anything remotely likely to lead to such associations. Here are some of my recent Google search terms:

  • no-knock warrant raid
  • npe
  • Ramesh Ponnuru
  • where is wordle on nyt app
  • hugh weathers

Over on Amazon, I’ve looked for:

  • Wrangler Authentics Men’s Classic Relaxed Fit Cargo Short
  • The Lincoln Highway: A Novel
  • Sun Joe SJH901E 18-Inch Electric Telescoping Pole Hedge Trimmer
  • Hourleey Garden Hose Washer Rubber, Heavy Duty Red Rubber Washer Fit All Standard 3/4 Inch Garden Hose Fittings, 50 Packs

Sexy stuff, huh?

OK, wait. This morning — after I saw the ad — Pinterest showed me this pic of Norma Jeane posing for a pinup painting (between pins of “Far Side” comics and Mickey Mantle), and the cutline used the name “Earl Moran,” and that made me curious enough to search for that name, which showed me this bevy of softcore images. And hey, all of those women, if still alive, are older than I am.

But that was after Google showed me the ad.

Personally, I feel cheated: If you think I’ve got some kinda thing for “older women,” why don’t you show me something like this, or this? You know, hubba-hubba stuff…

Don’t try to palm off some young babe on me. Give the customer what he wants — or what you, for inexplicable reasons, think he wants…

Anyway, this post is for my “AI is artificial, but not intelligent” file…

Doesn’t everyone do this? And if there are people who don’t, what is wrong with them?

This Tweet raised a number of questions:

My initial response was simply to reply, “I don’t know what this has to do with ‘parents,’ unless it was written by a child. Doesn’t everyone do this? And why isn’t IMDB mentioned?”

But seriously, people, when all of us are sitting there with smartphones, who doesn’t do this?

I don’t mean with Steve Buscemi. If you have to look him up, you should just quit partaking in popular culture altogether. I mean somebody a little harder, like Zoë Wanamaker. I see her all the time, of course (such as, recently, in “Britannia”), but I was thrown because she appeared in an episode from the first series of “Prime Suspect” in 1991, and I hadn’t seen her when she was that young. Also, she distracted me by stripping off her blouse to flash her breasts at a cop who was surveilling her.

Of course, some of us do it to a greater extreme than others. Like me. My wife goes, “Who is that? Where did we see her?” But then, she generally returns her attention to the show and follows the action.

Meanwhile, several feet away, I’m on my phone’s IMDB app, researching away. Which, of course, sometimes takes several steps. Sometimes with a TV show, simply calling up the entry for the show won’t tell you who this actor or actress, who may only have appeared in this episode, was (either because the person is buried in a long list, or, too often, is missing entirely from the main page). So I might have to look up the series on Wikipedia, and find the title of the specific episode, and then go back to IMDB and search for that episode by name, and that leads to success. I then call up a representative photo of that person, and show it to my wife, and tell her where she has seen him or her before.

And my wife says, “Yes,” and goes back to the show.

This presents a bit of a problem. Because even with my new hearing aids, I’m very dependent on subtitles to help me follow the dialogue. So after a couple of minutes of looking at my phone, I’m a bit lost as to what’s going on.

So I ask my wife. And tolerant as she is, this sometimes makes her a bit impatient with me. But she doesn’t call me a “parent.” She just, you know, thinks I’m a bit of a compulsive idiot.

But I can’t help it. In a world in which the computer — phone, tablet, laptop, what have you — is always right there, and always connected to the Web, I have to do this.

Before the Internet, I was sorta kinda able to focus on what was going on. The biggest problem back then was the dictionary. Always right there on the desk. Fortunately, I didn’t use it much, because I’m a fairly literate guy, and if I had to look the word up to be sure I was using it correctly, that was an indication that I probably shouldn’t be using it in the newspaper.

But I did look sometimes, and that meant I’d be lost for awhile. On the way to the word in question, I’d run across other words that would trip me and tie me down and force me to study them and the other words they led to, and it just went on and on from there. Eventually I’d get back to work, but it took awhile.

And the Web is millions of times worse, of course.

But it’s not because I’m a “parent.” It’s because I’m the most easily fascinated person on the planet. It’s like my superpower, although not very empowering…

Helen Mirren and Zoë Wanamaker in “Prime Suspect” in 1991.

The Stupid Decade, and how it happened

Well, I just used up my last free read on The Atlantic — if I were to take out one more subscription, it might be that one, but I’ve really been overdoing it, so I’m holding myself back — and the piece was worth it.

One of y’all — was it Barry? — brought it to my attention the other day, and I just got around to reading the rest of it. The headline is, “WHY THE PAST 10 YEARS OF AMERICAN LIFE HAVE BEEN UNIQUELY STUPID.”

Which they have, as we all know. Or at least, all of us who were adults long enough before the last 10 years that we can tell the difference. If we were around that long, and really, truly paying attention, we know that a lot of really crazy stuff went down before the past 10 years, as a sort of a warmup, but we can tell that these last few have truly been stupid and yes, uniquely so.

Here’s a key bit that sort of sets up the piece. I include the subhed because always like to pat people on the back for citing Yeats. That poem has been profound since it was written, but more and more now the human race is living like we’re determined to act it out fully:

Things Fall Apart

Historically, civilizations have relied on shared blood, gods, and enemies to counteract the tendency to split apart as they grow. But what is it that holds together large and diverse secular democracies such as the United States and India, or, for that matter, modern Britain and France?

Social scientists have identified at least three major forces that collectively bind together successful democracies: social capital (extensive social networks with high levels of trust), strong institutions, and shared stories. Social media has weakened all three. To see how, we must understand how social media changed over time—and especially in the several years following 2009….

Yep, you see where it’s headed, right? We’re getting back to the Rabbit Hole.

And each time someone explores the Hole more thoroughly, I nod a little more, as it becomes clearer that this explains so much of what had been puzzling me since 2016.

You know that book I keep talking about, Sapiens? This piece makes similar observations, such as the fact that before all this stupidity, human history could be largely summed up by saying, “there is a direction to history and it is toward cooperation at larger scales.” Yep, there was. But this piece is about how things suddenly — extremely suddenly — went wrong.

I was rereading Sapiens a bit more today, and suddenly realized that Harari didn’t realize that this megatrend had hit a major snag. That’s because his book was written in 2011 (and came out in English in 2014). So the unique stupidity hadn’t kicked in yet. In fact, Jonathan Haidt, the author of this piece in The Atlantic, considers 2011 sort of the arguable “high point of techno-democratic optimism.” Then things fell apart.

Anyway, if you’re already with me on the whole Rabbit Hole thing, you don’t need to read all of this to be convinced — although you might enjoy it.

But I know some of you aren’t convinced yet, so I urge you to read the whole thing. Yeah, it’s more than 8,000 words, but as newsroom wags used to say about an overly long piece, it reads like 7,000….