What’s in the Box?
I touched on this a bit when I wrote about the overwhelming wave of AI at CES this past year, but it really hit me when I added the BMW iX to the house, my first full EV. What I observed at CES, something I’m now living firsthand, was that we’ve drifted into this entirely new world of software behaviors that we may not even know are occurring. We can no longer look at any black box and know what’s inside. Maybe you don’t care, but I kind of do. At least I think I should care!
We know if a light switch is analog and it turns on and off, but we don’t know if a light switch is digital just by looking at it anymore. It could be run by software quietly and autonomously running ten other things at the same time. And, those ten things may not be binary at all. I picture some old lady in Ohio asking me to stop flicking her lights.
Think of my prior post about the WeatherFlow Tempest and all the data collected. It’s remarkable what it can sense and record and chart. Yet, it doesn’t end with that device. Add enough of them in a series, and you can accurately predict what’s coming, and homes could potentially react all on their own. My thermostats in the house could plan ahead for cold or hot and bring in awnings, or cover vulnerable windows. Maybe even move cars into the garage.
None of this bothers me, but what is of concern is when we don’t know the limitations or even the variables in which a black box can be working without our knowledge. I’m not sure I care, but that worries me too.
You could buy an iX and never ever know the depth of technology within the car or what it’s actually doing as you drive. This goes way beyond autonomous driving and takes into account upcoming elevation changes in the route to your destination, and knows ahead of time when and where to apply regenerative braking, including when to let it rip down the hill. Nobody discussed those decisions with me. The car does it for me. I get to adjust the level of regenerative braking for “one pedal” driving, but how would I know the extent of variability unless I looked or someone pointed it out to me? After all, it’s all software-driven. I learned it from a YouTube video!
CES 2023 marked the year for me that we collectively lost consciousness of the capability of AI-enabled devices. They have moved beyond our control, and in many cases, we don’t even know it. We’ll just shrug our shoulders and think, wow the car drives great! With almost zero awareness of the decisions made on our behalf as we drive. In fact, that’s the goal, to make changes without our perception.
I was slowly moving the iX in the driveway, and my dog strolled behind the car, not in any danger, and I saw him in the cameras, but the car said “nope” and stopped. The beep didn’t give me the choice of what to do. It just stopped the car. It decided what’s best for me and Tide.
For me, as I walked around CES 2023, it marked the first year when devices were overwhelmingly beyond our awareness of what’s happening, and it will now accelerate. Us humans never expressed the slightest hesitancy to give up control. We just accept it, like my car deciding to stop without asking me first, and we call it a feature. It was a decision taken away from me. The car’s AI took authority over my cautious driving, all in the name of safety. My car decided it could make a better decision about possibly hitting my dog than me, in spite of the fact that both of us were aware of our movements, and my dog was just moving from one side of the car to the other as he often does when I move a car to sweep the garage. Neither of us was worried.
There is a lot of press about large language models rewriting history and adding bias to our lives that we never intended. Well, decisions are being made right now in our "best interest." I don’t think we understand the degree to which we’ve already given up control, and we don’t seem to care. I’m still trying to decide how much I should care.
I guess I’m okay unless it locks the doors when it sees someone I shouldn’t date. Or worse, locks the doors with them inside when it’s someone I should.