The AI Goal

I think we’ve done it already. We’ve reached “peak AI”, and have something to aim for in our “AI adjacent” roles.

This has come about because I’ve just heard how a 16 year-old kid has been tackled by armed police because of their tactical Doritos bag. This story is awesome, and it’s something we need to tackle.

AI is messy, like software development (as if you didn’t know)

Somewhere between machine vision and administrative panic, a teenager in Baltimore had eight police cars descend on him. Guns drawn.

They were armed with a snack and clearly had the will and ability to use it.

That’s because the latest Terminator AI flagged the teenager as a threat, and we all know you can’t go against AI in case our future AI overlords mark us as “uncooperative”.

Real-World Gun Detection Is Messy. So Is Software Development

This isn’t just a one-off story about AI gone rogue. We all know the limits of software development, and this story will sound familiar to anyone who’s built software in a modern tech company.

A false positive

This is your buggy test case. That flaky end-to-end test that always fails on Fridays. The intern’s regex that matches nothing and everything at once.

The reviewer disagrees

You find the bug. You raise a PR, and get a comment. “I’m curious to know whether you’ve run this past the business?”. You reply to the PR. “This is not a gun. It’s a chip bag. Link to Slack discussion”, and you wonder if your reviewer genuinely thinks that you’re not able to check your work at all.

The school principal ignores the reviewer

Management, folks. They didn’t read the ticket. They didn’t see the Slack thread. They skimmed the comment thread and decided to merge it into production at 4 p.m. on a Friday. “Let’s be safe and push it”.

Police are called

Now we’re live. Now we’re paged. Now users are tweeting screenshots of your bug like it’s a crime scene. Because in some metaphorical (and literal) sense, it is.

Why We Shouldn’t Trust AI Pipelines Without Human Context

This entire incident unfolded because a workflow involving AI, humans, and decision-makers had no concept of context or delay.

And if you’ve ever tried to integrate AI into a feature release — let’s say automatic form population, or AI-generated summaries, or god forbid, AI code suggestions. You’ll have had your “Doritos-as-a-gun” moment.

It’s the AI that puts a return null; in the middle of a function. The GitHub Copilot that writes a function to delete all user data. The AI-powered “code reviewer” that approves PRs on sight like it’s handing out candy on Halloween.

Conclusion

We’ve become so obsessed with automation, we forgot the cost of false positives.

And in real life, false positives come with guns.

Like that PR you’ve decided to “vibe code” without checking properly before marking it as ready for review.

About The Author

Professional Software Developer “The Secret Developer” can be found on Twitter @TheSDeveloper

The Secret Developer is starting to believe that the end is near.

Previous
Previous

You Can’t Fake Passion When You Can’t Use the Product

Next
Next

We Just Achieved Peak AI