AI, Tools, and Expectations

Last week was about my kids using ChatGPT and not limiting our thinking.  This week is about AI, tools, and expectations.

Let’s say that you needed to assemble something, so you grab a wrench out of your toolbox.  You use the wrench to fasten the nuts and bolts.  Then, you realize there are screws you need to insert.  Your wrench won’t be able to insert the screws.  Does this make the wrench bad?  Would you throw the wrench away, because it wasn’t good at solving this challenge?  I’m guessing you wouldn’t.  I hope you’d recognize the value and the limitations of the wrench, and every other tool in your arsenal.

Let’s connect some dots.  We should apply this same thinking to AI.  I’ve been in conversations exploring different AI tools and heard people say, “It can’t do X, so I don’t know if it’s any good.”  Have you ever heard someone say something like that?  This would be like saying, “This wrench doesn’t work for every single situation, so that means wrenches are bad.”  It’s true that the tool couldn’t do X.  However, the tool could do A, B, and C and get you 70% of the way there in minutes vs the weeks it would take you to do this manually.  That is powerful.  That is valuable.

Whether it’s wrenches and screwdrivers from a toolbox or types of AI applications, it’s important to have the right expectation for each tool.  We don’t expect a wrench to be perfect and solve all problems.  Instead, we understand we need a variety of tools to be successful.  In a similar way, we shouldn’t treat AI as if it is just one tool.  AI spans a variety of tools and use cases, each with their own benefits and limitations.

The challenge: How will you properly set expectations for various AI tools?

Have a jolly good day,

Andrew Embry