Like most people on the planet, I’ve been using ChatGPT lately. It has been really neat to learn how this technology works and how to interact with it for best results. I’ve also been taking a mathematics class at the local college and I thought I would try out a few of the homework problems with ChatGPT. It didn’t go well. Here’s a pretty simple problem that deals with probabilities:

Me: A catering service offers 6 ​appetizers, 12 main​ courses, and 6 desserts. A banquet committee is to select 2 ​appetizers, 6 main​ courses, and 2 desserts. How many ways can this be​ done?

ChatGPT: The number of ways to select 2 appetizers out of 6 is given by the combination C(6,2) = 15.

The number of ways to select 6 main courses out of 12 is given by the combination C(12,6) = 924.

The number of ways to select 2 desserts out of 6 is given by the combination C(6,2) = 15…

So far so good.

ChatGPT: Therefore, the total number of ways to select 2 appetizers, 6 main courses, and 2 desserts is given by the product of these combinations:

15 * 924 * 15 = 204,750

So there are 204,750 ways to make this selection.

And this is where it loses it, completely. 15 x 924 x 15 definitely does not equal 204,750. It equals 207,900. I even try correcting it a few times and giving it the correct answer but it still sticks with its original number.

I truly don’t understand this. How can something this advanced not be able to do simple arithmetic? It would be one thing if it couldn’t do some complex math or some orbital mechanics calculations but I’m asking it to do a simple multiplication and it botches it pretty hard. It’s unbelievable to me that this is possible. There is definitely a great deal about ChatGPT, LLM, etc. that I don’t know but I would not have guessed that “can’t do multiplication” would be a sticking point.