(This post was last modified: 03-06-2024, 05:17 PM by Kyng.)

Quite a lot of topics about AI have been about how scarily powerful it is; how it's this menacing threat that might take all of those jobs; and so on. And I'm not saying those fears are unfounded or anything - but, in the hopes of allaying some of those concerns, let's look at the things that ChatGPT (and other similar chatbots) are really bad at .

Here's one to start with: hard maths problems. Yesterday, I gave the following maths problem to ChatGPT:

Quote:In how many ways can we put 8 people in 4 rooms, such that no room contains more than 3 people?

This was my solution to the problem... it's essentially an exercise in counting and binomial coefficients:

My solution

There are only four patterns that work here (ignoring re-ordering):

2,2,2,2

3,2,2,1

3,3,2,0

3,3,1,1

2,2,2,2 is the easiest one: there are 8*7/2 = 28 ways to put two people in the first room; for each of those, there are 6*5/2 = 15 ways to put two people in the second room; for each of those, there are 4*3/2 = 6 ways to put two people in the third room, and then only 2*1/2 = 1 way to put two people in the last room. So, the total is 28*15*6*1 = 2520 ways to put two people in each room.

3,2,2,1 follows a similar logic: 8*7*6/6 = 56 ways to put three people in the first room; then 5*4/2 = 10 ways to put two people in the second room; then 3*2/2 = 3 ways to put two people in the third room; then one way to put one person in the last room. But... we also have 12 ways of ordering the rooms themselves: we have four choices of which room has three people, and then three choices for which of the remaining rooms has one person. So our answer here is 56*10*3*12 = 20160 ways to do this.

You probably know what to do for 3,3,2,0. 8*7*6/6 = 56 ways to put three people in the first room; 5*4*3/6 = 10 ways to put three people in the second room; 2*1/2 = one way to put two people in the third room, and one way to put nobody in the last room. Then there are 12 ways to order the rooms: four choices for which room has two people, then three choices for which remaining room is empty. So there are 56*10*12 = 6720 ways to do this.

Finally, there's 3,3,1,1. There's 8*7*6/6 = 56 ways to put three people in the first room; 5*4*3/6 = 10 ways to put three people in the second room; two ways to put somebody in the third room; and one way to put the last person in the last room. Then, there are 4*3/2 = 6 ways to choose which of the two rooms have three people in. So, the total number of ways to do this is 56*10*2*6 = 6720 again.

Finally, adding these together gives 2520 + 20160 + 6720 + 6720 = 36120 ways.

Now, I don't know whether or not my solution is correct: I have no easy way of checking it . But, I decided to give the same problem to ChatGPT, to see how it would get on:

As you can see, ChatGPT made a complete pig's ear of this problem . It eventually got the same answer as me - but I had to correct it on thirteen different things before it got there. And on the final step, it even added up the four numbers incorrectly at first !

So, do you have any more examples of things that ChatGPT is bad at?

It seems insistent that Sunni Ali was born in 1442 even though there’s no evidence for this figure. For all I know, Sunni Ali MAY have been a senior citizen conqueror.

(03-06-2024, 05:53 PM)JHG Wrote: It seems insistent that Sunni Ali was born in 1442 even though there’s no evidence for this figure. For all I know, Sunni Ali MAY have been a senior citizen conqueror.

Yeah, I can imagine ChatGPT would be bad with common misconceptions.

It just crawls the web and 'learns' things by reading them... so if any falsehoods get repeated often enough, then it's going to believe them!

(03-06-2024, 05:53 PM)JHG Wrote: It seems insistent that Sunni Ali was born in 1442 even though there’s no evidence for this figure. For all I know, Sunni Ali MAY have been a senior citizen conqueror.

Yeah, I can imagine ChatGPT would be bad with common misconceptions.

It just crawls the web and 'learns' things by reading them... so if any falsehoods get repeated often enough, then it's going to believe them!

More like total guesswork; NOBODY as far as I know has figured out when Sunni Ali was born.

(This post was last modified: 03-10-2024, 10:39 PM by ~ True Legend ~.)

(03-06-2024, 05:15 PM)Kyng Wrote: Quite a lot of topics about AI have been about how scarily powerful it is; how it's this menacing threat that might take all of those jobs; and so on. And I'm not saying those fears are unfounded or anything - but, in the hopes of allaying some of those concerns, let's look at the things that ChatGPT (and other similar chatbots) are really bad at .

Here's one to start with: hard maths problems. Yesterday, I gave the following maths problem to ChatGPT:

Quote:In how many ways can we put 8 people in 4 rooms, such that no room contains more than 3 people?

This was my solution to the problem... it's essentially an exercise in counting and binomial coefficients:

My solution

There are only four patterns that work here (ignoring re-ordering):

2,2,2,2

3,2,2,1

3,3,2,0

3,3,1,1

2,2,2,2 is the easiest one: there are 8*7/2 = 28 ways to put two people in the first room; for each of those, there are 6*5/2 = 15 ways to put two people in the second room; for each of those, there are 4*3/2 = 6 ways to put two people in the third room, and then only 2*1/2 = 1 way to put two people in the last room. So, the total is 28*15*6*1 = 2520 ways to put two people in each room.

3,2,2,1 follows a similar logic: 8*7*6/6 = 56 ways to put three people in the first room; then 5*4/2 = 10 ways to put two people in the second room; then 3*2/2 = 3 ways to put two people in the third room; then one way to put one person in the last room. But... we also have 12 ways of ordering the rooms themselves: we have four choices of which room has three people, and then three choices for which of the remaining rooms has one person. So our answer here is 56*10*3*12 = 20160 ways to do this.

You probably know what to do for 3,3,2,0. 8*7*6/6 = 56 ways to put three people in the first room; 5*4*3/6 = 10 ways to put three people in the second room; 2*1/2 = one way to put two people in the third room, and one way to put nobody in the last room. Then there are 12 ways to order the rooms: four choices for which room has two people, then three choices for which remaining room is empty. So there are 56*10*12 = 6720 ways to do this.

Finally, there's 3,3,1,1. There's 8*7*6/6 = 56 ways to put three people in the first room; 5*4*3/6 = 10 ways to put three people in the second room; two ways to put somebody in the third room; and one way to put the last person in the last room. Then, there are 4*3/2 = 6 ways to choose which of the two rooms have three people in. So, the total number of ways to do this is 56*10*2*6 = 6720 again.

Finally, adding these together gives 2520 + 20160 + 6720 + 6720 = 36120 ways.

Now, I don't know whether or not my solution is correct: I have no easy way of checking it . But, I decided to give the same problem to ChatGPT, to see how it would get on:

As you can see, ChatGPT made a complete pig's ear of this problem . It eventually got the same answer as me - but I had to correct it on thirteen different things before it got there. And on the final step, it even added up the four numbers incorrectly at first !

So, do you have any more examples of things that ChatGPT is bad at?

I’m curious as to whether ChatGPT would correct you if you suggested a combination that’s wrong e.g. 3 0 0 0 (adds up to 3) or 4 1 2 1 (has number greater than 3)
I’ve always suspected it’s good at things that can be googled, but unique problems? Maybe not so much.

What intrigues me however is how it’s able to generate essays not simply copied and pasted, yet problems like this, it gets wrong. My assumption would be that this is a common problem and the answer has been wrongly calculated on various forums, and it’s getting that info.

(03-10-2024, 10:37 PM)~ True Legend ~ Wrote: I’m curious as to whether ChatGPT would correct you if you suggested a combination that’s wrong e.g. 3 0 0 0 (adds up to 3) or 4 1 2 1 (has number greater than 3)

Good question!

I decided to put that to the test, in this chat. I asked it what 4837 x 5841 was; if you put that into a calculator, you'll get the answer 28,252,917, but instead, ChatGPT gave me the incorrect answer 28,269,717. So, I told it that I thought it was 28,252,957 (deliberately giving it an incorrect answer ). And, sure enough, it came up with an argument for why 28,252,957 was actually correct .

I hate to say it, but ChatGPT is utterly hopeless at maths.

(03-10-2024, 10:37 PM)~ True Legend ~ Wrote: I’m curious as to whether ChatGPT would correct you if you suggested a combination that’s wrong e.g. 3 0 0 0 (adds up to 3) or 4 1 2 1 (has number greater than 3)

Good question!

I decided to put that to the test, in this chat. I asked it what 4837 x 5841 was; if you put that into a calculator, you'll get the answer 28,252,917, but instead, ChatGPT gave me the incorrect answer 28,269,717. So, I told it that I thought it was 28,252,957 (deliberately giving it an incorrect answer ). And, sure enough, it came up with an argument for why 28,252,957 was actually correct .

I hate to say it, but ChatGPT is utterly hopeless at maths.

Surprising. It seems to be getting the hard stuff right and easy stuff wrong. Like being able to generate essays but not do maths calculations 😂

I suppose its strength is web crawling but someone in some online forum may have answered the multiplication weong or something 😂

Show me the explanation if you still have it I’m curious - oh, do share 👀