Solving Minesweeper – Part 11: I’m Done

In the last post, we solved the issue about sizing, however I mentioned that, I needed to adapt a few things. Except the usual little code optimization and cleanups, there were two main things: the color approximation, and the way we process the frames.

The main idea about the buckets was good. The issue came in at the extremes, the smallest and largest cells. If you remember there was a threshold set to 5, being the acceptable difference in buckets in order to count that color. Well for the smallest images it was too small and excluded the actual cell value. If I adjusted it to bigger then there were some misreadings (although in hindsight these might have been caused by the frame processing). But let’s look at what I did.

void GameImageAnalyzer::ExtractColor(Mat* src, int* col, Rect roi)
{
	int stx = roi.x;
	int sty = roi.y;
	int enx = roi.x + roi.width;
	int eny = roi.y + roi.height;
	int count = 0;
	int totals[3]{ 0 };
	for (int i = stx; i < enx; i++)
	{
		for (int j = sty; j < eny; j++)
		{
			auto c = src->at<Vec3b>(j, i);
			auto diff01 = abs(c.val[0] - c.val[1]);
			auto diff02 = abs(c.val[0] - c.val[2]);
			auto diff12 = abs(c.val[1] - c.val[2]);
			// detect grayish color
			if (max(diff01, max(diff02, diff12)) < 30)
				continue;

			totals[0] += c.val[0];
			totals[1] += c.val[1];
			totals[2] += c.val[2];
			count++;
		}
	}

	if (count > 0)
	{
		col[0] = totals[0] / count;
		col[1] = totals[1] / count;
		col[2] = totals[2] / count;
		col[3] = (100 * count) / (roi.width * roi.height);
	}
}

I’ve got rid of the bins, and I tried out calculating the average non-gray colors. The accuracy of this was better than having the bins. I was looking more closely to the binning through debugging, and I saw that in some cases there was a more even spread of one channel than what I’ve expected. This lead to the fact that on the medium size, we selected one bin but for the big size, where there is more shadow darker colors in the middle, there we selected a different bin for the same channel. Anyway, long answer but now it’s more accurate.

Additionally to this I decided the record the ratio of the colored pixels in the whole image. This helped detect the initial blueish cells that we have for the UNDISCOVERED cells. Changing this of course led to changing the matching algorithm in the following way. Not in a big way, just how we apply the threshold and how many components we check. As you can see I allow a deviation of a total of 40 over the 4 components.

int GameImageAnalyzer::GetBestMatch(Mat* src, int start, Rect rect)
{
	int components[COMP_COUNT]{ 0 };
	ExtractColor(src, components, rect);

	int min_dist = 40, match = -1; // obtained through testing
	for (int i = start; i < TEMPLATE_COUNT; i++)
	{
		int dist = 0;
		for (int c = 0; c < COMP_COUNT; c++)
			dist = max(dist, abs(temp_colors[i][c] - components[c]));
		if (dist < min_dist)
		{
			min_dist = dist;
			match = templates[i];
		}
	}

	return match;
}

The second problem is a bit trickier. Since we are comparing mean color, when the cells transition from filled blue to their actual value through animations, we can get a lot of issues. The mean color value can be very close to a 1,4,7 since they are all different shade of blues. Actually the way I noticed is that the solver started to lose more games, even considering bombs cells that were not, and looking at the memory it detected some cells as sevens.

To get around that we need to stop analyzing while there is an animation running. In other words before analysis, we need to wait until the image becomes still. No more ludicrous speed. My first implementation was keeping a copy of the previous image and comparing. Thinking that it’s a bit too slow and non-elegant I decided to compute a hash for the image by adding up all of it’s pixels. It’s not the way to go generally to only apply addition, but in my case the image becomes brighter as it goes so there is virtually no chance of false positives. The main method changes in the following way.

int tries = 0;
while (tries < MAX_TRIES)
{
	tries++;
	Sleep(25);
	gm.GetImageInto(imageAnalyzer.img.data);
	auto hash = imageAnalyzer.GetHash();
		
	if (prevHash == hash && imageAnalyzer.Analyze())
		break;

	prevHash = hash;
}

I don’t think it deserves a lot of explanation. In short we go into a loop until the two hashes don’t become equal, and when they do, we do the analysis which if successful we stop (usually this takes between 200 and 700 milliseconds).

Watching this video makes me feel great amount of pride but also a bit of bitter nostalgia. I need to find myself another challenge. Before I leave, for those of you who want to check the whole code out. Here is the access to the repo. Enjoy!