So I'm taking a matrix from a file.
5 6
R F F F F F
F F F F F F
R R R F F F
F F F F F F
F F F F F F
And put it in to a 2d matrix called arr and convert Rs to 0 and Fs to 1.
d1 is the number of rows, d2 is the number of columns
for(int i = 0; i < d1; i++)
{
for(int j = 0; j < d2; j++)
{
int binary = 0;
input >> let;
if(let == 'F')
binary = 1;
arr[i][j] = binary;
}
}
I create a new array S from the previous array filling in first the first row then the first column
for(i = 0; i < d1; i++)
{
S[i][0] = arr[i][0];
}
for(j = 0; j < d2; j++)
{
S[0][j] = arr[0][j];
}
This works fine but the next part is where the error lies
//calculate the rest of the table
for(i = 1; i < d1; i++)
{
for(j = 1; j < d2; j++)
{
if(arr[i][j] == 1)
{
S[i][j] = min(S[i][j-1], S[i-1][j], S[i-1][j-1]) + 1;
}
else
S[i][j] == 0;
Can anyone help me figure out why it gives me those weird numbers? Any help is appreciated.
The min function is
int min(int a, int b, int c)
{
int m = a;
if(m > b)
m = b;
if(m > c)
m = c;
return m;
}
just for reference.