I'm trying to generate a triangle inside a 2D array , the triangle must fit completly in the array and the peak of the triangle must be in the middle of the first row of the 2D array.
The problem I have is, how do I calculate the "rate of change" for the two sides of the triangle so they can fit in lenght and width of any 2D array size...
I see no one has tackled this yet. Have you yet to come up with a solution? It might be kind of tricky.
You can think of your array as a coordinate system, with the top left corner being (0, 0), and the bottom right corner being (height, width). Maybe that'll help as a start? That's what I used in my solution.
Well, you can find the "slope" from the top point to the corner. After that, you can use how far away you are from the top, and the slope, to figure out how many little stars to put on that row.
You're not going to be able to get a perfect triangle for every array size.
This is what I get for one situation:
I don't to just give you the solution. That defeats the purpose of learning. But I'll happily help you along to the way to finding the solution.
So, say you have a width of 21 and a height of 13. What is the 'coordinate' of the center (keep i mind that we are indexing from 0 not from 1)? What would be the slope of the line going from the top point (centerX, height) to the corner?
Keep in mind that this was a solution I came up with based on your original statement of finding the "rate of change". There are other ways to do this.