In your first code, you have a Fib function. For some n > 1, the function calls itself not once, but two times.
Jot down the logic on paper, something like:
/ Fib(1)
Fib(2)
/ \ Fib(0)
Fib(3)
/ \ Fib(1)
Fib(4)
\ / Fib(1)
/ Fib(2)
\ Fib(0)
Fib(5)
\ / Fib(1)
/ Fib(2)
\ Fib(0)
Fib(3)
\ Fib(1)
|
If you have some n, then what ends up happening is you have a bunch of splitting branches, and you end up making around 2^n function calls, which is an awful complexity and blows up really quickly (2^5 = 32, 2^8 = 255, 2^16 = 65535, etc). (Note: It's not exactly 2^n, but it
grows exponentially, asymptotically to 2^n.)
Edit: Actually it's apparently asymptotic to n*2^(n/2) but the same concept applies.
Notice how, for example, Fib(3) is computed 2 times, each independently of one another, so redundant calculations are happening.
What you refer to in your second example is called "memoization". See:
http://courses.csail.mit.edu/6.006/fall09/lecture_notes/lecture18.pdf
https://medium.com/@porzingod/fibonacci-and-memoization-e99f765b97f6
Saving the value of Fib(some lower n) prevents you from having to make more recursive, exponentially increasing function calls.