&& is a Logical operator and & is a bitwise operator, they are not the same and it is not a matter of which one is faster but a matter of which one is the correct one to use.
I know what the difference is between the two thing, though I don't see how what they actually end up doing is any different in this case. The first one, creates a boolean...erm...thing (what do you call it) which is tested to be true by the if statement, and the && really just means 'and'.
I tested it and they both came out with the same results...
The first one, creates a boolean...erm...thing (what do you call it) which is tested to be true by the if statement, and the && really just means 'and'.
They both mean 'and'.
The first one (&) ands the individual bits of the variable and the second one (&&) evaluates the whole of the variables to true or false and ands them to get the result.
It's entirely possible that the two will produce similar or the same code. But if not, any speed difference between the two would be negligable.
Let the compiler do it's thing. Don't try to optimize prematurely. It's a waste of time and energy, and often is counterproductive since you end up butting heads with how the compiler wants to optimize and end up making slower and more confusing code.