Of course it's a good thing; you're giving the computer as much information as possible to be able to know what to do with what you're doing. If you don't, it has to guess - and this means it'll either consume more resources in terms of memory (e.g. in Visual Basic, if you don't define a type, you get a Variant which consumes much more memory than any of the base types), or in runtime when it has to decide how to shift between types.
Worse, you get all sorts of weird fringe bugs with inconsistent type handling and in ways you'd never immediately suspect.
I've seen software glitch out on password hash comparisons because of this. If the system is a bit older and it's using MD5 or SHA1 (or possibly SHA256, any such hash is vulnerable where it's a bare hex string), you could have a password stored in the database as 0f123456789 or similar - the key point is that it begins with a number. Depending on exactly what goes down in the code, you could end up doing a comparison in PHP such that PHP will get confused about the type *and convert it to a number first*. Which means your string of all those digits goes away and you end up comparing a 0 to something. It's certainly not impossible (and has happened in the wild) that you can get the password *wrong* and still be let in by way of bad comparisons here.
Having stricter typing also means you can flag your functions to say 'I want this type and this type' and if you end up with something else coming in, you can be sure there is something wrong with the logic of your program that won't slip through the net and be ignored. Too many cases of bugs are the results of not handling things correctly.