An algorithm is a definite computational method with the additional property of finiteness. Equivalently, an algorithm is a generic algorithm with the additional property of definiteness.
Thus an algorithm has inputs and outputs and has all of the properties of definiteness, finiteness, and being resource constrained (which implies effectiveness).
A computational method which is intended to be an algorithm but which sometimes doesn't terminate (i.e., some of its execution sequences are infinite) is often still called an algorithm, but one with a "bug" in it.