Algorithm
An algorithm is a sequence of simple tasks, steps, or operations, carried out in accordance with a definite law.
Examples
Schoolchildren learn algorithms for computing the sums, products, and ratios of numbers.
There are algorithms to find the matrix resulting from the composition of two matrices.
There can be no algorithm to compute whether two presentations generate the same group.
Time complexity
The time complexity of an algorithm is how many simple steps the algorithm will take to compute an input of a given size.
The most practical measure of time complexity, in standard mathematics, is "big O notation." [TODO maybe I should explain it in the function article.] It describes the time complexity in the limit where the input size goes to infinity.
Of course, since actual algorithms are finite, it is not clear that this measure will have anything to do with reality. Nonetheless, we observe empirically that it usually does. It is usually the case that algorithms with asymptotically low time complexity also have low time complexity in practice, and that algorithms with asymptotically high time complexity also have high time complexity in practice. There are, however, some exceptions. An exception to the former generality is that there are dense matrix multiplication methods that are asymptotically better than (the obvious method), but are slow for the relatively small matrices that people actually want to multiply. An exception to the latter generality is that in compilers, some type-checking algorithms have asymptotically exponential time complexity, but work fast for the relatively simple programs that people actually write.