So we have two different things: the limit of a sequence, and the limit of a function, which generalizes the sequence idea.
Let's start with a sequence {a_1, a_2,...,a_i...} in R. Then if a is the limit of {a_1, a_2,...,a_i,...}, then for all r > 0, there exists a natural number n such that for all i > n, |a - a_i| < r. That is to say, {a_1, a_2, ...,a_i,...} converges to a.
Similarly, we can say that if the sequence {a_1, a_2,...,a_i,...} converges to a, then the series a_1 + (a_2-a_1) + (a_3 - a_2)+... = a (i.e. converges to a), and if the series b_1 + b_2 + b_3... = b, then the sequence {b_1, b_1+b_2,...,b_1+...+b_i,...} converges to b.
One thing to be careful about regarding series: some series appear to converge to multiple things. We do not consider them as converging.
Consider the series 1 + (-1) + 1 + (-1) +...
This series does not converge.
But you could say, "1 + ((-1) + 1) + ((-1)+1)+... = 1 + 0 + 0 + ... = 1" so it converges to 1, and also "(1+ (-1)) + (1 + (-1)) + (1 + (-1)) + ... = 0 + 0 + 0 + ...= 0" so the series converges to both 1 and 0.
We do not like this in almost all cases (the ones where we allow multiple convergences are very interesting though).
We want series to converge to one value, so we say that the series 1 + (-1) + 1 + (-1) + ... does not converge. We can see this from the sequence: a_1 = 1, a_2 = 0, a_3 = 1, a_4 = 0 and so on. Consider r = 1/2. Then there is no n such that for all i > n, |1- a_i| < 1/2 because for i even, |1 - a_i| = 1, and there is no such n such that for all i > n, |0 - a_i| < 1/2, because for i odd, |0 - a_i| = 1.
Now for the limit of a function:
Consider then the function f taking R to R where f(x) = x^2. Now take the limit as x approaches 0 from the right, i.e. the set {f(x)=x^2 for x > 0}. Then lim_{x->0+} f(x) = 0, because for all r > 0, there exists a y such that for (x-0) < y, |f(x)-f(0)| < r. In other words, as you approach 0, the square of x gets arbitrarily small. This limit converges to 0.
Then consider the limit of f(x) = x^2 as x goes to infinity. The limit of f(x) as x goes to infinity is infinity (or, in other words, it doesn't have a limit) because for all r > 0, for all x_0 in R there does not exist a y such that |f(x) - y| < r for all x > x_0. In other words, x^2 gets arbitrarily big as x approaches infinity. This limit diverges.