Because it falls off too fast. If you have an infinite number of points emitting light, and the amount of light you can see from any point that's x distance away from you is equal to k/x² (it is: the intensity of light falls off proportional to the square of the distance), you will see only a finite amount of light, provided constant light-source density (or at least light-source density that doesn't increase as you go further out).
I can give you a one-dimensional example to prove my point. Say that you're at the end of a ray (a line that starts at a point and goes off infinitely), and every inch along the ray is a light that emits one lumen divided by ten to the power of its distance from you. Now, how much light will you see? From the first, you'll see one lumen divided by 101, i.e., a tenth of a lumen. From the second, you'll see a hundredth of a lumen, plus a tenth of a lumen from the first, for a total of 0.11 lumen. From the third, you'll see a thousandth of a lumen, for 0.111 lumens total.
As you can see, this pattern will go on forever: even if there are infinite lights, you'll only see a total intensity of 0.11111111... lumens, i.e., a ninth of a lumen. An infinite number of lights produce a finite amount of light. (I promise you that the same holds true if it decreases by the square of the distance, in which case you'll see a total of exactly π²/6 = 1.6449... lumens, but that's harder to see and in any case is irrelevant.)