I think the best way to answer the OP's question is that there are two reasons that very high voltages used. Unksol mentioned them but I will expand upon it.
1. The amount of current you can deliver in a wire of a given composition is determined by its size. The more current, the larger the wire has to be.
2. Power = current * voltage. You can deliver a bunch more power over the same wire by increasing the voltage.
3. Voltage drops (resistive losses) in AC systems is actually V=I^2 * R. R is a fixed quantity for a particular wire, so your voltage losses quadruple when you double the amperage. Going back up to the second equation, if you double the voltage, you halve the current, and thus reduce your voltage losses by 75%.
The power company saves a lot of money by using high-voltage distribution systems as it allows them to lose less power to resistive losses in the lines and also spend less money by using smaller, less-expensive wires. Going from distribution voltages of 600-2300 volts around 1900 to 7200+ volts by the 1930s also allowed for rural areas to be electrified. The lower voltages simply had too much voltage drop over long lines (resistance is measured in ohms per foot) to feed customers more than a few miles beyond the generation plant.
Your typical pole or pad-mount residential transformer takes 7200 volts and turns it into 125 to 200 amps of 240 volt service. Drawing the 200 amps at 240 V from the transformer causes the transformer to draw less than seven amps at 7200 volts. Other common voltages to feed customer transformers are 13,200 volts and 19,920 volts, particularly to larger customers that draw thousands of amps of 480 volt service and to very isolated rural areas. The power company will use voltages like 34 kV, 69 kV, 115 kV and 161 kV to transmit power between generation plants and substations, and voltages from 230 kV up to 765 kV is used to send power several states away.