It's been a long time since I did any work with LEDs (back in the '90s when I was an Electrical Engineering student), but typically you have a power supply of X volts (let's say 6V for sake of argument). A typical red LED has a forward voltage drop of 1.7V. Typically you choose a current value of 50% of the LED's rated current (Imax). We'll say for sake of argument that this is 20mA.

The voltage drop that will occur across the resistor is 6V - 1.7V, or 4.3V (power supply voltage - forward voltage drop of the LED). Ohm's law for resistors says R = V/I. R = 4.3 V / 20mA, which works out to 215 ohms. Pick the closest resistor you have and wire it in series with the LED.

Since P=IE (also Ohm's law), the power dissipated by the LED (including light emitted) is 1.7V * 20mA = 0.03W. The power dissipated by the resistor is 0.086W. Note that in the case of an AC powered LED, there are other electronics involved as well (such as a transformer and rectifier circuit) which will likely produce more heat than the LED and resistor itself.