I did watch the video, that's not what he said. He said the power transported by EM waves (not "charges") traveled between the battery and the legacy light bulb, which are one meter apart, in 1/c seconds (where c is in m/s). He did not say the power traversed the full length of the (1 light-second) in that time.
Again, saying that anything traveled a light-second in less than a second is obviously incorrect because it would require traveling faster than light.
The initial question was;
"Imagine you have a giant circuit consisting of a battery, a switch, a lightbulb, and two wires which are each 300k KM long. That is distance light travels in 1 second so they would reach out half way to the moon and then come back to be connected to the lightbulb which is 1 meter away. Now the question is; if I close this switch how long would it take for the bulb to light up? Is it; A 0.5s, B 1s, C 2s, D 1/Cs, or E none of the above?"
With the answer of D 1/Cs in mind, which is an amount of time it takes the bulb to turn on, because the electromagnetic fields just propagate from the battery to the lightbulb which is only 1 meter away, I can say that its effect in this scenario is that the "charge" to light the bulb takes 1/c time to get to the bulb is not incorrect, even if the wires are 1 light second in length. This is, again, because the power does not go through the wires to then get to the bulb. I may have used the wrong terminology, but I am not incorrect.
The time it took the power to go from the source to the bulb 1 meter away is 1/Cs. To get the amount of time in seconds that this took you need to first divide the distance from the battery to the bulb, 1 meter, by the amount of distance light can travel in one second, 299,792,458 meters, and you get the amount of time in seconds, 0.00000000333564095198s, power took to go that distance. You can then get a percentage of the speed of light that then took to take power to go from the battery to the bulb.