Good LED bulbs have a smoothing capacitor after the full bridge rectifier. This allows the LED to maintain most of its output during the low points in the cycle, resulting in minimal to no flicker when recording.
If lights are flickering when you record videos, you probably need to change the settings on your camera to match your country’s grid frequency. Almost every video recording device will have a 50/60Hz setting somewhere.
Buffer the input in a battery then use dc out from the battery to power your lights, no flickering. No need to reconfigure the entire grid and every device on it for niche applications.
Rectifying the AC, even with a full bridge rectifier, will still drop to zero every time the AC voltage crosses the zero line. So usually a capacitor is added to buffer this output. Its capacity depends on the size of the load.
High frequency is generally bad for transmission line losses, so getting power from A to B is better at lower frequency — DC is a great option here.
If we switched to DC, many things would still flicker though as they would presumably use switching power supplies, but those could be relatively high frequency like you said.
Interestingly, airplanes use 400Hz, as transmission over distance doesn’t matter, and transformers can be made much smaller/lighter.
Even a switch mode power supply doesn’t really flicker since they have a rectification and smoothing stage on the output to produce a DC voltage. The switching is done on the input to set the duty cycle which controls the voltage/current ratio at the output.
As far as I understand, a DCDC converter is less efficient and more expensive than an equivalent ACAC converter. I don’t know about switching power supplies, and whether that’s true or extendable to the transformer case, sorry.
Long distance point to point power transmission (like internationally) is often DC because transmission losses become more important.
I don’t think that’s actually true. To do AC to AC conversion at grid frequencies normally requires large inefficient transformers. A PC power supply is an example of a switch mode power supply. Basically what happens is: AC mains -> DC (at mains voltage) -> AC (high frequency, mains voltage) -> transformer -> AC (low voltage, still high frequency) -> DC (low voltage). Why do all this? Because doing the voltage conversion at grid frequency would need a much bigger transformer. They could just do the voltage conversion at grid frequency and only have to rectify once with no conversion back to ac, but it’s actually less efficient and requires more expensive hardware. So actually DC to DC conversion is more efficient, even if it means using high frequency AC in the middle. Not all switch mode supplies use this AC trick, though they do all involve switching current. buck and boost converters are used in smartphones, laptops, motherboards don’t have any transformer and are incredibly compact and efficient.The fact that many many things also need DC would be a bonus. Recitifying single phase AC at low frequency is not the most efficient thing in the world. Three phase is better, but having straight DC and only needing to change voltage would probably be best.
They’re more efficient than old school ac-dc linear supplies (of which an ac transformer is just a part of). However if you just want to step up or down ac voltage, transformers are quite efficient.
But we could just attach an antenna to our roofs and steal electricity, I consider it worth the transmission loss if we can create more transmission loss.
Do you think we will ever change our power grid to have a higher frequency so that our bulbs don’t flicker when we record things?
Good LED bulbs have a smoothing capacitor after the full bridge rectifier. This allows the LED to maintain most of its output during the low points in the cycle, resulting in minimal to no flicker when recording.
Alright, show me your eyebrows
I loled
If lights are flickering when you record videos, you probably need to change the settings on your camera to match your country’s grid frequency. Almost every video recording device will have a 50/60Hz setting somewhere.
How about banning flickering lamps? I’d ban screeching power adapters too.
Buffer the input in a battery then use dc out from the battery to power your lights, no flickering. No need to reconfigure the entire grid and every device on it for niche applications.
Just rectify the AC, if the voltage isn’t too much.
You don’t need a buffer unless the power fluctuates.
Not a licensed electrician
Rectifying the AC, even with a full bridge rectifier, will still drop to zero every time the AC voltage crosses the zero line. So usually a capacitor is added to buffer this output. Its capacity depends on the size of the load.
deleted by creator
High frequency is generally bad for transmission line losses, so getting power from A to B is better at lower frequency — DC is a great option here.
If we switched to DC, many things would still flicker though as they would presumably use switching power supplies, but those could be relatively high frequency like you said.
Interestingly, airplanes use 400Hz, as transmission over distance doesn’t matter, and transformers can be made much smaller/lighter.
Even a switch mode power supply doesn’t really flicker since they have a rectification and smoothing stage on the output to produce a DC voltage. The switching is done on the input to set the duty cycle which controls the voltage/current ratio at the output.
Also if we switched to DC, you’d need costly dcdc transformers to step up the voltage for transmission and back down again for domestic usage
Aren’t switching mode power supplies smaller and more efficient than regular AC transformers anyway?
As far as I understand, a DCDC converter is less efficient and more expensive than an equivalent ACAC converter. I don’t know about switching power supplies, and whether that’s true or extendable to the transformer case, sorry.
Long distance point to point power transmission (like internationally) is often DC because transmission losses become more important.
I don’t think that’s actually true. To do AC to AC conversion at grid frequencies normally requires large inefficient transformers. A PC power supply is an example of a switch mode power supply. Basically what happens is: AC mains -> DC (at mains voltage) -> AC (high frequency, mains voltage) -> transformer -> AC (low voltage, still high frequency) -> DC (low voltage). Why do all this? Because doing the voltage conversion at grid frequency would need a much bigger transformer. They could just do the voltage conversion at grid frequency and only have to rectify once with no conversion back to ac, but it’s actually less efficient and requires more expensive hardware. So actually DC to DC conversion is more efficient, even if it means using high frequency AC in the middle. Not all switch mode supplies use this AC trick, though they do all involve switching current. buck and boost converters are used in smartphones, laptops, motherboards don’t have any transformer and are incredibly compact and efficient.The fact that many many things also need DC would be a bonus. Recitifying single phase AC at low frequency is not the most efficient thing in the world. Three phase is better, but having straight DC and only needing to change voltage would probably be best.
They’re more efficient than old school ac-dc linear supplies (of which an ac transformer is just a part of). However if you just want to step up or down ac voltage, transformers are quite efficient.
deleted by creator
But we could just attach an antenna to our roofs and steal electricity, I consider it worth the transmission loss if we can create more transmission loss.