IET
Decrease font size
Increase font size
Topic Title: Voltage optimisation ?
Topic Summary:
Created On: 29 October 2007 07:06 PM
Status: Read Only
Linear : Threading : Single : Branch
<< 1 2 Previous Last unread
Search Topic Search Topic
Topic Tools Topic Tools
View similar topics View similar topics
View topic in raw text format. Print this topic.
 11 March 2012 08:25 AM
User is offline View Users Profile Print this message



alancapon

Posts: 5746
Joined: 27 December 2005

Originally posted by: tonylongstaff
Voltage Optimisation & Regulation has developed a lot since this thread was started back in 2007, and has received the approval of the CARBON TRUST as a way companies can reduce their electricity usage and carbon footprint. . .


There are also companies out there that have been mis-sold these devices, and are not achieving the savings they were promised.

Regards,

Alan.
 11 March 2012 08:58 PM
User is offline View Users Profile Print this message



cookers

Posts: 203
Joined: 10 February 2012

Originally posted by: tonylongstaff

Voltage Optimisation & Regulation has developed a lot since this thread was started back in 2007, and has received the approval of the CARBON TRUST as a way companies can reduce their electricity usage and carbon footprint. . .


I have no direct experience of the product promoted by Tony.

There are a number of voltage optimisation products available on the market.

Basically the devices are low loss transformers (sometimes amorphous core) and the voltage is reduced and stabilised with the help of power electronics or not.

Energy saving comes from voltage reduction to the installation equipment from 240 to 220 (so the story goes). Wasn't really convinced but under pressure from "the Green Army" did a trial in a number (20) of 100kVA HH metered properties.

Results by simple comparitive ( before and after) analysis were best described as uncertain to poor, more detailed analysis to prove or disprove the savings claims proved more challenging, as variations in business use would always affect consumption( during the measured period business activity decreased markedly) . Trial was abandoned.

However, this trial and thinking did lead us down another more prosaic related path where we concluded that low loss distribution transformers 11kV/400V (Low core losses)) are an area that seemed to offer a return on investment of less than 3 years, and if we changed TX we could have ones with a 220 volt winding and so if there were savings to be had by reducing voltage then we could do that as well.

Edited: 28 March 2012 at 06:14 AM by cookers
 27 March 2012 09:52 AM
User is offline View Users Profile Print this message


Avatar for aroscoe.
aroscoe

Posts: 91
Joined: 18 October 2002


I was chatting to colleagues about this yesterday. The specs for the Vphase device suggest to me that it is an in-series inverter that allows the voltage to be adjusted slightly - i.e. the primary current goes through on a bypass and only the "extra volts" need to be adjusted via an in-series coupled element. That's how I would do it anyway, and there are good examples of "prior art" in 3-phase devices to ensure power quality to sensitive loads. These devices can also clean up the harmonics which can be handy as well (but if done too cheaply could add harmonics). Above 20A the device switches into complete bypass, presumably because this is above the rating of the coupling element. Doing it this way means you don't need a fully-rated transformer: both the coupling element and the power electronics are only part rated.

In terms of energy saving, I'm sceptical and it will be very dependent on application. A device like a kettle or washing machine may use MORE energy when voltage is reduced since it will take longer to get the same energy from the heating element and therefore the heating cycle will be longer (more heat losses from kettle, more energy used in "computer" and "drive" parts of washer during the extended cycle).

A device like a plasma/lcd telly or a PC, or a laptop charger, is essentially a constant-power device and it will just increase its current to compensate for the lower volts. However, there may be second-order effects such as perhaps the efficiency of the switching cycle in the switched-mode supply is slightly better at lower input voltages. On the other hand, it might be slightyl worse. It would be interesting to pull up specs for, for example, a laptop charger and compare its stated efficiency at 110V operation in the US vs. 230V operation in the UK. Maybe this isn't even stated! It would be very interesting to amass an array of domestic devices and run some controlled tests with decent instrumentation, and assess the results.

Devices like lights will draw less power. I dissassembled a mains GU10 LED lamp the other day, and its circuit means that it will definitely dim if the voltage is reduced, and power consumption will be reduced. Other lamps like incandescent (P=V^2/R but R is non-linear with P), fluorescent and CFL will have profiles which may be similar or different. Thus reducing voltage in the short term will save energy, BUT, in the long term, people will notice the lower light levels and compensate by upping the number of "wattage" of bulbs. People have already complained vociferously about lower light levels from CFLs and LEDs compared to the old 100W incandescent, and the result is people now have multiple bulbs in the same room instead of a single central ceiling rose. Reducing your voltage long-term from 230-240V to 220V will tend to push people in the long term to do this, even if they don't in the first few weeks (of a trial!).

As a cheaper and more obvious alternative, how about banning 50W GU10 downlighter bulbs? They are far less effective than the old 100W incandescents. Making the bulbs (and other devices) would be a more efficient would be the ideal way to tackle energy reduction.

When I moved into my new house, it had 6x50W downlighters in the kitchen, 300W in total. We ripped this out and put in 2 central lamps with 20W CFLs. Its just as bright and uses only 13% of the energy.

The final thought is that by reducing your voltage, you are also increasing the energy losses within your domestic wiring, because the current is higher for the same power draw. These cable losses rise as I^2*R so there can be a doubly-whammy here. Reducing your voltage at any time by 10% increases your cable losses by 20% if the same power is being drawn by a constant-power device.


-------------------------
Dr. Andrew Roscoe

http://personal.strath.ac.uk/andrew.j.roscoe
 27 March 2012 11:08 PM
User is offline View Users Profile Print this message



alancapon

Posts: 5746
Joined: 27 December 2005

Originally posted by: aroscoe
. . . As a cheaper and more obvious alternative, how about banning 50W GU10 downlighter bulbs? They are far less effective than the old 100W incandescents. . .

I think you will find that the EU are already plotting against halogen lamps. We have recently put up some fittings in the lounge with 20W G4 halogen lamps. I am seriously considering holding a few years supply of the lamps!

Regards,

Alan.
 14 August 2012 09:25 PM
User is offline View Users Profile Print this message



powercor

Posts: 6
Joined: 07 September 2009

Dear Andrew, I was just wondering if you could give me your thoughts on reducing voltage on induction and syncronous motors, we have been installing voltage optimisation for a few years on large scale installations and had good savings depending on load type, but recently we have been asked to install on 60 kw motors and I'm not sure how the motor will respond to the lower voltage, I'm guessing it depends on how the motor is loaded but would appreciate your comments many thanks in advance chris wright
 15 August 2012 09:28 AM
User is offline View Users Profile Print this message


Avatar for aroscoe.
aroscoe

Posts: 91
Joined: 18 October 2002


Chris,

The answer to this will be quite different, depending on what your answer is to the following:

Are these directly connected, or connected through variable-speed drives (power electronics) ?

If they are connected through power electronic drive systems, then, probably, you will notice almost zero difference in consumption (active ot reactive) when using "optimisation". The systems should still work fine so long as the voltage is still within the spec. for the drive.

If the machines are directly connected, then:

- for an induction machine, there will be a drop in active power consumption as the machine slows down on its torque/slip/voltage curve, and also a drop in reactive power draw. You might want to check that whatever is being driven will be okay with the lower speed/power - i.e. less air will be shifted by a fan. If there is a power correction factor capacitor then this might need reviewing, although this will tend to respond by producing less reactive power as voltage drops, and therefore the motor and capacitor response will tend to balance out. However, assuming that the "optimised" voltage is still within the machine nameplate spec, and the plant was designed conservatively, then in theory nothing should really go too wrong....

- for the synchronous machine (this would be a specialist application), you will notice no difference in consumption (watts), assuming that the voltage is still in spec. You need to watch out because at a lower voltage the maximum torque the synchronous machine can tolerate before pole-slipping will be reduced, so if your machine ever works towards its maximum nameplate power output, you might want to check. There might be some change to the reactive power draw (or export) of the machine, but that would depend on the field controller (assuming its not a PM machine).



-------------------------
Dr. Andrew Roscoe

http://personal.strath.ac.uk/andrew.j.roscoe
 15 August 2012 07:48 PM
User is offline View Users Profile Print this message



iamck

Posts: 10
Joined: 16 September 2001

This discussion takes me back to the days of the miners' strike of the 1980s. There was no shortage of plant, but there was a shortage of fuel (and therefore energy). All grid bulk supply transformers are supplied with voltage reduction equipment, which reduces voltage in two stages, first by 3% and then by 6%. Some transformers had a third stage which reduced voltage by 10%. Also, during the emergency the 3% and 6% figures were changed to 5% and 10%. These should have given power reductions of 6%, 12% and 19% respectively, but rarely did. The most common reason was the absence of a facility to freeze taps on the primary substaion transformers. Where these were co-sited with the grid transformers there was no problem, but in those days pilot cables between sites were a scarce resource, not to be wasted on once -in-10-years contingencies, such as a shortage of energy.

Even in those days ther were problems in customers' installations. Arc welders at shipbuilders in Greenock were the subject of angry phone calls.

I wonder if we return to a period of energy shortage, how will modern customer plant fare.

Iain mcKenzie

-------------------------
iamck
 15 August 2012 10:19 PM
User is offline View Users Profile Print this message


Avatar for aroscoe.
aroscoe

Posts: 91
Joined: 18 October 2002


Ian,

Just wondering if you are the Ian McKenzie who lives across the road from me !!

Andrew

-------------------------
Dr. Andrew Roscoe

http://personal.strath.ac.uk/andrew.j.roscoe
 15 August 2012 10:34 PM
User is offline View Users Profile Print this message



iamck

Posts: 10
Joined: 16 September 2001

Andrew,
Yes.
Iain

-------------------------
iamck
 16 August 2012 08:38 AM
User is offline View Users Profile Print this message



powercor

Posts: 6
Joined: 07 September 2009

Dear Andrew

Thank you very much for your quick reply, the motors are DOL and no speed controllers apart from a soft start, Star delta starter. Thanks for the information.

Regards

Chris
IET » Energy » Voltage optimisation ?

<< 1 2 Previous Last unread
Topic Tools Topic Tools
Statistics

See Also:



FuseTalk Standard Edition v3.2 - © 1999-2014 FuseTalk Inc. All rights reserved.