I was chatting to colleagues about this yesterday. The specs for the Vphase device suggest to me that it is an in-series inverter that allows the voltage to be adjusted slightly - i.e. the primary current goes through on a bypass and only the "extra volts" need to be adjusted via an in-series coupled element. That's how I would do it anyway, and there are good examples of "prior art" in 3-phase devices to ensure power quality to sensitive loads. These devices can also clean up the harmonics which can be handy as well (but if done too cheaply could add harmonics). Above 20A the device switches into complete bypass, presumably because this is above the rating of the coupling element. Doing it this way means you don't need a fully-rated transformer: both the coupling element and the power electronics are only part rated.
In terms of energy saving, I'm sceptical and it will be very dependent on application. A device like a kettle or washing machine may use MORE energy when voltage is reduced since it will take longer to get the same energy from the heating element and therefore the heating cycle will be longer (more heat losses from kettle, more energy used in "computer" and "drive" parts of washer during the extended cycle).
A device like a plasma/lcd telly or a PC, or a laptop charger, is essentially a constant-power device and it will just increase its current to compensate for the lower volts. However, there may be second-order effects such as perhaps the efficiency of the switching cycle in the switched-mode supply is slightly better at lower input voltages. On the other hand, it might be slightyl worse. It would be interesting to pull up specs for, for example, a laptop charger and compare its stated efficiency at 110V operation in the US vs. 230V operation in the UK. Maybe this isn't even stated! It would be very interesting to amass an array of domestic devices and run some controlled tests with decent instrumentation, and assess the results.
Devices like lights will draw less power. I dissassembled a mains GU10 LED lamp the other day, and its circuit means that it will definitely dim if the voltage is reduced, and power consumption will be reduced. Other lamps like incandescent (P=V^2/R but R is non-linear with P), fluorescent and CFL will have profiles which may be similar or different. Thus reducing voltage in the short term will save energy, BUT, in the long term, people will notice the lower light levels and compensate by upping the number of "wattage" of bulbs. People have already complained vociferously about lower light levels from CFLs and LEDs compared to the old 100W incandescent, and the result is people now have multiple bulbs in the same room instead of a single central ceiling rose. Reducing your voltage long-term from 230-240V to 220V will tend to push people in the long term to do this, even if they don't in the first few weeks (of a trial!).
As a cheaper and more obvious alternative, how about banning 50W GU10 downlighter bulbs? They are far less effective than the old 100W incandescents. Making the bulbs (and other devices) would be a more efficient would be the ideal way to tackle energy reduction.
When I moved into my new house, it had 6x50W downlighters in the kitchen, 300W in total. We ripped this out and put in 2 central lamps with 20W CFLs. Its just as bright and uses only 13% of the energy.
The final thought is that by reducing your voltage, you are also increasing the energy losses within your domestic wiring, because the current is higher for the same power draw. These cable losses rise as I^2*R so there can be a doubly-whammy here. Reducing your voltage at any time by 10% increases your cable losses by 20% if the same power is being drawn by a constant-power device.
Dr. Andrew Roscoehttp://personal.strath.ac.uk/andrew.j.roscoe