Our initial focus is developing an industrial power system that provides primary power in remote and environmentally challenging locations.
Our systems provide essential electricity and load management to mission critical systems around the world, including in some of the most remote places on earth. Our systems easily integrate into your existing power infrastructure, operate on a fully autonomous basis and provide complete assurance that your equipment will continue to operate even when your regular power generation has failed or when your power demand spikes above your installed capacity.
In operation our systems are inaudible, produce no particulate matter and virtually no NOx and SOx. Depending on the application, we are up to 20 times more efficient than competing systems and provide dramatic savings in fuel consumption and carbon tax emission costs.
What is the difference between a kWh and a kW?
We recently published a post about the average electricity requirements of a typical home. Thank you to everyone who emailed us, we had some great feedback and some suggestions for new topics. One of the most popular requests was for some information on what a kWh is and how it differs from a kW. So, here goes:
A kWh is a measure of energy, a kW is a measure of power. But what are energy and power?
Physics textbooks will explain that Energy is the “capacity to do work”, but in the case of residential electricity it might be easier to think of Energy as the total of what we need to buy to run our households.
Power is the rate at which energy is generated or used. For our purposes it might be easier to think of boiling a kettle. It takes a certain amount of energy to boil water (this is actually the definition of a calorie, but we’ll leave that for another day). If we want to boil the water faster then we need a kettle with more power.
In equation terms, we can express the relationship between Energy, Power and Time as follows:
Energy = Power x Time. (or kWh = kW x h)
Another example might help with the explanation. The average solar panel installation in America has a power rating of 6kW. We know from our previous blogs that the average home has a typical energy consumption of 30kWh a day, so we can work out theoretically how many hours of sunshine we need to provide all the energy the home needs as follows:
Time = Energy / Power
Time = 30kWh / 6kW
Time = 5hrs
This calculation tells us that the average American home equipped with solar panels should have the ability to produce all of the required energy from the sun, provided the sun shines for at least 5 hours a day.
In actual fact, the relationship is more complicated than this. The power output of solar panels is sensitive to the angle of the sun’s rays and accordingly don’t produce the maximum power output throughout the day even in consistent sunshine. The power output of solar panels is also sensitive to any barriers between the panel and the sun, for example, cloud, snow etc. Finally the energy requirement of the home does not typically coincide with the peak power output of the solar panels. This topic will covered in our next blog.
How much electricity does an average home need?
According to the US Energy Information Administration, the average annual electricity consumption for a U.S. residential utility customer was 10,766 kilowatt-hours (kWh) in 2016. This is an average of 897kWh per month, around 30kWh per day, or 1,250 watts.
There is quite a wide variation of consumption by state, with Louisiana homes using over 40kWh per day and homes in Hawaii using just over 16kWh. Additionally, homes have pronounced diurnal, weekly and seasonal variation; meaning electricity consumption is not spread evenly over hours of the day, days of the week or months of the year.
The majority of power is used during the daytime (referred to as “on-peak” and usually occurring between 7am and 10pm on weekdays). Demand levels tend to be lowest between 10pm and 7 am and on weekends (this is usually referred to as “off-peak”). Demand levels also tend to be highest in winter and summer when the need for space conditioning (heating or cooling) is high.
All of these different variation factors mean that average figures alone aren’t sufficient when modeling different types of electricity supply. In order to prevent power shortages, it is necessary to consider peak demand, i.e. the maximum electricity demanded by homes. In an ideal world this would be a detailed analysis of a specific home in a specific location, but for large scale modeling purposes this is clearly impractical. At Upstart Power, we use the Peak to Average demand ratio in our modeling. We track the ratio by region by year but as a rule of thumb tend to use the information for New England as our reference point. The Peak to Average demand ratio for New England has increased steadily over the last 20 years from around 1.5 to 1.9. This means that the highest peak hour demand for electricity is nearly twice the average hourly level.