PHOENIX (June 6, 2017) – The Spring 2017 round of the Arizona Innovation Challenge (AIC) has concluded as the Arizona Commerce Authority (ACA) today announced six awardees in its bi-annual business plan competition.
PHOENIX (June 6, 2017) – The Spring 2017 round of the Arizona Innovation Challenge (AIC) has concluded as the Arizona Commerce Authority (ACA) today announced six awardees in its bi-annual business plan competition.
The evolution of energy markets is accelerating in the direction of a greater reliance upon distributed energy resources (DER), whether those resources generate, consume, or store electricity. One strategy to address this trend is a virtual power plant (VPP), the concept that the intelligent aggregation and optimisation of DER can provide the same essential services as a traditional 24/7 centralised power plant. As a result of the wide divergence of what constitutes a VPP, Navigant Research has come up with its own definition, which has generally been validated by market participants:
“A VPP is a system that relies upon software and a smart grid to remotely and automatically dispatch retail DER services to a distribution or wholesale market via an aggregation and optimisation platform.”
VPPs can help transform formerly passive consumers into active prosumers through the integration and optimisation of new technologies. Recently, the focus has been on the pairing of solar PV systems with advanced batteries at the household or commercial building level. Through sophisticated software algorithms, it is possible to create DER fleets (i.e. VPPs) capable of providing bidirectional value to both sides of the utility meter.
The primary goal of a VPP is to achieve the greatest possible profit for asset owners while maintaining the proper balance of the electricity grid—at the lowest possible economic and environmental cost. From the outside, the VPP looks like a single power production facility that publishes one schedule of operation and can be optimised from a single remote site. From the inside, the VPP can combine a rich diversity of independent resources into a network via the sophisticated planning, scheduling, and bidding of DER-based services.
VPPs are dependent upon utility programs, regulations and/or organised markets for revenue and hence survival. They are outward-looking aggregations that can typically only solve problems upstream if regulations allow them to do so.
The greatest challenge in this regard for regulatory treatment may be for mixed asset VPPs, a segment of the overall VPP market that incorporates not only generation, but loads as well as energy storage. These VPPs have characteristics that traditionally were treated separately, often in silos. VPPs instead look to treat these DER aggregations as systems of systems. Ideally, the mix of resources does not matter; it is the services that can be provided that should be counted and compensated for in markets or within utility programs.
It is clear that utilities play a major role in this market transformation, as this is where many VPPs are incubated. Clear, transparent interconnection rules, rational telemetry requirements, and coherent and stable subsidy schemes all contribute to a market environment conducive to VPPs.
A VPP is a system that relies upon software and a smart grid to remotely and automatically dispatch retail DER services to a distribution or wholesale market via an aggregation and optimisation platform.”
Energy storage is not a prerequisite to the creation of a VPP. Instead, it enhances the flexibility and underlying value of other generation and load assets being assembled within the mixed asset VPP portfolio. Once storage is included in a VPP, it becomes dispatchable and schedulable, and other assets that are not schedulable become more attractive. Navigant Research has deemed VPPs that include energy storage as mixed asset VPPs, distinguishing them from load-based demand response (DR) VPPs and those VPPs limited to aggregating generation (i.e., supple-side VPPs.)
Various market trends have led to energy storage enabled mixed asset VPPs:
If investment in energy storage is included in the tally, the total VPP market today is estimated as a US$731.4 million market. By 2025, including energy storage results in an estimated cumulative investment of US$68.6 billion over a decade of VPP growth. Of that total revenue pie, North America is forecast to capture 38.1%. Asia Pacific is expected to come in second in cumulative spending with 34.6% compared to Europe’s 26.4%.
Transforming the market for virtual power plants with advances in energy storage.
Cumulative VPP capacity and implementation spending (with energy storage) by region, world markets: 2016-2025. Image: Navigant.
Despite the impressive value of energy storage assets being deployed within VPPs, the ultimate driver for VPP implementation is still software. Intentionally broad, this segment of the automation and intelligence market encompasses solutions across the utility value chain, including far-reaching enterprise IT down to single application, standalone distribution automation solutions. While some form of telemetry, such as smart meters, are prerequisites for a VPP, as well as individual device controls and communication infrastructure, it is software that allows for the aggregation, optimisation, and ultimately market interface that is the most vital enabling technology for VPPs. The market for VPP software is forecasted to reach US$1.8 million annually by 2025. Without such software, there can be no VPP.
Along with energy storage, it will be the growing sophistication of DR that will be another key driver behind the success of the overall VPP market. The more DR can be automated, called up in near real-time, and surgically administered at the distribution grid in order to solve problems from cascading to the wholesale market, the better this VPP platform will look to all stakeholders, including utilities. Backing up DR with energy storage can help make it a more dependable resource capable of smart renewables integration, further building the business case for mixed asset VPPs.
Source: Peter Asmus’ Transforming the market for virtual power plants with advances in energy storage on Energy Storage News
The need to evolve meter data management technology to gain more insight has become increasingly apparent
A recent report from Navigant Research analyzes the market for meter data management systems (MDMSs) and meter data analytics (MDA), with global market forecasts for revenue, broken out by segment and region, through 2024.
Today, the United States alone logs billions of data points every day from more than 50 million installed smart electric meters. The increasing volume of meter data and new applications that rely on this data, such as demand-side management and grid optimization, have driven the need for enhanced MDMSs and MDA. Click to tweet: According to a recent report from @NavigantRSRCH, revenue for meter data management systems and analytics is expected to total $10.3 billion from 2015 to 2024.
“There is a growing emphasis on converging data from external sources for the purpose of more advanced, predictive analytics,” says Lauren Callaway, research analyst with Navigant Research. “This has supported developments in MDA technologies that have the capability of dealing with differently structured forms of data, generating prescriptive and predictive insights.”
As more and more networked devices penetrate the distribution grid, the need to evolve MDM technology has become increasingly apparent, according to the report. The increasing need for both prescriptive and predictive analytics as a method for gaining insight and taking action supports developments in MDA solutions, and MDMS are a critical technology in this evolution.
The report, Market Data: Meter Data Management, provides an analysis of global market trends that are affecting the uptake of MDM technologies. The study analyzes the market in two key categories, MDMSs and MDA, which are further broken down into four segments: software licenses and upgrades, services, maintenance, and software as a service. Global market forecasts for revenue, broken out by segment and region, extend through 2024. The report also examines the key technology developments, drivers, and barriers related to MDMSs and MDA. An Executive Summary of the report is available for free download on the Navigant Research website.
Once an organization has a grasp on all the energy data at its fingertips, the logical next step is figuring out how to use that information.
In my last post, I discussed the various kinds of energy data and how to collect them. But how do you align data collection goals with energy management initiatives across the enterprise?
High-resolution data from smart meters or other interval metering technologies are significantly more robust than utility bills, but this increased granularity is not always required. There are some energy management use cases that just require utility bill data.
First, what is the value proposition? There are a variety of estimates, and in some cases they compound savings across a full energy information system (EIS), rather than by use case. Lawrence Berkeley National Lab calculates that EIS can help realize 17 percent median site savings and 8 percent median portfolio savings, and most other sources estimate savings at 10 to 20 percent.
Energy reporting and benchmarking
This is one of the first steps that almost every enterprise will go through when establishing an energy management program.
Benchmarking can answer questions such as “how am I doing?” and “which sites are performing well, and which are laggards?” In addition, energy management programs typically pull data from many utilities and many other sources (interval and/or utility data), so centralized reporting of this information yields valuable insights.
Most industry benchmarking, such as Energy Star, is based on utility bill data. So, if the goal is simply to have Energy Star scores across your enterprise, your organizations will just need utility bills.
At the same time, benchmarking with interval data can help to identify buildings that have particularly expensive peak demand charges or inefficient startups and shutdowns. The interval data will provide more detail about where and when these problems occur, making it easier to address them.
That said, some firms might find that using utility bills to benchmark a portfolio provides enough information before selecting specific buildings to undergo a detailed energy audit.
This is a nebulous concept within the energy management space. Most organizations will attempt to estimate future energy costs and consumption based on past performance. This normally is part of a corporate-wide annual budgeting initiative.
Some firms will then track against these projections throughout the year. This has been a core use case for energy management for years, if not decades.
There are a few different approaches to developing a forward looking estimate, from just adding an inflation percentage to the current year’s consumption to using a more complex model with multiple variables, such as changes in square footage, utility rates or overall energy use.
Once the variables are set for the coming year, a budget estimate of energy use and cost can be generated. The challenge is that these budgets are only as good as the underlying assumptions. If you think that footage will grow by 5 percent and peak demand will go down by 2 percent, there are a variety of tools and services to build this budget. But the budget won’t be accurate if these assumptions don’t materialize.
Interval data can add granularity to the budget while also adding significant complexity. Some organizations use interval data to provide daily and weekly budget visibility. The budget is generated based on monthly utility bills, but it is tracked based on interval data.
Most building professionals want to know as soon as possible when actual performance deviates from the budget. The granularity of interval data can provide advance notice if a building is on track to exceed its budget. By the time the utility bill arrives, and it is too late to avoid the overage.
Interval data can give building professionals time to make changes that bring the budget back in line. If the budget value still happens to be exceeded, interval data will provide a better story around why it happened.
This is an increasingly common strategy that requires interval data. The goal is to reduce peak demand charges by reducing demand at the times in which a building is using the most energy, to avoid costly peak demand charges.
Battery storage vendor Stem reports that demand charges can make up 50 percent of the utility bill. The building must have visibility into interval demand, which is not commonly provided on utility bills.
A utility bill may identify the single point in time each month that peak demand was set, but without knowing if there are points throughout the month when the demand was nearly as high as the peak, it is difficult to build a demand reduction strategy. For example, avoiding a peak of 1,000 KW isn’t cost effective other weekdays have a peak of 990 to 995 KW.
A utility bill may not provide this full picture, which will make it hard for building professional to understand the opportunities to reduce demand and build a plan to do so.
Real time data from meters, rather than utility-owned smart meters, will enable alerts to be set that proactively warn users of potential peak demand thresholds.
These data enable control scenarios to be implemented that automatically reduce demand, such as the dimming of lights or small modifications to temperature setpoints. Utility bills may identify a peak demand problem but can’t solve it.
Settling the tab
There are a few kinds of utility bill validation. With only utility bills, a review of rates and check for billing errors may yield significant cost savings.
Accenture reports that 1-2 percent of all utility bills contain errors. This is especially true for firms that have hundreds or thousands of utility bills and rarely review them in detail. At the same time, comparing values from interval meters to the utility bill statements may identify even more discrepancies and billing errors.
But the process for challenging a utility bill based on what likely are non-revenue grade interval meters can be complex and varies by utility.
The goal of measurement and verification (M&V) is to calculate energy savings from efficiency projects and operational changes. Instead of comparing energy use before and after a project is implemented, M&V enables a comparison of actual energy use to what the use would have been if the project was not conducted.
This is more accurate, since occupancy and weather changes might impact energy use after the project. Without the efficiency project, energy consumption would have been even higher. M&V approaches typically include building a regression model of pre-retrofit energy use with variables like weather and occupancy.
M&V is the foundation of performance contracting and many utility-sponsored energy efficiency programs. There are industry standards from the Efficiency Valuation Organization and ASHRAE, which detail how M&V should function.
These standards do not dictate the use of a particular data set, and in many cases, utility bills are used to build a baseline model and conduct M&V.
At the same time, Lawrence Berkeley National Lab and others have been actively investigating how interval data can improve M&V, generally finding that accurate regression models can be developed with interval data. Additionally, LBNL surveyed some of the leading commercial products that use interval data for M&V finding that the state of the market is strong.
M&V can be conducted with utility bills or interval data, but moving forward, interval data-based models will show a variety of advantages and the gap between these two data sources will widen.
Alerts and anomaly detection
Alerts can replace reactive, schedule-based operations to a more pro-active effort. Especially when software solutions provide advanced workflow- and role-management capabilities, alerts that direct building professionals to problems right when they occur (or even before they occur), is more effective than finding issues when a utility bill arrives or when an occupant files a complaint.
Utility bill data can drive billing alerts, but the market is moving towards more interval data-based alerting capabilities.
In addition to interval energy data, trend data from a building automation system (BAS) can drivealerts. Insights from this data, such as current chiller performance, can identify specific equipment problems.
These are a few of the primary use cases for energy data. Firms looking to build an energy management plan should consider data availability and think about the desired use cases and outcomes.
While interval data does provide benefits above and beyond what can be accomplished with utility bills, there are some scenarios in which bills are satisfactory.
Source: Joseph Aamidor’s “GreenBiz 101: Putting your energy data to work” on GreenBiz
Earlier this year, Ellen Williams, the director of ARPA-E, the U.S. Department of Energy’s advanced research program for alternative energy, made headlines when she told the Guardian newspaper that “We have reached some holy grails in batteries.”
Despite very promising results from the 75-odd energy-storage research projects that ARPA-E funds, however, the grail of compact, low-cost energy storage remains elusive.
A number of startups are closer to producing devices that are economical, safe, compact, and energy-dense enough to store energy at a cost of less than $100 a kilowatt-hour. Energy storage at that price would have a galvanic effect, overcoming the problem of powering a 24/7 grid with renewable energy that’s available only when the wind blows or the sun shines, and making electric vehicles lighter and less expensive.
But those batteries are not being commercialized at anywhere near the pace needed to hasten the shift from fossil fuels to renewables. Even Tesla CEO Elon Musk, hardly one to underplay the promise of new technology, has been forced to admit that, for now, the electric-car maker is engaged in a gradual slog of enhancements to its existing lithium-ion batteries, not a big leap forward.
In fact, many researchers believe energy storage will have to take an entirely new chemistry and new physical form, beyond the lithium-ion batteries that over the last decade have shoved aside competing technologies in consumer electronics, electric vehicles, and grid-scale storage systems. In May the DOE held a symposium entitled “Beyond Lithium-Ion.” The fact that it was the ninth annual edition of the event underscored the technological challenges of making that step.
Qichao Hu, the founder of SolidEnergy Systems, has developed a lithium-metal battery (which has a metallic anode, rather than the graphite material used for the anode in traditional lithium-ion batteries) that offers dramatically improved energy density over today’s devices (see “Better Lithium Batteries to Get a Test Flight”). The decade-long process of developing the new system highlighted one of the main hurdles in battery advancement: “In terms of moving from an idea to a product,” says Hu, “it’s hard for batteries, because when you improve one aspect, you compromise other aspects.”
Added to this is the fact that energy storage research has a multiplicity problem: there are so many technologies, from foam batteries to flow batteries to exotic chemistries, that no one clear winner is attracting most of the funding and research activity.
According to a recent analysis of more than $4 billion in investments in energy storage by Lux Research, startups developing “next-generation” batteries—i.e., beyond lithium-ion—averaged just $40 million in funding over eight years. Tesla’s investment in its Gigafactory, which will produce lithium-ion batteries, will total around $5 billion. That huge investment gap is hard to overcome.
“It will cost you $500 million to set up a small manufacturing line and do all the minutiae of research you need to do to make the product,” says Gerd Ceder, a professor of materials science at the University of California, Berkeley, who heads a research group investigating novel battery chemistries. Automakers, he points out, may test new battery systems for years before making a purchase decision. It’s hard to invest $500 million in manufacturing when your company has $5 million in funding a year.
Even if new battery makers manage to bring novel technologies to market, they face a dangerous period of ramping up production and finding buyers. Both Leyden Energy and A123 Systems failed after developing promising new systems, as their cash needs climbed and demand failed to meet expectations. Two other startups, Seeo and Sakti3, were acquired before they reached mass production and significant revenues, for prices below what their early-stage investors probably expected.
Meanwhile, the Big Three battery producers, Samsung, LG, and Panasonic, are less interested in new chemistries and radical departures in battery technology than they are in gradual improvements to their existing products. And innovative battery startups face one major problem they don’t like to mention: lithium-ion batteries, first developed in the late 1970s, keep getting better.
July 12, 2016
A report from the Massachusetts Institute of Technology on the value of energy storage finds that combining certain storage technologies with wind and solar power projects can be economic at current prices in some locations.
“There is a window of opportunity right now with storage costs and wind and solar costs where they are,” says Jessika Trancik, one of the authors of the report and the Atlantic Richfield career development assistant professor of energy studies at MIT.
The study was published in Nature Climate Change and co-authored by graduate students William Braff and Joshua Mueller.
The premise of the study is that wind and solar penetration is currently so small that “they do not measurably contribute to climate change mitigation at current installations levels.”
The authors argue that low cost storage can play a pivotal role in furthering the penetration of those renewable resources by converting intermittent wind and solar power to on-demand power that is economically attractive to investors. But that opportunity for storage technologies, particularly pumped hydro and compressed air, could diminish as renewable energy costs continue to decrease.
Solar and wind power projects are not going to make greater inroads unless individual investors have a reason to invest in them, Trancik said.
A lot of focus has been on the role storage can play in smoothing out the intermittent output of wind and solar power, but what matters to potential investors is the price curve rather than the demand curve, the authors argue.
One of the challenges in trying to make those evaluations is that it has been difficult to compare the costs of different storage technologies in “two dimensions” — that is, both in terms of energy (kWh) and power (kW).
The MIT study, “Value of storage technologies for wind and solar energy,” aims to bridge that gap with cost comparisons across technologies. Trancik said that earlier studies have quantified the benefits of particular storage technologies for given locations and uses, such as frequency regulation, energy arbitrage, converting intermittent renewables into baseload power, and increasing the profits of intermittent renewable energy. They have not, however, shown how the benefits depend on the two dimensions of costs for storage technologies.
Trancik and her colleagues looked at one particular application — bundling storage with solar and wind power projects in order to make those projects more economically attractive. To do that they modeled projects able to arbitrage power prices within the project. In other words, they studied utility-scale solar and wind projects able to store energy at off-peak prices and sell the stored energy at peak prices.
Going into the study they looked at conditions in different locations – California, Texas and Massachusetts — in order to capture the variations in price fluctuations and pricing regimes. They found, however, that if a particular storage solution was economical in one of their location, it would work in another.
Surprisingly, it turned out that despite wide regional variations in the average prices and the amount of variability in demand and pricing, “the best storage technology in one location is also the best in the other,” Trancik said. That was one of the surprises that came out of the study. “We didn’t know that going in,” she said.
Sometimes, however, location can dictate technology. For example, the authors found that in Texas, at today’s prices, pumped hydro systems can provide added value for solar or wind installations. By selling power into the grid at peak prices, the project would exceed the costs of adding storage system, they said.
Trancik admits, though, that there might not be a lot of opportunities for pumped hydro storage in Texas’ relatively flat landscape, but compressed air storage is also a viable low cost technology for those applications. California’s topography, meanwhile, could more readily lend itself to new pumped storage projects.
In locations where storage can increase the value of solar and wind projects, pumped storage and compressed air technologies tended to dominate with their low energy costs and high power costs. The reverse is true of batteries, she said, but batteries have the advantage of being easier to deploy and less dependent on the physical features of the locations, such as high elevation reservoirs for pumped storage or large underground caverns where compressed air can be stored.
While batteries may be better suited for different, case-specific applications such as frequency regulation or backup power, non-battery storage options fared better for the kind of rate arbitrage examined, which requires long discharge times. Trancik said that investors should take advantage of current costs for solar and wind power, as well as falling storage costs to combine the technologies in instances where they can enhance the economics of a project.
Pumped hydro storage (PHS) and compressed air energy storage (CAES) can provide added value to wind and solar projects in Texas at current prices, the MIT researchers found.
By moving now to combine the technologies, investors could spur further cost reductions in storage technologies, the authors say. But if they fail to act, the window could close.
If wind and solar technology costs decline more rapidly than storage costs, it could reach an inflection point for investors where the economics of adding more solar or wind technology to build a larger project could outstrip the economic advantage of adding storage, Trancik said.
At current prices those technologies can work in concert to boost the profitability of a project, but that window will not stay open indefinitely, she said.
A late 2015 report from Lazard came to a similar conclusions. In that report, Lazard looked that levelized cost of energy storage and found that storage could be on the verge of a virtuous cycle in which greater adoption spurs storage costs declines and that, in turn, spurs higher rates of deployment.
At current prices, Lazard also noted that pumped hydro and compressed air storage can be competitive with gas peakers, along with some applications of lithium-ion batteries:
Despite the rosy findings for the economics of non-battery storage, researchers largely agree costs of battery systems will need to decline further before they become truly widespread. A separate report by MIT and Argonne National Laboratory earlier this year found that continued innovation and cost declines for lithium-ion batteries and other electrochemical storage technologies will be necessary to economically justify large-scale deployment in future low-carbon power systems.
In short, “costs have to improve before we see ubiquitous adoption of storage,” Jesse Jenkins, an MIT PhD candidate and one of the authors of that study, said.
June 12, 2016
Batteries capable of storing power at utility scale will be as widespread in 12 years as rooftop solar panels are now, revolutionizing the way consumers use energy.
That’s the the conclusion of Bloomberg New Energy Finance, which forecasts the battery market may be valued at $250 billion or more by 2040. It expects 25 gigawatts of the devices to be deployed by 2028, about the size of the small-scale photovoltaic industry now.
The findings in the researcher’s New Energy Outlook indicate a further challenge to the traditional utility business model, where power generation and distribution are monopolized in a single company. Energy storage devices can be used to smooth out variable power flows from wind and solar plants, reducing the need for large, centralized generation plants fired by fossil fuels.
“Batteries will get a boost as costs drop and developers see the chance for lucrative new revenue streams,” said Julia Attwood, storage analyst at Bloomberg New Energy Finance. “Batteries could offer a whole range of services to the grid — they have the flexibility that will allow renewables a larger stake in energy generation.”
Currently, less than 1 gigawatt of batteries are operating on the grid around the world. By 2040, the industry will mushroom, storing and discharging 759 gigawatt-hours, BNEF estimates.
The spread of electric cars is driving up demand for lithium-ion batteries, the main technology for storage devices that are attached to utility grids and rooftop solar units. That’s allowing manufactures to scale up production and slash costs. BNEF expects the technology to cost $120 a kilowatt-hour by 2030 compared with more than $300 now and $1,000 in 2010.
That would help grid managers solve the intermittency problem that comes with renewables — wind and solar plants don’t work in calm weather or at night, creating a need for baseload supplies to fill the gaps. Today, that’s done by natural gas and coal plants, but the role could eventually be passed to power-storage units.
The researcher estimates 35 percent of all light vehicles sold will be electric in 2040, equivalent to 41 million cars. That’s about 90 times the figure in 2015. Investment in renewables is expected to rise to $7.8 trillion by then, compared with $2.1 trillion going into fossil-fuel generation.
“The battery industry today is driven by consumer products like computers and mobile phones,” said Claire Curry, an analyst at Bloomberg New Energy Finance in New York. “Electric vehicles will be the driver of battery technology change, and that will drive down costs significantly.”
The industry still has a long way to go. About 95 percent of the world’s grid-connected energy storage today is still pumped hydro, according to the U.S. Energy Department. That’s when surplus energy is used to shift large amounts of water uphill to a reservoir so it can be used to produce electricity later at a hydropower plant. The technology only works in areas with specific topographies.
There are several larger-scale battery projects in the works, according to S&P Global. They include a 90-megawatt system in Germany being built by Essen-based STEAG Energy Services GmbH and Edison International’s 100-megawatt facility in Long Beach, California.
“Utility-scale storage is the new emerging market for batteries, kind of where electric vehicles were five years ago,” said Simon Moores, managing director at Benchmark Mineral Intelligence, a battery researcher based in London. “EVs are now coming of age.”
Source: Bloomberg’s “Batteries Storing Power Seen as Big as Rooftop Solar in 12 Years”
WASHINGTON – Today, the Energy Department released the On the Path to SunShot reports, a series of eight research papers examining the state of the U.S. solar energy industry and the progress made to date toward the SunShot Initiative’s goal to make solar energy cost-competitive with other forms of electricity by 2020. The solar industry is currently about 70 percent of the way towards achieving the Initiative’s 2020 goals, but as solar has become more affordable, helping the industry grow by an astonishing 23-fold since the beginning of the Obama Administration, new challenges and opportunities have emerged.
The reports released today explore the lessons learned in the first five years of the ten-year Initiative and identify key research, development, and market opportunities that can help to ensure that solar energy technologies are widely affordable and available to power millions more American homes and businesses.
“Solar energy is an integral part of our nation’s ongoing energy revolution,” said U.S. Secretary of Energy, Dr. Ernest Moniz. “The U.S. has over 10 times more solar installed today compared to 2011 when the SunShot Initiative was first launched, and the overall costs of solar have dropped by 65 percent. The Administration’s continued efforts through the SunShot Initiative will help to further reduce costs to make solar energy more accessible and affordable for American families and businesses.”
Launched in 2011, the SunShot Initiative was created with the goal to reduce the cost of solar energy technologies by 75 percent within a decade across the residential, commercial, and utility-scale sectors. Since then, solar technologies, solar markets, and the solar industry have changed dramatically. The On the Path to SunShot series serves as a follow-up to the 2012 SunShot Vision Study, which analyzed the economic and environmental benefits that would result from achieving SunShot’s ambitious 2020 goal. This new study series explores the areas of focus that could help the United States to achieve cost-competitive solar energy.
Among the conclusions from the study series, a recurring theme emerges that sustained innovation across all levels of the industry—from cell efficiency improvements, to faster and cheaper installation methods— will help to achieve the Energy Department’s SunShot goals.
The On the Path to SunShot series was developed in collaboration with leading researchers from the National Renewable Energy Laboratory, Lawrence Berkeley National Laboratory, Sandia National Laboratories, and Argonne National Laboratory.
New reports from Navigant Research and Grand View Research raise two interesting questions: What is the relationship between energy management systems and building energy management systems (EMS and BEMS)? How are the definitions of these two terms changing as technology evolves and energy efficiency becomes a priority?
The Navigant research predicts that the global building energy management systems (BEMS) market will grow from $2.8 billion last year to $10.8 billion in 2024. The study clearly points to the fact that buildings – now “intelligent buildings” — are in the middle of great changes in telecom, information technology and energy generation.
The nature of the technology is that buildings are networked to create efficiencies. “Intelligent buildings become nodes in the Energy Cloud,” Navigant Senior Research Analyst Casey Talon told Energy Manager Today. “In other words, these facilities can be assets to balance grid pressures and sites for renewables and distributed energy resources to expand customer choice. The capacity to manage energy consumption in a dynamic way with automation, controls, and analytics is fundamental to the concept of the intelligent building, and this functionality is exactly what helps the intelligent building interact with the Energy Cloud.”
The Grand View Research Grand View Research took a broader view. It looked at the overall energy management sector. The report defines EMS to include power and energy, telecom and IT, manufacturing, retail and office, healthcare and other categories, according to the press release.
The research found that the global market will reach $58.59 billion by 2020. International standardization and global initiatives will be key drivers, the report found. The largest product segment is industry energy management systems (IEMS), which generated 60 percent of the overall category’s revenues last year.
A tremendous amount has been written during the past couple of years on BEMS and EMS. As time passes, it will become important for the vast ecosystem to understand how the two fit together. This will be especially vital from the marketing and technical perspectives.
An obvious goal of EMS and BEMS is more efficient energy use. However, things can get a bit tricky. Creating an energy-neutral building is difficult, but relatively straight forward. Depending upon the precise definition used, a building that meets certain goals – usually generating as much or more energy than it uses during given period of time – meets the objective.
However, to really get a picture of the impact that building is having, wouldn’t it be just as relevant to consider a broader set of variables? For instance, if the building is constructed in a rural area not served by mass transportation, it is fair to assume that more energy would expanded in workers commuting to work than if the building is closer to population centers or near a railroad station.
That context is one of the ideas behind The Sustainable Communities and Climate Protection Act of 2008. The Institute for Local Government says that the California law, which also is known as SB 375…
…builds on the existing framework of regional planning to tie together the regional allocation of housing needs and regional transportation planning in an effort to reduce greenhouse gas (GHG) emissions from motor vehicle trips.
The law assumes that where the building is (in relation to transportation) is important. Thus, the determination of how energy efficient the building really is goes beyond the energy use of the building itself. In other words, it is technically correct but misleading to say that a building that generates as much as energy as it uses is energy efficient if it requires hundreds of people to drive commute to it via car every weekday.
That’s important. If energy efficiency is measured on a community-wide scale, the way in which managers approach things will be different. In such a context, it may not be necessary for each building in a community to generate as much energy as it uses. It may only matter what the numbers are on the broader communitywide scale.
Another way to phrase the question at the top of this post is this: How does adding tremendous connectivity and communications capabilities to the overall power grid change the role of an individual building in the context of its community?
EMS and BMS will infrastructure will be used to create and operate smart cities. The bottom line is that building managers and owners will need to understand where buildings fit in the big picture: How are the management capabilities and analytic data derived created by sensors and other equipment in the building relevant outside the structure? A good first step to answering this question is clearly defining the limits of EMS and BEMS.