Applied Energy 206 (2017) 193–205
Contents lists available at ScienceDirect
Applied Energy journal homepage: www.elsevier.com/locate/apenergy
An energy performance evaluation methodology for individual office building with dynamic energy benchmarks using limited information
MARK
⁎
Jiangyan Liua, Huanxin Chena, , Jiahui Liua, Zhengfei Lia, Ronggeng Huanga, Lu Xinga, Jiangyu Wanga, Guannan Lia,b a b
Department of Refrigeration and Cryogenics, Huazhong University of Science and Technology, Wuhan, China School of Urban Construction, Wuhan University of Science and Technology, Wuhan, China
H I G H L I G H T S study developed dynamic energy benchmarks for individual office building. • This information was used to establish the energy benchmarks. • Less energy benchmarks were established according to four power consumption patterns. • Four analysis was conducted between energy baseline and dynamic energy benchmarks. • Comparative • The evaluation results of energy baseline were improved using the proposed dynamic energy benchmarks.
A R T I C L E I N F O
A B S T R A C T
Keywords: Dynamic energy benchmarks Energy consumption pattern Energy consumption rating system Information poor buildings
A rational and reliable energy benchmark is useful for understanding and enhancing building performance while most buildings cannot provide sufficient information for a detailed energy assessment. This work presents a systematic methodology of developing dynamic energy benchmarks for individual office building with very limited information. Simultaneously, an energy consumption rating (ECR) system is established to provide vertical energy assessment for individual office building in a short time span, i.e. hourly. Based on the data produced by DOE prototype large office building model performed in the EnergyPlus environment, this study is conducted in three steps: (1) Step 1: Data preparation; (2) Step 2: Development of the dynamic energy benchmarks; and (3) Step 3: Evaluation of the dynamic energy benchmarks and ECR system. Based on the decision tree analysis, the system energy consumption is classified into eight patterns by few commonly accessible weather and time variables, i.e. outdoor dry-bulb temperature, relative humidity, day type and time type. Then, four energy benchmarks are developed according to four energy consumption patterns on weekdays. To verify the effectiveness of the proposed dynamic energy benchmarks, it is used to evaluate the building energy performance on September, October and November, respectively. Besides, comparative analysis is conducted between the energy baseline (i.e. the same benchmark is used for all energy consumption patterns) and proposed dynamic energy benchmarks. Accordingly, the hourly ECRs were calculated using energy baseline and proposed dynamic energy benchmarks, respectively. Results showed that the energy baseline can be improved by using the proposed dynamic energy benchmarks. And the proposed method is capable of evaluating the energy performance of information poor office buildings.
1. Introduction Building sector occupies the lion’s share of both energy and resources. Currently, the building sector constitutes about 40% of total energy consumption world-wide as well as 30% of global greenhouse gas emissions [1]. Previous investigation demonstrated that typical buildings consume 20% more energy than required due to inefficient ⁎
Corresponding author. E-mail address:
[email protected] (H. Chen).
http://dx.doi.org/10.1016/j.apenergy.2017.08.153 Received 22 March 2017; Received in revised form 27 July 2017; Accepted 13 August 2017 0306-2619/ © 2017 Elsevier Ltd. All rights reserved.
operation procedures, non-optimal control schedules and unnoticed faults [2]. In recent years, building energy benchmarking has gradually become an useful technique because it can assess systematic energy behavior and help operation personnel to identify unnormal energy use and inefficient operation state. It is defined as a macroscopic level of performance evaluation, using metrics to measure the building energy performance relative to its previous performance or other typical
Applied Energy 206 (2017) 193–205
J. Liu et al.
buildings in many countries. Generally, the OR is employed to give a horizontal energy evaluation for buildings which have different energy performance, as it compares the energy consumption of a given building to a typical building. There is less discussion about developing a similar rating system for individual building, which can provide a vertical energy evaluation by compare the current energy consumption to previous typical energy consumption. Since the energy performance of building is very changeable due to shifty weather and internal instability factors, it is difficult to judge the reasons which cause drastic energy consumption variations for normal factors (e.g. shifty weather and occupant behavior) or faults. Hence, a reasonable and reliable energy benchmark is necessary for evaluating the energy performance of individual building. In addition, the rating system is a promising tool to provide a short-term energy assessment (e.g. daily and hourly) for individual building. Most importantly, the energy benchmark should be established with less information in order to achieving positive generalization that could be used for most buildings. According to above analysis, a knowledge gap has been identified: the individual building need proper energy benchmark for detailed energy performance evaluation while few buildings can provide sufficient information. Therefore, this paper proposed dynamic energy benchmarks and energy consumption rating (ECR) system to provide vertical energy evaluation for individual office building specially for information poor building in a short time span, i.e. hourly. The remainder of this paper is organized as follows. In Section 2, the framework of the proposed methodology is presented and each phase of the proposed method is introduced step by step. In Section 3, the energy performance evaluation results are analyzed and the comparative analysis results are presented. Conclusive remarks are given in the final section.
buildings [3]. There is a large volume of published studies describing the role of energy benchmarking on building energy performance evaluation [4,5]. Tronchin and Fabbri [6] used three different simulation methods (i.e. operational rating based on energy bills, dynamic simulation with the DesignBuilder software, and simplified simulation with the BestClass software) to analysis the energy performance of a single house in Italy. Florio and Teissier [7] employed a typology-based model to estimate the EPCs of a housing stock using insufficient energy use data. Menezes et al. [8] presented a case study about how the lighting, small power and catering equipment impact the electricity prediction accuracy. Kabak et al. [9] examined a “fuzzy multi-criteria decision making” approach to analyze the National Building Energy Performance Calculation Methodology in Turkey. Koo and Hong [10] developed a dynamic operational rating (DOR) system for existing buildings based on geostatistical approach and data-mining technique. It was proposed to solve the irrationality of the conventional operational rating system (i.e. the negative correlation between the space unit size and the CO2 emission density). Park et al. [11] presented an energy benchmark for improving the operational rating system of office buildings based on various data-mining techniques. Jeong et al. [12–14] established an energy benchmark to evaluate the energy efficiency of residential buildings in Korea. The proposed method was more reasonable than the original benchmarks as it solved the irrationality of the original benchmarks from overall database. Nevertheless, efforts have been made to provide multi-level benchmarks from building level to system level, subsystem level and/or component level. Yan et al. [15,16] proposed a simplified monthly energy performance calculation method based on basic energy balances for information poor building. It can provide energy performance data of a building at multiple levels. Wang et al. [17] presented a detailed multi-level energy diagnosis method to identify poor energy performance of a building, which can provide weekly, daily and hourly diagnoses at the building level. In addition, in the past two decades, many countries and institutions focused on assessing the energy performance of buildings by developing an operational rating (OR) system, such as Display Energy Certificates (DECs) of UK [18], Energy performance certificates (EPCs) of European Union [19], Energy Star of the US Environment Protection Agency [20], and Building Energy Quotient of ASHRAE [21]. These OR systems compare the actual energy consumption of buildings with that of typical building which can be referred to as an energy benchmark, and then evaluate the energy performance of buildings by calculating the ORs according to specific energy benchmarks. The OR is a numeric indicator of the amount of annual energy consumption, which can provide quantitative assessment on building energy performance by classifying the energy consumption into several grades. It evaluates the energy performance of building by comparing with other buildings which have similar category and located in the same climate region [22]. In a nutshell, previous studies on building energy benchmarking usually stick to the building energy performance in a relatively longtime span, e.g. annual (365 days) benchmarking [22] or monthly benchmarking [15]. However, the energy performance of every building is invariably changeable due to shifty weather as well as internal instability factors (e.g. occupant behavior, electric equipment operation), it is improper to evaluating the energy performance of individual building only using an annual average or monthly average. In addition, a detailed multi-level energy benchmark for building is very useful while it usually requires comprehensive information for model development such as sub-metering data and building design data [17]. However, most existing buildings are information poor buildings in which very few sub-meters are installed, especially for auxiliary equipment such as fans, lift and lighting [23]. It is also difficult and time-consuming to obtain detailed building design data in some historical buildings. On the other hand, it is clear that the OR system is widely used to provide a reliable and fair energy assessment for
2. Methodology The framework of the proposed methodology is illustrated in Fig. 1. It consists of three steps: (1) Step 1: data preparation. The building energy data was collected from DOE prototype large office building model and the initial database was processed using different methods. (2) Step 2: Development of the dynamic energy benchmarks. The processed database was classified into proper clusters using decision tree. Then, the dynamic energy benchmarks of different energy consumption patterns were established and validated. (3) Step 3: Evaluation of the dynamic energy benchmarks and ECR system. The energy evaluation results of three months (i.e. September, October and November) using dynamic energy benchmarks and ECR system were analyzed. Comparative analysis was conducted between energy baseline and proposed dynamic energy benchmark. 2.1. Step 1: Data preparation 2.1.1. Step 1.1: Data collection To establish the database for developing the dynamic energy benchmark of the office building, EnergyPlus [24] is used as the simulation program to produce data in this work. In addition, we deployed a prototype large office building model developed by the Department of Energy (DOE) of the U.S., since the repository of DOE covers building types that directly characterize more than 80% of commercial buildings [25]. Moreover, there are 17 representative cities of the U.S. for selection, which stand for all possible climate locations according to the American National Standards Institute (ANSI) and the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) distinction of climate zones [25]. In this work, we selected Miami as the representative city, which represents ASHRAE climate zone 1A. It has a relative long air-conditioning season from March to November, while its wet and hot season usually begins during the month of May and continues through mid-October. The DOE prototype large office building has 12 storeys with a 194
Applied Energy 206 (2017) 193–205
J. Liu et al.
Fig. 1. Research framework.
analysis process.
2.1.2. Step 1.2: Data processing One aim of this step is to selecting proper features for model development. EnergyPlus has the advantage of reporting a variety of input and output variables, while the real building management system (BMS) typically record a limited range of variables. Particularly, there are less information records in some information poor buildings. Hence, in order to generate a realistic synthetic database, the most commonly measured variables in actual office buildings are chosen as input variables and are grouped according to weather data and smart meter records. In previous studies, Liang et al. [26] argue that the time of week and outdoor air temperature were crucial to the energy consumption prediction in commercial office buildings. Zhao et al. [27] have demonstrated the importance of the outdoor dry-bulb temperature, day type and time type on energy performance evaluation of VRV system in office buildings. In addition, studies of Sun et al. [28] suggest that variables of outdoor temperature and relative humidity can improve the prediction accuracy of the office building energy consumption. In general, previous studies had emphasized the importance of outdoor temperature, relative humidity and time information on the office building energy performance evaluation. On the other hand, these variables are always available information in the real building BMS databases even in information poor buildings. Therefore, according to above analysis, outdoor dry-bulb temperature, relative humidity, day type and time type are selected to be input variables in this work. Note that the day type denotes the type of day for “weekday” or “weekend”, and the time type denotes the count per hour, i.e. 0, 1, 2, … , 23. In addition, the total electricity consumption of chillers and pumps are selected as the energy consumption data, which is also easy to obtain using the sub-meters in real buildings. The other aim of this step is to improve the quality of the database. In this work, the interquartile range rule [29] was used to remove outliers of the database. The method provides both lower threshold (LT) and upper threshold (UT) for eliminating outliers as shown in Eqs. (1) and (2):
Fig. 2. Large office typology.
basement and served by a typical chilled water variable air volume (VAV) air-conditioning system. The total building area is 46320 m2. There are four perimeter zones and one core zone in each floor. The window to wall ratio is 0.38 for all four orientations. Fig. 2 shows the topology of the large office building. For the internal heat source, the lighting power density and electric equipment power density are both 10.76 W/m2. Besides, the occupant density of the basement and each thermal zone of the storey are 37.16 m2/person and 18.58 m2/person, respectively. Fig. 3 shows the schedules of occupancy, lighting and internal equipment in the simulation. In addition, in the air side, there are totally four multi-zone VAV systems, with four variable volume fans. In the water side, there are two water-cooled screw chillers with two variable speed pumps in the chilled water loop and a constant speed pump in the cooling water loop. Moreover, according to the airconditioning season aforementioned, we specified the simulation period of the EnergyPlus model as March–November. The generated results of the simulation form the synthetic database that was utilized for the data
LT = Q1−1.5 × (Q3−Q1) 195
(1)
Applied Energy 206 (2017) 193–205
J. Liu et al.
Fig. 3. Simulated large office building design day schedules and outdoor air dry-bulb temperature.
Fig. 4. Energy consumption rating (ECR) system for vertical energy evaluation.
Fig. 5. Energy consumption characteristics from March to November.
UT = Q3 + 1.5 × (Q3−Q1)
(2)
2.2. Step 2: Development of the dynamic energy benchmarks
where Q1 and Q3 are the first quartile and the third quartile, respectively.
2.2.1. Step 2.1: Classification of the energy consumption patterns using decision tree In this work, we employed the decision tree to classify the initial energy consumption data based on different input variables. Decision tree is a popular data-mining algorithm which can be used for both 196
Applied Energy 206 (2017) 193–205
J. Liu et al.
Fig. 6. Results of the conditional inference tree analysis.
Table 1 Results of ANOVA analysis. Pattern
No. of cases
1 2 3 4
Mean
468 551 1275 643
Std
278.552 422.101 488.377 590.199
95% Confidence level
52.360 60.135 75.377 40.153
Min
Max
137.070 262.310 295.530 489.480
411.840 579.910 658.650 691.640
F
Sig.
2451.000
0.000***
*** The mean difference among patterns is significant at the 0.05 level.
decision tree algorithms such as ID3, C4.5, CART and conditional inference tree. The ID3 employs information gain as splitting criterion [30]. It is efficient on dealing with discrete data but cannot deal with numeric attributes. The C4.5 is the promotion of the ID3, it is capable of handling both numeric and categorical data [31]. The CART is the most popular decision tree method which employs Gini gain or towing criteria as splitting criteria [32]. Hothorn et al. [33] proposed the conditional reference tree which applies the statistical test procedures to both variable selection and stopping. It is capable of handle all kinds of regression problems, such as nominal, ordinal, numeric, multivariate response variables and arbitrary measurement scales of the covariates. Previous studies have reported that the conditional inference tree is beneficial to simultaneously handle various data type (i.e. numeric, factor, etc.) of buildings as well as HVAC systems [34,35]. Therefore, the conditional inference tree was used for energy consumption classification in this work. As a result, the initial energy consumption data were classified into eight clusters. Considering the unsteady energy consumption on weekends and zero energy uses at night, four energy consumption patterns of the weekday were selected to establish the energy benchmarks. The results would be discussed in detail in Section 3.1.1.
Table 2 Results of Post hoc analysis between pattern 1 and 4. Compared patterns
M.D
Pattern 1
Pattern 2 Pattern 3 Pattern 4
Pattern 2
Sig
95% confidence level Min
Max
−148.549 −214.825 −316.647
***
0.000 0.000*** 0.000***
−159.973 −224.647 −327.689
−137.125 −205.004 −305.605
Pattern 1 Pattern 3 Pattern 4
148.549 −66.276 −168.098
0.000*** 0.000*** 0.000***
137.125 −75.541 −178.648
159.973 −57.012 −157.549
Pattern 3
Pattern 1 Pattern 2 Pattern 4
214.825 66.276 −101.822
0.000*** 0.000*** 0.000***
205.004 57.012 −110.612
224.647 75.541 −93.032
Pattern 4
Pattern 1 Pattern 2 Pattern 3
316.647 168.098 101.822
0.000*** 0.000*** 0.000***
305.605 157.549 93.032
327.689 178.648 110.612
*** The mean difference of two patterns is significant at the 0.05 level.
regression and classification. In comparison to other “black-box” likely methods such as support vector machine and artificial neural network, the decision tree is self-explanatory with a recursive tree construction that can provide accessible partition rules in split processes. Useful domain knowledge can be extracted from the trees. There are various 197
Applied Energy 206 (2017) 193–205
J. Liu et al.
Fig. 7. Cumulative distribution curve of the power consumption for different patterns.
Fig. 8. Kernel density estimation and frequency histogram of the system power at pattern 1.
the differences among group means and their associated procedures. It provides a statistical test of whether or not the means of several groups are equal [36]. In this work, the ANOVA method can be used to test if there are significant differences in the dependent variable (i.e. the energy consumption) among different energy consumption patterns classified by the decision tree. Post-hoc test is a statistical method that can provide specific pairwise comparison among groups [37]. It is usually used with ANOVA. In the ANOVA analysis, the one significant factor indicated that one or more groups have different distribution. However, in order to ensure the significant difference of each two groups, multiple comparisons using the Post-hoc test must be implemented based on the ANOVA analysis results [38]. There are various algorithms to realize the Posthoc test including Benjamini–Hochberg test [39], Sidák's inequality test [40], Tukey’s test [41] and Scheffé’s test [42]. The Scheffé’s test can be applied to the set of estimates of all possible contrasts among groups, not just the pairwise differences. It is flexible and can be used to test unequal groups [12,42]. Hence, the Scheffé’s test was employed to provide multiple comparisons among different energy consumption patterns in this work. After the validation of decision tree classification results, the energy
Table 3 Energy benchmarks of different power consumption patterns. Power consumption pattern
Conditions
1
Outdoor temperature ≤ 25.46 °C Relative humidity ≤ 71.83% Outdoor temperature ≤ 25.46 °C Relative humidity > 71.83% 25.46 °C ≤ Outdoor temperature ≤ 28.90 °C Outdoor temperature > 28.90 °C
2 3 4
Energy benchmark (kW h) Baseline system
Proposed system
457.10
272.10 427.20 497.40 592.60
2.2.2. Step 2.2: Establishment and validation of the dynamic energy benchmarks In order to validate the classification results of decision tree analysis, we employed both analysis of variance (ANOVA) method and Post hoc test to provide a statistical identification. ANOVA is a collection of statistical models that was used to analyze
198
Applied Energy 206 (2017) 193–205
J. Liu et al.
Fig. 9. ECRs distribution by pattern on September.
on a scale from A to G, where A is the most energy efficient and G is the least energy efficient. The OR is compared to a hypothetical building with performance equal to on typical performance of its type (the benchmark). The typical performance for that type building would have an OR of 100. Moreover, the asset ratings of DEC are on a scale of 0–150, where 0 is the most energy efficient building and 150 is the least energy efficient building [43]. The OR of DEC is used for reasonably evaluating the energy performance of existing public buildings [22,43–47]. However, it is usually employed to give a horizontal energy evaluation for buildings which have different energy performance, as it compares the energy consumption of a giving building to a typical building. In this work, the dynamic energy benchmarks are developed to provide vertical energy evaluation for an individual building. It compares the current energy consumption to previous typical energy consumption. Therefore, we
benchmarks would be established according to different energy consumption patterns. A representative value (e.g. mean value, median value) would be defined as the energy benchmark that outlines an average level of energy consumption for all cases in each pattern. The results will be described in detail in Section 3.1.2. 2.3. Step 3: Evaluation of the dynamic energy benchmarks and ECR system 2.3.1. Step 3.1: Calculation of the ECR based on the proposed benchmark In step 2, the dynamic energy benchmarks are established and validated. In order to quantificationally evaluate the building energy performance, it is necessary to develop an energy rating system. In the DECs of UK, the Operational Rating (OR) of a building is required to display, which is a numeric indicator of the actual annual amount energy consumed during the occupation of the building [22]. It is shown 199
Applied Energy 206 (2017) 193–205
J. Liu et al.
Table 4 Comparative analysis results of the energy baseline and proposed dynamic energy benchmarks on September. Energy consumption pattern
1
2
3
4
Grade
Energy baseline
Proposed benchmark
No. of cases
Ratio (%)
Energy conservation/penalty Ratio (%)
No. of cases
Ratio (%)
Energy conservation/penalty Ratio (%)
A B C D E F G
0 0 2 0 0 0 0
0.00 0.00 100.00 0.00 0.00 0.00 0.00
100.00
0 0 0 1 1 0 0
0.00 0.00 0.00 50.00 50.00 0.00 0.00
50.00
A B C D E F G
0 0 1 55 19 0 0
0.00 0.00 1.33 73.33 25.33 0.00 0.00
74.67
0 0 1 33 41 0 0
0.00 0.00 1.33 44.00 54.67 0.00 0.00
45.33
A B C D E F G
0 0 0 21 120 6 0
0.00 0.00 0.00 14.29 81.63 4.08 0.00
14.29
0 0 0 74 72 1 0
0.00 0.00 0.00 50.34 48.98 0.68 0.00
50.34
A B C D E F G
0 0 0 0 28 84 0
0.00 0.00 0.00 0.00 25.00 75.00 0.00
0.00
0 0 0 55 57 0 0
0.00 0.00 0.00 49.11 50.89 0.00 0.00
49.11
0.00
25.33
85.71
100.00
1 × 100 Energy Benchmark
54.67
49.66
50.89
3. Results
established an energy consumption rating (ECR) system referred to the OR system of DEC. It provides quantitative building energy assessment in a short time span, i.e. hourly. In addition, seven grades (i.e. A–G) are assigned according to the OR of DEC (refer to Fig. 4). The energy consumption rating (ECR) is calculated by two steps: First, the energy benchmarks were obtained according to different clusters based on decision tree analysis. Then, the operational ratings were calculated using the energy benchmark by Eq. (3).
ECRactual = ECactual ×
50.00
In this work, we selected the data from March to November to develop and evaluate the proposed dynamic energy benchmarks as well as ECR system. The selected months coved the transition season and airconditioning season as shown in Fig. 5. Specifically, data of March–August was used to train the decision tree and obtain dynamic energy benchmarks, while data of September, October and November were employed to evaluate the established benchmarks, respectively.
(3) 3.1. Development of the dynamic energy benchmarks
where ECRactual is the energy consumption rating of the building in a given condition, ECactual is the energy consumption of the building in a given condition, and Energy Benchmark is the median value of the related cluster. As shown in Fig. 4, if the ECR of the building in a given condition is 100, it indicates that the energy performance in the given condition is the same as the energy benchmark. Besides, the ECRs are defined as incentive ratings when they are less than the typical rating (i.e. 100). Inversely, the ECRs that larger than the typical rating are deemed as penalty ratings.
3.1.1. Classification of the energy consumption pattern using decision tree The conditional inference tree was used to identify energy consumption patterns of office building. As shown in Fig. 6, the initial data set was classified into eight clusters based on four criteria (i.e. outdoor dry-bulb temperature, relative humidity, day type and time type). First, the data at time of 0, 1, 2, 3, 4, 21, 22, 23 were separated from the database, showing zero energy consumption. It is obvious that the system would be shut down at night for the office building which led to zero-energy uses. Second, the remained data was divided based on the criteria of “day type”. It indicated that the energy consumption patterns of weekday are similar to each other, while the energy consumption patterns in weekday, Saturday and Sunday are very different from each other. Third, for the weekday cluster (i.e. node 4 as shown in Fig. 6), the data was separated according to the criterion of outdoor dry-bulb temperature. It was found that the energy consumption increased as the outdoor dry-bulb temperature raised. Then, the data of node 5 was classified into two clusters according to “relative humidity”, which resulted in two different energy consumption patterns as shown in Fig. 6. It indicated that the power use increased as the relative humidity raised. This is because the higher relative humidity would lead to higher enthalpy difference between indoor and outdoor when the drybulb temperature is relative steady. Hence, more power would be
2.3.2. Step 3.2: Comparative analysis of the energy baseline and proposed energy benchmarks To test the effectiveness of dynamic energy benchmarks, comparative analysis was conducted between it and the energy baseline. In general, the energy baseline uses the same energy benchmark for all energy consumption patterns which was calculated by taking the average of energy consumption in the operating period. The proposed energy benchmark used different energy benchmarks, it was calculated according to different energy consumption patterns. Accordingly, the hourly ECRs were calculated to give quantitative energy assessment using energy baseline and proposed dynamic energy benchmarks, respectively. The comparative results would be discussed in detail in Section 3.2. 200
Applied Energy 206 (2017) 193–205
J. Liu et al.
Fig. 10. ECRs distribution by pattern on October.
uses at night, we developed the energy benchmarks using the data of four energy consumption patterns on weekdays. Hence, four energy benchmarks would be established based on four energy consumption patterns in this work.
consumed to cool down the room. In addition, the data of node 8 was separated into two clusters according to “outdoor dry-bulb temperature”. For the weekend cluster (i.e. node 11 as shown in Fig. 6), the data was separated according to the criteria of “time type” at first for the energy consumption at time of 17 pm. and 18 pm. were very low and different from other time. Then, the remained data was divided based on “outdoor dry-bulb temperature”. However, the energy consumption pattern of node 14 and node 15 were similar in some cases as shown in Fig. 6. This is because the energy consumption of the system was unsteady due to irregular occupant activity as well as equipment uses in the weekends. Therefore, four energy consumption patterns were classified by the conditional inference tree on weekdays as shown in Fig. 6. Considering the irregularity of energy consumption on weekend and zero energy
3.1.2. Validation of the classification results The decision tree results showed that there were four energy consumption patterns on weekdays, indicating that the power uses of four patterns were significantly different from each other. To validate the classification results, we employed both analysis of variance (ANOVA) and Post hoc test to provide a statistical identification. First, The ANOVA method was used to analyze the differences among the means of four energy consumption patterns. Table 1 shows the ANOVA analysis results, it was found that pattern 1–4 on all 201
Applied Energy 206 (2017) 193–205
J. Liu et al.
Table 5 Comparative analysis results of the energy baseline and proposed dynamic energy benchmarks on October. Energy consumption pattern
1
2
3
4
Grade
Energy baseline
Proposed benchmark
No. of cases
Ratio (%)
Energy conservation/penalty Ratio (%)
No. of cases
Ratio (%)
Energy conservation/penalty Ratio (%)
A B C D E F G
0 1 32 1 0 0 0
0.00 2.94 94.12 2.94 0.00 0.00 0.00
100.00
0 0 0 17 16 1 0
0.00 0.00 0.00 50.00 47.06 2.94 0.00
50.00
A B C D E F G
0 2 18 49 23 0 0
0.00 2.17 19.57 53.26 25.00 0.00 0.00
75.00
0 0 15 28 49 0 0
0.00 0.00 16.30 30.43 53.26 0.00 0.00
46.74
A B C D E F G
0 0 8 39 88 14 0
0.00 0.00 5.37 26.17 59.06 9.40 0.00
31.54
0 0 20 49 80 0 0
0.00 0.00 13.42 32.89 53.69 0.00 0.00
46.31
A B C D E F G
0 0 0 0 15 30 0
0.00 0.00 0.00 0.00 33.33 66.67 0.00
0.00
0 0 0 23 22 0 0
0.00 0.00 0.00 51.11 48.89 0.00 0.00
51.11
0.00
25.00
68.46
100.00
50.00
53.26
53.69
48.89
pattern of DT analysis, which represents the median value of each pattern.
evaluation criteria shown statistical significance, with p-values below 0.05 (ANOVA, p < 0.05). Results indicated that there were significant differences in the energy consumption among four classified patterns of the decision tree. Second, multiple comparisons using the Post-hoc test were implemented based on the ANOVA analysis results. As shown in Table 2, the mean differences of each two patterns were significant, with p-values below 0.05 (Scheffé’s test, p < 0.05). Results indicated that the distribution of each pattern was independent. In addition, Fig. 7 illustrates the cumulative distribution curve of four energy consumption patterns on weekdays. It was found that the median value of each cluster is different, indicating that it is reasonable and reliable to develop energy benchmarks based on four classified energy consumption patterns of the decision tree.
3.2. Evaluation of the dynamic energy benchmark and ECR system This work proposed the dynamic energy benchmarks for information poor office buildings. To verify the effectiveness of the dynamic energy benchmarks, it was used to evaluate the building energy performance on September, October and November, respectively. In addition, comparative analysis was conducted between the energy baseline and proposed dynamic energy benchmarks. Accordingly, the hourly system ECRs were calculated using the energy baseline and proposed dynamic energy benchmarks, respectively. The detailed analysis results are as follows. Fig. 9 and Table 4 show the comparative results of the energy baseline and proposed dynamic energy benchmark on September. When the energy performance of the office building was evaluated using the energy baseline, the energy consumption of the system at Pattern 1 and Pattern 2 tended to have low energy consumption grades, whereas the energy consumption of the system at Pattern 3 and Pattern 4 tended to have high energy consumption grades, showing disparity in the system ECRs of previous months. However, when the energy performance of the office building was evaluated using the proposed energy benchmarks, it tended to have even ECRs at each energy consumption pattern, showing improvement in ECRs. As shown in Table 4, it was found that 100% and 74.67% of the energy consumption at Pattern 1 and Pattern 2 were lower than the energy baseline, whereas majority of the energy consumption at Pattern 3 (85.71%) and Pattern 4 (100%) were higher than the baseline. This indicates that there would be severe disparity in the ECRs if the energy baseline is used to evaluate the energy performance of office buildings. Inversely, when using the proposed dynamic energy benchmarks, the system had ECRs below the proposed benchmark account for 50.00%, 45.33%, 50.34% and 49.11% of the total energy consumption at Pattern 1, 2, 3 and 4, respectively.
3.1.3. Establishment of the new energy benchmarks Since the decision tree analysis results have been validated using ANOVA analysis, Post-hoc test and cumulative distribution curve visualization, the four classified energy consumption patterns were employed to establish the dynamic energy benchmarks. The energy benchmark is a representative value that outlines an average level of energy consumption for all cases in each pattern. Hence, it is reasonable to get a value that can properly divide the data of the upper and lower part into 50% of the total. The mean value and median value of the sample are common indexes to represent the average level. However, the mean value is easily affected by the outliers of the sample. Nevertheless, the when the mean value is employed to sample whose distribution is skewed, the ratio of the upper and lower cases may not be equal [12,13,48]. Fig. 8 shows the energy consumption distribution of pattern 1. It was found that if the mean value is applied as the energy benchmark, 56.41% of the total cases would be grouped into the lower part. While the upper and lower cases would be distributed evenly when the median value is used. Hence, the median value is determined as the energy benchmark in this work. Table 3 shows the developed dynamic energy benchmarks of each identified energy consumption 202
Applied Energy 206 (2017) 193–205
J. Liu et al.
Fig. 11. ECRs distribution by pattern on November.
benchmarks showed that the ECRs at each energy consumption pattern tended to be even compared to the energy baseline. For example, as shown in Table 6, when the system had an energy consumption below the energy baseline account for a significantly high percentage of the total energy consumption at Pattern 1 and Pattern 2, i.e. 100% and 67.42%, respectively, such percentage decreased to 49.98% and 51.69% when the proposed energy benchmarks were used. Similarly, when the system had an energy consumption below the baseline account for 39.98% of the total energy consumption at Pattern 3, such percentage adjusted to 53.39% when the proposed energy benchmarks were used. Therefore, it was confirmed that the use of proposed energy benchmark can resolve the disparity in the ECRs according to different energy consumption patterns for office buildings.
This indicates that the ECRs at each energy consumption pattern have been evenly distributed, which gave rise to the improvement of the energy baseline evaluation results. Fig. 10 and Table 5 show the comparative results of the energy baseline and proposed dynamic energy benchmarks on October. When the energy performance of the office building was evaluated using the energy baseline, the energy consumption at low outdoor dry-bulb temperatures (i.e. Pattern 1 and Pattern 2) tended to lower than the baseline while the system tended to have high ECRs at high outdoor dry-bulb temperatures (Pattern 3 and Pattern 4). The energy consumption below the baseline account for 100%, 75.00%, 31.54% and 0% of the total energy consumption at Pattern 1, Pattern 2, Pattern 3 and Pattern 4, respectively. However, the system tended to have more even ECRs at each energy consumption pattern when using the proposed energy benchmark, since approximately 50% of the energy consumption of the system at each pattern was less than the proposed energy benchmark as shown in Table 5. It implies that the proposed dynamic energy benchmarks could better identify the energy performance of the office building compared to the energy baseline. Fig. 11 and Table 6 show the comparative results of the energy baseline and proposed dynamic benchmark on November. Since the weather becomes cool in this month and the outdoor temperature is lower than 28.90 °C, the system would not operate in Patterns 4. The energy performance evaluation results using proposed energy
4. Conclusions To conclude, this paper presents a systematic methodology of developing dynamic energy benchmark and ECR system for individual office buildings. The method addressed the major challenge of vertical energy evaluation for individual office building where very limited information data is available. Dynamic energy benchmarks are proposed to evaluate the changeable energy performance of office buildings. In addition, the energy consumption rating (ECR) system is established to provide quantitative energy assessment in a short time 203
Applied Energy 206 (2017) 193–205
J. Liu et al.
Table 6 Comparative analysis results of the energy baseline and proposed dynamic energy benchmarks on November. Energy consumption pattern
1
2
3
Grade
Energy baseline
Proposed benchmark
No. of cases
Ratio (%)
Energy conservation/penalty Ratio (%)
No. of cases
Ratio (%)
Energy conservation/penalty Ratio (%)
A B C D E F G
0 11 76 11 0 0 0
0.00 11.22 77.55 11.22 0.00 0.00 0.00
100.00
0 0 2 46 38 12 0
0.00 0.00 2.04 46.94 38.78 12.24 0.00
48.98
A B C D E F G
0 1 7 52 29 0 0
0.00 1.12 7.87 58.43 32.58 0.00 0.00
67.42
0 0 7 39 42 1 0
0.00 0.00 7.87 43.82 47.19 1.12 0.00
51.69
A B C D E F G
0 0 0 46 52 20 0
0.00 0.00 0.00 38.98 44.07 16.95 0.00
38.98
0 0 0 63 55 0 0
0.00 0.00 0.00 53.39 46.61 0.00 0.00
53.39
0.00
32.58
61.02
51.02
48.31
46.61
51328602), founded by Graduates' Innovation Fund, Huazhong University of Science and Technology (Project 5003120005).
span, i.e. hourly. Comparative analysis between proposed dynamic energy benchmark and energy baseline is conducted. Results show that the energy baseline can be improved by using the proposed dynamic energy benchmarks. And the proposed method is capable of evaluating the energy performance of information poor office buildings. The contribution of this work can be summarized into three aspects. First, the proposed method fills the gap on vertical energy evaluation for individual office buildings by comparing the current energy consumption to its previous typical energy consumption. A generalized methodology of developing dynamic energy benchmarks and ECR system is proposed for office buildings, even for information poor buildings. It can be used to identify energy consumption patterns and benchmarking the energy uses. Second, the proposed dynamic energy benchmarks are developed and validated based on the simulation data of DOE prototype large office building. Four energy consumption patterns of the office building are classified using the decision tree by few commonly accessible weather and time variables, i.e. outdoor dry-bulb temperature, relative humidity, day type and time type. The reliability of the decision analysis result is validated using ANOVA analysis and Post-hoc test. Lastly, The ECR system is established to provide quantitative assessment on the building energy performance by classifying the energy consumption rating into seven grades, i.e. A–G. Comparative analysis is implemented between energy baseline and proposed dynamic energy benchmarks. Results show that the disparity and irrationality of the energy baseline were addressed by the proposed dynamic energy benchmarks. The limitation of this work is that the proposed energy benchmark is not addressed to detect and diagnosis the problems or faults in building. The effectiveness of the energy benchmark is only validated by comparing it to the energy baseline. It is necessary to investigate the reliability of dynamic energy benchmark on judging the reasons that cause drastic energy consumption variation for normal factors (e.g. shifty weather and occupant behavior) or faults. Therefore, future study would focus on investigating the performance of the proposed method at different faulty conditions.
References [1] Buildings and climate change. Paris; 2009. [2] Song L, Liu M, Claridge DE, Haves P. Study of on-line simulation for whole building level energy consumption fault detection and optimization. In: Proceedings of architectural engineering, building integration solutions; 2003. p. 76–83. [3] Djuric N, Novakovic V. Review of possibilities and necessities for building lifetime commissioning. Renew Sustain Energy Rev 2009;13:486–92. [4] Hong T, Koo C, Kim J, Lee M, Jeong K. A review on sustainable construction management strategies for monitoring, diagnosing, and retrofitting the building’s dynamic energy performance: Focused on the operation and maintenance phase. Appl Energy 2015;155:671–707. [5] Li Z, Han Y, Xu P. Methods for benchmarking building energy consumption against its past or intended performance: an overview. Appl Energy 2014;124:325–34. [6] Tronchin L, Fabbri K. Energy performance building evaluation in Mediterranean countries: comparison between software simulations and operating rating simulation. Energy Build 2008;40:1176–87. [7] Florio P, Teissier O. Estimation of the energy performance certificate of a housing stock characterised via qualitative variables through a typology-based approach model: a fuel poverty evaluation tool. Energy Build 2015;89:39–48. [8] Menezes AC, Cripps A, Bouchlaghem D, Buswell R. Predicted vs. actual energy performance of non-domestic buildings: using post-occupancy evaluation data to reduce the performance gap. Appl Energy 2012;97:355–64. [9] Kabak M, Se EK, Lmaz OKR, Lu SB. A fuzzy multi-criteria decision making approach to assess building energy performance. Energy Build 2014. [10] Koo C, Hong T. Development of a dynamic operational rating system in energy performance certificates for existing buildings: geostatistical approach and datamining technique. Appl Energy 2015;154:254–70. [11] Park HS, Lee M, Kang H, Hong T, Jeong J. Development of a new energy benchmark for improving the operational rating system of office buildings using various datamining techniques. Appl Energy 2016;173:225–37. [12] Jeong J, Hong T, Ji C, Kim J, Lee M, Jeong K. Development of an integrated energy benchmark for a multi-family housing complex using district heating. Appl Energy 2016;179:1048–61. [13] Jeong J, Hong T, Ji C, Kim J, Lee M, Jeong K, et al. Improvements of the operational rating system for existing residential buildings. Appl Energy 2017;193:112–24. [14] Jeong J, Hong T, Ji C, Kim J, Lee M, Jeong K, et al. Development of a prediction model for the cost saving potentials in implementing the building energy efficiency rating certification. Appl Energy 2017;189:257–70. [15] Yan C, Wang S, Xiao F, Gao D. A multi-level energy performance diagnosis method for energy information poor buildings. Energy 2015;83:189–203. [16] Yan C, Wang S, Xiao F. A simplified energy performance assessment method for existing buildings based on energy bill disaggregation. Energy Build 2012. [17] Wang H, Xu P, Lu X, Yuan D. Methodology of comprehensive building energy performance diagnosis for large commercial buildings at multiple levels. Appl Energy 2016;169:14–27. [18] Ministry of Land. Transport and Maritime Affairs (MLTM), the act on the promotion of green buildings. Seoul: MLRMA; 2012.
Acknowledgement The research work presented in this paper is supported by National Natural Science Foundation of China (Project 51576074 and Project 204
Applied Energy 206 (2017) 193–205
J. Liu et al. [19] 2016 – Implementing the EPBD – Featuring Country Reports. Lisbon: Concerted Action EPBD; 2015. [20] Asensio OI, Delmas MA. The effectiveness of US energy efficiency building labels. Nature Energy 2017;2. [21] Montgomery R, Wentz TG. Putting bEQ in Practice. ASHRAE J 2014;56:62–70. [22] Department for Communities and Local Government. The Government's methodology for the production of operational ratings. Display Energy Certificates and Advisory Reports. London; 2008. [23] Maile T, Bazjanac V, Fischer M. A method to compare simulated and measured data to assess building energy performance. Build Environ 2012;56:241–51. [24] Crawley DB, Lawrie LK, Winkelmann FC, Buhl WF, Huang YJ, Pedersen CO, et al. EnergyPlus: creating a new-generation building energy simulation program. Energy Build 2001;33:319–31. [25] U.S. D.O.E. Commercial Prototype Building Models; 2016. [26] Liang X, Hong T, Shen GQ. Improving the accuracy of energy baseline models for commercial buildings with occupancy data. Appl Energy 2016;179:247–60. [27] Zhao D, Zhong M, Zhang X, Su X. Energy consumption predicting model of VRV (Variable refrigerant volume) system in office buildings based on data mining. Energy 2016;102:660–8. [28] Sun Y, Wang S, Xiao F. Development and validation of a simplified online cooling load prediction strategy for a super high-rise building in Hong Kong. Energy Convers Manage 2013;68:20–7. [29] Xiao F, Fan C. Data mining in building automation system for improving building operational performance. Energy Buil 2014;75:109–18. [30] Quinlan JR. Induction of Decision Trees. Mach Learn 1986;1:81–106. [31] Quinlan JR. C4.5: Programs for machine learning. San Francisco, CA, USA: Morgan Kaufmann Publishers; 1993. [32] Breiman L, Friedman JH, Olshen RA, Stone CJ. Classification and regression trees. Monterey, CA: Brooks/Cole Publishing; 1984. [33] Hothorn T, Hornik K, Zeileis A. Unbiased recursive partitioning: A conditional inference framework. J Comp Grap Stat 2006;15:651–74. [34] Fan C, Xiao F, Yan C. A framework for knowledge discovery in massive building
[35]
[36] [37] [38] [39] [40] [41] [42] [43] [44] [45] [46] [47] [48]
205
automation data and its application in building diagnostics. Autom Constr 2015;50:81–90. Li G, Hu Y, Chen H, Li H, Hu M, Guo Y, et al. Data partitioning and association mining for identifying VRF energy consumption patterns under various part loads and refrigerant charge conditions. Appl Energy 2017;185:846–61. Kutner M, Nachtsheim C, Neter J, Li W. Applied linear statistical models. Irwin: McGraw-Hill; 1996. Jaccard J, Becker MA, Wood G. Pairwise multiple comparison procedures: a review. Psychol Bull 1984;96:589–96. Brown AM. A new software for carrying out one-way ANOVA post hoc tests. Comput Methods Prog Biomed 2005. Bland JM, Altman DG. Multiple significance tests: the Bonferroni method. BMJ 1995;310:170. Fan J, Hall P, Yao Q. To how many simultaneous hypothesis tests can normal, student's t or bootstrap calibration be applied. J Am Stat Assoc 2006;102:1282–8. Tukey J. Comparing individual means in the analysis of variance. Biometrics 1949;5:99–114. Klockars AJ, Hancock GR. Scheffé’s more powerful F-protected post hoc procedure. J Educ Behav Stat 25. 2000; 2000. Department of Finance and Personnel. Improving the energy efficiency of our buildings. London; 2013. Department of Energy & Climate Change. DECC Display Energy Certificate (DEC) how efficiently is this building being used? London; 2011. Sustainable Energy Authority of Ireland (SEAI). Methodology for the production of display energy certificates (DEC). Dublin; 2013. Sustainable Energy Authority of Ireland (SEAI). User guide to the calculation tool for Display Energy Certificates (DEC) for public buildings. Dublin; 2013. Department of Energy & Climate Change (DECC). Exploring the use of Display Energy Certificates. London; 2013. Environment Protection Agency (EPA). ENERGY STAR score for offices in the United States. Washington, D.C.; 2014.