Public utilities commission



Download 1.21 Mb.
Page1/50
Date conversion07.06.2018
Size1.21 Mb.
  1   2   3   4   5   6   7   8   9   ...   50


STATE OF CALIFORNIA

PUBLIC UTILITIES COMMISSION


505 Van Ness Avenue

San Francisco, California 94102







APRIL 2006

Prepared for the California Public Utilities Commission


by The TecMarket Works Team
Under Contract with and Directed by the CPUC’s Energy Division, and with guidance from Joint Staff





California Energy Efficiency Evaluation

Protocols: Technical, Methodological, and Reporting Requirements for Evaluation Professionals

California Energy Efficiency Evaluation Protocols: Technical, Methodological and Reporting Requirements for Evaluation Professionals

{a.k.a. Evaluators’ Protocols}
Prepared under direction of the Energy Division, with the guidance by Joint Staff, for the

California Public Utilities Commission



APRIL 2006

Submitted by


Nick Hall, Johna Roth, Carmen Best

TecMarket Works

TecMarket Business Center

165 West Netherwood Road, Second Floor, Suite A

Oregon, WI 53575


NPHall@TecMarket.net

608 835 8855

And sub-contractors
Sharyn Barata

Opinion Dynamics, Irvine, California


Pete Jacobs

BuildingMetrics, Inc., Boulder, Colorado


Ken Keating, Ph.D.

Energy Program Consultant, Portland, Oregon


Steve Kromer

RCx Services, Oakland, California


Lori Megdal, Ph.D.

Megdal & Associates, Acton, Massachusetts


Jane Peters, Ph.D.

Research Into Action, Portland, Oregon


Richard Ridge, Ph.D.

Ridge & Associates, Alameda, California


Francis Trottier

Francis Trottier Consulting, Penn Valley, California


Ed Vine, Ph.D.

Energy Program Consultant, Berkeley, California




April 2004
Prepared for
Cinergy Services, Inc.

139 East Fourth Street

Cincinnati, OH 45202

Prepared by

Nick Hall and Johna Roth


TecMarket Works

165 West Netherwood Road, Suite A, 2nd Floor

Oregon, WI 53575

Voice: (608) 835-8855

Fax: (608) 835-9490

Mail@TecMarket.net






Acknowledgements

The authors wish to acknowledge and express our appreciation to the many individuals who contributed to the development of the California Evaluation Protocols. Without the support and assistance of these individuals this effort would not have been possible.

The Joint Staff (California Public Utilities Commission and the California Energy Commission) provided considerable Protocol development guidance and conducted multiple rounds of reviews of all sections of the Protocols. These individuals and their affiliations are the following:

Appreciation is also extended to the Administrative Law Judge, Meg Gottstein, who ordered the development of the Protocols and who provided instructive guidance and policy direction along the way.


In addition to the oversight and guidance provided by the above individuals, others within the Energy Division of the California Public Utilities Commission and the California Energy Commission provided valuable contributions and support. For these efforts we thank the following individuals:

  • Nancy Jenkins, California Energy Commission

  • Tim Drew, Energy Division, California Public Utilities Commission

  • Zenaida Tapawan-Conway, Energy Division, California Public Utilities Commission

  • Peter Lai, Energy Division, California Public Utilities Commission

  • Nora Gatchalian, Energy Division, California Public Utilities Commission

  • Jeorge Tagnipes, Energy Division, California Public Utilities Commission

  • Sylvia Bender, California Energy Commission

We also wish to thank the California investor owned utilities and their program management and evaluation staff who have attended workshops and provided both written and verbal comments during the Protocol development process. And we wish to thank the public representatives who attended the workshops and provided verbal and written comments. All of these combined efforts helped move the development of the Protocols to a successful completion in a very short period of time.

Lastly, we wish to thank the TecMarket Works Protocol Project Team who under direction from the ALJ and the Joint Staff, and with useful comments from the IOUs and the public, took the Protocols from concept to completion under the oversight of Joint Staff.

This team was made up of the following individuals:



  • Nick Hall, Johna Roth, Carmen Best, TecMarket Works

  • Sharyn Barata, Opinion Dynamics Corporation

  • Pete Jacobs, Building Metrics Inc.

  • Ken Keating, Ken Keating and Associates

  • Steve Kromer, RCx Services

  • Lori Megdal, Megdal & Associates

  • Jane Peters, and Marjorie McRae, Research Into Action

  • Rick Ridge, Richard Ridge and Associates

  • Ed Vine, Edward Vine and Associates

  • Francis Trottier, Francis Trottier Consulting



Table of Contents

PUBLIC UTILITIES COMMISSION 1

Acknowledgements i

Table of Tables ix

Table of Figures xi

California Energy Efficiency Evaluation Protocols: Technical, Methodological and Reporting Requirements for Evaluation Professionals {a.k.a. Evaluators’ Protocols} 1

Introduction 1

How The Protocols Were Developed 5

How the Protocols Work Together 6

How the Protocols Meet CPUC Goals 7

Use of the Evaluation Results to Document Energy Savings and Demand Impacts 9

The Evaluation Identification and Planning Process 9

Evaluation Rigor and Budgets 13


Impact Evaluations 13

Process Evaluations 14

Market Effects Evaluations 14

Codes and Standards and Emerging Technology Program Evaluations 14

Evaluation Budgets 15

Recommendations for Using the Protocols 15

The Detailed Evaluation Work Plan 15

Confidentiality Issues 17

Contacting the Customer 18

Impact Evaluation Protocol 19

Introduction 19

Audience and Responsible Actors 21

Overview of the Protocol 21

Protocol Types 21

Rigor 22

Key Metrics, Inputs and Outputs 23

Energy and Demand Impact Protocols 25



Gross Energy Impact Protocol 25

Gross Demand Impact Protocol 32

Participant Net Impact Protocol 36

Indirect Impact Evaluation Protocol 40

Guidance on Skills Required to Conduct Impact Evaluations 45

Summary of Protocol-Driven Impact Evaluation Activities 46

Measurement and Verification (M&V) Protocol 49

Introduction 49

Audience and Responsible Actors 50

Overview of the Protocol 50


M&V Framework & Language 51

Relationship of the M&V Protocol to Other Protocols 51

Key Metrics, Inputs and Outputs 52

Site-Specific M&V Plan 54

M&V Rigor Levels 56

Measure Installation Verification 56



Measure Existence 56

Installation Quality 56

Correct Operation and Potential to Generate Savings 57

M&V Protocol for Basic Level of Rigor 57



IPMVP Option 57

Sources of Stipulated Data 58

Baseline Definition 58

Monitoring Strategy and Duration 58

Weather Adjustments 58

M&V Protocol for Enhanced Level of Rigor 58



IPMVP Option 59

Sources of Stipulated Data 59

Baseline Definition 60

Monitoring Strategy and Duration 60

Weather Adjustments 60

Calibration Targets 60

Additional Provisions 61

M&V Approach Examples 61

Overall Results Reporting 62

Sampling Strategies 62

Skills Required for M&V 62

Summary of Protocol-Driven M&V Activities 63

Emerging Technologies Protocol 65

Introduction 65


Audience and Responsible Actors 66

Key Metrics, Inputs, and Outputs 66

Evaluation Planning 67

A Sample of Available ETP Evaluation Methods 68

Protocols Requirements 69

Integration of Results 75

Reporting of Results 75

Summary 76

Summary of Protocol-Driven Emerging Technology Evaluation Activities 76

References 77

Codes and Standards and Compliance Enhancement Evaluation Protocol 81

Introduction 81

Audience and Responsible Actors 82

Key Inputs, and Outputs 83

Evaluation Methods 84

Evaluation Planning 84

Technology-Specific Code and Standard Change Theory 85

Evaluation Approach 88

Summary 98

Summary of Protocol-Driven Codes and Standards Evaluation Activities 99

Code Compliance Enhancement Programs 100



Definition of a Code Compliance Enhancement Program 100

What this Protocol is Designed To Do 100

Joint Staff Responsibilities 100

Draft Evaluation Plan 101



Program Theory Review and Assessment 101

Pre-Program Compliance Rate 101

Post-Program Compliance Rate 102

Adjustment For Naturally Occurring Compliance Change 102

Net Program-Induced Compliance Change 102

Assessment of Energy Savings 103

Recommendations for Program Changes 103

Cost Effectiveness Assessment 103

Reporting of Evaluation Results 103

References 104

Effective Useful Life Evaluation Protocol (Retention and Degradation) 105

Introduction 105

Audience and Responsible Actors 107

Overview of the Protocol 108

Protocol Types 108

Rigor 109

Key Metrics, Inputs, and Outputs 110

Retention Study Protocol 111

Degradation Study Protocol 116

Effective Useful Life Analysis Protocol 119

Reporting Requirements 126

Study Selection and Timing 127

Guidance on Skills Required to Conduct Retention, EUL, and Technical Degradation Evaluations 129


Summary of Protocol-Driven Impact Evaluation Activities 130

Process Evaluation Protocol 131

Introduction 131

Audience and Responsible Actors 132

Overview of the Protocol 132

Process Evaluation Planning 133

Annual Process Evaluation Planning Meeting 134

Recommendations for Change 135

Key Issues and Information Covered 135

Process Evaluation Efforts 136



Program-Specific Process Evaluation Plans 136

Data Collection and Assessment Efforts 137

Conducting Investigative Efforts 138

Independence 140

Selection of Evaluation Contractors 140

Skills Required for Conducting Process Evaluations 140

Market Effects Evaluation Protocol 142

Introduction 142



Overview of the Market Effects Protocol 142

The Market Effects Protocol and Other Protocols 145

Key Market Effects Inputs and Outputs 146

Audience and Responsible Actors 147

Steps in Conducting Market Effects Evaluations 147

Scoping Study 148


Collection of Baseline and Longitudinal Indicators 153

Analysis of Market Effects 154

Reporting 157

Guidance on Skills Required to Conduct Market Effects Evaluations 158

Considerations for Conducting a Market Effects Evaluation 158

Summary of Protocol-Driven Market Effects Evaluation Activities 159

Sampling and Uncertainty Protocol 162

Introduction 162

Precision: Gross and Net Impact, Measurement and Verification, and Verification Activities 163

Development of the Evaluation Study Work Plan 166

Process Evaluations 168

Market Effects 169

System Learning 169

Acceptable Sampling Methods 170

Skills Required for Sampling & Uncertainty 170

Audience and Responsible Actors 170

Key Metrics and Information Covered 171

Sample Size and Precision 171

Validity and Research Design 171

Accuracy 172

Summary of Sampling and Uncertainty Protocol 174

Evaluation Reporting Protocol 176

Introduction 176

Report Delivery Dates 176


Common Evaluation Reporting Requirements 176

Evaluation Type Specific Reporting Requirements 182

Sample Reporting Tables 195

Evaluation Support Information Needed From Administrators 204

Storage and Disposal of Customer Information Used in the Evaluation 210

APPENDIX A. Measure-Level M&V Results Reporting Requirements 212

APPENDIX B. Glossary of Terms 216

APPENDIX C. Performance Basis Metrics 246

Appendix D. A Primer for Using Power Analysis to Determine Sample Sizes 248

Basics of Power Analysis and the Protocols 249

Example of Varying Parameters and Estimating Required Sample Size for Survival Analysis through Power Analysis 250

References 253



Appendix E. Summary Tables for All Protocols 254

Summary of Protocol-Driven Impact Evaluation Activities 254

Summary of Protocol-Driven M&V Activities 261

Emerging Technology 263

Codes and Standards 264

Effective Useful Life 266

Summary of Protocol-Driven Market Effects Evaluation Activities 269

Sampling and Uncertainty 273




  1   2   3   4   5   6   7   8   9   ...   50


The database is protected by copyright ©hestories.info 2017
send message

    Main page