px
Avaliações do Guias da Comissão Europeia

Com o objetivo de sustentar o princípio de orientação para resultados e de garantir a qualidade e utilidade da função de avaliação da aplicação dos Fundos Europeus Estruturais e de Investimento, a Comissão Europeia desenvolveu um conjunto alargado de guias e documentos orientadores de apoio ao desenvolvimento da referida função.

  1. Guias sobre Avaliação dos Fundos da Política de Coesão
  1. 1.1 European Commission (2013), EVALSED sourcebook: Method and Techniques.
  2. 1.2 European Commission (2013), EVALSED: The resource for the evaluation of Socio-Economic Development.
  3. 1.3 European Commission (2008), EVALSED: The Resource for the Evaluation of Socio-Economic Development, Julho.
  4. 1.4 Comissão Europeia (s. d.) EVALSED: A Avaliação do Desenvolvimento Socioeconómico – O GUIA, tradução: Observatório do QREN.
  5. 1.5 European Commission (2014), Guidance Document on Monitoring and Evaluation, Concepts and Recommendations, Concepts and Recommendations, European Cohesion Fund, European Regional Development Fund, março.
  6. 1.6 European Commission (2018), Monitoring and Evaluation of European Cohesion Policy. Guidance DocumentEuropean Social Fund. Programming period 2014-2020, agosto (Primeira versão, 2015
  7. 1.7 European Commission (2016), Annex D – Practical guidance on data collection and validation. Programming period 2014-2020, maio  (Primeira versão em 2015).
  8. 1.8 European Commission (2011), Outcome indicators and targets. Towards a new system of monitoring and evaluation in EU Cohesion Policy, Revised version.
  9. 1.9 European Commission (2013), Guidance for the Terms of Reference for Impact Evaluations, European Regional Development Fund and Cohesion Fund, outubro.
  10. 1.10 European Commission (2015), Guidance Document on Evaluation Plans – Terms of Reference for Impact Evaluations Guidance on Quality Management of External Evaluations, fevereiro.
  11. 1.11 European Commission (2015), Guidance on Evaluation of the Youth Employment Initiative, Setembro 2015 (Primeira versão)
  12. 1.12 European Commission (2014), Guidance Document on Indicators of Public Administration Capacity Building, Employment, Employment, Social Affairs and Inclusion DG, junho.
  13. 1.13 European Commission (2014), Monitoring and Evaluation of European Cohesion Policy. Guidance document on ex-ante evaluation – European Regional Development Fund. European Social Fund. Cohesion Fund, janeiro.
  14. 1.14 European Commission (2015), Examples of EU added value. Accompanying the document Report from the Commission to the European Parliament and the Council on the evaluation of the Union’s finances based on the results achieved, junho.
  15. 1.15 European Commission (2014), Guide to Cost-Benefit Analysis of Investment Projects – Economic appraisal for Cohesion Policy 2014-2020, dezembro.
  16. 1.16 European Commission (2012), Evaluation of Innovation Activities. Guidance on methods and practices, Directorate -General for Regional Policy.

Nota: ver guias da Comissão Europeia referentes à Avaliação Baseada na Teoria e Avaliação de Impacto Contrafactual no separador seguinte “Publicações disponíveis online de aprofundamento da temática da Avaliação”.

 

  1. Guias sobre os Fundos da Política de Coesão

2.1 European Commission (s.d.), Better Regulation Toolbox.

2.2 European Commission (s.d.) Standardized Key Terminology.

2.3 European Commission (2018), Guidance for Member States on Performance framework, review and reserve, junho. (Versão anterior)

2.4 European Commission (2016), FAQ on data collection and data validation, ESF 2014-2020, fevereiro.

 

  3. Documentação de ações de formação da Comissão Europeia

A Direção Geral de Política Regional da Comissão Europeia (DG Regio) organiza uma Summer School em avaliação por ano, destinada às Autoridades de Gestão, cujos documentos se apresentam.

  1. 3.1 Apresentações da Summer school, Valletta, Malta, 5-7 outubro 2016
    1. _ Dia 1: Developing quality terms of reference for impact evaluation
    2. _ Dia 2: Howard White
    3.             Vladimír Kváca
    4.             Barbara Romano
    5. _ Dia 3: Procuremet Vladimir
    6.             10 commandments for the good shepard of Tor
  2. 3.2 Apresentações da Summer school, Atenas, Grécia, 21-23 junho 2017
    1. Caso de avaliação em intervenções complexas em Itália
    2. Caso de avaliação baseado na teoria Oliver Shwab
    3. Athens Seminar Junho 21-23 2017
  3. 3.3 Apresentações da Summer school, Bruxelas, 18 abril 2018
    1. TBE Brussels trainning schedule
    2. Presentation Elliot Stern
    3. Results and Impacts Elliot Stern
    4. Design and Eval Questions Elliot Stern
    5. Methodological Choice Elliot Stern
    6. Master set Elliot Stern
    7. Checklists Eliot Stern
    8. Presentation Marco Caliendo
    9. The microeconometric estimation of treatment effects – An overview | Marco Caliendo
    10. Some pratical guidance for the implementation of propensity score matching
  4. 3.4 Apresentações da Summer school, Hungary, 23-25 maio 2018
    1. Balaton trainning programme
    2. Support to Large Enterprises – Final Report
    3. Support to Large Enterprises – Final Report | Executive summaries 
    4. Counterfactual Impact Evaluation of Enterprise Support Policies: an empirical application to EU Co-Sponsored, National and Regional programs, Daniele Bondonio
    5. Are the subsidies to private capital useful? a Multiple Regression Disocntinuity Design Approach, Augusto Cerqua and Guido Pellegrini
    6. Pratical information
  5. 3.5 Apresentações da Summer school, Burgas, Bulgária, 11-12 Julho 2019
    1. Practitioner’s reflections | The case of ERDF in Berlin, Oliver Schwab
    2. Evaluation Questions: Stakeholder Needs and Evaluation Design, Elliot Stern
    3. Evaluation whole programmes? Or delimiting the scope and scale of evaluations , Elliot Stern
    4. How to make use of evaluation results, Elliot Stern
    5. What is learning? Elliot Stern
Publicações disponíveis online de aprofundamento da temática da Avaliação

O acesso on line a estes documentos foi realizado em fevereiro de 2020.

1. Guias de avaliação de políticas públicas

1.1 HM Treasury (2018), The Green Book Central Government Guidance on Appraisal and Evaluation, Reino Unido, Crown Copyright.

1.2 HM Treasury (2018), The Magenta. Guidance for evaluation, Reino Unido, Crown Copyright.

1.3 White, Howard; Raitzer, David A. (2017), Impact Evaluation of Development Interventions, Asian Development Bank.

1.4 Gertler, Paul J.; Martinez, Sebastian; Premand, Patrick; Rawlings, Laura B.; Vermeersch, Christel M. J. (2016) Impact Evaluation in Practice, Second Edition. Washington, DC: Inter-American Development Bank and World Bank.

1.5 Stern, E. (2015), Impact Evaluation: A Guide for Commissioners and Managers, Prepared for the Big Lottery Fund, Bond, Comic Relief and the Department for International Development.

1.6 World Health Organization (2013), Evaluation Practice Handbook, Suíça.

1.7 International Labour Office; Evaluation Unit (2013), ILO policy guidelines for results-based evaluation: Principles, rationale, planning and managing for evaluations, Genebra, ILO, 2nd ed..

1.8 White, H.D., Phillips, D.S. (2012), Addressing attribution of cause and effect in small n impact evaluations: towards an integrated framework, 3IE International Initiative for Impact Evaluation, Working Paper 15.

1.9 Scarpa, R. (2012), Guidance for the Design of Quantitative Survey-Based Evaluation.

1.10 Vanclay, F. (2012), Guidance for the design of qualitative case study evaluation, University of Groningen, fevereiro.

1.11 Community of Practice on Results Based Management (2012), Sourcebook on results based management in the European Structural Funds, Community of Practice on Results Based Management.

1.12 Centre of Excellence for Evaluation (2012), Theory-Based Approaches to Evaluation: Concepts and Practices, Canadá, Centre of Excellence for Evaluation at the Treasury

1.13 Kusters, Cecile, Vugt, Simone, Wigboldus, Seerp, Williams, B., Woodhill, A. (2011), Making evaluations matter: a practical guide for evaluators, Center for Development Innovation, Wageningen University & Research Center.

1.14 Roberts, Dawn; Khattri, Nidhi; Wessal, Arianne (2011), Writing Terms of Reference for an Evaluation: a How to Guide, IEG Independent Evaluation Group, Washington, D.C., The World Bank.

1.15 Blasco, Jaume; Casado, David (2009), Guía Práctica 5. Evaluación de Impacto, Barcelo, Institut Català d’Avaluació de Politiques Publiques.

1.16 Mancebo, J. A. G.; Llaneza, J. L. O. (coord.), (2007), Manual de Gestión de Evaluaciones de la Cooperación Española, Madrid, Ministerio de Asuntos Exteriores y de Cooperación.

1.17 World Bank (2006), Conducting Quality Impact Evaluations Under Budget, Time and Data Constraints, Washington D.C., Independent Evaluation Group.

1.18 Perrin, Burt (2002), Implementing the vision: Addressing challenges to results focused management and budgeting, Paris, OECD.

1.19 GAO (1990), Prospective Evaluation Methods. The Prospective Evaluation Synthesis, Washington, D. C., Program Evaluation and Methodology Division.

 

2. Critérios de Avaliação

2.1 OECD/DAC (2019), Better Criteria for Better Evaluation. Revised Evaluation Criteria Definitions and Principles for Use, Paris, OECD DAC Network on Development Evaluation (EvalNet).

2.2 OCDE/DAC (s.d.), Criteria for Evaluating Development Assistance, Paris.

2.3 OECD (1991), Principles for Evaluation of Development Assistance, Paris, Development Assistance Committee.

 

3. Causalidade e desenho de pesquisa

3.1 Befani, Barbara (2016), Choosing Appropriate Evaluation Methods. A Tool for Assessment & Selection, Londres, Bond.

3.2 Rogers, Patricia (2014), Overview: Strategies for Causal Attribution, Florença, UNICEF. ​​​​​

3.3 Pawson, Ray (2010), Causality for Beginners, First draft for commentary and criticism.

3.4 Imas, L. G. M; Rist, Ray C. (2009), The Road to Results. Designing and Conducting Effective Development Evaluations, Washington D.C., The World Bank.

3.5 De Vaus, David A. (2001), Research Design in Social Research, Londres, SAGE Publications.

 

4. Avaliação Baseada na Teoria

4.1 Avaliação Baseada na Teoria – Método

Guia da Comissão

1.1 European Commission (2012), Theory-based Evaluation, Based on material produced for DG Regional Policy by Frans L. Leeuw.

Outros documentos:

4.1.1 Befani, Barbara; Rees, Chris; Varga, Liz; Hills, Dione (2016), Testing Contribution Claims with Bayesian Updating, Note nº 21, Inverno, Surrey, CECAN Centre for the Evaluation of Complexity Across the Nexus.

4.1.2 Melloni, Erica, Pesce, Flavia, Vasilescu, Cristina (2016), “Are social mechanisms usable and useful in evaluation research?” in Evaluation, Vol. 22(2), SAGE, pp. 209–227.

4.1.3 White, Howard (2011), Theory Based Impact Evaluation: Principles and PracticeWorking Paper 3, India, 3IE International Initiative for Impact Evaluation.

4.1.4 Coryn, C.; Noakes L.; Westine, C.; Schroeter, D. (2011), “A Systematic Review of Theory-Driven Evaluation Practice From 1990 to 2009” in American Journal of Evaluation, 32 (2), junho.

4.1.5 Astbury, Brad; Leeuw, Frans (2010), “Unpacking Black Boxes: Mechanisms and Theory Building in Evaluation” in American Journal of Evaluation, 31, SAGE, pp. 363-381.

4.1.6 Ben-Gal, I. (2007), “Bayesian Networks”, in Ruggeri F.; Faltin F.; Kenett R., Encyclopedia of Statistics in Quality & Reliability, Wiley & Sons.

4.1.7 Stame, Nicoletta (2004), “Theory-based Evaluation and Types of Complexity” in Evaluation, janeiro, 1, pp. 58-76, SAGE.

 

4.2 Teoria da Mudança: desenho e elementos preparatórios (árvores de problemas, mobilização de stakeholders, cadeias causais, definição de metas)

4.2.1 Coffey (s.d.), What is a Theory of Action?

4.2.2 INDABA Network (s.d.), Problem Tree/Objective Tree, Toolbox.

4.2.3 Bullen, Piroska Bisits (2019), “Theory of Change vs Logical Framework – what’s the difference?” in Tools4dev. Practical tools for international development.

4.2.4 Dhillon, Lovely; Vaca, Sara (2018), “Refining Theories of Change” in Journal of Multidisciplinary Evaluation, vol 14, Issue 30.

4.2.5 Davies, Rick (2018), Representing Theories of Change: Technical Challenges with Evaluation Consequences, Inception paper, CEDIL, Londres, UK Aid.

4.2.6 Sen, Nabanita; Kessler, Adam; Loveridge, Donna (2018), Guidelines to the DCED Standard for Results Measurement: Defining indicators of change and other information needs, DCED –The Donor Committee for Enterprise Development.

​​​​​​​4.2.7 Zwart, R. (2017), “Strengthening the results chain: synthesis of case studies of results-based management by providers. Discussion paper” in OECD Development Policy Papers, nº 7, Paris, OECD Publishing.

​​​​​​​4.2.8 Pasanen, Tiina; Shaxson, Louise (2016), How to Design a Monitoring and Evaluation Framework for a Policy Research Project, Londres, Methods Lab.

​​​​​​​4.2.9 Laing, K.; Todd, Liz (ed.) (2015), Theory based Methodology: Using theories of change for development, research and evaluation, Research Centre for Learning and Teaching, Newcastle University.

​​​​​​​4.2.10 Mayne, John (2015), “Useful Theory of Change Models” in La Revue canadienne d’évaluation de programme, 30, 2, Verão, pp. 119-142.

​​​​​​​4.2.11 van Es, Marjan; Guij, Irene; Vogel, Isabel (2015), ToC Guidelines. Theory of Change Thinking in Practice. A stepwise approach, Holanda, Hivos People unlimited.

​​​​​​​4.2.12 Simister, Nigel (2015), Output, Outcomes and Impact. Monitoring and Evaluation Planning, Series 7, Oxford, INTRAC for Civil Society.​​

4.2.13 Rogers, Patricia (2014), Theory of Change, Methodological Briefs, Impact Evaluation, nº 2.

​​​​​​​4.2.14 Taplin, Dana H.; Clark, Heléne; Collins, Eoin; Colby, David C. (2013), Theory of Change. Technical papers. A series of papers to support development of theories of change based on practice in the field, Nova Iorque, Actknowledge – Theory to Results.

​​​​​​​4.2.15 Floretta, John (2013), Theory of Change in Program Evaluation (slides), Monitoring & Evaluation Training Course for the Indian Economic Service, Abdul Latif Jameel Poverty Action Lab, IFMIR, Clear Regional Centres for Learning on Evaluation and Results.

​​​​​​​4.2.16 Roberts, Dawn; Khattri, Nidhi (2012), Designing a Results Framework for Achieving Results: a How to Guide.

​​​​​​​4.2.17 Stein, Danielle; Valter, Craig (2012), Understanding “Theory of Change” in international development: a review pf existing knowledge, The Asia Foundation, The Justice and Security Programme.

​​​​​​​4.2.18 Taplin, Dana H.; Clark, Heléne (2012), Theory of Change Basis. A primer on theory of change, Nova Iorque, Actknowledge – Theory to Results.

​​​​​​​4.2.19 Clark, Heléne (2012), “Intervention Logic and Theories of Change: What are they, how to build them, how to use them” (slides), Community of Practice on Results Based Management. 2014 and beyond: how to ensure delivery of better and more results by the European Social Fund? EU Conference 5-6 novembro.

​​​​​​​4.2.20 Vogel, Isabel (2012), Review of the use of “Theory of Change” in international development. Review Report, UK Department of International Development.​​​​​​​

4.2.21 Vogel, Isabel (Consultant) Stephenson, Zoe (2012), Appendix 3: Examples of Theories of Change, DVID EVD.​​​​​​​

4.2.22 Davies, Rick (2012), Criteria for assessing the evaluability of a Theory of Change.​​​​​​​

4.2.23 Vogel, Isabel (2012), ESPA guide to working with Theory of Change for research projects, ESPA Ecosystem services for Poverty Alleviation Programme, LTS International, ITAD. ​​​​​​​

4.2.24 BSR (2011), Stakeholder Engagement Strategy, novembro, Washington DC, IEG Independent Evaluation Group, World Bank.​​​​​​​

4.2.25 Fujita, Nobuko (2010), Beyond Logframe; Using Systems Concepts in Evaluation, Japan, FASID Foundation for Advanced Studies on International Development.​​​​​​​

4.2.26 Overseas Development Institute (2009), Stakeholder Analysis” in Successful Communication: A Toolkit for Researchers and Civil Society Organisations. ​​​​​​​

4.2.27 Blamey, Avril; Mackenzie, Mhairi (2007), “Theories of Change and Realistic Evaluation. Peas” in Evaluation, Londres, SAGE.​​​​​​​

4.2.28 Organizational Research Services (2004), Theory of Change: A Practical Tool for Action, Results and Learning, Annie E. Casey Foundation.​​​​​​​

4.2.29 W. K. Kellogg Foundation (2004), Logic Model Development Guide, Michigan, W. K. Kellog Foundation.​​​​​​​

4.2.30 Canadian International Development Agency (2000), RBM Handbook on Developing Results Chains.​​​​​​​

4.2.31 Connell, James P.; Kubisch, Anne C. (1998), “Applying a Theory of Change Approach to the Evaluation of Comprehensive Community Initiatives: Progress, Prospects, and Problems” in New Approaches to Evaluating Community Initiatives, Aspen Institute, pp. 15-44.​​​​​​​

4.2.32 Cummings, F. Harry (1997), “Logic Models, Logical Frameworks and Results-Based Management: Contrasts and Comparisons” in Canadian Journal of Development Studies/Revue Canadienne d’Études du Développement, 18, pp. 587-596.

4.2.33 Documentação da formação Theory Based Impact Evaluation por Elliot Stern, Lisboa 2019

Time Table

Dia 1: Design approaches, methods and casual inference

Theory_Based and Impact Evaluation

Dia 2: Contemporary Theory Based Methods

Support to large enterprises

Quality Checklists

Theories of change, main ‘brands’ and a detailed example

 

​​​​​​​4.3 Análise da Contribuição

4.3.1 Delahais, Thomas; Toulemonde, Jacques (2017), “Making rigorous causal claims in a real-life context: Has research contributed to sustainable forest management?” in Evaluation, 23(4), ·outubro 2017, pp. 370-388.​​​​​​​

4.3.2 Mayne, John (2012), “Contribution Analysis: Coming of Age?” in Evaluation, 18, The Tavistock Institute, SAGE, pp. 270-280. ​​​​​​​

4.3.3 Mayne, John (2001), “Addressing attribution through contribution analysis: using performance measures sensibly” in The Canadian Journal of Program Evaluation, Canadian Evaluation Society, vol. 16, nº 1, pp. 1-24.​​​​​​​

4.3.4 Mayne, John (2001), “Addressing attribution through contribution analysis: using performance measures sensibly” in The Canadian Journal of Program Evaluation, Canadian Evaluation Society, vol. 16, nº 1, pp. 1-24.​​​​​​​

4.3.5 Mayne, John (2001), “Addressing attribution through contribution analysis: using performance measures sensibly” in The Canadian Journal of Program Evaluation, Canadian Evaluation Society, vol. 16, nº 1, pp. 1-24.

 

​​​​​​​​​​​​​​4.4 Process Tracing

4.4.1 Oxfam GB (n.d.), Process Tracing – Draft Protocol.​​​​​​​

4.4.2 Bennett, Andrew (2010), “Process Tracing and Causal Inference” in Henry Brady and David Collier (ed.) Reasoning with Cases in the Social Sciences, Rowman and Littlefield.​​​​​​​

4.4.3 Reilly, Rosemary C. (2010), “Process tracing” in Mills, Albert J., Durepos, Gabrielle and Wiebe, Elden, (eds.), Encyclopedia of case study research, California, SAGE, Thousand Oaks, pp. 734-736. ​​​​

4.4.4 Derek, Beach; Pedersen, Rasmus (2011), What is Process-Tracing Actually Tracing? The Three Variants of Process Tracing Methods and Their Uses and Limitations, paper prepared for presentation at The American Political Science Association Annual Meeting, Seattle Washington, Setembro, 1-4.​​​​​​​

4.4.5 Collier, David (2011) “Understanding Process Tracing” in Political and Politics, 44, nº 4, Cambridge University Press, 18, outubro, pp. 823-830.​​​​​​​

4.4.6 Beach, Derek; Pedersen, Rasmus (2012), “Case Selection Techniques in Process-Tracing and the Implications of Taking the Study of Causal Mechanisms Seriously”, SSRN Electronic Journal.​​​​​​​

4.4.7 Beach, Derek (2012), (slides) Process Tracing Methods – an introduction, PhD workshop, Department of Political Science, University of Aarhus, Denmark.​​​​​​​

4.4.8 Beach, Derek; Pedersen, Rasmus (2013), Process-Tracing Methods: Foundations and Guidelines, The University of Michigan Press.​​​​​​​

4.4.9 Beach, Derek.; Pedersen, Rasmus (2015), “Applying Process Tracing in Five Steps” Centre for Development Impact, Practice Paper, Annex, 10, Brighton, IDS Institute of Development Studies.​​​​​​​

4.4.10 Punton, M.; Welle, K. (2015), Straws-in-the-wind, Hoops and Smoking Guns: What can Process Tracing Offer to Impact Evaluation?, CDI Practice Paper, Annex, 10, Brighton, IDS Institute of Development Studies.​​​​​​​

4.4.11 Schmitt, Johannes; Beach, Derek (2015), “The contribution of process tracing to theory-based evaluations of complex aid instruments” Evaluation. 21.​​​​​​​

4.4.12 Beach, Derek; Rasmus, Brun; Pedersen, Rasmus. (2017), “How do I know mechanistic evidence when I see it? A four-step procedure for tracing causal mechanisms in case study research.” Paper prepared for the 2017 American Political Science annual meeting, San Francisco, CA, agosto 31-Setembro 2.

 

4.5 Comparative Qualitative Analysis

4.5.1 Ragin, Charles C. (s.d.), (slides) Qualitative Comparative Analysis and Fuzzy Sets.​​​​​​​

4.5.2 Kane, H.; Hinnant, L.; Day, K.; Council, M.; Tzeng, J.; Soler, R.; Chambard, M.; Roussel, A.; Heirendt, W. (2017), “Pathways to Program Success: A Qualitative Comparative Analysis (QCA) of Communities Putting Prevention to Work Case Study Programs” in Journal of public health management and practice, JPHMP, 23 (2), pp. 104–111.​​​​​​​

4.5.3 Byrne, David (2016), Qualitative Comparative Analysis: a pragmatic method for evaluating intervention, CECAN Centre for the Evaluation Complexity Across the Nexus, nº 1, Outono, Surrey, pp. 1-4.​​​​​​​

4.5.4 Befani, Barbara (2016), Pathways to change: Evaluating development interventions with qualitative comparative analysis (QCA), Report for the Expert Group for Aid Studies-EBA, Expert Group for Aid Studies, Report 05/16, Estocolmo, EBA.​​​​​​​

4.5.5 Schatz, Florian, Welle, Katharina (2016), Qualitative Comparative Analysis: A Valuable Approach to Add to the Evaluator’s Toolbox? Lessons from Recent Applications, CDI Practice Paper, 13, Brighton, IDS Institute of Development Studies.​​​​​​​

4.5.6 Baptist, Carrie; Befani, Barbara (2015), Qualitative Comparative Analysis – A Rigorous Qualitative Method for Assessing Impact, Coffey How To.​​​​​​​

4.5.7 Thiem, Alrik (2014), “Unifying Configurational Comparative Methods: Generalized-Set Qualitative Comparative Analysis” in Sociological Methods & Research, 43(2), pp. 313-337.​​​​​​​

4.5.8 Befani, Barbara (2013), “Between complexity and generalization: Addressing evaluation challenges with QCA” in Evaluation, 19, pp. 269-283.​​​​​​​

4.5.9 Ragin, Charles C. (2008), (slides), Redesigning Social Inquiry.

 

​​​​​​​4.6 Avaliação Realista

4.6.1 Westhorp, Gill (2014), “Realist impact evaluation: an introduction” Methods Lab, Londres, Overseas Development Institute.​​​​​​​

4.6.2 Wong, Geoff; Greenhalgh, Trisha; Westhorp, Gill; Buckingham, Jeanette; Pawson, Ray. (2013), RAMESES publication standards: Realist syntheses, BMC medicine.​​​​​​​

4.6.3 Westhorp, Gill; Prins, Ester; Kusters, Cecile; Hultink, Mirte; Guijt, Irene; Brouwers, Jan (2011), Realist Evaluation: an overview Report from an Expert Seminar with Dr. Gill Westhorp, Wageningen UR Centre for Development Innovation​​​​​​​

4.6.4 Pawson, Ray; Tilley, Nick (2004), Realist Evaluation, British Cabinet Office.

 

​​​​​​​4.7 Análise de Congruência

4.7.1 Blatter, Joachim (2012), Innovations in Case Study Methodology: Congruence Analysis and the Relevance of Crucial Cases, slightly modified version of the paper presentation at the Annual Meeting of the Swiss Political Science Association, Luzerna, 2-3 fevereiro.​​​​​​​

4.7.2 Blatter, Joachim; Blume, Till (2008), “In Search of Co‐variance, Causal Mechanisms or Congruence? Towards a Plural Understanding of Case Studies” in Swiss Political Science Review, 14 (2), pp. 312-356.  (disponibilização gratuita, mediante registo).

 

​​​​​​​4.8 Avaliação de Processo

4.8.1 Renger, Ralph; Bartel, Gabrielle; Foltysova, Jirinia (2013), “The reciprocal relationship between implementation theory and program theory” in The Canadian Journal of Program Evaluation, vol. 28, nº 1, pp. 27-41.​​​​​​​

4.8.2 Peters, Jane S; McRae, Marjorie (2009), Process Evaluation Insights on Program Implementation, CIEE, fevereiro.

 

​​​​​​​​​​​​​​4.9 Cruzamento de Abordagens

4.9.1 Contribution Tracing

4.9.1.1 Befani, Barbara; Stedman-Bryce, Gavin (2016), “Process Tracing and Bayesian updating for impact evaluation” in Evaluation, SAGE, pp. 1-19.​​​​​​​

4.9.1.2 Befani, Barbara; Mayne, John (2013), “Process Tracing and Contribution Analysis: A Combined Approach to Generative Causal Inference for Impact Evaluation” in IDS Bulletin, 45(6), Institute of Development Studies, pp. 17-36.

 

4.9.2 Teoria da Mudança e outras abordagens

4.9.2.1 Bamanyaki, Patricia A.; Holvoet, Nathalie (2016), “Integrating theory-based evaluation and process tracing in the evaluation of civil society gender budget initiatives” in Evaluation, Londres, SAGE, pp. 72-90.​​​​​​​

4.9.2.2 Wauters, Benedict (2018), “Process tracing and congruence analysis to support theory-based impact evaluation”, in Evaluation, 24(3), SAGE, pp. 284-305.​​​​​​​

4.9.2.3 Wauters, Benedict (2016), Process tracing and congruence analysis to support theory based impact evaluation, Paper presented at the 12th European Evaluation Society Biennial Conference, Evaluation Futures in Europe and beyond: Connectivity, Innovation and Use, 29 de Setembro, Maastricht, Holanda, (Draft 31/8/2016).

 

5. Análise Contrafactual

Guias da Comissão Europeia:​​​​​​​

5.1 European Commission (2020), How to use administrative data for European Social Funds counterfactual impact evaluationsA step-by-step guide for managing authorities, maio.​​​​​​​

5.2 European Commission (2019), Advanced counterfactual evaluation methods. Guidance document, Directorate-General for Employment Social Affairs and Inclusion.​​​​​​​

5.3 European Commission (2013), Design and Commissioning of counterfactual impact evaluations, outubro.

Outros documentos:​​​​​​​

5.4 Athey, Susan; Imbens, Guido W. (2017), “The State of Applied Econometrics: Causality and Policy Evaluation” in Journal of Economic Perspectives, Vol. 31(2), primavera 2017, pp. 3-32.​​​​​​​

5.5 Born, Benjamin; Müller, Gernot; Schularick, Moritz; Sedlacek, Petr, (2017), The Economic Consequences of the Brexit Vote, No 1738, Discussion Papers, Centre for Macroeconomics (CFM).​​​​​​​

5.6 Słoczyński, Tymon; Wooldridge, Jeffrey M. (2014), A General Double Robustness Result for Estimating Average Treatment Effects, março 2014, IZA DP No. 8084, Institute for the Study of Labor.​​​​​​​

5.7 Michalek, Jerzy (2012), Counterfactual impact evaluation of EU rural development programmes – Propensity Score Matching methodology applied to selected EU Member States, vol 1: A micro level approach, Seville, European Commission, Joint Research Centre.​​​​​​​

5.8 Haynes, Laura; Service, Owain; Goldacre, Ben; Torgerson, David. (2012), Test, Learn, Adapt: Developing Public Policy with Randomised Controlled Trials, Cabinet Office Behavioural Insights Team.​​​​​​​

5.9 Iacus, Stefano M.; King, Gary; Porro, Giuseppe (2011), “Causal Inference without Balance Checking: Coarsened Exact Matching” in Political Analysis.​​​​​​​

5.10 Khander, Shahidur, R; Koolwal, Gayatri; Samad, Hussain (2010), Handbook on Impact Evaluation. Quantitative Methods and Practices, Washington DC, World Bank.​​​​​​​

5.11 Bondonio, Daniele (2009), “Impact identification strategies for evaluating business incentive programs” in EconPapers.​​​​​​​

5.12 Abadie, Alberto; Imbens, Guido W. (2008), “Large Sample Properties of Matching Estimators for Average Treatment effects.” in Econometrica, vol. 74(1), Jannuary 2006), pp. 235-267.

5.13 Angrist, Joshua D.; Pischke, Jörn-Steffen (2008), Mostly Harmless Econometrics: An Empiricist’s Companion, Princeton University Press. ​​​​​​​

5.14 Ho, D. E, Imai; Kosuke, King; Gary; Stuart, Elizabeth (2007), “Matching as Nonparametric Preprocessing for Reducing Model Dependence in Parametric Causal Inference” in Political Analysis, 18, pp. 199-236.​​​​​​​

5.15 Caliendo, Marco (2008), Some Practical Guidance for the Implementation of Propensity Score Matching, Discussion Pape Series, IZA, Bona, Forschungsinstitut zur Zukunft der Arbeit Institute for the Study of Labor.​​​​​​​

5.16 Caliendo, Marco, Hujer, Reinard (2006), “The Microeconometric Estimation of Treatment Effects – An Overview”, in Allgemeines Statistisches Archiv, 90, pp. 199-215.​​​​​​​

5.17 Duflo, Esther; Glennester, Rachel (2006), “Using Randomization in Development economics research: a toolkit”, in NBER Technical Working Paper Series.

 

6. Avaliações: aplicações práticas

6.1 Olejniczak, K.; Kozac, M.; Bienas, Stanislaw (2011), Evaluating the effects of regional interventions. A look beyond current Structural Funds’ practice, novembro, Varsóvia.​​​​​​​

6.2 Department for Business Innovation & Skills (2011), Guidance on Evaluation the Impact of Interventions on Business, agosto.

 

7. Técnicas de recolha e tratamento de informação

7.1 Estudos de Caso

7.1.1 Gerring, John (2017), Case Study Research. Principles and Practices, 2nd edition, Cambridge, Cambridge University Press.​​​​​​​

7.1.2 Seawright, Jason; Gerring, John (2008), “Case Selection Techniques in Case Study Research. A Menu of Qualitative and Qualitative Options”, in Public Research Quarterly, vol. 61(2), junho, pp. 294-308.​​​​​​​

7.1.3 Yin, Robert (2003), Case Study Research. Design and Methods, USA, SAGE Publications.

 

7.2 Inquéritos por Questionário

7.2.1 ESF Sample Calculator.​​​​​​​

7.2.2 Chyung, Seung; Swanson, Ieva; Roberts, Katherine; Hankinson, Andrea (2018), “Evidence‐Based Survey Design: The Use of Continuous Rating Scales in Surveys” in Performance Improvement, 57, pp. 38-48.​​​​​​​

7.2.3 Saris, Willem E.; Revilla, Melanie; Krosnick, Jon A.; Shaeffer, Eric M. (2010), “Comparing Questions with Agree/Disagree Response Options to Questions with Item-Specific Response Options” in Survey Research Methods, Vol.4, No.1, European Survey Research Association, pp. 61-79.​​​​​​​

7.2.4 Willis, Gordon B.; Lessler, Judith T. (1999), Question Appraisal System QAS-99, Rockville, Research Triangle Institute Suite.​​​​​​​

7.2.5 Czaja, Ronald; Blair, Johnny (2005), Designing surveys: a guide to decisions and procedures, 2nd ed., Thousand Oaks, Califórnia, Pine Forge Press. | Disponibilidade online mediante pagamento.​​​​​​​

7.2.6 Groves, Robert M.; Fowler Jr.; Floyd J.; Couper, Mick P.; Lepkowski, James; Singer, Eleanor; Tourangeau, Roger (2004) Survey Methodology, Nova Jérsia, Wiley Interscience.​​​​​​​

7.2.7 Bryman, Alan; Cramer, Duncan (2003), Análise de dados em Ciências Sociais, Introdução às Técnicas Utilizando o SPSS para Windows, Oeiras, Celta Editora, 3ª ed. | Disponibilidade online mediante pagamento.

7.2.8 De Vaus, David (2014), Surveys in Social Research, 6th edition, Nova Iorque, Routledge. | Disponibilidade online mediante pagamento.​​​​​​​

7.2.9 Coffey, Amanda; Tannock, Stuart; Heley, Jesse; Mann, Robin; Patterson, Corinna; Plows, Alexandra; Woods, Mike (2012), Anonymisation in Social Research, Social Research Wiserd, Briefing Series.Dillman, D. A., Smyth; Jolene D., Christian, L. M. (2008), Internet, Mail, and Mixed-Mode Surveys: The Tailored Design Method, Nova Jérsia, Wiley and Sons. | Disponibilidade online mediante pagamento.

 

7.3 Técnicas de Recolha Participativa de Informação

7.3.1 Taplin, Dana H.; Rasic, Muamer (2012), Facilitator’s Source Book. Sourcebook for Facilitators Leading Theory of Change Development Sessions, março, Nova Iorque, ActKnowledge.​​​​​​​

7.3.2 Jost, Christine; Alvarez, Sophie;, Schuetz, Tonya (2014), Theory of Change Facilitation Guide, Version 2, , Dinamarca, Climate Change, Agriculture and Food Security.

 

8. Política Baseada em Evidências

8.1 Puttick, Ruth (2011), Using Evidence to Improve Social Policy and Practice. Perspectives on how research and evidence can influence decision making NESTA Making Innovation Flourish, outubro, Reino Unido, Alliance for Useful Evidence.​​​​​​​

8.2 NESTA (2011), Evidence for Social Policy and Practice Perspectives on how research and evidence can influence decision making in public services, NESTA Making Innovation Flourish, outubro, Reino Unido, Alliance for Useful Evidence.​​​​​​​

8.3 Bamberger, Michael; Kirk, Angeli (eds.) (2009), Making smart policy: using impact evaluation for policy making – Case Studies on Evaluations that Influenced Policy, Doing Impact Evaluation, nº 14, Poverty Reduction and Economic Management, junho, Washington, DC, World Bank.

 

9. Outros Temas

9.1 Estratégia de Avaliação do Portugal 2020 – Uma perspetiva segundo a Teoria da Mudança | Caderno Temático n.º 3 – AD&C

9.2 Caso Pedagógico sobre a Avaliação de Políticas Públicas – Uma rede para melhores políticas | IPPS-ISCTE