Nicht aus der Schweiz? Besuchen Sie lehmanns.de
Evaluation Essentials, First Edition

Evaluation Essentials, First Edition

From A to Z
Buch | Softcover
260 Seiten
2010
Guilford Publications (Verlag)
978-1-60623-898-1 (ISBN)
CHF 59,30 inkl. MwSt
zur Neuauflage
  • Titel erscheint in neuer Auflage
  • Artikel merken
Zu diesem Artikel existiert eine Nachauflage
Written in a refreshing conversational style, this text thoroughly prepares students, program administrators, and new evaluators to conduct evaluations or to use them in their work. The book's question-driven focus and clear discussions about the importance of fostering evaluation use by building collaborative relationships with stakeholders set it apart from other available texts. In 26 concise sections, Marvin C. Alkin explores how to articulate answerable evaluation questions, collect and analyze data using both quantitative and qualitative methods, and deal with contingencies that might alter the traditional sequence of an evaluation. Student-friendly features include handy bulleted recaps of each section, ""Thinking Ahead"" and ""Next Steps"" pointers, cautionary notes, annotated suggestions for further reading, and an in-depth case study that provides the basis for end-of-chapter exercises.

Marvin C. Alkin is Emeritus Professor in the Social Research Methodology Division of the Graduate School of Education and Information Studies at the University of California, Los Angeles. He has been a member of the UCLA faculty since 1964 and has, at various times, served as Chair of the Education Department and Associate Dean of the school. Dr. Alkin was one of the founders of the Center for the Study of Evaluation and was its Director for 7 years. The Center, established in 1966 by the U.S. government to engage in research on appropriate methods for conducting evaluation of educational programs, continues to be an integral part of the UCLA Graduate School of Education and Information Studies. Dr. Alkin is a leading authority in the field of evaluation. He has published important research studies on the use of evaluation information in decision making and on comparative evaluation theory. His publications include five books and more than 150 journal articles, book chapters, monographs, and technical reports. Dr. Alkin is currently Associate Editor of Studies in Educational Evaluation and co-section Editor of the American Journal of Evaluation. He has been a consultant to numerous national governments and has directed program evaluations in 14 different countries.

1. Overview. 2. Structure. A. What Is Evaluation? 3. Professional Program Evaluation. 4. Evaluation and Research. 5. Evaluation Definition. 6. A Confusion of Terms. 7. Evaluation Purposes. B. Why Do Evaluations? 8. Making Decisions. 9. Issues for Professional Evaluation. Time-Out: The RUPAS Case. 10. The Rural Parents’ Support Program (RUPAS): A Community Well-Being Case Study, Nicole Eisenberg. C. Who Does Evaluations? 11. Evaluator Settings. 12. Multiple Orientations to Doing Evaluation. 13. My View. D. Who Are the Stakeholders for an Evaluation? 14. Stakeholders—Not Audience. 15. Who Are the Stakeholders? 16. Focus on Primary Stakeholders. 17. Differences in Stakeholder Participation. E. How Are Positive Stakeholder Relationships Maintained? 18. Gaining RTC (Respect, Trust, Credibility). 19. Concluding Note. F. What Is the Organizational, Social, and Political Context? 20. Organizational Context. 21. Social Context. 22. Political Context. 23. My Advice. 24. Thinking Ahead. G. How Do You Describe the Program? 25. Program Components. 26. Program Size and Organizational Location. 27. Program Differences. 28. What Should We Know about Programs. 29. Learning about the Program. H. How Do You "Understand" the Program? 30. Theory of Action. 31. Logic Models. 32. Why is This Important? 33. What Does a Logic Model Look Like? 34. A Partial Logic Model. 35. Getting Started. I. What Are the Questions/Issues to Be Addressed? 36. Kinds of Evaluation Questions. 37. Getting Started on Defining Questions. 38. Some Next Steps. J. Who Provides Data? 39. Again, Be Clear on the Questions. 40. Focus of the Data. 41. Selecting Individuals. K. What Are Instruments for Collecting Quantitative Data? 42. Instruments for Attaining Quantitative Data. 43. Acquisition of Data. 44. Existing Data. 45. Finding Existing Instruments. 46. Developing New Instruments. 47. Questionnaire Construction. 48. Measuring Achievement. 49. Achievement Test Construction. 50. Observation Protocols. L. What Are Instruments for Collecting Qualitative Data? 51. Developing New Instruments. 52. Observations. 53. Interviews and Focus Groups. 54. Surveys and Questionnaires. M. What Are the Logistics of Data Collection? 55. Gaining Data Access. 56. Collecting Data. 57. Quality of Data. 58. Understanding the Organization’s Viewpoints. 59. My Advice. N. Are the Questions Evaluable (Able to Be Evaluated)? 60. Stage of the Program. 61. Resources. 62. Nature of the Question. 63. Establishing Standards. 64. Technical Issues. 65. Ethical Issues. 66. Political Feasibility. 67. My Advice. O. What Is the Evaluation Plan (Process Measures)? 68. The Evaluation Design. 69. Process Measures. 70. Program Elements. 71. Program Mechanisms. 72. Question. P. What Is the Evaluation Plan (Outcome Measures)? 73. An Exercise to Assist Us. 74. Toward Stronger Causal Models. 75. Descriptive Designs. 76. Mixed Methods. 77. Summary. 78. My Advice. Q. What Is the Evaluation Plan (Procedures and Agreements)? 79. Evaluation Activities: Past, Present, and Upcoming. 80. The Written Evaluation Plan. 81. The Contract. 82. My Advice. 83. R. How Are Quantitative Data Analyzed? 84. Types of Data. 85. A First Acquaintance with the Data. 86. Measures of Central Tendency. 87. Measures of Variability. 89. Getting Further Acquainted with the Data. 90. Descriptive and Inferential Statistics. 91. Are the Results Significant? 92. Appropriate Statistical Techniques. 93. My Warning. S. How Are Qualitative Data Analyzed? 94. Refining the Data. 95. Testing the Validity of the Analysis. T. How Do Analyzed Data Answer Questions? 96. Difficulties in Valuing. 97. Valuing in a Formative Context. 98. "Valuing" Redefined. 99. A Final Note. U. How Are Evaluation Results Reported? 100. Communication. 101. Reporting. 102. The Final Written Report. 103. Nature and Quality of Writing. V. What Is the Evaluator’s Role in Helping Evaluations to Be Used? 104. A Word about "Use." 105. What Is Use? 106. What Can You Do? 107. Guard against Misuse. W. How Are Evaluations Managed? 108. Acquiring the Evaluation. 109. Contract/Agreement. 110. Budget. 111. Operational Management. X. What Are the Evaluation Standards and Codes of Behavior? 112. Judging an Evaluation. 113. The Program Evaluation Standards. 114. American Evaluation Association Guiding Principles. Y. How Are Costs Analyzed? 115. Cost-Effectiveness Analysis. 116. Cost-Benefit Analysis. 117. Cost-Utility Analysis. 118. And Now to Costs. 119. How to Determine Cost. Z. How Can You Embark on a Program to Learn More about Evaluation? 120. Getting Feedback on Evaluation. 121. Taking Full Advantage of This Volume. 122. Gaining Evaluation Expertise Beyond This Volume. Appendix. 123. An Evaluation Lesson, "Unknown Student."

Erscheint lt. Verlag 8.11.2010
Verlagsort New York
Sprache englisch
Maße 156 x 234 mm
Gewicht 382 g
Themenwelt Geisteswissenschaften Psychologie
Sozialwissenschaften Pädagogik
ISBN-10 1-60623-898-1 / 1606238981
ISBN-13 978-1-60623-898-1 / 9781606238981
Zustand Neuware
Haben Sie eine Frage zum Produkt?
Mehr entdecken
aus dem Bereich
Kleine Auf-Bau-Therapie - Neuropsychologisches Therapieprogramm für …

von Anne Schroeder

Buch (2023)
modernes lernen (Verlag)
CHF 41,90