From the web page http://www.ncddr.org/dsi/index.html National Center for the Dissemination of Disability Research Dissemination Self-Inventory What is the Dissemination Self-Inventory AN INTRODUCTION This self-inventory has been developed to assist National Institute on Disability and Rehabilitation Research (NIDRR) funded project staff in reviewing their dissemination practices. Use of this self-inventory is designed to assist project staff that want the best dissemination strategies possible but have limited time and resources. Dissemination is a process that has measurable components. The dissemination process should create anticipated and planned-for results or impact. In most cases, dissemination can be concretely measured, however, the timeframe for measurement may require a prolonged period depending upon the exact nature of the desired impact. Many times, the evaluation of dissemination never goes beyond "bean counting" of numbers of pieces of paper, numbers of contacts, or numbers of responses. This may occur because the "beans" are the easiest and most immediate measures available. Beans should be counted and can provide some valuable information. However, the issue of your NIDRR project's impact on intended audiences necessitates a broader analysis than simple "bean counting" exercises. Why a Self-Inventory The Spirit of Improvement The National Center for the Dissemination of Disability Research (NCDDR) believes that no matter how good a dissemination effort may be, it can still be improved. This is the spirit with which this self-inventory instrument was developed. It is designed to provide an aid to the principal investigator in analyzing relative strengths and weaknesses of current, actual dissemination strategies. The details of dissemination plans frequently differ from project to project. Utilization -- the use of information you disseminate -- should be the goal of your dissemination strategy. How to achieve this depends on your topical area, your identified target audience(s), your environment and its resources, and the resources available through your grant funding. Today, our challenge is to be more effective -- produce more impact -- with the same or fewer resources. This is not easy but with a spirit of improvement and a notion that trying new strategies can be a very positive experience, it can be beneficially addressed. This self-inventory may help in providing some ideas in addressing this challenge. The Three Versions of the Self-Inventory Version One This is the first of three versions of this self-inventory. The content for this first-generation inventory is taken from the literature concerning dissemination, knowledge utilization, and the change process. This initial basis for the self-inventory is sound and provides "safe footing" in appraising dissemination practices and in designing new project approaches to increase impact. Version Two The second version of this self-inventory will include the results of a benchmarking effort to identify the "best examples" of dissemination and knowledge utilization. These benchmarking examples will not be drawn from the ranks of NIDRR grantees but rather, will come from business and industry. The rationale in looking for examples in business and industry, is that the identification of new strategies will require us to look in new places. Recognizing that NIDRR grant activity has significant differences from many business and industry efforts, care will be taken to identify models that seem to be most applicable in highlighting new practices that have great potential for working in human service settings such as those reflected by most NIDRR grantees. Version Three The third version of this self-inventory will link research and benchmarking practices, with special knowledge from both research and benchmarking concerning issues of multiculturalism and underservice. Benchmarking activity will identify "best practices" from public relations and marketing firms specializing in campaigns that successfully reach minority persons and persons with disabilities. This version will add new research and benchmarking examples that are aimed at increasing exposure and use of NIDRR grantee results by minority persons with disabilities and their families. Self Inventory Form: Relative Dissemination Strengths and Weaknesses Version One Instructions: Start by printing the Self-Inventory from your browser and filling it out at your convenience. Read through the following items for the five areas mentioned below. Score your response to each item on the scale provided. You may want to think in terms of your project's general dissemination efforts, or you may want to focus on dissemination strategies that you are using for a specific set of research results. Either is acceptable, but you should be consistent in your perspective. Dissemination is influenced by project staff knowledge and activities in five major areas: * User Group - user group(s) or potential users of the information or product to be disseminated * Information Source - your project/organization as an information source, that is, the agency, organization, or individual responsible for creating new knowledge or products, and/or for conducting dissemination activities * Content of Message - message content that is disseminated, such as, the new knowledge or product itself, and any supporting information or materials * Medium of the Message - the ways in which the knowledge or product is described, "packaged," and transmitted; and * Contextual Consideration for Implementation - context for use of the message, that is, the environmental, personal, and other supports needed to use the information or product. When you finish, score your responses according to the scoring directions at the end of the inventory. (See example below.) Your scores can be transferred to the graph outline provided to suggest visually relative strengths and weaknesses across the five major categories: user groups, information source, content, medium, and context. D Please e-mail us if you have any questions concerning the Self-Inventory. While this inventory is just that, an inventory, it has been constructed to have instructional value. Chart your results on the graph that is providied in the PDF format, "Thinking about Your Results" for some ideas for improving your dissemination practice related to each area. If you would like to read about a certain category, click on the References section and you will find citations for research serving as the basis of the inventory items. By providing these suggestions and citations, we hope to stimulate your thinking about your dissemination efforts. 1. Does your research design clearly define the intended groups of "users" or beneficiaries of your project's results? Select One: 0 = Don't Know 1 = No 2 3 = Partially 4 5 = Yes 2. Does your project design include specific dissemination and utilization activities targeting primary (and secondary, as appropriate) user groups? Select One: 0 = Don't Know 1 = No 2 3 = Partially 4 5 = Yes 3. Was your project's original proposal developed in collaboration with your current intended user/beneficiary groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 4. Has a representative sample of your intended user groups been meaningfully involved in planning, implementing, and evaluating the project's activities? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 5. Do you have needs assessment data that identifies the extent of motivation or readiness of your intended user groups to use information in the project's topical area? Select One: 0 = Don't Know 1 = No 2 3 = Somewhat 4 5 = Yes 6. Is information concerning the resources, knowledge, and information needed for the intended user to implement project results clear and a part of your dissemination plan? Select One: 0 = Don't Know 1 = No 2 3 = Partially 4 5 = Yes 7. Is your dissemination strategy directly targeted to intended users by name? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 8. Are your intended user groups known well enough that the project can describe their dissemination-related characteristics such as: average reading/comprehension level, dominant language, level/extent of desired information, and accessibility requirements? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 9. Have you sampled your user groups regarding information previously received from your project to determine their satisfaction with its communication style, content of message, and accessibility? (Note: Projects in their initial stage may omit this question and adjust scores/graph values accordingly.) Select One: 0 = Don't Know 1 = No 2 3 = Partially 4 5 = Yes 10. Does your project design clearly describe measurable outcomes to assess impact/use by user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes Category Score for User Group Total Score for questions 1-10: Divide Total Score by 50: Multiply by 100 to get percentage: D These items refer to the degree to which project information is perceived or known as a source of information. 11. Is the project perceived by user groups as an active information dissemination resource? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 12. Are your project staff regarded by users as highly knowledgeable resources in the project's topical area? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 13. Is your project collaborating with a nationally recognized association, organization, institution of higher education, or other entity in the scientific community? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 14. Is your project collaborating with other organizations that are equally or more highly visible or interactive with your project's intended user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 15. Does your project provide technical assistance to facilitate implementation of project information by the intended user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 16. Does your project bring the project's researchers into frequent personal dialogue with members of the intended user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 17. Do you annually evaluate the impact of your dissemination activities in terms of both process and outcome measures? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 18. Do you provide periodic feedback to your user groups regarding your dissemination-related impact evaluation data? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 19. Do you measure the cost-effectiveness of various dissemination-related strategies you may have used? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 20. Has an assessment been conducted within the last two years of the intended/actual user groups' perceived information needs? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 21. Is the project's "host" organization well-known among user groups? Select One: 0 = Don't Know 1 = No 2 3 = Somewhat 4 5 = Yes 22. Do users frequently (three or more times each work week) contact the project for information in your topical area? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 23. Does your project share your research results and related information with potential user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 24. Does your project use past dissemination impact evaluation data to guide new plans for dissemination? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes Category Score for Information Source Total Score for questions 11-24: Divide Total Score by 70: Multiply by 100 to get percentage: D 25. Is the reading/comprehension level required to understand your project's information analyzed and matched to the characteristics of your intended user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 26. Does the content of project information match the expressed informational needs of the intended user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 27. Does your project information contain examples or demonstrations of how to use, and the implications of use, of the information? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 28. Is the content of your project information reviewed through a quality control mechanism to assure accuracy and relevance? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 29. Have your user groups been involved in developing content and in field-testing (or other review) and revision of your project information? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 30. Is your project information available in languages that are dominant among your intended user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 31. Do you provide your project information in alternate formats that are accessible to all members of the intended user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 32. Is the amount of technical information included in your project materials responsive to expressed technical information needs of your user groups? Select One: 0 = Don't Know 1 = Never 2 = Seldom 3 = Sometimes 4 = Frequently 5 = All the Time 33. Does your project information include "real world" examples and illustrations to communicate to non-technical user groups? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 34. Do you share and allow requests for information through multiple means, for example, telephone, fax, mail, e-mail, and other modes upon request? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes Category Score for Content of Message Total Score for questions 25-34: Divide Total Score by 50: Multiply by 100 to get percentage: D 35. Does your project make information available in any alternate format requested by individual users? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 36. Does your dissemination strategy include opportunities for person-to-person contact with users? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 37. Have your project staff conducted a needs assessment to determine user"s general accessibility requirements? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 38. Are you providing information to users through channels (visual, auditory, etc.) they are known to prefer? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 39. Is your project information delivered directly to intended users? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 40. Is your project information delivered through existing (not developed by grantee) networks, communication channels, associations/organizations, meetings/conferences, and other venues? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 41. Is your project information available through the Internet? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 42. Place a check by the formats and modes your project has used to disseminate information in the last twelve months: * Internet * Press Releases * Newspapers * Books * Chapters * Magazines * Radio * Television * Brochures * Monograph * Internet/Elec. Files * Information Manuals * Paper Presentation * Journal Articles * Audio Tape * Braille * Large Print * Compact Disk * Popular Press * Non-English Language Select One: 0 = No Items Checked 1-3 = Items Checked 4-8 = Items Checked 9-13 = Items Checked 14-17 = Items Checked 18-21 = Items Checked Category Score for Medium of the Message Total Score for questions 35-42: Divide Total Score by 40: Multiply by 100 to get percentage: D 43. Do you treat dissemination as a process that requires time and personal support to be effective? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 44. Does your project develop a written plan with objectives as a guide in delivering technical assistance to user groups? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 45. Does your project design clearly describe dissemination goals, strategies, and expected outcomes? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 46. Does your project periodically sample recipients of project information to determine their perceptions and the extent to which they may have recommended your project (or project information) to others? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 47. Has your disseminated information been evaluated by users in terms of its ease of use? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 48. Does your project periodically sample to determine the manner in which users learn about the availability of your project's information? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 49. Has your project clarified what user organizational changes are necessary to successfully implement your project's information? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 50. Is a range of ways to use your project information provided for users? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 51. Does your project conduct personal follow-up with users to assess how useful project information was and how easily it could be applied? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 52. Does your project provide a variety of ways for potential users to discuss implementation of project information with project researchers or content experts? Select One: 0 = Don't Know 1 = No 2 3 = Sometimes 4 5 = Yes 53. Does your project dissemination plan address motivational factors that may promote use of your disability research information? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes 54. Does your project staff that engage in discussions with user groups have both technical knowledge and effective communication skills? Select One: 0 = Don't Know 1 = No 2 3 4 5 = Yes Total Score for questions 43-54: Divide Total Score by 60: Multiply by 100 to get percentage: D R E F E R E N C E S References by Category * User Group * Information Source * Content of Message * Medium of the Message * Contextual Consideration for Implementation * General References for Further Reading ---------- REFERENCES USER GROUP Duarte, J.A., & Rice, B.D. (1992, October). Cultural diversity in rehabilitation. Nineteenth Institute on Rehabilitation Issues. Hot Springs, AR: Arkansas Research and Training Center in Vocational Rehabilitation. Edwards, L. (1991). Using knowledge and technology to improve the quality of life of people who have disabilities: A prosumer approach. Philadelphia: Pennsylvania College of Optometry. Fuhrman, S. (1994, April). Uniting producers and consumers: Challenges in creating and utilizing educational research and development. In Tomlinson & Tuijnman (Eds.), Education research and reform: An international perspective (pp. 133-147). Washington, DC: U.S. Department of Education. Hall, G. & Hord, S. (1987). Change in schools: Facilitating the process. Albany, NY: State University of New York Press. Huberman, M. (1990, Summer). Linkage between researchers and practitioners: A qualitative study. American Educational Research Journal, 363-391. Leung, P. (1992, June). Translation of knowledge into practice. In Walcott & Associates, NIDRR National CRP Panel final report. Washington, DC: Walcott & Associates. Louis, K.S. (1983). Dissemination systems: Some lessons from programs of the past. In W.J. Paisley & M. Butler (Eds.), Knowledge uilization systems in education. Beverly Hills: Sage. Pollard, J. & Rood, M. (1989). Design and development of an emerging-issues tracking system for state-level educational policy and decisionmakers. Paper presented at SIG/Futures Research and Strategic Planning Symposium of the American Educational Research Association, San Francisco. Sanderson, P.R. (1994, June). Beyond research data and demographics for American Indians with disabilities. In Dew, D. (Ed.), Disability, diversity, and change: In rehabilitation (pp. 48-53). Proceedings of the 1994 National Symposium. San Diego: Rehabilitation Cultural Diversity Initiative. Steinke, J. (1995, June). Reaching readers: Assessing readers' impressions of science news. Science Communication, 432-453. Westbrook, J. (1994). Promoting change through information dissemination and utilization. Regional Rehabilitation Exchange Update 2(1) Austin, TX: Southwest Educational Development Laboratory. INFORMATION SOURCE Crandall, D.P. & Loucks, S.F. (1983). A roadmap for school improvement, people, policies, and practices: Examining the chain of school improvement. Volume 10. Andover, MA: The NETWORK. Donabedian, A. (1966). Evaluating the quality of medical care. Milbank Medical Fund Quarterly, 44,166-206. Fuhrman, S. (1994, April). Uniting producers and consumers: Challenges in creating and utilizing educational research and development. In Tomlinson & Tuijnman (Eds.), Education research and reform: An international perspective (pp. 133-147). Washington, DC: U.S. Department of Education. Huberman, M. (1987, June). Steps toward an integrated model of research utilization. Knowledge, 586-611. Leung, P. (1992, June). Translation of knowledge into practice. In Walcott & Associates, NIDRR National CRP Panel final report. Washington, DC: Walcott & Associates. Peterson, G.M. & Emrich, V.A. (1983). Advances in practice. In W.J. Paisley & M. Butler (Eds.), Knowledge utilization systems in education. Beverly Hills: Sage. Sechrest, L., Backer, T.E., & Rogers, E.M. (1994). Synthesis of ideas for effective dissemination. In L. Sechrest, T.E. Backer, E.M. Rogers, T.F. Campbell, & M.L. Grady (Eds.), Effective dissemination of clinical and health information: Conference summary (pp. 187-196). AHCPR Pub. No. 95-0015. Rockville, MD: Agency for Health Care Policy and Research. Westbrook, J., Botterbusch, K. (1989). Characteristics of technical assistance.The provisions of technical assistance for vocational rehabilitation, 16th Institute for Rehabilitation Issues. Menomonie, WI: University of Wisconsin. Yin, R.K., & Moore, G.B. (1984). The utilization of research: Lessons from a multi-disciplined field. Washington, DC: Cosmos Corporation. CONTENT OF MESSAGE Fuhrman, S. (1994, April). Uniting producers and consumers: Challenges in creating and utilizing educational research and development. In Tomlinson & Tuijnman (Eds.), Education research and reform: An international perspective (pp. 133-147). Washington, DC: U.S. Department of Education. Patton, M.Q. (1986). Utilization focused evaluation. Beverly Hills: Sage. Pollard, J. (1989). Educational choice-thinking it through. INSIGHTS on educational policy and practice. No. 8. Austin, TX: Southwest Educational Development Laboratory. Scriven, M. (1967). The methodology of evaluation. In Ralph W. Tyler et al. (Eds.) Perspectives in curriculum evaluation. AERA Monograph. Chicago: Rand McNally. Seidel, A.D. (1981, December). Underutilized research: Researchers' and decision makers' conceptions of information quality. Knowledge, 233-248. University of Wisconsin-Stout (1989). The provisions of technical assistance for vocational rehabilitation. Menomonie, WI: Author. MEDIUM OF THE MESSAGE Edwards, L. (1991). Using knowledge and technology to improve the quality of life of people who have disabilities: A prosumer approach. Philadelphia: Pennsylvania College of Optometry. Paisley, W.J. 1993, May). Knowledge utilization: The role of new communications technologies. Journal of the American Society for Information Science, 222-234. Regional Information Exchange (1993). Overview of the information exchange process. Washington, DC: NIDRR. University of Wisconsin-Stout (1989). The provisions of technical assistance for vocational rehabilitation. Menomonie, WI: Author. CONTEXTUAL CONSIDERATION FOR IMPLEMENTATION Crandall, D.P. (1984). Principles for effective networking and program improvement. Andover, MA: The NETWORK. Crandall, D.P. & Loucks, S.F. (1983). A roadmap for school improvement, people, policies, and practices: Examining the chain of school improvement. Volume 10. Andover, MA: The NETWORK. Dentler, R.A., Kell, D., & Louis, K.S. (1983). Putting knowledge to work: An examination of an approach to improvement in education, (Second Synthesis Report). Cambridge, MA: Abt Associates. Edwards, L. (1991). Using knowledge and technology to improve the quality of life of people who have disabilities: A prosumer approach. Philadelphia: Pennsylvania College of Optometry. Fullan, M. (1991). The new meaning of educational change. New York: Teachers College Press/Columbia University. Glaser, E.M., Abelson, H.H., & Garrison, K.N. (1983). Putting knowledge to use: Facilitating the diffusion of knowledge and the implementation of planned change. San Francisco: Jossey-Bass. Leung, P. (June 1992). Translation of knowledge into practice. In Walcott & Associates, NIDRR National CRP Panel final report. Washington, DC: Walcott & Associates. Louis, K.S., Dentler, R.A., & Kell, D.G. (1984). Putting knowledge to work: Issues in educational dissemination. Cambridge, MA: Abt Associates. Majunder, R.K., Walls, R.T., Fullmer, S.L., & Dowler, D.L. (1994, June). Information flow in vocational rehabilitation. Rehabilitation Counseling Bulletin, 332-346. Peterson, G.M. & Emrich, V.A. (1983). Advances in practice. In W.J. Paisley & M. Butler (Eds.), Knowledge utilization systems in education Beverly Hills: Sage. Westbrook, J., Lumbley, J. (1990, Spring). Consumer-driven supported employment: A way to improve supported employment services and outcomes. Bulletin of the National Model for Supported Employment and Independent Living. Austin, TX: Southwest Educational Development Laboratory. General References for Futher Reading Anderson, R.H., Bikson, T.K., Law, S.A., & Mitchell, B.M. (1995). Universal access to e-mail: Feasibility and societal implications. Santa Monica, CA: Rand Corporation. Backer, T.E. (1988, April/May/June). Research utilization and managing innovation in rehabilitation organizations. Journal of Rehabilitation, 18-22. Blasiotti, E.L. (1992, March). Disseminating research information to multiple stakeholders: Lessons from the experience of the National Institute on Disability and Rehabilitation Research. Knowledge, 305-319. Brown-McGowan, S. & Eichelberger, R.T. (1993, June). The utility of a knowledge use system in a higher education evaluation setting. Knowledge, 401-416. Buchman, M. (1982, July). The use of knowledge: Conceptual problems and empirical confusion. Occasional Paper No. 57. East Lansing, MI: Institute for Research on Teaching. Buttolph, D. (1992, June). A new look at adaptation. Knowledge, 460-480. Crandall, D.P. (1984). Principles for effective networking and program improvement. Andover, MA: The NETWORK. Crandall, D.P. & Loucks, S.F. (1983). A roadmap for school improvement, people, policies, and practices: Examining the chain of school improvement. Volume 10. Andover, MA: The NETWORK. Dentler, R.A., Kell, D., & Louis, K.S. (1983). Putting knowledge to work: An examination of an approach to improvement in education (Second Synthesis Report). Cambridge, MA: Abt Associates. Donabedian, A. (1966). Evaluating the quality of medical care. Milbank Medical Fund Quarterly, 44, 166-206. Edwards, L. (1991). Using knowledge and technology to improve the quality of life of people who have disabilities: A prosumer approach. Philadelphia: Pennsylvania College of Optometry. Fuhrman, S. (1994, April). Uniting producers and consumers: Challenges in creating and utilizing educational research and development. In Tomlinson & Tuijnman (Eds.), Education research and reform: An international perspective (pp. 133-147). Washington, DC: U.S. Department of Education. Fullan, M. (1982). The meaning of educational change. New York: Teachers College Press/Columbia University. Hall, G. & Hord, S. (1987). Change in schools: Facilitating the process. Albany, NY: State University of New York Press. Hutchinson, J.R. (1995, September). A multimethod analysis of knowledge use in social policy: Research use in decisions affecting the welfare of children. Science Communication, 90-106. Indyk, D., & Rier, D.A. (1993, September). Grassroots AIDS knowledge: Implications for the boundaries of science and collective action. Knowledge, 3-47. Kennedy, M.M. (1989, Summer). Studying smoking behavior to learn about dissemination. Knowledge, 107-115. Leung, P. (1992, June). Translation of knowledge into practice. In Walcott & Associates, NIDRR National CRP Panel final report. Washington, DC: Walcott & Associates. Louis, K.S. (1983). Dissemination systems: Some lessons from programs of the past. In W.J. Paisley & M. Butler (Eds.), Knowledge utilization systems in education. Beverly Hills: Sage. Louis, K.S., Dentler, R.A., & Kell, D.G. (1984). Putting knowledge to work: Issues in educational dissemination. Cambridge, MA: Abt Associates. Miguel, R.J., Izzo, M.V., Lankard, B.A., & Riffey, G. (1985). Knowledge development and utilization: Getting employability research into public use. Columbus, OH: National Center for Research in Vocational Education. National Center for the Dissemination of Disability Research (1995). Dissemination, utilization and the NCDDR. [Online] Available: http://www.ncddr.org/du/ncddrdu.html National Institute on Disability and Rehabilitation Research. (1994). Access to computers and electronic equipment. Rehab BRIEF, 16(3). Newman, S., & Vash, C. (1994). Gray matter: Utilization of rehabilitation research results. Rehabilitation Education, 8(4), pp. 380-385. Patton, M.Q. (1986). Utilization focused evaluation. Beverly Hills: Sage. Peterson, G.M. & Emrich, V.A. (1983). Advances in practice. In W.J. Paisley & M. Butler (Eds.) Knowledge utilization systems in education. Beverly Hills: Sage. Pollard, J. (1989). Educational choice-thinking it through. INSIGHTS on educational policy and practice. No. 8. Austin, TX: Southwest Educational Development Laboratory. Pollard, J. & Rood, M. (1989). Design and development of an emerging-issues tracking system for state-level educational policy and decisionmakers. Paper presented at SIG/Futures Research and Strategic Planning Symposium of the American Educational Research Association, San Francisco. Regional Information Exchange (1993). Overview of the information exchange process. Washington, DC: NIDRR. Scriven, M. (1967). The methodology of evaluation. In Ralph W. Tyler et al. (Eds.), Perspectives in curriculum evaluation. AERA Monograph. Chicago: Rand McNally. University of Wisconsin-Stout (1989). The provisions of technical assistance for vocational rehabilitation. Menomonie, WI: Author. Westbrook, J. (1994). Promoting change through information dissemination and utilization. Regional Rehabilitation Exchange Update 2(1) Austin, Texas: Southwest Educational Development Laboratory. Westbrook, J.D. & Boethel, M. (1996). The dissemination and utilization of disability research: The National Center for the Dissemination of Disability Research approach [Online]. Available: http://www.ncddr.org/du/ncddrapproach.html Westbrook, J. & Boethel, M. (1996). General characteristics of effective dissemination and utilization [Online]. Available: http://www.ncddr.org/du/characteristics.html Westbrook, J., Botterbusch, K. (1989). Characteristics of technical assistance. The provisions of technical assistance for vocational rehabilitation, 16th Institute for Rehabilitation Issues. Menomonie, WI: University of Wisconsin. Westbrook, J., Lumbley, J. (1990, Spring). Consumer-driven supported employment: A way to improve supported employment services and outcomes. Bulletin of the National Model for Supported Employment and Independent Living. Austin, TX: Southwest Educational Development Laboratory. Yin, R.K., & Moore, G.B. (1984). The utilization of research: Lessons from a multi-disciplined field. Washington, DC: Cosmos Corporation. Updated June 17, 1997 ---------- End of Document