Following presentations by meeting participants about six NSF-funded websites in various stages of development that support online professional communities, we moved into small breakout sessions to discuss the topic of how to assess the value and impact of these kinds of online endeavors. To set the stage and get participants thinking about real user types, Group Session 1 focused on characterizing ISE professionals with examples of professional development needs and activities. Group Session 2 discussed how the new NSF/ISE Framework for Evaluating Impacts of Informal Science Projects could be applied to online professional development and learning communities. In small groups, we discussed how to conceptualize a new or modified framework more appropriate for measuring the impacts of online communities. In Group Session 3, groups considered and drafted approaches for how to build an innovative research and evaluation plans to assess the impact of the community sites such as those presented.
Group Session 1 - Characterizing ISE Professional UsersAs an opening activity to familiarize Inquiry Group participants with professionals working in informal science education (ISE) and create a shared set of user types, we did a user scenario building exercise. In small groups, participants were asked to think about who ISE professionals are. Then each group developed two user scenarios characterizing a typical professional and how they might use online community websites for professional development.
Over the course of this Group Session, the following are some of user types that were identified and discussed:
- New and potential PIs
- Experienced, well–known PI
- Community educator
- Young, aspiring informal science researcher
- Disciplinary or professional neighbor
- Policy maker / executive
- Private funder/foundations
- Volunteer museum worker
- Project scientist with an outreach budget and mandate
- Professional group interested in starting an online community
- Conference organizer
Going into this discussion, we knew that informal science education was not a homogeneous field made up of a limited set of professional practices or disciplinary expertise. We discovered in our conversations that the field might be even more diverse (and perhaps disparate?) than any of us had suspected. The needs of an afterschool or museum educator are different than a researcher, and different again from an exhibit designer or new media producer? Yet, they share the goal of doing better informal science education. We are still learning and defining the commonalities that unite these diverse professionals into the broader field of informal science education (ISE) (see CAISE Landscape Study).
A one conclusion the Inquiry Group drew was that exactly because of the diverse and disparate nature of the field, there is pressing need to explore the promise of professional online communities for learning, networking and capacity building. Without some virtual means of sharing and extending knowledge, the broader ISE field will never build and connect across professional silos. The other conclusion we drew was that a site will only have impact if designed to fit with minimal effort into the daily work of its intended audiences. There was also general consensus that a "one-stop shop" for all of informal science is not a workable idea for a web community unless there is some external pressure or incentive that behooves people to participate (e.g. funder requirements, accreditation, professional awards or recognition). The broader lesson from Web 2.0 is that users like to self-organize their own information spaces and online interactions. The single portal model for web-based communities has become more complex. Users are now beginning to expect to be able to customize their own information sources and assemble them into personalized portals and feeds (e.g., Facebook, iGoogle homepage). We will best serve the larger ISE community by having our services and assets find them through sub-community circulations and such, rather than expecting them to find us and become members of OUR world. Web sites (or more accurately web services and infrastructure) need to offer well-designed affordances for personalization, participation and enable the broad circulation of customizable information. Lastly, and this can never be overstated, we really do need to understand existing community needs and practices before we develop tools to support professional development.
Group Session 2 - Applying the NSF Evalution Framework to Online Professional Communities
How suitable are the impact categories and indicators described in NSF Framework for Evaluating Impacts of Informal Science Projects for assessing professional development web infrastructure projects. What are the needs of professional development in informal science? What kinds of activities are likely to support that best in the future? How do we capture and assess the social aspects of online professional development? We worked in small groups to discuss how the framework could be applied to online professional development and learning communities.
What is the connection between the easily measurable things and things that aren't so measurable?
So if you don't have anything posted, there's nothing for anyone to learn from, and if nobody actaully goes to the site, then nobody's learning from anybody. It sort of goes on and on. You have to ascertain, yes: I have people, and yes: I have stuff, so now what do we do with that?
So increasing social awareness of resources - concepts, people, projects - in the field for a young professional would be a measure of social capital.
Is the framework actually talking about direct impacts on professionals? Or the indirect impacts on the people the professionals work with and for?
Is the act of contributing, authoring, reviewing is in itself professional development.
How do you asses the causal effects of short-term exposures and measure usage of knowledge or information?
The new NSF-ISE evaluation framework does not necessarily provide the complete solution for how to assess the outcomes and impacts of online professional development communities. The impact categories (knowledge, engagement, attitude, skills and behavior) outlined in the framework are particularly strong for measuring changes in knowledge or "know-what" aspects of individual learner impacts. But these impact categories are less helpful when it comes to considering, at both the individual and community level, the vital "know-who" social aspects of professional development and "know-how" growth in professional practice which characterize some the most important potential impacts of online communities. Particularly for professional audiences, it is critical to enable broader awareness and access to the social resources in the field, such as people, projects, sub-specialties, ongoing conversations, concepts, debates, historical tensions. Given that ISE is geographically disparate and professionally diverse, there is an acute need for building this capacity. Newcomers and outsiders need access to what old timers and insiders take for granted: a meta-view of ISE as a field, a discipline and as a set of networks. The evaluation impact categories are not inconsistent with this, but they do not naturally lend themselves to making these kinds of impacts explicit. One participant suggesting making "professional development" itself the other category when thinking about assessing the impacts of these field-building web infrastructure initiatives.
Mapping the EVALUATION FRAMEWORK to online community settings started a thorny discussion about the 'unit of analysis'. Although the framework allows for group or community level impacts, it seemed that measuring impacts at the indivudual level was the default approach. How do you box and measure gains in social capital, explain participation dynamics or acertain an increase in the collective efficacy of a field? Several participants in the Inquiry Group thought we needed to add the word "community" or "field" in front of the impact categories (e.g., field awareness, community engagement, etc.) in order to think in the aggregate what happens when a community uses and contributes to online resources. Would these changes be sufficient to capture the professional development outcomes of online community initiatives? Does the approach of the frameworks assume that impacts are at the level of a person, or small group of people? We debated the extent to which this was true. Regardless, we recognized that it is very tricky to measure community and field level impacts, especially with respect to determining causality. This kind of evaluation is an area where informal science education professionals can draw on the work, models and techniques being developed in other fields (for examples, find references in the Inquiry Group bibliography) and, indeed, this work is beginning.
Group Session 3 - Assessing the Impact of Community SitesIn the last session, the Inquiry Group was posed the question "What would an innovative evaluation plan look like for the different kinds of professional development web projects presented at this meeting?"
In evaluating knowledge, measure the understanding of who's who and what's what. We'd want to see increases in research used in terms of people they already knew vs. people they know and utilize now. You can use things like bibliometrics, analyzing proposals or evaluation reports, or a social network analysis to assess this. To follow up, you could get in touch with the users to see if they use the things they look at.
How do we know that our online resources have impact on the field? After viewing the presentations from six principal investigators leading NSF-funded professional online community sites at the meeting, we realized that there are at least two dimensions that are important to define as we think about assessing impact. The sites presented varied along a continuum from being more project-oriented to field-oriented, and varied along an orthogonal dimension of being more closed to open access.
|Cyber Infrastructure||Professional Infrastructure|
|Teams and goal-driven||Communities and identity-driven|
|Bounded user community||Anybody can join|
|Focus on internal core users||Focus on external users, peripheral participants|
|Goal is to increase project success||Goal is to maximize field impact|
Sites like ExhibitFiles.org focus on field-oriented and open access objectives—anyone interested in exhibitions can join, contribute, and comment. Sites like ITEST Learning Resource Center are more project-oriented and internal in focus and looks for its primary impact in supporting project innovation, collaboration, and dissemination and provides value to external audiences as a second order impact. These dimensions help make explicit what audiences a professional development site is aiming to serve, exposes sustainability concerns, suggests where you should look for impact and who you are expecting to be part of the impact story.
The second area the Inquiry Group identified as essential to distinguish betweem was the difference between community activity on the site and its actual impact on the field. A community objectives model was put forth that clarified the distinction between community operations objectives (resource usage, contribution and participation) and community impacts objectives (changes in professional practice) and we realized that most existing evaluations of professional web sites have only examined operational objectives. We know who is using the sites, what they are doing, and how often they come. We assume that if individuals have access to information, build reputation, increase visibility, network and gain social capital through our sites, it will eventually strengthen and connect the field. But does it? Do we actually know anything about the downstream impacts?
The group could identify very few examples of strong evidence for downstream impacts. We discussed several evaluation strategies that might produce the kinds of measurable evidence needed. In discussing how to assess the value-added of the SMILE pathway, one idea put forth was an evaluation design that compared the work of professionals engaged in an authentic task (e.g., designing an educational program or writing a proposal) against the work of "controls" who had access only to Google and the Internet. In discussing downstream impacts for InformalScience.org, a suggested evaluation strategy included using bibliometric and document analysis tools to measure whether researchers had expanded their citation base combined with social network analysis technique to look for the growth of collaborative circles in projects, papers, and proposals as a result of using the site. There are many ideas that would work here. The big point was that simply that analyzing activity of our sites is insufficient to prove that activity translates to real change in the field. We need to be much more experimental with our approaches to evaluating online communities.