Tag Archives: evaluation

Visitor Response Cards: What To Do When the Exhibition Is Over

Written by Jessica Fuentes, Dallas Museum of Art

Over the past few decades, museums have positioned themselves in this post-modern society as institutions representative of multiple perspectives. One way this is happening is by inviting visitors to be active participants in the museum experience. More and more we are listening to our visitors by asking them to respond to prompts and questions. If your institution has started down this path then you may be facing a conundrum much like mine: What do we do with the thousands of visitor responses we’ve collected?

Statistics and Evaluation

As a baseline, collecting can be a way to understand trends in visitor experiences. Comparing the number of responses to total attendance can reveal the percentage of participating visitors. Depending on the data prompted by the response card, you may be able to learn more about participates. For example, the Art Spot creation labels used in the Center for Creative Connections (C3), prompt visitors to note their age. With over a year’s worth of data collected we know that 6-12 year olds make of the majority of Art Spot participants. We also know that 30% of participants are adults. It is interesting to note the months when adult participation spikes to nearly 40%, and consider what might be effecting those fluctuations. Furthermore, the actual responses can be a source for qualitative data collection, illustrating the depth of visitor experience.

Also, by collecting and reviewing responses, we can evaluate our own prompts.  When C3 first installed Starry Crown by John Biggers, we offered two prompt cards related to the work of art.

Starry Crown and responses

A high percentage of the responses we received to the prompt pictured at the top did not address the prompt. This revealed that the question was difficult for visitors compared to the other prompt (on the lower right) which consistently received more thoughtful responses. Because of this, we eventually phased out the first prompt.

In a similar way, visitor responses as feedback can offer insight into visitor’s motivations, expectations, and experiences of a program or space.  In preparing for a redesign of the Young Learners Gallery within C3, we solicited visitor feedback to find out why caregivers bring their children to the DMA. Visitors left their responses on Post-it notes and using the Post-it Plus app, we easily digitized, sorted, and analyzed the responses.  We used the three categories with the largest number of responses as a guiding force in the redesign of the space.

YLG Post its

After analyzing and sometimes digitizing, are these visitor responses then doomed to storage?  Working in an educational space that serves, on average, 18,000 visitors a month, I question our habit of simply counting, sorting, and boxing up visitor responses to store away in file cabinets or closets. When we use Post-its, we digitize the responses because the Post-it Plus app makes it an easy process and contains helpful sorting and exporting systems, but in regards to broad digitization, I have to stop and ask, “Why?”  What would we do with responses in a digital form? Would it be any better to store these responses in digital file cabinets?  Would we one day go into the vault to re-read the responses?  Have we done that in the past with the responses currently being stored?

Re-Cycling

When I’m reflecting on past visitor response prompts, I go back to the spreadsheets and summaries that help extract meaning from the raw data. But what to do with the more esoteric prompts and responses? For instance, in spring 2014, C3 hosted a community exchange project inspired by A panel depicting the Tuba Tree, with the 99 names of God on its leaves. Museum visitors helped us explore the potential meanings behind “Nur” the Arabic word that translates to “Light” in English. The work of art was on view with an accompanying interactive that prompted visitors to share one word they associate with the word “light” on a golden leaf and hang it on the fabricated tree in the space.  When it came time to extract meaning from the responses we enlisted the help of a writer.  In 2015 C3 Visiting Artist, A. Kendra Greene, started by alphabetizing the responses.  The process of doing this created some interesting word combinations, one of my favorites being “Jesus, Joy, Justin Bieber.”  From these alphabetized lists sprang arranged poems.  The creation of the poems led to a spoken word performance where Greene took the words of our visitors, re-interpreted them, and produced an engaging performance. Creating a visitor response cycle—the museum prompted visitors, visitors left responses, their responses were made into a performance, the performance was shared with visitors—and in effect an artistic evaluation and summary of the responses.

This new take on how to re-cycle visitor responses planted a seed in my thinking about how to use other responses. In early 2015 we worked with Kendra Greene to package visitor responses to Starry Crown. This painting references the importance of women as keepers of knowledge and the significance of familial traditions, stories, and wisdom passed down through generations.  Visitors responded to the prompt, “What wisdom has an important woman in your life shared with you?” The responses we received were funny, heartfelt, nostalgic, sad, universal, and at times deeply personal. In early 2015, Greene organized the responses into bite sized booklets that could be given back to museum visitors. First, she created categories and sub-categories like:

Kendra categories

From these categories emerged tailored booklets called, Common Thread: Selections of women’s wisdom, guidance, counsel, advice, experience, notions, revelations, hard truths, and plain facts. Throughout the year we have found various opportunities to share these booklets with our visitors.  First at Mother’s Day, then Thanksgiving, and now as we prepare to say farewell to Starry Crown, we are assembling more booklets to give out through the month of April.

This slideshow requires JavaScript.

Share your thoughts

What creative solutions have you found for documenting, storing, or sharing visitor responses?

Plan, Implement, Evaluate: Leveraging All Staff for Program Development

Written by Mike Deetsch, Director of Education & Engagement, Toledo Museum of Art

Plan

In 2010 the Toledo Museum of Art (TMA) passed its 2015 Strategic Plan with an emphasis on the Museum’s Purpose: Art Education. One of the primary intentions behind the plan was to create a more relevant and sustainable Museum and at this point we adopted the Strategic Objective of Teaching Visual Literacy.  The thought process behind this, brought forward by the Museum’s director Brian Kennedy, was that the Museum would leverage great works of art in the collection to teach people to see better in our 21st century’s image-saturated society.  Since this time there have been a variety of visual literacy-specific initiatives developed by the Museum, including The Art of Seeing Art thinking routine, a docent training class highlighting visual perception, and the creation of a Visual Literacy website.

In November 2014 the Toledo Museum of Art hosted the 47th annual conference for the International Visual Literacy Association (IVLA).  As part of the preparation for this conference, in January 2014 the education department was charged with designing a professional development program that would train all Museum staff and volunteers on the theories and processes around visual literacy. With the conference imminent, we wanted to ensure that any staff or volunteer in the organization would feel comfortable talking about visual literacy with any of our attending guests.

Before my colleagues and I developed the curriculum, we needed to clarify TMA’s approach to teaching visual literacy and its associated concepts. Our goal was to make the content accessible to a wide audience.  It might go without stating, but not everyone on our staff has a background in art, art history, or art museums.  Keeping in mind that we were going to be training such a diverse audience (i.e. all museum departments as well as docents and other volunteers), our approach couldn’t be intimidating and had to be presented in a fun and engaging way.

We had been incorporating visual literacy concepts into programming in a number of ways since 2010, but those programs largely lived with the education department.  To be successful on this project, it was clear that it was essential to engage a variety of staff members outside of education in order for the concepts to “stick” and be embodied throughout the organization.  Aided by strong support from the director’s office, we pulled together a cross-departmental team of 14 staff, for three consecutive Tuesdays in February 2014, to brainstorm around visual literacy concepts.  This team consisted of staff from curatorial, education, library, marketing, visitor engagement, visitor services, and the director’s office.

Visual Literacy Content Meeting 021114

Our meetings took place in a white board room (three walls covered in white board paint) where we were able to discuss, brainstorm, and illustrate ideas.  While the participants were not always in agreement, we were able to use these meetings (about six hours total) to land on consensus for our key process which include:

  1. an easy to understand definition of visual literacy,
  2. the Elements of Art and Principles of Design as the foundational vocabulary,
  3. the Art of Seeing Art thinking routine, and
  4. the concepts of interpretation distilled into four visual languages.

During these sessions the group realized the value of aligning the TMA’s definition of visual literacy with textual literacy.  The comparison to textual literacy is important for two reasons: one, it makes an analogy that people are already familiar with and, two, it gives the Museum the opportunity to shift the discussion from literacy to language.  The latter shift was key because focus groups had been telling us that literacy implies there are people who are illiterate, while language implies level of fluency.  TMA’s definition of visual literacy is the ability to read, comprehend, and write visual language.  Reading visual language is about the process of seeing, comprehending visual language is about the interpretation of seeing, and writing visual language is about the action you take in response to what you have seen.

Implement

With the definition, process, and concepts in place my colleague Kate Blake, Manager of Curriculum, and I drafted the curriculum for the professional development.  From the outset of writing the curriculum we identified a few musts: the program needed to be multidisciplinary, meaning it wasn’t going to be art history-centric; it needed to be activity-based; and it had to be taught in the galleries.

As museum educators we know the value in using a variety of approaches to gallery learning, including group discussion, small group activities, drawing, and independent exploration.  Facilitating activities, opposed to discussion only, would afford us the chance to engage with a variety of learning styles and dabble into a bit of game mechanics.  By making the approach activity-based, we were able to engage our staff in the overall experience which proved to be useful in retaining the concepts  introduced.

Staff participating in Visual Literacy workshop activity.
Staff participating in Visual Literacy workshop activity.

As I mentioned earlier this training was offered to TMA staff and volunteers, in all approximately 300 individuals.  In the end we designed a curriculum of 12 contact hours which introduced the concepts surrounding Visual Literacy, spent time on close looking techniques, and gave special emphasis to the four visual languages.  Kate and I knew that 12 hours was a significant commitment for people to give over during the work week, so we also developed a variety of workshop formats to adjust to people’s schedules accordingly.  Initially each of these sessions was facilitated by full-time TMA education staff but gradually transitioned two of our more experienced docents into facilitators.  These docents, who were both former docent board presidents, had been working closely with staff on visual literacy programming since 2010.

One lesson the facilitators quickly discovered during the workshops was the importance of acknowledging expertise, at all levels, throughout the professional development.  There were content experts, such as curators, as well as other areas of expertise. For example our security staff, which spends more time in the galleries than anyone else on staff, was actively encouraged to contribute their opinions and perspectives.  The guards’ comments were often  the most insightful for their interpretations and their observations of visitor interactions with the collection.

Evaluation and Next Steps

As a means of reflection, we developed an evaluation tool that  allowed us to make real time adjustments. Specifically we measured the digestibility (of content) and overall enjoyment.  To do this, we created a series of online surveys to collect feedback at various touch points during the 12 hour workshop.  The curriculum was grouped into six modules and each module had its own evaluation.  While the evaluation was not a requirement for participation, we collected over 300 surveys.  The general response was positive, with most activities receiving a rating of 5 (out of 6) on a Likert scale. Open-ended questions provided constructive feedback that we were able to act on immediately, such as making a slight adjustment to our definition of visual literacy and dropping activities that did not resonate or were too complicated.

Staff feedback to Visual Literacy workshops.
Staff feedback to Visual Literacy workshops.

All told between April and October 2014, our team of eight facilitated 28 workshops, totaling 336 hours, for 300 staff and volunteers.  The entire experience, from design to facilitation, relied heavily on cross-departmental staff involvement, input, and engagement.  As a result we were able to design a clear and concise introductory visual literacy curriculum which we have been able to repurpose for a variety of audiences and in a multitude of formats since getting the staff involved.  Our staff and volunteers clearly understand TMA’s Purpose is Art Education and that we will achieve it by Teaching Visual Literacy now.

Having the opportunity to share and rely on expertise throughout the Museum proved invaluable throughout the entire process.  How many of you have the opportunity to cross collaborate on projects from start to finish?  If so, what does that look like?  And do you have the opportunity to prototype new ideas?  How can we build that into our practice? I’d love to hear your thoughts and experiences.

*     *     *     *     *

About the Author

headshotMIKE DEETSCH: Emma Leah Bippus Director of Education and Engagement at the Toledo Museum of Art, Deetsch is a key member of the Museum’s executive team, leading educational and programming initiatives across the Museum. He is responsible for curriculum development for all audiences, outreach, exhibition interpretive material and management of the docent program as well as conceptualizing innovative public programming.  He oversees a strong, motivated education staff and a highly engaged TMA docent corps responsible for developing visual literacy initiatives and partnering to create opportunities for visual literacy education and awareness.  Prior to joining the TMA staff, Deetsch served as a senior museum educator at the Brooklyn Museum, the exhibition and programs director at the Lexington Art League, and the student programs manager at the Kentucky Historical Society. Deetsch received his master’s degree in art education from the Pratt Institute and a bachelor’s degree in art history from Hanover College. He was chosen in 2011 to participate in the Getty Leadership Institute’s “Museum Leaders: the Next Generation.”

Up Close: Distance Learning & Art Museums

By Anne Kraybill, Distance Learning Project Manager, Crystal Bridges Museum of American Art

Check out Crystal Bridges Museum of American Art’s Distance Learning website, which includes research and resources made available to support your distance learning initiatives.

The term “distance learning” can seem antithetical to art museums that espouse the power of an authentic experience with an object. As I worked to develop a distance learning initiative at Crystal Bridges Museum of American Art, I struggled to reconcile the rationale for such a program. After all, Crystal Bridges has a robust and well-funded school visits program that brings students from all over the region. Why would I want to create a program that did not take place within our walls?

First, let me provide a little context. Crystal Bridges decided to pursue a distance learning initiative shortly after the field trip study conducted by Jay Greene, Brian Kisida, and Dan Bowen at the University of Arkansas. The findings revealed that student gains from a one-time fieldtrip in a variety of outcomes were two to three times higher for students in rural locations. With these findings in mind, we decided to create a distance learning program that would reach more students overall, but particularly students in these rural schools.

Where to Start?

We began with some formative research to determine what path we might take. In July 2013, we hosted a Distance Learning Summit, which brought together more than 40 art museums and arts organizations to better understand the current landscape and approaches to distance learning, as well as envision the future of how art museums might further leverage distance learning. Case study presentations included traditional approaches such as synchronous video conferences—often branded as “virtual fieldtrips”—that connected classrooms remotely, to blended approaches that utilized Learning Management Systems (LMS) before and after an onsite program, to asynchronous approaches such as a Massive Open Online Courses (MOOCs) that engage thousands of learners at one time.

While all of these approaches have advantages and disadvantages to consider, the model that resonated with our particular situation was presented by Michelle Harrell and Emily Kotecki from the North Carolina Museum of Art (NCMA). In an effort to increase their reach to teens, they partnered with North Carolina Virtual Public School to develop online courses in the visual arts for high school students throughout the state of North Carolina. This model resonated for a few reasons. First, in the state of Arkansas we have the Digital Learning Act that requires all high school students to take an asynchronous online course for graduation, so this approach was a natural fit. Second, the notion of having such a direct role in a student’s school career was appealing and provided a level of accountability not found in most art museum/school partnerships. Following the trail Michelle and Emily had blazed, Crystal Bridges set out to develop a for-credit online course with the aim of deeply connecting high school students to art history, American history, and museum studies.

Course Development

After an RFP process, we selected Education Development Center, Inc. (EDC) as the development partner. Over the course of a year, a cross disciplinary team of museum educators, instructional designers, subject-matter experts, graphic designers, and programmers, developed Museum Mash Up: American Identity through the Arts. Rather than progress through the artworks chronologically, the course begins with contemporary art. The guiding questions ask students “How did we get here? And how have artists shaped and reflected upon American identity?” Crystal Bridges partnered with Virtual Arkansas to offer and deliver the course. Like North Carolina Virtual, Virtual Arkansas is a supplementary provider of online courses that any public school student in the state can take. EDC and Crystal Bridges trained a few online arts instructors from Virtual Arkansas with volunteer students to test the activities and get formative feedback from both instructors and students.

The course has now launched through Virtual Arkansas with a pilot group of about 40 students from all over the state, including the community of Deer, population 680; the community of Hugh, population 1,441; and the community of Star City, population 2,248. Students typically log onto the course during one of their class periods at school. Though the course is asynchronous, students are paced in weekly units and use tools to engage in online discussion. This was one of the most important elements for the design of this course. While there are many valuable websites and other online resources to learn about the arts, we wanted to be sure that the act of “collaborative meaning-making” was not lost. Similar to an onsite program, students begin their lesson by looking at the work of art and sharing their initial observations and interpretations using VoiceThread™. This tool allows for a conversation in the cloud using text, video, or audio and is an excellent platform for students to build on one another’s ideas. Following their initial observations in VoiceThread™, the students read about the art and engage with multi-media materials to ascertain some context about the art, artist, and historical time period. They then participate in another, more in-depth discussion about their new and evolving interpretations.

course

Simultaneously, students are also working on two major capstone projects. The first project is a curated exhibition about their own individual identity using the tool Kapsul™ somewhat similar to a Pinterest board. Through this project, along with videos by curators, designers, and educators, they learn about the curatorial, design, and interpretive process necessary to curate an exhibition. These skills are used in their final project: a virtual exhibition curated by each student using the artworks they learned about during the semester, and research new works in the Crystal Bridges collection. This amazing virtual rendering of the Twentieth-Century Art Gallery at Crystal Bridges was created by David Charles Frederick from Tesseract Studios at the University of Arkansas using Unity™, an immersive game engine that includes rich textures and allows the students to explore the space as if on foot. The rendering is completely accurate to the specifications from the museum blueprints and provides learners with an immersive experience in which they arrange paintings they have researched on the walls, write the labels and interpretation, develop the graphic identity of their exhibition, and most importantly, learn that they can make meaning and conversations between paintings and across history.

bridge1

bridge2

Challenges

Along the way, there were many challenges to overcome and there will be many more as we continue to pilot the course. Content for all of the artwork had to be generated requiring a mass amount of writing. Image rights had to be procured, videos needed to be produced, and external content from primary and secondary sources had to be found. One of the most challenging hurdles we had to overcome was the course approval process with the Arkansas Department of Education. Because this was not a standard course, the state had to approve it under a standards framework. After much work and standards alignment, we were able to obtain course approval for students to receive .5 credit hours in fine arts. The course now satisfies two requirements all high school students must meet for graduation; a .5 credit hour in fine arts, and at least one course taken online.

Beyond the bureaucratic and logistical challenges we continue to encounter and amend, there are, not surprisingly, some challenges in working with high school students. There are a wide range of motivations, with some students passionately interested in learning art and history, and others who are more ambivalent about visual arts and museums. This results in a wide range of responses in the discussions. For instance, students were asked to look at and respond to George Tooker’s, The Ward in VoiceThread™. Their only prompt was “What do you notice and what do you wonder?”

Student One:  it looks like there is a bunch sick people laying (sic) in hospital. like it looks like the ones already laying down are dead.

Student Two: George Tooker’s “The Ward” is a very interesting piece that’s (sic) shows to have many subliminal messages. In the background there are many American flags hanging on the wall in a much brighter contrast to the rest of the painting. I recognize this as a representation of patriotism and American pride. Going on to the next part of the painting, the elderly people lined up in rows on beds. There isn’t much to identify the various elderly by- except as Madeliene said, they have little to no hair- so they are most likely men. The elderly people are lined up on these beds- which do not appear to be comfortable by their stiff appearance. It seems that these people are just existing, not really being anything other than a case number or a medical condition. I believe that this represents the wounded soldiers that have returned from the various wars. When the soldiers came back from the war wounded this is how they were treated oftentimes, in a lifeless building or tent, not having anything to do or participate in, often making them become depressed which slowed or stopped the healing process completely. When Tooker made this painting I wonder why he depicted the wounded soldiers scene as so dreary and negative when he could have followed in the footsteps of others and sugar coat it to pacify the public and make it seem appealing enough. For Tooker’s honesty in this painting I admire him greatly. He really got his point across that the war wasn’t pleasant and it wasn’t pleasant afterwards either, because these memories still haunt you…

voicethread

In addition, for many students this is the first time they have taken an online course, so they need support in learning the tools plus very well-defined and articulated expectations of the level and quality of work the course requires.  Everyone is making significant progress. For example, early responses from all but a few students were rarely justified, but just five weeks in, student are better articulating their interpretations with more detail and inference, and justifying their claims with evidence.

Overall, the benefits far outweigh the challenges. There is a level of anonymity for each student that is freeing. They are not burdened by labels that they might encounter in their physical school. They are also able to contribute their ideas without ridicule. The way in which they engage with works of art and learn about the works is multi-model. And they are connecting with Crystal Bridges and the collection in a way that a one-time fieldtrip could never afford. In addition, Crystal Bridges is providing a unique course-offering to the state that expands access to quality arts education.

Next Steps

Crystal Bridges has a large agenda as it continues to expand upon this program. Next steps include:

  1. Conduct an observational study of the current section of Museum Mash Up to analyze instructional design and quality, and measure student perceptions. Follow the observational study with a rigorous, experimental design to measure student outcomes including critical thinking and writing.
  2. Develop an online teacher professional-development program that certifies teachers in any state to teach the course;
  3. Create a second course offering that is grounded in studio and design practice;
  4. Host an online professional learning community where teachers can receive support in teaching the online course.
  5. Host a second Distance Learning Summit (details forthcoming this summer).

Phew!

Final Thoughts

This project has been one of the scariest and most fulfilling in my career. The students are not the only ones who have a stake in the course; we as a museum cannot fail our obligation to them.  I could not have conceived of it without the ground-breaking work by Michelle and Emily at NCMA. I also have to thank the talented and dedicated Crystal Bridges museum educators, Emily Rodriguez and Donna Hutchinson, for all their help in developing, researching, and designing the course outline, as well as EDCs project manager, Kirsten Peterson, for her unwavering dedication and belief in this project, and Diana Garrison, teacher extraordinaire at Virtual Arkansas.

Read about the Distance Learning Project from the perspective of a participating student, “Museums and Online Learning: A Student’s Perspective.”

*     *     *     *     *

About the Author

AnneKraybillANNE KRAYBILL:  Distance Learning Project Manager at Crystal Bridges Museum of American Art, where she is developing online accredited courses for high school students and online professional development for teachers. In her previous position as the school and community programs manager at Crystal Bridges, she developed and implemented all of the Museum’s programming related to K-12 students, teachers and pre-services teacher as well as community groups. She has held positions at the Walters Art Museum in Baltimore, MD, the Pennsylvania Academy of Fine Arts, the Norton Museum of Art , the Center for Creative Education, and the Vero Beach Museum of Art. Prior to joining Crystal Bridges, she worked as the Art School Director at the Durham Art Council, managing visual and performing arts classes for over 3,000 youth and adult students annually. Anne has a B.F.A. in Photography from Maryland Institute College of Art, a M.A. in Museum Education from The University of the Arts, and a M.S. in Instructional Technology from East Carolina University. She is currently a Doctoral Academy Fellow in Education Policy at the University of Arkansas. Anne’s postings on this site are her own and don’t necessarily represent the Crystal Bridges Museum of American Art’s positions, strategies, or opinions.

Hands-On Learning: Not Just for Kids

Written by Jessica Fuentes, Center for Creative Connections (C3) Gallery Coordinator, Dallas Museum of Art

“Every child is an artist. The problem is how to remain an artist once we grow up.” – Pablo Picasso

Reposted from the Dallas Museum of Art’s education blog DMA Canvas, where the museum’s fantastic education team writes about creativity, community outreach, technology, and insights into the field of museum education. 

The Center for Creative Connections (C3) at the Dallas Museum of Art is unique because we focus on learning by doing. That means we design activities for people of all ages to learn about works of art from the collection by participating in a hands-on way. The activities we create to accompany works of art prompt visitors to engage in ways that are different from the standard didactic approach of a wall label. In C3, we want to provide experiences where visitors can make personal connections by drawing, writing, making, and discussing works of art with each other.

This kind of active engagement carries a certain stigma; many people assume that it’s only for kids, mainly because we are used to seeing activities like these in children’s museums. Part of our design process is to evaluate visitors’ experiences by observation, interviewing and counting. We’ve learned that half of our participants are adults and that there is a reoccurring theme in their comments regarding why they participate. So, why do adults flock to C3 to draw, write, make, and talk about art? Because it connects them to a childlike curiosity and creativity which, as an adult, often takes a backseat to other responsibilities and tasks.

jfuentes_image2

In January 2014, we installed a large table in the middle of the C3 Gallery, that hosts three activities that rotate on a monthly basis. As a part of the evaluation of these activities, we interviewed visitors about their motivations for participating, their past experience with art making, and their view of the value derived from participating in a making activity at the Museum. I was repeatedly intrigued by the responses of the adult participants.

For example, I spoke with a couple participating in a portrait drawing activity which encouraged close looking at the proportions and scale of the human face.  The couple, in their mid-thirties, each claimed to have no artistic experience. Through our conversation, they divulged that they both graduated from arts-based college programs. “I went for fashion, like a BFA in Design, and he went for Graphic Design. We don’t really draw in our free-time though, I mean, he does for work,” the woman stated as she looked over at the man who accompanied her. He added, “Yeah, but just on the computer.” Then the woman broke in, “And I do for work, but it’s not the same. Like, I do fashion sketches, not this kind of drawing.” I prodded them a bit to understand what “this kind of drawing,” meant. “Well, it’s like… it’s fun. Like drawing before was so serious and it had to be perfect, cause you were doing it for a grade. But this is just for enjoyment.”

This idea was reinforced by further conversations with other adult participants: drawing, making, and discussing in C3 is fun in a freeing kind of way. I interviewed another thirty-something couple drawing at a light box activity designed to assist in the making of hybrid imagery. The man began with, “I’m guessing this was made for children? It’s fun and different and I didn’t expect to see this here.” The woman with him agreed, “Yeah, it’s like that spark of creativity, kind of… childlike. I didn’t think I’d spend as much time or get into it like I did.” A sixty-something man participating in the portrait drawing activity remarked, “I used to take art classes, but it’s been so long ago… it’s like I forgot that and I saw this and I remembered.” This feeling of nostalgia for something that is no longer a part of someone’s everyday life was also a common response from adults. Many adults responded that they enjoy drawing or making but, “don’t do it enough.”

jfuentes_image3

Aside from drawing-based activities, the Center for Creative Connections also has a drop-in art making area with large communal tables called the Art Spot, which we say is the place for “anytime art-making for everyone.” We invite visitors to explore their creativity by making creations out of unexpected or everyday materials. Every two months we change the materials and provide a prompt to inspire ideas. Each time I’m hesitant and wonder, “What will people make with this?” But, I am always delighted and surprised by the imaginative creations that are made and left behind. Children often come to C3 and head straight for the Art Spot, while adults can be a bit more tentative. However, regardless of age, most visitors stay anywhere from 5 minutes to two hours, with an average of about 20 minutes. Once they gather their materials they become immersed in their creation. For some it is a hands-on problem solving activity while for others it is about manipulating materials. How can you combine these objects (cups, spoons, paperclips, wire, egg cartons, cardboard, etc.) into something unique and surprising? This kind of open-ended activity, reminiscent of childhood playing and pretending, is not often made available to adults. I frequently watch my eight-year-old daughter take something like a toilet paper roll and turn it into a piano for her dollhouse, or repurpose a cardboard box to make an enormous rocket ship. This nostalgia for childhood play was brought perfectly into perspective by a recent Art Spot creation.

jfuentes_image4

 

At the DMA, learning can take many approaches and forms. We strive to be inclusive so that we can reach visitors with a multitude of interests and experiences and preferences for learning. In the Center for Creative Connections, our mission is to engage visitors of all ages with works of art and the creative process of artists. We hope that by designing participatory ways to learn we will provide fun and playful activities for all of our visitors, regardless of their age.

How Are You Engaging Adults in Unexpected Ways?

As museum education steps further away from a traditional didactic style and more towards an inclusive approach that attempts to reach a multitude of interests and learning styles, the question of how to engage adults is at the forefront of many educators’ minds. What if our adults want a lecture? What if they shy away from participatory activities? Will we isolate a large population of our adult audience by trying a new approach? These are valid questions to consider, and making a change does not imply that you have to make a 180-degree turn, but rather consider offering varying opportunities including these types of child-like playful activities. How are you engaging adults in unexpected ways? What successes and struggles have you come across as you experiment with offering new adult experiences?

Read more about the Dallas Museum of Art’s education programs, community outreach, and explorations in creativity through their educator blog: DMA Canvas.

About the Author

JfuentesJESSICA FUENTES: Center for Creative Connections(C3) Gallery Coordinator, Dallas Museum of Art.  Jessica received her MA in Art Education from the University of North Texas.  Her thesis was a collaboration with her then six-year-old daughter to explore self-guided family experiences in art museums.  Jessica’s daughter remains an important resource in her work developing interactives and activities which provide opportunities for visitors of all ages to engage with works of art through drawing, making, and discussion. Jessica is also an artist and a member of 500X Gallery, one of Texas’ oldest, artist run cooperative galleries.  In her down time, she can usually be found with her daughter enjoying an art museum or making art in their home studio. Jessica’s postings on this site are her own and don’t necessarily represent the Dallas Museum of Art’s positions, strategies, or opinions.

We Don’t Need New Models, We Need a New Mindset

Editor’s Note: I have been following EmcArts ever since they announced their first round of Innovation Labs for Museums back in 2011, and have had the pleasure of meeting with their staff as well as those working with the ArtsFwd initiative.  I was also fortunate enough to be invited to attend the Association of Art Museum Directors meeting this summer in Dallas, where Richard Evans gave a great presentation on innovation as part of that organization’s thinking around education.  The team at EmcArts and ArtsFwd is working to help make a break with our patterns of “business as usual” and develop new capacities and mindsets to tackle the major adaptive challenges facing museums in the 21st century. The post below by Karina Mangu-Ward does such a fantastic job of highlighting this shift in practice and ‘mindset’, to use her word.  I thought it was worth sharing with ArtMuseumTeaching community as a way for museum professionals at all levels of their organizations to reflect on the models and mindsets underlying our practice as well as the real challenges we face.  I invite readers to comment below about how you see these models operating at your institution, and how you might help support change towards a new mindset in museums.

Written by Karina Mangu-Ward, Director of Activating Innovation at EmcArts

Reposted from the blog at ArtsFwd, an online community of arts and culture leaders committed to doing things differently in their organizations in order to stay relevant and vital in a changing world.

I’m thrilled to announce that I’ve been selected as a guest at the Dinner-vention 2, organized by Barry Hessenius of Barry’s Blog and WESTAF. On October 9, I’ll join seven other dynamic, forward-thinking leaders in the arts to discuss some of the most pressing challenges across the field. I’m looking forward to meeting everyone and engaging in what should be a spicy conversation.

To prepare for the Dinner-vention, Barry asked all of us to capture our preliminary thinking in a briefing paper that responds to the topic: “Broken Models: Picking Up the Pieces and Moving Forward.”

I’ve shared my briefing paper below. I encourage you to read the papers of the other seven guests, which you can find here.

What’s a model, exactly?

I’m a very literal person, so the first thing I did when tasked with this briefing paper was look up the definition of “model.”

Model (n): 1) A standard, an example for imitation or comparison

OK, got it. A model is like a blueprint. Or a recipe. So, this Dinner-vention is a debate about standard or best practices in our field. We’re taking a long hard look at the routines we’ve replicated again and again because they work, or at least they’re supposed to, or they once did.

What models are we questioning?

My next step was to plainly state what I see as the old model in each of the areas Barry mentions (plus I added strategic planning, evaluation, and artistic development).

However, I assume every model evolved to meet a particular challenge. So I also tried to name the challenge I think we’re facing right now in that area. For me, there’s nothing worse that poor problem definition. We can reform our models until we’re blue in the face, but that’s useless unless we get clear about the future we want and the challenges we’ll face in getting there. Only then can we answer the question: why aren’t our models working?

I think this was a useful exercise, so I’ve shared the results below. It’s wide open for debate. My hope is that it serves as a starting place for a shared understanding of the standard practices we’re questioning and the real challenges we’re faced with as a field, so that we can begin to understand whether our approaches are the right ones.

In each case, I see a stark disconnect. The old models we’re using aren’t matching up with the deeply complex challenges we’re faced with right now.

Income/Revenue

  • Old model: Ticket sales + government + foundation + corporate + wealthy patrons + small donors + endowment income = Balanced budget
  • New challenge: To generate new sources of sustained revenue and capital

Audience development

  • Old model: Sell subscriptions and market shows
  • New challenge: To engage new and more diverse groups of people in meaningful arts experiences

Governance

  • Old model: Give/get boards focused on fiduciary oversight and maintaining stability
  • New challenge: To cultivate boards that are partners in change

Evaluation

  • Old model: More ticket sales, more revenue, bigger budget, nice building = Success!
  • New challenge: To evaluate the success of our organizations based on the value they create in people’s lives

Leadership development

  • Old model: Attend leadership conferences and seminars, build your network, wait for your boss to finally leave/retire/die. (Alternatively, change jobs every year.)
  • New challenge: To develop a generation of new leaders equipped with the tools they’ll need to tackle the wickedly complex challenges the future has in store

Artistic development

  • Old model: MFA programs, residencies, commissions, occasionally a grant, get a day job
  • New challenge: To support artists in making a living and a life

Strategic planning

  • Old model: Decide where you want to be in 5 years. Outline the steps to get there in a long document no one will read.
  • New challenge: To plan for the future in a way that allows us to stay close to our core values and make incremental improvement while also making room for experimentation, failure, and rapidly changing conditions.

Funding allocation

  • Old model: The money goes to whoever the funder says it to goes to. Usually bigger organizations run by white people in major cities.
  • Our challenge today: To distribute funds in a way that is equitable, geographically diverse, and creates the most value

Note: I decided I was too ignorant in the areas of creative placemaking, advocacy and arts education to weigh in. I’ll leave that to my colleagues.

Here’s my main argument

Over 60 years in the field, we’ve developed standard practices, or models, in all these different areas. They worked for a while. Now they don’t. This has given us a false notion that we need new models in each area. This is wrong.

Models, best practices, recipes, and blueprints work only when your challenge has a knowable, replicable solution. Sure, there are some challenges that fit this mold. I’d argue that having a great website, designing an effective ad, doing a successful crowd funding campaign, and producing a complicated show are all challenges where best practices, models, and experts are really valuable. You might not know the solution, but someone does, and you can find it out.

But what happens when there actually isn’t a knowable solution to your challenge? When there is no expert, no model to call upon? When the only way forward is through experimentation and failure?

I’d argue that every one of the big challenges I name above falls into the realm of complexity, where the search for replicable models is fruitless. There isn’t going to be a new model for generating revenue that the field can galvanize around that will work for every or even most arts organizations. Nor is there going to be a long lasting model for community engagement that can be replicated by organizations across the country. For the deeply complex challenges we face today, there simply isn’t a knowable solution or model that can reliably help us tackle them. These kinds of challenges require a new way of working.

We don’t need new models, we need a new theory of practice

Instead of new models, I’d argue that we need a new theory of practice, one that champions a different set of priorities in how we do our work.

Our old models imply a vision of success that’s rooted in growth, stability, and excellence. They drive us towards efficiency and competition by perpetuating an atmosphere of scarcity. They are not as creative as we are.

What if a new vision of success in our field could prioritize resilience, flexibility, and intimacy? What if we could be enablers, not producers? What if we could harness the abundance of creative potential around us?

This new vision of success doesn’t demand consensus around a new set of standards, best practices, or “examples for imitation,” it demands a new way of thinking and acting that empowers us to shift and change our routines all the time, as needed.

A proposed theory of practice for the future

Here is my call to the field: a proposed set of practices that align with the world as it is today, not as it was before:

  • Let’s get clear about the challenges we’re facing and if they’re complex, treat them as such
  • Let’s ask hard questions, listen, do research, and stay vulnerable to what we learn.
  • Let’s question our assumptions and let go of what’s no longer working.
  • Let’s embrace ambiguity and conflict as a crucial part of change
  • Let’s bring together people with different experiences and lean into difference
  • Let’s experiment our way forward and fail often
  • Let’s recognize the system in which we’re operating.
  • Let’s rigorously reflect and continuously learn

In conclusion

When I set out to write this post, I wanted to question the premise that a conversation about “broken models” could even be useful in a time when expertise, excellence and replicability are the values of the past. I wanted to propose that we move past the very notion of models – let’s jettison the word itself from our vocabulary.

In the end, I guess you could call what I’ve proposed a kind of “new model.” But I’d rather think of it as a new mindset.

Read more about Innovation Stories, the National Innovation Summit, and tools & activities you can use in your own organization by connecting to the ArtsFwd blog.

*     *     *

About the Author

karina-mangu-wardKARINA MANGU-WARD: Director of Activating Innovation for EmcArts, Inc. Karina leads the development of ArtsFwd, an interactive online platform that extends learning about innovation among arts leaders and organizations nationally and internationally. She took on the role of Director of Activating Innovation in August 2011. In addition to her work at EmcArts, Karina is a New York based producer and filmmaker, whose projects include an original web series, an interactive online documentary series, promos, how-to videos, and event videography. She received her MFA in Theater Management & Producing from Columbia University, where she wrote her thesis on the strategic use of online tools and technologies for arts organizations. She holds a BA from Harvard College.

How Can Museums Change Teens — and Vice Versa?

Written by Chelsea Kelly, Milwaukee Art Museum

Over the past four years, I have worked with hundreds of Milwaukee-area teens who love art, and who, over their time in teen programs at the Milwaukee Art Museum, grow to love museums as well.

I have always had a sense that my students grow over their time at the Museum. This year, though, to really study that growth, we designed our longstanding Satellite High School Program as a year-long experience to explore exactly how weekly sessions at an art museum might change the thinking of our teen participants. To that end, our program outcome for students was that they would show an increased ability to reflect upon their own experiences and performance.

Evaluation is a grey area—much like teaching and interpretation—and we as educators need to use multiple methods in order to get a fuller picture of what’s going on with our students.

This means I’ve been thinking a lot about evaluation: How do we show change was made? Years ago, I thought evaluation was more or less a prickly, black-and-white, necessary evil that forced me to use altogether too much math. But over the past two years, I’ve come around to believe evaluation is completely the opposite (though math is still important!). Evaluation is a grey area—much like teaching and interpretation—and we as educators need to use multiple methods in order to get a fuller picture of what’s going on with our students. And further, these methods can be tools to help our teaching, improving programs and our impact on students.

In the end, I found I needed to use reflective practice myself to understand how my students were changing, and to explore and experiment with a number of different methods for articulating their growth. In this post, I’ll share a few of the methods we used in the Satellite High School Program this year to explore how our teen interns changed through reflective practice.

First… What is Satellite?

The Satellite High School Program is a year-long internship for sixteen teens ages 16 to 18 from diverse high schools all over the Milwaukee area. Once a week after school, they come together at the Museum and explore how art can be made relevant to our lives today. They participate in “object studies” (hour-long discussions on a single work of art), behind-the-scenes career talks with staff, and resume-writing workshops, and also mentor elementary school students in tours of the permanent collection.

Teens create a final project that has a real-world impact on the Museum. They choose a work of art in the Museum Collection, research it, and form their own interpretation of the piece. In past years, students have created responses in visual art, writing, or performance. This year, the students used iPads to create videos on their work of art, explaining what the work means to them and how it changed their thinking or art practice. You’ll see a few of those videos later in this post.

Friends, family, and teachers of the Satellite interns watch their final project videos at the program celebration. Photo by Front Room Photography
Friends, family, and teachers of the Satellite interns watch their final project videos at the program celebration. Photo by Front Room Photography

Core Evaluation

Let’s start with the core evaluation method we used for the program. We were lucky to work with one of our teen program funders, the Milwaukee Public Schools Partnership for the Arts & Humanities, and the University of Wisconsin-Milwaukee’s Center for Urban Initiatives and Research (CUIR) to develop the outcome above and to establish a tool to measure it.

We settled on one-on-one interviews, doing a “pre” interview on the first days of the program in October and a “post” interview on the final days of the program in May. Each student was privately asked the same set of questions in the pre- and post-interviews, meant to get at their ability to reflect on their experiences in the program. I scored each interview on a rubric that measured level of detail in their responses, and then we compared their pre-program score to their post-program score to see if they had improved.

At the end, every student did improve in their ability to reflect—their answers got significantly more detailed. As someone whose default is to be a more qualitative thinker, it was rewarding to use the rubric to see their interviews as data, in a quantitative, more tangible way.

But as helpful as this was, it’s still just one method of evaluation. Being able to explain in detail is certainly one aspect of successfully being able to reflect. But as I listened to their responses, and thought about what I had seen in the students over the course of the whole year, I realized there is much more to reflecting than just detail. Their responses used stronger vocabulary, they expressed sophisticated ideas, and they asked more and deeper questions. How could I articulate that kind of change?

Unexpected Data

Happily, along the way, we also found that we had collected some unexpected data which helped me more concretely see the change in my students.

Exit Slips

At the end of each session, teens used a web app on their iPads called Infuse Learning to fill out a quick exit slip survey. Exit slips are an easy way to take the pulse of your students at the end of a session. For Satellite, they answered the questions “What is something you learned today?” and “What are you still wondering about?” Though different from our interview questions, these certainly also support reflective practice by thinking back on the day’s session.

As the year went on, I noticed that the teens ‘ responses were growing more sophisticated: they were longer, they used more art vocabulary, and they realized that they might not be able to answer questions definitively, if at all. At the suggestion of Marianna Adams, who specializes in museum research and evaluation, I tried running these responses through two readability tests to see if that would quantify the sophistication of these responses. One test produces the sample’s Fog Scale Level, which measures syllable count and sentence length (a score of 5 being readable, 20 being very difficult). The other was for the Flesch-Kincaid Grade Level, which approximates the average grade level necessary to read and understand the text.

For the first question (“What is something you learned today?”), students’ scores jumped considerably in Fog Scale and Reading Level. Since these tests measure syllable count, sentence length, and grade level, this corroborates with what I found in the core evaluation.

somethingyoulearnedtoday

But I was surprised to see that when I tested responses to the second question (“What are you still wondering about?”), students’ scores actually dropped! Yet if you read their responses, there is a drastic change—for the better.

stillwonderingabout

Take Student D’s responses. In his early answer, he asks a relatively basic art historical question about distinguishing one type of art from another. In his later response, he is thinking deeply about the purpose of art and how we even decide what art is. And while Student F uses high-level art history vocabulary in her first response, it’s without context; later on, she’s thinking about how two seemingly opposite concepts may have something in common after all.

The scores of these comments may have decreased, but I’d argue that their reflective quality increased—the teens ask big questions that might not have an answer; they ditch high-level vocabulary to more informally muse on philosophical questions of art, destruction, and race. Running these responses through the tests helped me see, again, that while tools can be helpful, they’re ultimately just one tool—we need more than one to paint a bigger picture.

Videos

To round out that image, I’ll share one final unexpected evaluation tool: the teens’ final project videos as well as a talkback session they conducted at their video premiere.

For their final project, each student chose one work of art in the Museum Collection and looked at it, researched it, and talked about it with others for seven months. (Given that most visitors spend under 10 seconds looking at art in museum galleries, this is a feat in and of itself!) They distilled a school year’s worth of thinking into brief, 2-4 minute videos that answered what the work meant to them, what it had meant to others, and how their own thinking had changed as a result of looking at the piece—all questions with, of course, that familiar reflective bent.

The teens also participated in a talk-back/Q&A at the celebration where we premiered these final projects. Guests—museum staff, teachers, family, and friends—asked the group questions about their experience. If you like, you can watch the teens’ videos, along with the Q&A, in the YouTube playlist below.

Impact — Can Museums Change Teens?

So: does all the above—interviews, exit slips, readability tests, and final projects—add up to a full image of the impact that a year’s worth of reflective practice can have on students?

Brandon answers a question during the Satellite premiere Q&A session. Photo by Front Room Photography
Brandon answers a question during the Satellite premiere Q&A session. Photo by Front Room Photography

I’m not sure we can ever paint a full picture of student growth in intensive programs such as this one. I do think combining all of these tools can help, though—especially if the evaluative tools actively support the goal of the program. The interviews, exit slips, and activities were all intentionally structured to be reflective, related to the outcome itself. This relevancy was key, not only in genuinely evaluating the program’s success, but also in supporting the students’ abilities through the methods themselves. It’s also important that we educators make the program goal transparent to the students. The Satellite interns knew from the beginning that they were working on reflective ability—this helped prime them to think reflectively from the get-go.

As far as impact beyond reflective capacity, I also want to share a few quotes from the teens themselves about their time in this program:

“The videos help us think deeper about what we do—so even in school I think deeper about what I’m doing or why this was made or why this happened.”
“I learned that I shouldn’t judge a book by its cover. When I first saw my piece I just thought it was a bunch of different colors and didn’t really think about it actually having a meaning. But now I’ve learned that it actually has a super cool meaning behind [it], and I never would have learned about that meaning if I hadn’t taken the chance to explore. So I shouldn’t be so quick to judge.”
“We had to give tours and I found out that I really like to work with children and art at the same time. I would like to pursue a career in art education for elementary school students.”
“I was able to change and evolve my way of thinking, now being able to look past the obvious… I learned that art holds all the answers to any questions anyone may have, you just have to search for it.”

From the other evaluation tools, we saw that the students developed their ability to reflect on themselves and their own performance. But as seen in the comments above, they were also able to develop skills reflecting on the world beyond them—the world of art history, their future careers, how they interact with other people. All of these are ways of thinking that are valuable for their futures, as they go to college, discover their passions, and pursue meaningful career opportunities.

amt7_FRPhoto_140515N_C2_0135

Can Teens Change Museums?

I’ve shown how this program helped these students grow in many ways. What about the Museum itself? Have these students had an impact on our institutional practice?

Institutions move at a slower pace than most programs, and if change and impact are complex to measure in sixteen individual students, then it’s multiplied tenfold for an organization that serves hundreds of thousands visitors a year. Even so, over the past few years, the work of teens in our programs has slowly but surely worked its way into the daily fabric of the Museum. Teens have interviewed artists on behalf of the institution. They have advised docents on ideas for giving tours to high schoolers. Their video projects will be part of on-site and online Collection Resources at the Museum, as well as our Archives, for all visitors to access while learning about works of art.

Ultimately, evaluation and impact are ongoing, a grey area that has a lot in common with the act of teaching itself. When done well and intentionally, evaluation doesn’t just show if we’ve met a goal. The tools we use to evaluate ideally become part of our teaching practice, because they reinforce the very abilities we are trying to help our students develop.

As for what I’m still wondering about? This year, our evaluation methods for the most part required the teens to have specific existing skills, such as writing for the exit slips or proficiency in using an iPad (though we did have video-making workshops as part of the program). I’m thinking about other ways to holistically gather data. For example, given that much of our evaluation methods emerged from teaching tools, should I document or film our discussions with works of art and find ways to analyze them? I’d love to hear any ideas or tools you’ve used to evaluate your programs, just as I hope this post has inspired you to take a fresh look at your teaching practice and find unexpected ways to see the growth in your participants.

*     *     *     *     *

ABOUT THE AUTHOR

Milwaukee Commercial PhotographerCHELSEA EMELIE KELLY: Manager of Digital Learning at the Milwaukee Art Museum, where she develops educational technology initiatives and oversees and teaches teen programs. She is passionate about using gallery teaching and technology to foster relevancy for art museums in the 21st century. She has previously worked at the Frances Lehman Loeb Art Center, the Metropolitan Museum of Art, the Frick Art & Historical Center, and the Carnegie Museum of Art. Chelsea is a graduate of Vassar College and holds an M.S.Ed. in Leadership in Museum Education from the Bank Street College Graduate School of Education, where she was a Kress Foundation Fellow. She is also the founder and co-editor of The Art History BlogChelsea’s postings on this site are her own and don’t necessarily represent the Milwaukee Art Museum’s positions, strategies, or opinions.

Evaluation Can Be Fun

Written by Marianna AdamsAudience Focus, 2014 Educator-in-Residence at the Isabella Stewart Gardner Museum

Cross-posted from mariannaadams.blogspot.com

One of the great luxuries I value about my time here at the Gardner Museum has been the opportunity to have rather leisurely and unstructured conversations with museum educators here and at other museums in the Boston area. I appreciate the value of not always having an agenda and not needing to solve a problem. We bounced ideas off each other and I always came away with a fresh perspective, a deeper conviction in my intuition, and lots of new ideas. Our talks often meander around the relationship between a museum experience or program and how we choose to evaluate it. A few themes have emerged from the conversations so far.

There is Life Beyond the Survey

MA-SURVEYOver the years I have not made a secret of how much I don’t like written questionnaires, paper or online, despite how much I end up using them on evaluation projects. Why? The written survey is the most difficult methodology to do well. It’s the default methodology that most people think of when planning an evaluation and most of them are tedious and poorly focused. It’s a blunt instrument that cannot capture much in the way of subtlety and nuance (and life is so much about nuance). In recent years, with the plethora of online survey programs, we are drowning in surveys so survey-fatigue is a reality. Most surveys are really asking for the visitor to tell us that we did a good job (e.g., How satisfied were you with this experience?) and not enough about how the visitor values or benefits from the experience. Besides, the written questionnaire usually does not reflect the spirit of the experience we’re trying to evaluate, bringing me to my next point.

Match the evaluation method to the experience.

Imagine yourself at a museum’s “evening hours” event. There is a great band, wine, engaging activities going on throughout the galleries, good friends, and a happy crowd of people of all types and ages. The atmosphere is both relaxing and energized at the same time. As you stroll towards the door to leave the museum, someone hands you a piece of paper. It’s a survey asking you to evaluate this time you just had and it smacks you out of the pleasant, liminal state you spent several hours dropping into. That’s an example of how the survey methodology is not well matched to the quality of the experience you just had.

So what methodology might better align with the evening program experience you imagined yourself attending above?

First you start with what you want to know and why.

So often we select the methodology before we figure out what we want to know and why. We decide on surveys or focus groups when those may or may not be the best ways to collect the data. Often we collect more data than we know what to do with. Here’s an example that came up in a recent conversation:

Like many art museums, the Gardner offers several community nights with free admission throughout the year and these events are very well attended. Primarily, the Gardner wants to know if these events are indeed attracting people from communities close to the museum. Yes, we could easily get zip code information via a written questionnaire. The problem is that we tend to throw in a lot of other questions that we don’t really need the answer to. The other area of inquiry the Gardner would like to know about revolves around how visitors connect to the museum. So let’s keep those two data points in mind, residence and connection, as we think about how to get useful information.

Think creatively about ways to get that data and match it to the spirit of the experience.

How could we get zip code data and not make people fill out a survey?

Imagine a big map (maybe near the wine bar because most everyone would go there), with zip code areas and neighborhoods clearly identified. Give people a small colorful adhesive file folder dot and invite them to put it on their zip code. It becomes a fun, social activity and, for some reason, people like to find themselves on a map. It’s simple and inexpensive. At the end you have a picture of the zip code distribution of your audience. You could do this for other evening events and compare the maps.

MA-response wall

What about the ways visitors connect to the museum?

One methodology that I love to experiment with is embedded performance assessment. This means that visitors don’t realize they are providing evaluation data, even when we tell them, because the process is engaging on it’s own. At a workshop for the Gardner Museum education staff this week, artist-in-residence Paul Kaiser inspired us all to explore new ways to engage visitors and possibly end up with some interesting evaluation data.

MA-galleryPaul first introduced us to the concept of collaborative writing, using the example of Japanese renga poetry. He then provided us with a set of words —  rising, distant, enclosed, fold, release — and asked us to take the spirit of renga into the galleries, substituting the verses for objects, spaces, or views based on that set of words. We did it and were struck by how beautifully the experience honored the spirit of what Mrs. Gardner did in the ways she arranged objects to suggest ideas or relationships.

We played with ways to use this activity with visitors, discussing ways to engage families and adult visitors at community nights in something similar. Perhaps if we created a more playful set of words to match the feel of these events, visitors would find it enjoyable. We brainstormed possibly having a place where people could post their responses and read what others thought about. Having these responses could be a rich data source that helps us better understand ways that visitors make connections to the museum. We were jazzed!

What are some unconventional ways that you have collected rich and useful data about the visitor experience?

OTHER POSTS IN THIS SERIES:

Towards a More Mindful Practice

Falling in Love with Your Visitors

Barriers to Family Engagement in Museums

*     *     *     *     *

ABOUT AUTHOR

AdamsMARIANNA ADAMS is President of Audience Focus Inc. Her professional roots began in K-12 public and private school teaching (fine art, English literature, social studies, and special education) and segued into museum education where she headed several education departments in Florida museums. She founded Audience Focus Inc., in 2007 after 12 years of conducting evaluation, research, professional development, grant proposal writing, and concept development for the Institute for Learning Innovation. Her degrees are from George Washington University (Ed.D.), University of South Florida (M.A.) and Mercer University (B.A.). In her spare time she is an avid yoga practitioner and teacher.