Bringing It All Together; Evaluating The “Old Ironsides” 1812 Discovery Center
Evaluating a new exhibit is not a process that can be planned from start to finish. It takes improvisation, adjustment, collaboration and a willingness to abandon methods that are simply not working. When evaluating The “Old Ironsides” 1812 Discovery Center we sought to determine how visitors were interacting with the space and if these interactions were leading to greater content understanding.
Are they going in?
First we had to determine the percentage of visitors that were entering the Discovery Center as part of their visit to the museum. Right away we discovered that our initial method, counting visitors as they entered the front doors and the door to the Discovery Center, was not the most efficient or thorough way to get the data we wanted. This led us to our first “snapshot” evaluation of the process. Instead of counting the flow of visitors, we began counting the total number of visitors in each part of the museum in half hour intervals.
How long are they staying?
Next, we wanted to determine the average time that guests were spending in the Discovery Center. This was achieved by choosing groups at random and noting the time they spent in the center.
Who's in there and what are they doing?
To determine how visitors were engaging with the interactives and panels in the Discovery Center, we initially used our standard tracking method with demographic breakdowns ( A=Adult, H=High school, M=Middle school, C=Children) along with breakdowns of the method of interaction ( L=Looked at, R=Read, I=Used Interactively, G=Used as a group, C=Conversation, X=Not engaged). We quickly determined that data collected this way would not accurately capture the space's usage. Some interactives were being used for long periods of time by one group. Since we were tracking groups at random, these interactives were not showing up as being used at all or infrequently. This was simply due to the fact that the groups monopolizing them were not being tracked.
This led us to use a “snapshot” method for every interactive and panel in the Discovery Center. In 30 minute intervals, and later 5 minute intervals, we used the same demographic and interaction breakdowns to record visitors location and engagement. This solved the issues from the previous method and allowed us to see which areas of the Discovery Center were being used and how. This method made it easy to see were real hands-on minds-on family learning was taking place. Areas that saw a lot of groups (G) using interactives (I) in conversation (C) with one another stood out as extremely successful in engaging families to have fun and learn together. This also allowed us to see which areas were not grabbing visitors attention.
Are they learning anything?
Now that we knew who was using the Discovery Center and How they were using it, we wanted to determine what, if any, content visitors were taking with them after their visit. We chose to develop a game to get at this question. We reasoned that this would be a more engaging method that would get deeper answers and make visitors more likely to participate. The game was presented as a way for families to compete to see who knew more about The War of 1812. Groups were presented with a piece of paper with “War of 1812” in the middle, on the top was a place for demographic data ( elementary school, middle school, High School, Adult) and for whether they had visited the Discovery Center or not. Participants were then given one minute to write all relevant words or phrases that came to mind related to the War of 1812. We then tallied their responses and named the winner.
Although a few responses given by those who had been in the Discovery Center reflected their experience there (“Canada” or “Rats,” etc), most responses were not indicative of whether people had improved their understanding of the War of 1812 by visiting the center. This is because the quality (relevance, depth of knowledge) of the responses did not correlate to the visitor having been in the Discovery Center.
What did they think about the experience?
For the final stage of the evaluation we conducted 52 Old Ironsides 1812 Discovery Center exit interviews. We came up with 5 interview questions. The first two questions asked visitors to recall one new fact that they had learned from the Center and their favorite part (panel, interactive, or object) of their experience. Next, it asked if there was anything they would have liked to have seen or learned more about. For the fourth question sought to establish how visitors would define the space and who they thought it predominantly catered to. Early observations suggested that visitors were seeing the space as almost exclusively for young children, and this would help to determine if this was an accurate assessment. The final question asked them to reflect on their visit to the Discovery Center and their general impressions of the space.
Collecting lots of data is great, but it is only the beginning. Next we will be digging in and making sense of it all. Most importantly we will determine how this information will dictate the adjustments needed to make the visitor's experience in the exhibit as rich and engaging as possible.