A research team at the University of Wisconsin explored a key education research question: Which Web mapping technology should students learn? Executive Editor Adena Schutzberg shares the results and explains why they may suggest a process for those outside of academia to find the “optimal” Web toolkit.
Change is inevitable: Deal with It
If you are sitting on your laurels with a Web mapping award for a Flash-based entry, Robert Roth and his team from the University of Wisconsin have news for you. Flash, once Apple decided to forgo support for it on the “i” devices, no longer works across all mobile platforms. Roth, who teaches introductory and advanced division cartography courses, has to deal with changing technologies regularly. And that means not only rewriting labs to match the new or updated technology, but also ensuring that his students leave the university with the skills needed in the job market. One question comes up repeatedly: Which Web mapping technology should students be learning to best prepare them for their careers? That prompts a more basic question: Can we develop a process to determine the best technology for a given mapping problem?
Those are two of the underlying questions in Roth’s academic research, but they have implications for other users of GIS. Wouldn’t government, private industry or GIS consultants value a process to help tease out the “best” tool for the job? While the original research was geared to answering the question for the university, I think there’s a lot of value in this and future work, for the rest of the geospatial community.
Roth’s research team includes Richard Donohue, the laboratory instructor for Roth’s class; Tanya Buckingham, assistant director of the UW Cartography Lab; as well as two graduate students/research assistants in the lab, Timothy Wallace, currently the graphics editor at the Huffington Post, and Carl Sack. It was Donohue’s slide deck (pdf) from a presentation at the 2012 North American Cartographic Information Society (NACIS) Annual Meeting that prompted me to interview Roth on this work.
When fully formed, the research questions boil down to four. Donohue visualizes them in a pyramid (Figure 1).
Figure 1: The pyramid summary of the four research questions
At the bottom is question 1: What technology to use (“development” in the pyramid)?
This is a very practical question and will determine what technologies students will touch in the classroom. University of Wisconsin cartography students have used Adobe Flash/Flex and supporting software products for the past 10 years.
Above that in the pyramid is question 2: What is Web mapping (“design” in the pyramid)?
Today’s Web mapping, Roth explained, is a mix of representation and interaction. That makes for a broad definition of design.
Above that in the pyramid is question 3: How do we teach Web mapping (“teaching” in the pyramid)?
The “how to” of teaching technology, with a focus on underlying principles and problem solving needed in real world jobs, is key to a valuable student experience.
And, at the pointy top of the pyramid, is the broadest question, question 4: How do we cope with change (“process” in the pyramid)?
The ultimate goal of Roth’s research is a process or workflow that an organization (in this case, the university) could repeat as needed to determine the current “optimal option” for Web mapping.
The research process involved three parts:
- Competitive Analysis Study
- Needs Assessment Survey
- Diary Study
Competitive Analysis Study
The Competitive Analysis Study involved the creation of a matrix with possible technologies on the left vertical axis and the mapping functions across the horizontal axis. The functions were organized into four categories: basemap, representation, interaction and mobile. Each grid cell was color-coded based on whether, or how well, the technology supported that function. The legend (Figure 2) reveals the range of possible values.
Figure 2: Cell coding “legend” for the Competitive Analysis Study
Needs Assessment Survey
The Needs Assessment Survey involved approaching stakeholders across the university system’s dozens of campuses. The team asked questions about practical considerations related to the software (cost, support, etc.) and what a Web map should do (scale, support real-time data, etc.). There were also questions about experience and familiarity with the 35 technologies explored in the Competitive Analysis Study.
The practical consideration that topped the “essential” list was “maintenance/stability.” Interestingly, “cost/accessibility” did not show a strong position toward “essential” or “not needed.” The feature of Web mapping considered most “essential” was “interactivity.” At the other end of the spectrum was “animation,” which fell in the “not needed” category for most respondents. The vast majority of respondents were not familiar with most of the technologies explored. Among those that had been used by respondents in the last two years were ArcServer, Bing Maps API, Google Maps API, OpenLayers, MapServer and TileMill.
Using data from the Competitive Analysis Study and the Needs Assessment Survey, the team selected the top four performers for its particular situation: teaching. These four libraries moved on to the Diary Study, a hands-on trial by students:
In the Diary Study, graduate students were given a Web mapping scenario with representative learning objectives from Roth’s course. The assignment was designed to be far more work than could be completed in the 40-hour time frame provided. Among the goals was to distinguish the “low hanging fruit” from the “difficult tasks.” Four students each tackled one technology and one student completed the study for all four technologies. Thus, there were two diary datasets for each package.
The students worked through the assignments and regularly documented in a diary how they felt with a key word from a list used in psychology tests (among the terms: blah, frustrated, okay, optimistic, excited, determined). They could then choose as many other terms as they liked for a more complete description. Participants were also asked to estimate the current completion rate for their tasks. The team documented each participant’s actual completion rate for comparison.
The data from the Diary Study were summarized in a few ways. Comparing completion estimates with actual progress revealed some significant gaps. Also interesting were large leaps in productivity which were linked to finding and learning to use a specific library component, for example (Figure 3).
Figure 3: A drop, then quick recovery in task progress when students moved to a new version of Leaflet (0.4). (red: estimated completion, blue: actual completion) Terms along the bottom are self-reported answers to “How I feel.”
The team also created word clouds for the terms selected during the use of each piece of software. I noted right off that “frustrated” was prominent in three of the four. I had to look hard to find it in the fourth cloud, the Leaflet cloud (Figure 4).
Figure 4: The “How I Feel” word cloud for Leaflet
Despite the fact that the Google Maps API delivered on more of the requirements set out in the Needs Assessment Survey, the team selected Leaflet as the answer to the first question of which technology should be used for teaching. Leaflet was, in fact, second best in supporting the requirements, but the Diary Study suggested students made more progress and felt better working with that set of libraries. The team suggests that might be due to the added transparency and control provided by a fully open source library.
The response to the design question, the one addressing what Web mapping is and what Web maps do, highlights the granularity of the current offerings. While most of the libraries can do most of the required tasks, students (and users in the real world) are required to combine atomic parts to create molecular structures. Roth suggests that instead of having to build a tool from scratch that toggles a map from choropleth to proportional symbols, for example, a single higher level call should be in the library.
When it comes to teaching, Roth will continue to develop exercises that are technology agnostic. While there will be rewriting of exercises, the principles should remain the same.
As for the final question, the one related to creating a workflow to select the “optimal” Web mapping technology, Roth feels his team has laid the groundwork. The Competitive Analysis Grid is in a wiki that students will update continuously. As for the rest of the process, the goal is to streamline it into a single week of work.
Roth and his team are preparing this research for submission to a journal. We appreciate them sharing this early, informal look at the project.