With a short timeframe due to the holiday, this week was a mix of finishing up dwindling projects and starting new ones. One project that has been consistent over the past few weeks is an internal project for the data team. The project concerns a web application for data visualizations of the company’s subscriber population. Though it is mostly the data team’s assignment, the design team was now tasked with improving the overall user experience (UX).
This internal site is written in R and uses Shiny to create a web application; R is a programming language for statistical computing and data analysis and Shiny is an open source R package that lets you create visual web applications. While our data team had taken the time to layout the application so that it was usable, they did not have the time to customize or optimize the framework. While I have never worked with R or Shiny, there is extensive documentation about customizing the appearance of the applications but that will be a challenge for another week). The task for this week was to have the application undergo usability testing to see how our internal teams use this data to find what they need to know. Usability testing is part of UX design and tests to make sure that the site is successful in letting the user achieve their goals.
The task for this project was to bring in different internal users and see how they use the site. The first step was to make sure that all potential users were being tested, so the final interview list consisted of Sales, Editorial and C-level executives. My job was to then take an example usability test script and apply it to these interviews to make sure that we were answering all the questions needed for the redesign. The challenge for these interviews was to actually conduct them and effectively follow the script. I had to cover all of the necessary questions but also draw on past interviews by bringing in new questions that people had brought up. For example, the idea of an email alert had been discussed and I needed to adjust the script so that we could see if that would be helpful for other users.
For the interviews I followed a 5 step interview process:
1. Explain the purpose of the meeting – The key was to make sure that the user understood the product but also didn’t have too much information. If I explained too much then they wouldn’t be able to make their own assessments or judgements.
2. Ask the user about his/her goals – I found this to be a very important part of the process because it brought to light new requirements. While I had a general idea of user goals, their answers always had a few things I hadn’t considered.
3. Find out how the user is currently trying to obtain these goals – This step allowed for me to understand how often the user would be using the system and which pieces needed to be more easily accessible if they were used frequently.
4. Test UI of tool – Testing the UI tool was the meat of the interview. It showed how the user currently uses the tool and where they run into problems. It also allowed for the interviewee to critique the currently layout and verbalize their thought process when using the interface.
5. Find out if anyone else should be interviewed – This question helped with defining the users to be tested and let me know who they thought might have a different take on the application.
While there are still a few more interviews to go, each interview so far has brought to light new issues and goals. The next step after user interviews is determining the requirements for the new design and prioritizing each requirement as needed. After that it is to start either mocking up pages, if the changes are dramatic, or start implementing the actual adjustments.
As for the interviews so far, some large takeaways include:
- Minimizing the graph sizes because users want data first and foremost.
- Summary statistics can be more powerful than all of the data at once.
- Some features may seem intuitive, but without direction details can be overlooked.