Overview
Project:
Concept Design & Prototype for a Kiosk-Based Conversational UI for Human Services

Role:
Student, UX Designer (co-created with Shannon Badiee as part of our Master's course of study)

Tools & Skills:
Sketch, Invision, Jasp, Statistical Analysis, Experimental Design, User Research, Usability Testing, Ergonomics, Visual Design, User Experience Design 

Description
This project was co-created by Shannon Badiee and myself as part of our Master's course of study at Iowa State University. We were tasked with finding a problem in the ergonomics space and creating a solution that would benefit society. For our challenge, we chose to address San Francisco's homeless epidemic and the city's lack of homelessness related infrastructure.
In the last five years, many San Franciscans have lost their homes due to a sharp incline in real estate and rental prices. According to the 2017 homeless census, 7,499 people in San Francisco are displaced (Spotswood, 2017). San Francisco has the second largest homeless population in the United States behind New York City (Sze, 2017). With this homeless crisis comes challenges and confusion; how do homeless people get help, especially if they don’t have regular access to the internet to search for specialized services? How do they find a bed for the night, a place to eat, or receive health and addiction related treatment? 
In response to these questions, we designed a purpose-driven and easily accessible conversational user interface with a focus on human services for homeless needs. Our interface directs homeless individuals to the local resources they need including food, shelter, jobs, and specialized health services. In execution, our interface design would be housed within a conceptual public kiosk which would also supply San Francisco residents and tourists with free, self-service wifi, device charging, and local business information.
Our project consisted of research & persona creation, task analysis and user flow creation, designing both a voice and visual user interface, designing a research plan for testing our UI, analyzing the results of our research, and revising our design. We also created a visual mockup of what our kiosk might look like and what attributes it might bear to fit in with San Francisco's culture.

Opportunity:
As a homeless individual, the world does not always treat you kindly. You may face issues of safety, scarcity, or illness. You may feel shame about the reasons you became homeless, about your current condition, or the difficulty you are facing in trying to get back on your feet. You may feel frustration about not knowing where to begin again. In San Francisco, it is not easy to find specialty services, as a homeless person, without access to a computer, the ability to conduct research, and the mobility to travel about the city.  It is not easy to find a shower, a hot meal, or a place to sleep with dignity. With this project, we sought to answer the question: How might we create easy access to homeless services for all homeless individuals in San Francisco?
The subsequent sections provide further details on the project phases, learnings, and deliverables.

Tools:
Sketch, Invision, Google Slides (for VUI prototype), Adobe Illustrator
User Research & Persona Creation
As part of our research process, Shannon and I designed three core personas (including a secondary user) to illustrate who the kiosk & application would need to accommodate. To inform these designs, we pulled archetype data from the 2017 SF Homeless Census.

Personas
Taxonomy and Task Analyses
For this application, a great deal of thought was given to the design of the information architecture and taxonomy. As one of the core goals of the product was to facilitate service discovery for homeless individuals, creating a relatively easy to digest, shallow menu was critical. Further, by limiting menu options on each page, the amount of time required by a user to perform visual search has been reduced in accordance with the Hick Hyman Law.

Category Structure
The Helping Hands Human Services Application features five core categories with a maximum of six subcategories. We limited subcategory options to six based on our understanding of the limited capacity of human short term memory. Six is a sufficiently small number of choices to enable a user to easily recall the options with which they have been presented. Further, from a visual standpoint, limiting the selection to six options enabled all options to be presented on the UI simultaneously, thereby reducing the complexity of visual search when compared to an interface with scrolling functionality.
Research and qualitative results from usability testing drove category-subcategory mappings. The most challenging category from a taxonomy and placement perspective was the Supplies & Services category as the contents of this category are not self-evident based on labelling alone. As such, there is some deductive reasoning and assumption on part of the user that must take place prior to selecting this option. This being the case, we ordered this option on the interface last as a forcing function for users to scan for more clearly labelled options first. Our goal in doing this was to reduce the risk of error due to satisficing that could occur if this element were placed first on the interface.
Application Architecture
Diagrammatic HTA - Engaging With The Kiosk's Voice User Interface
Hierarchical Task Analyses
Interface Design
The primary interface design goal for the kiosk and kiosk interface is accessibility for all. As fifty-one percent our target audience, homeless individuals in SF, is over the age of forty we were especially aware of the impact of aging on user perception and other physical capabilities (2017 San Francisco Homeless Count and Survey Full Report, 2017, p.17). As such, visual, auditory, and dexterity-related needs of older audiences were of particular import. While no median age for tourists was found in our research, the median age of residents in SF is about thirty-eight years old and the population is predominantly male (Census Reporter). Interestingly, the chronically homeless population in San Francisco is also largely male, with men comprising sixty-eight percent of that population (2017 San Francisco Homeless Count and Survey Full Report, 2017, p.42).
The kiosk interface features large, high contrast text to accommodate differences in vision among users. Colors were selected with colorblindness in mind, given the high percentage of male users the kiosk would serve. Large, touch-screen button inputs both increase the ability with which a user can tap a given touch target and create latitude for individuals with dexterity challenges or tremor. Button design leans toward the skeuomorphic so as to ensure clarity of visual affordances. If our project were to be engineered in the future, the software for natural language processing would be designed with special consideration to users who may have accents or speech impediments.
Our interface design process began with an ideation phase. We collected inspirational images, performed research on other kiosk based UI's, and used these to help inform our design decisions. We tried multiple layouts, but ultimately landed on a design which featured a persistent sidebar to assist with wayfinding.
Layout Ideation

Inspiration Board

Initial Interface Design for Usability Testing
Measuring Design Success
To test the efficacy of our design, Shannon and I focused specifically on the visual/screen, auditory/voice, and voice-screen mixed modal interfaces within the Helping Hands Human Services application. We wanted to understand whether there was a difference in effectiveness between these conditions. The evaluation of the application required some creative thought as we were unable to access individuals within our specialty population, the homeless population of San Francisco. As such, we created an experimental design which was conducted on a population of older adult users with no visual or auditory impairments (beyond the scope of that typically associated with aging). In future experiments, we would like to test the efficacy of the interface on a population of users with auditory and visual differences. 

Experimental Design
We designed a hybridized between and within-subjects experiment wherein we compared clicks, time-to-task completion, and NASA TLX scores for three experimental groups. The independent variable in our study was the use of voice-based interaction and, as such, our groups were divided into a 1) voice and haptic group, 2) a voice only group, and 3) a haptic only group. We hypothesized that all groups would perform equally well from a task load standpoint. We further hypothesized that voice input would require longer for time to task completion but that this would not reflect negatively on task load or the overall user experience.
Because of the small number of participants used (ten total), the results of our experiment were not statistically significant. However, we were able to make some high level, practical decisions based on the data collected which led to improvements in the design. 

Outcomes
The results of our experiment confirmed our hypothesis that voice interactions would result in a longer time to task completion than haptic interactions on a visual interface alone. The TLX scores between these two conditions also aligned with our hypothesis from a practical significance standpoint. In general, users felt that experiences were comparably easy across both conditions. Complexity arose, however, in the third condition wherein users were met with both voice and screen elements. This was surprising in that previous research in the field of Human Computer Interaction has indicated that redundancy across screen and voice inputs benefits users. Further research is required on our end to understand how to improve the mixed screen-voice condition for usability, mostly because we don’t have a working voice UI app with NLP (Pearl, 2017, page 178-179). However, we predict that by reducing the amount of on-screen content and further simplifying visual menu options, we can improve this design. These changes were implemented in our revised design, visible below.
 Interface Design (Post Usability Testing)
Kiosk Concept Design
After having completed our Visual/Haptic and Voice UI designs, Shannon and I created some initial concepts around the physical kiosk in which the Helping Hands Human Services app would live. 
In our concept design, the kiosk touch screen is positioned at 34” above the ground and is recessed into the kiosk by 12” to provide the user with privacy (2010 ADA Standards for Accessible Design [PDF], 2010, p. 108). From a macro-ergonomic standpoint, the physical kiosk design considers the larger ecosystem of San Francisco and embodies the city’s values of eco-friendliness and sustainability. The kiosks is solar powered and each component used is constructed of recyclable materials. The kiosk also features a living wall of plants on the rear, street facing side to add to the aesthetic of the city, create an inviting appearance for the kiosk, and provide a natural means of cleaning the air on high traffic streets. 
Resources & References:
Sze, K. (2016, June 29). Data shows SF has 2nd highest homeless population in US. Retrieved
    October 04, 2017, from
   http://abc7news.com/news/data-shows-sf-has-2nd-highest-homeless-population-in-us/1407123/

Spotswood, B. (2017, June 26). 2017 San Francisco 'Homeless Census' Reveals That Despite
    Numbers, Things Are Worse, Not Better. Retrieved October 04, 2017, from           http://sfist.com/2017/06/26/2017_san_francisco_homeless_census.php

Census profile: San Francisco, CA. (n.d.). Retrieved December 02, 2017, from   https://censusreporter.org/profiles/16000US0667000-san-francisco-ca/
Johnson, G., Coventry, L. (2009) "You Talking to Me?" Exploring Voice in Self-Service User Interfaces. International Journal of Human–Computer Interaction. Volume 13, 2001 - Issue 2
Whitenton, K. (2017, November 12). Voice First: The Future of Interaction?. Nielsen Norman     Group. Retrieved December 02, 2017, from https://www.nngroup.com/articles/voice-first/

Pullin, G., Treviranus, J., Patel, R., Higginbotham, J. (2017 July 4). Designing interaction, voice,   and inclusion in AAC research. Augmentative and Alternative Communication (AAC). Volume 33, 2017 - Issue 3.

Alper, M. (2017). Giving Voice: Mobile Communication, Disability, and Inequality. MIT Press: Cambridge, MA.
2010 ADA Standards for Accessible Design [PDF]. (2010). Department of Justice. Retrieved December 02, 2017, from https://www.ada.gov/regs2010/2010ADAStandards/2010ADAStandards.pdf
2017 San Francisco Homeless Count and Survey Full Report [PDF]. (2017, June 21).
    Watsonville, CA; San Jose, CA: Applied Survey Research (ASR). Retrieved December 02,     2017, from http://hsh.sfgov.org/wp-content/uploads/2017/06/2017-SF-Point-in-Time-Count-General-FINAL-6.21.17.pdf

Schalkwyk, J., Beeferman, D., Beaufays, F., Byrne, B., Chelba, C., Cohen, M., Garret, M.,            Strope, B. (2010, September 9) Google Search by Voice: A case study. Mountain View, CA: Google Research.

Nielsen, J. (2003, January 27). Voice Interfaces: Assessing the Potential. Nielsen Norman Group. Retrieved November 01, 2017, from https://www.nngroup.com/articles/voice-interfaces-assessing-the-potential/

Oliver, A. (2017). How to build your own Alexa-like personal assistant. InfoWorld. Retrieved    December 02, 2017, from https://www.infoworld.com/article/3183521/application-development/how-to-build-your-own-alexa-like-personal-assistant.html

Pearl, C. (2017). Designing Voice User Interfaces: Principles of Conversational Design. Sebastopol, CA: O’Reilly Media, Inc.

Grey Silhouetted Man in kiosk mockup courtesy of Vecteezy.com

You may also like

Back to Top