Studio Project Photo.jpg

Studio iOS App

 
Group 274.jpg
 
 

Studio iOS App

The Flatiron School iOS Capstone Project

 
 
 

OVERVIEW

About the project

 
 

"How might we discover music guided by our emotions and speaks to us in times that we need it the most?"

Overview: For three weeks, my team and I worked together at the Flatiron School to build and ship an application that solves user painpoints from Vevo's mobile application. Our app was successfully admitted to the Apple App Store.

The solution: We created Studio which helps users discover newly recommended songs based on genre, artist, and mood preferences. 

Timeline: ~3 weeks

The team: Joseph Ugowe, Erica Gutierrez, Matt Amergie, and Joel Bell

My role: Product Design, Visual Design, User Experience Design, and iOS Engineering

 
 
 

BACKGROUND CONTEXT

Project scope

 
 

Studio is a mobile iOS app that I worked on with a team of four other developers for our final capstone project at The Flatiron School's iOS Developer Program (the project is also available for viewing on github). This app is special to me because it was not only the first app I worked on, but also because it was created with a team of friends. 

Studio helps users discover newly recommended songs based on genre, artist, and mood preferences. For three weeks, we worked together to build an application that relates to Vevo, a multinational video hosting service. Each group in our cohort was given a different project objective. However, our task entailed developing an app that relates to music and/or music videos. Danny Gallardo (Project Manager at Vevo) served as our project mentor and company contact. Danny's mentorship involved us being instilled with Vevo's way of thinking: We always ask ourselves 'What is the problem we are trying to solve?' We often devise these grand features, that then get broken down into their most basic needs.

 
 
 

OUR TEAM

iOS Engineers

 
 

Joseph Ugowe integrated the Google sign-in SDK for login, designed sign-in screen, helped implement Alamofire and SwiftyJSON to connect with Spotify & Youtube's API to filter songs by seed metrics, helped manage version control using Git, and managed the integration of Objective-C Cocoapods into our Swift project.

Erica Gutierrez helped implement Alamofire to connect Spotify's API to YouTube's API to filter songs by seed metrics, managed Youtube OAuth and access tokens, and encapsulated YouTube API requests into a data structure for easy access when displaying video results in music video playlist screen.

Matt Amerige designed algorithm to manage Spotify/YouTube OAuth and access tokens, helped build backend using Google Firebase SDK, integrated Cocoapods for such as status bar notifications and reachability to handle edge cases, implemented YouTube Player Cocoapod for playlist screen, and managed the version control using Git. 

My role

Studio is one of those rare and unique opportunities where I stretch beyond my comfort zone and wear different hats. I helped in developing a product from ideation to acceptance into the Apple AppStore. I got to explore the three realms of design, product, and engineering and everything in between:

Product Design: Creatively solving a problem with a well-designed solution and making the technical programming decisions that affect not only the functionality of the product but also the design.

Visual Design: Layouts, color schemes, logo, typography.

User Experience Design: Writing code and designing data structures which would influence my approach to creating an intuitive UX and designing an easy-to-navigate UI. 

Engineering: Using iOS development tools like Xcode, Swift, and Storyboard; querying Spotify's data-rich API with Alamofire and SwiftyJSON to fetch and use JSON information; cocoapods integration; coding independently as well as in teams. 

Leadership: Stand-ups, demos, technical and design presentations regarding the weekly progress of our project to our cohort, iOS Instructor (Joel Bell), and team mentor.

Furthermore, working in a team required us to take on roles that seemed both overwhelming and challenging. I was taught topics that I was not comfortable with, and conversely, I explained topics that I had experience on. Through this process, I learned more about my self, how to be a better teacher, life-long student, and integral part of a team. 

 
 
 

IDENTIFYING THE PROBLEM

 

“We always ask ourselves 'What is the problem we are trying to solve?'  We often devise these grand features, that then get broken down into their most basic needs.”

Vevo's Design Philosophy

 
 
 

Problem

There are many music streaming services that offer millions of songs available for listening. However, organizing and presenting that information becomes a challenge especially when the organization is based off a few parameters. As a result, discovering new music becomes troublesome. Moreover, this prompts several interpretive questions in the form of hypotheses. 

Hypotheses

Should there be different parameter selections when discovering new music and music videos? If so, how should those parameters be different? Furthermore, when discovering new music, should users be given a choice to search? Effectively, how would a user know what to search for if they intend to discover?

 
 
 

UNDERSTANDING OUR USERS

Identifying who we are designing for

 
 

Given that we only had about three weeks to go from ideation to App Store, we needed to be sure that we were not designing and developing an app and user experience in a vacuum. As a team, we looked into our own experiences, pain points, and areas in which music streaming services that we use such as Apple Music, Spotify, and Amazon Music could be improved. By doing so, we were able to ask the following questions throughout each phase of product development and iteration to be sure that we were designing for other voices other than ourselves.

  • What people problem are we trying to solve?

  • What is the story of the experience that we want people to have?

  • What is happening before people get in and start interacting with these screens?

  • What are they feeling and thinking?

  • What got them in the screen in the first place?

  • What happens afterwards?

  • After the interaction with Studio, how do we want people to feel?

  • What do we want them to have taken away from that experience?

Many of these questions came from Julie Zhuo's The Year of the Looking Glass Series, specifically the Building Products and Good Design Medium essays. Developing a more holistic perspective towards learning how to design for others and asking these interpretive questions helped us imagine and bring along other people's voices while designing and developing. Effectively, being mindful of those who might be using Studio.

 
 
 

BRAINSTORMING

Generating hypotheses and interpretive questions

 
 

To start off our brainstorming session, we thought of several pain points--in the form of cases--with current music and music video streaming services.

Case 1

  • Hypothesis: How might we match curated playlists to user moods?

  • Claim: Allow users to fine-tune their preferred genre(s) and artist(s) selections based on their mood.

Case 2

  • Hypothesis: How might we help users discover new artists depending on their changing moods?

  • Claim: Create mood playlists for users that change depending on trending and new artists.

Case 3

  • Hypothesis: How might we get users to create and discover more content with or without regards to their moods?

  • Claim: Let users build their own music video playlists that can be inspired by their mood.

 

IDENTIFYING, FRAMING, AND REFRAMING THE PROBLEM

Getting to the right problem to solve

While comparing existing platforms against our sample cases, we realized that existing platforms had a lack of personalization regarding emotions. Current platforms allow users to listen, store and save music and playlists. The problem we discovered is that users are creating playlists with music they already know rather than creating playlists with newly discovered artists in addition to the music they are already familiar with. After realizing this epiphany, our opportunity for improvement in the music streaming space would be inspired by designing around our users. However, we needed to find a way to accurately determine what specific mood certain songs would be categorized by. 

 

The problem we discovered is that users are creating playlists with music they already know rather than creating playlists with newly discovered artists in addition to the music they are already familiar with.

 
IMG_3369 5.JPG
 

Defining the core functionality

Listening to music is a personally visceral experience that influences our emotions. What's more, watching music videos associated with their respective songs profoundly affects our emotions because it is affecting an additional sense--our sense of sight.

Assume you have been invited to a concert, and you are familiar with some of the artists playing. While at the concert, you fall in love with three new artists that you never heard of because their songs resonate with your recent heartbreak and your breakthrough from it. This act of discovering the new artists was not something that you actively did. Instead, the music spoke to you because your heart was receptive to it at that time.

 
 

How might we re-create that concert experience, but with more emotions and a series of different experiences? How might we make watching music videos more personal? How might we find music that speaks to us in times that we need it the most?  If we wish to re-create the above scenario, is it best for users to search, or should an algorithm find the best music available for them? Asking these interpretive questions made our vision clear: How might we re-recreate a music watching experience that mirrors the changing moods that we express?

 
 
 

WIREFRAMING

Prototyping the interaction flow

 
 

For the user flow of our app, we wanted a simple and intuitive three-step process for discovering new music. New users are able to immediately start exploring new music, as they are not required to initially sign-up. However, users are required to sign-up if they wish to save their playlists to their YouTube account. 

 
 

Returning users would automatically be signed in because we are using NSUserDefaults and Google's Firebase to store login-up information. After all the preliminary processes, normal user flow will move in the following sequence: first the genre screen, second the artist screen, third the mood screen, and finally the playlist screen. Therefore, there would be no boundaries between users and their music. Only their emotions would guide their decisions.

 
 
 

ENGINEERING

Week 1 responsibilities

  • Implement Spotify OAuth to run in the background.

  • Query results from the Spotify API by using genres, artists, and mood seeds.

  • Create a user interface prototype to present the query information of artist, genre and mood selections.

Week 2 responsibilities

  • Integrate Spotify API with YouTube API.

  • Encapsulate the final data received from the API calls and pass that to a PlaylistViewController to display the results.

Week 3 responsibilities

  • Research UI/UX features to implement and help configure view controller features.

  • Begin work on finalizing the UI, standardized fonts, colors, and other assets throughout the app to achieve a consistent design language.

  • Create collection view cell assets for artists and moods Collection View Controllers in Sketch 3.

  • Export assets for various screen resolutions from Sketch 3 and import them into Xcode.

Since designers and engineers were all working together, we were all aware of the different design iterations. We were all working in tandem--agilely. I had designed the wireframes while programming the mood selections in code and made adjustments to the wireframes accordingly. Effectively, my purpose for refactoring the code was for two reasons. First, to organize and simplify our data structure for easier access and manipulation. Second, to create an intuitive and easy option for users to choose mood selections. 

We design 18 different mood preferences to choose from. The moods include acoustic, trendy, live, slow dance, energetic, instrumental, happy, chill, sad, rage, smooth, reflective, awake, motivational, chaotic, sleepy, active, and focused. They were constructed using the Spotify API which provides numerous tune-able track attributes that allow for further customization of audio features. However, the attributes we use include acousticness, danceability, energy, instrumentals, liveness, loudness, mode, speechiness, tempo, time signature, and valence.

Sa

Sample mood selections programmed in Swift using Xcode

For each mood selection, we adjusted specific tune-able track attributes from Spotify's API according to the moods that we had chosen. As a team, we were consistently re-iterating the design process as the code was being created, changed, and refactored. We did a lot of test case scenarios--we even made sure to test whether our playlists met the satisfaction of each participating student in our cohort!

 

DESIGN

Interaction design

 
 

For the user experience of our app to allude to the feeling of discovering, I deliberately chose each genre and mood icon to relate to their respective categories. Furthermore, each icon is hidden until they are tapped. Namely, with each selection, the user not only receives feedback, but they are presented with a surprise. We asked ourselves: if discovering music is fun, then what is the simplest action we can design that would be just as fun?  

 
 
Selection Example 2.jpg
 
 

Pop genre and energetic mood selected

This design choice was inspired by translating the feeling of touch from the physical space into the digital space. With each icon in the collection view being unique, users are encouraged to create different playlists by tapping to discover the unique icons. The interaction we designed spurs curiosity, thus further elevating the feeling and experience of personal discovery with music. The interface design implements high-level feedback in the form of visuals that serves two purposes.

 

The interaction we designed spurs curiosity, thus further elevating the feeling and experience of music through personal discovery.

 
 

From left-to-right: Genre and Mood Screen both prototyped in Apple’s Xcode Simulator

First, it communicates information regarding the acknowledgement and confirmation of an action as it transitions from an inactive state to an active one. Specifically, showing the visibility of the system’s status. Second, the new state spurs curiosity in the user; therefore, inspiring them to engage with the app further by encouraging them to discover what the other inactive states might reveal. Effectively, inspiring them make different selections and create different playlists.

Finally, the the user interface and user experience adheres to Jakob Nielson’s 10 usability design heuristics as well as the following guidelines from the metric-ISO system. ISO 9241-12:1998 Ergonomic requirements for office work with visual display terminals (VDTs)—Part 12: Presentation of information. ISO 9241-13:1998 Ergonomic requirements for office work with VDTs—Part 13: User guidance. ISO 9241-210:2010 Ergonomics of human-system interaction—Part 210: Human-centered design for interactive systems.  

Final hi-fidelity design

 
 

Above is the user flow for a new user from left-to-right. However, the introduction screen will not be presented to existing users who have used the app at least once. Users follow a simple three step process to discovering new music: first they choose up to five genres, second up to five artists, and third, they have the choice to pick one mood.

 
 
 

RECAP

Revisiting our hypotheses

 
 

Behind each of the three hypotheses that we had proposed, we challenged it against our vision: how might we re-create a music watching experience that mirrors the changing moods that we express? By answering our vision, we, in fact, created solutions to our hypotheses. Our answer is this: allow users the opportunity to make decisions guided by their emotions. Below, I address the solutions--and their value proposition--to the hypotheses from each of our three cases:

Case 1: How might we match curated playlists to user moods?

 
 

Case 2: How might we help users discover new artists depending on their changing moods?

 
 

Case 3: How might we get users to create and discover more content with or without regards to their moods?

 
 
 

REFLECTIONS, RESULTS, AND KEY TAKEAWAYS

Addressing the big elephant in the room

All of our design and engineering decisions were led by assumptions based on the little research that we conducted from several members of our cohort. With only three weeks to create an application from ideation to acceptance to the Apple App Store, we did not have the opportunity to thoroughly conduct user research, A/B usability tests, or other necessary quantitative measurements. Given the time, we for sure would have done so. Moreover, that research would have served as the basis and trajectory of our project.

Nonetheless, we were aware of this fact, and this is the core reason that we aligned our design and engineering decisions around our users. During our brainstorming sessions, we noticed a pain point in current music streaming services, and we saw an opportunity to address it. By presenting users with genre and artist selections, we stayed true to options that existing platforms (e.g., Spotify and Apple Music) already provided users. Therefore, we decided to build off of these options by implementing a similar one which would be mood based. Effectively, allowing users to discover music inspired by their changing moods is our competitive advantage.  

Did we achieve our goal?

Yes, we created an application that effectively does what we need it to do: create music video playlists based on mood preferences. Furthermore, we did our best to address each of our hypotheses by creating strong value propositions to each solution to our hypotheses. However, most importantly, we staying true to our vision of keeping users that the center of our decisions.  

Tackling such an ambitious task seemed daunting for four student developers. However, each day we asked ourselves "What would this look like if it were easy?" This provided us better clarity and understanding of the following quote that we read on the wall each day at The Flatiron School:

IMG_8001.JPG
 

“If you want to build a ship, don't drum up people to collect wood and don't assign them tasks and work, but rather teach them to long for the endless immensity of the sea.”

Antoine de Saint Exupery

 

The sea would be our vision, but getting lost in it because of its vastness and complexities like the sea was indeed possible for us. But we exceeded our own expectations each day--even when we failed, did not know something, or did not think it was possible given our knowledge and capabilities at the time. We held on, taught, supported, and inspired each other. 

This simple idea of "What would this look like if it were easy?" helped us embrace our vision even more, as we worked towards it with passion. Above all, what helped us achieve our goal was our teamwork. 

Working alongside other engineers as a designer/engineer, I developed a clearer understanding of how software influences the UI/UX and vice-versa. For example, the data received from the Spotify API showed how Spotify's songs were possibly further classified. As an engineer, I understood that we could take that information and classify the songs even further! As a designer, I realized that I could find an elegant and practical solution to creating a new option for discovering music, which would be mood based.

What I learned

  • The importance of how and why the collaboration between engineers and designers are critical in product development.

  • How to effectively communicate my thoughts, ideas, and suggestions to engineers in a diplomatic fashion.

  • Why giving and soliciting feedback in a "yes and" approach sets the team up for success.

  • What working in a fast-paced startup would feel like as we had only three weeks to execute on an idea.

  • The value of being scrappy, flexible, adaptable, and agile to create good code and design.

  • Developing perspective as both a designer and engineer. Not every design can be engineered and not all code should be designed.

Below are several key points that we critically asked ourselves and our solutions to address those points.

Thinking about the platforms

Why did we choose to develop an iOS app and not a tvOS app--why not both? This was a question that was repeatedly asked during our science fair presentation. Since we had three weeks to develop an application, and we had to meet deadlines and deliverables, it made more sense for us to start building rather than researching an unfamiliar technology. If time permitted, we would have also developed an application for the Apple TV as well. 

Thinking about competitive differences

What makes this feature or product different from what's already out on the market? Why not just use Spotify's stellar readymade playlists? 

Looking forward

  • Create hypotheses that envision the user experience through persona development, storyboarding, and user experience maps rather than basing assumptions off of our cohort's preferences.

  • Conduct market research through customer discovery and test our hypotheses.

  • Create an MVP prototype for each different platform.

  • Explore interaction from different starting points: mobile, tablet, and TV.

  • Re-iteratively improve the prototypes based on market research to establish a more secure product-market fit.

  • Conduct further user testing (usability testing) to discover what design decisions can be confirmed or challenged.