a phone displaying the mobile version of the project website


This five-week project asked me to design any app that used a phone sensor as a core feature. For this project I eventually joined two other students to develop my initial idea further, from mockups to a high level interactive prototype. As a secondary concern, this project has a high focus on usability and fluid interaction patterns.

skills: Wire-Framing + Prototyping


The application that we created is called SunSense. It is an app that supports indoor gardeners by taking in data from the phones light sensor and about the type of plants to help them figure out whether their plant will have enough light to thrive in their home. Additionally, the app helps users keep track of their different house plants and their individual needs.

My Role: My main role was developing the UI and interaction patterns. Additionally, I helped our visual designer establish the branding.


one of the original forms that we were considering

The initial concept of the app was a low fidelity set of basic grey box screens. This iteration was intended to work out the core functions and communicate a general concept. At this point the primary function is a scanning flow where users input basic data about the lighting and the plant that they want to purchase before taking a “light reading” that uses the phones light sensor to detect the lumens in that spot. On the confirmation screen the app reflects whether the amount of light detected is adequate for that plant’s health. This screen also allows users to save this data as a “plant profile” so that they can more easily manage their plants and rescan if they desire.

The next iteration was highly focused on the interaction flows. Having chosen to pursue this topic we spent some time with a white board to flesh out the details of the interaction flows for the two main functions. We also explored different UI patterns that we could leverage to make the concept more usable.

full prototype in adobe XD

a detail image showing how the lamp can be rotated to form new shapes

For the next phase, we wanted to start user testing to make sure that the different aspects of the app were usable, and the goals were clear to different types of users. For this iteration we went high fidelity to make the testing as smooth as possible. Our results revealed that we had some heuristic issues with providing enough user freedom. Additionally, we found that our users wanted more information on screen at a time and a more directed experience for the scanning process as some details weren’t clear why they were important.

In the final version of our app we primarily updated the UI to display more information at a higher level. We also made the scanning process more clear with a step by step experience that leverages imagery and micro animations to explain each step that needs to be completed by the user.


Our final product is a simple, yet effective application that helps users figure out if their plant will thrive where it has been placed. When we tested our final prototype it was very well received, particularly regarding the look and feel of the UI. Users appreciated that the app hadn’t become over complicated, and that the primary function was fast, clear, and readily available with a high level of immediacy. Areas to improve are largely within the confirmation page and increasing the value of the data received from the scanning.

Feel free to try out the interactive prototype to the right, or . . . click to view interactive prototype