Smarthome Base Food Detection System: FoodCare
Liqian YOU 45411036
Introduction
FoodCare is a smarthome based food detection system support specific dietary for those with medical conditions. There are three modes with different food detection datasets and various interactions including pregnancy, vegetarian and seniors. Users can select one mode, login Costco account and upload the online shopping list. When Parcel delivery, the food detection system is working. The AI camera on the crisper can detect food categories and monitor expired date. When users put food into the crisper, it can identify which food is beneficial or harmful to their health and provide diet recommendations.
Background
Dietary is important in everyday life and appropriate nutrition from daily meals is essential for our health. Smart home technology may provide an avenue for monitoring. However, technology has relatively poor uptake and we need to understand how people consider and accept it. From the literature review, I found some studies related to smart home technologies and food detection applications, but few of them analyse their usability and usability. It is necessary to further explore this topic. Then I did online surveys and interviews. Results indicated that those who are in medical conditions, such as pregnant women have the requirements of dietary care. Finally, I created FoodCare for pregnant women and evaluated its usability and acceptance.
This video introduces the background, interaction, functionality and technologies of the FoodCare.
๐ค Domain: Smarthome Based Food Detection System
๐ง Research question: What are the usability and acceptance considerations?
๐ Research approach: Interview + Online Survey
๐คฉ Prototype: image detection system + mobile interface
Design & Development Process
๐ Research
Quantitative Research: Online survey aims at obtaining people's idea about smart home technologies.
Qualitative Research: Semi-structured interview is to explore the requirements on dietary care for pregnant women and their consideration of the food detection system.
These two research methods were applied in the early stage of the design process for idea generation. Then, I set the potential users and created personas through data analysis.
๐ Prototype
Detection System: Raspberry Pi and camera is for food detection. TensorFlow software library and OpenCV open source library is to establish real-time computer vision algorithms. Google Cloud Speech API for text-to-speech function. Here I only build a prototype representing voice feedback. The speech recording has already downloaded in the mp3 module.
Mobile Interface: InVision is for mobile interface and interaction design.
๐ Evaluation
TAM Survey: technology acceptance model (TAM) is to evaluate the usability and functionality of the food detection system. I created an online survey through Google Form and invited six participants to attend this activity. Results indicated some users have no idea about this new technology but would like to try it. Others care more about the accuracy, privacy and security and they may not try it until it is proven and widely employed.



Reference
[1] ๐ Get Emoji โ All Emojis to โ๏ธ Copy and ๐ Paste ๐. (2020). Retrieved 25 October 2020, from https://getemoji.com/
[2] The Best At-Home Sunday Brunch Recipes. (2020). Retrieved 25 October 2020, from https://www.delish.com/cooking/menus/g2645/brunch-breakfast-recipes/
[3] T. Pizzuti, G. Mirabelli, M. A. Sanz-Bobi, and F. Gomรฉz-Gonzalรฉz, โFood Track & Trace ontology for helping the food traceability control,โ Journal of Food Engineering, vol. 120, pp. 17โ30, Jan. 2014, doi: 10.1016/j.jfoodeng.2013.07.017.
[4] J. Shin, Y. Park, and D. Lee, โWho will be smart home users? An analysis of adoption and diffusion of smart homes,โ Technological Forecasting and Social Change, vol. 134, pp. 246โ253, Sep. 2018, doi: 10.1016/j.techfore.2018.06.029.
[5] M. Schill, D. Godefroit-Winkel, M. F. Diallo, and C. Barbarossa, โConsumersโ intentions to purchase smart home objects: Do environmental issues matter?,โ Ecol. Econ., vol. 161, pp. 176โ185, Jul. 2019, doi: 10.1016/j.ecolecon.2019.03.028.