Guest davidfrischer Posted June 18 Posted June 18 [HEADING=1]Project Overview[/HEADING] Developed in partnership with UCL, Great Ormond Street Hospital (GOSH) and Intel, the WakeUp System provides reassurance and support to patients with disabilities in long-term care and the elderly. As mentioned previously, one of the system's main objectives is to enable them to autonomously control their environment, and it is designed for use in both hospital and home settings. Moreover, the project encompasses a research study in Intel's new OpenVINO project. The system architecture is divided into three parts: triggers, targets and the WakeUp system. The WakeUp system acts as a message broker, connecting a patient's commands to an appropriate target device and enabling actions to be performed. The system can be configured by a healthcare professional or by the patient using a graphical user interface (GUI). The WakeUp System represents a promising idea in healthcare technology innovation. It has the potential to improve the quality of life and efficiency of healthcare services for disabled patients. This report provides a comprehensive overview of the design concept, the technical implementation of the project and the potential impact it could have on healthcare. [HEADING=1] [/HEADING] [HEADING=1]Demo Video[/HEADING] [HEADING=1] [/HEADING] [HEADING=1] [/HEADING] [HEADING=1]Project Journey[/HEADING] This project was completed over the course of 6 months. The first three months, the team focused on the requirements engineering portion of the project, where we set functional and non-functional requirements, created context and architecture diagrams and broke the project down with the stakeholders, so that it was easy for us to implement in the following 3 months, making sure we included the most important features and requirements. This process also allowed for the team to see how much is realistically achievable, and what should be kept as optional if time allowed us to complete. For the implementation stage, the team utilised agile methodologies, sprints, continuous integration and continuous testing and git practices. Agile was the software development methodology chosen for this project as the team and stakeholders maintained constant communication; meetings every week, and we wanted continuous feedback on design and implementation. This way we were able to develop a successful product. At the start of the week, the team would update stakeholders on progress and decide what needs to be done before the next meeting. This would then be completed. We would also have our own internal meetings, to demonstrate work to other team mates and see how much progress has been made. [HEADING=1] [/HEADING] [HEADING=1]Technical Details[/HEADING] During the development process, the project team used a variety of software development tools and practices. Continuous integration processes were incorporated to support the reliability and scalability of the system. In particular, the project utilised advanced technologies such as Intel OpenVINO, which significantly improves the performance of the AI algorithms used within the system. To build the GUI, Microsoft Foundation Classes (MFC) was utilised. All the data edites within the MFC application by staff will be saved within the database to ensure long-term storage of application data. There are many features including managing users, managing devices, managing signals and exporting data usage. [HEADING=1] [/HEADING] [HEADING=1]Triggers[/HEADING] In this section we show a visual representation of each trigger we have developed, as well as a video describing our project. Tapping/Snapping Fingers Detection The above gif demonstrates our sound classification trigger. It works by detecting these sounds repeatedly within a specified timeframe and then sending a signal to the WakeUp system. You can see in the above gif, the smart plug is turned on after a finger snapping is detected. [HEADING=2]Eye Blinking Detection[/HEADING] As the gif demonstrates, our eye blinking detection triggers track the movement of the eye and constantly calculate the distance between upper and lower eyelids and count an eye blinking whenever the distance between eyelids is below the preset threshold. It counts the number of eye blinks over a certain period, with the count serving as a signal in the Wakeup System. For example, this trigger can be used for turning on lights or opening curtains. [HEADING=2]Morse Code Vision[/HEADING] This gif shows our Morse code vision trigger. Based on the previous eye blinking trigger , we introduced a threshold to differentiate long and short blinking, allowing for Morse code communication. The gif shows after a long blink to initiate trigger, the three continuous short blinking has analysed to be the letter “S”. While a bit more complex for patients to use, it enables the transmission of letter signals compared to simple blinking detection, which allows the user to use at least 27 different actions for a single trigger. For example, a nurse can set up for the letter T to turn on the TV. [HEADING=2]Fall Detection[/HEADING] This trigger detects if a patient is falling from a seated position by tracking upper body keypoints with the YOLOv8n-pose model, optimised with OpenVINO. A signal is generated if the patient's body angle becomes too large, which can then prompt actions such as sending a message and photo to a nurse. The gif above showing the trigger constantly tracks the angle of the patient’s body and shows falling text when the patient’s body angle goes above the preset threshold. [HEADING=1]Results and Outcomes:[/HEADING] [HEADING=1]OpenVino Benchmark:[/HEADING] As one of our key functional requirements, OpenVINO toolkit boost the inference performance of our triggers on edge devices, allowing us to run more capable models with limited computational power. To critically assess the performance boost provided by OpenVINO, we conducted a detailed benchmark on both triggers utilising OpenVINO, namely upper body fall detection and Whisper. [HEADING=2]Upper Body Fall Detection[/HEADING] To assess the performance boost provided by OpenVINO, we conducted a benchmark for our upper body fall detection system. The system uses the YOLOv8n-pose model, comparing the performance with and without OpenVINO integration. We evaluated both live and offline inference performance. For live inference, we performed a continuous 3-minute test, while for offline inference, we assessed performance using the COCO 2017 validation dataset. Our comparisons focused on the mean boot time, pre-processing time, post-processing time, and inference time. The benchmark was carried out on a computer running Ubuntu 22.04, equipped with an Intel i7-13650HX processor and 16GB of RAM. The results, as depicted in the figures, reveal that despite OpenVINO slightly increased boot time and slightly improved on pre-processing and post-processing times, it made a huge improvement on the bottleneck of the model running time, which is average inference time. Notably, OpenVINO reduced the inference time by an average of 11 ms. Overall, including pre-processing and post-processing, the OpenVINO reduced the average latency by approximately 25%. [HEADING=2] [/HEADING] [HEADING=2]Whisper[/HEADING] To assess the performance boost provided by OpenVINO for Whisper, we conducted a benchmark for our audio transcript trigger, which utilises Whisper model. We evaluated both live and offline inference performance. The live benchmark included an hour of live inference, while for offline inference involved processing a 30-minute audio file. Our comparisons focused on the average inference time. The improvements on Whisper model here are equally notable as for fall detection. For offline , OpenVINO cut the average inference time by 22%. And for live, we observed a remarkable 56% performance boost, with average inference time dropping from 1.2 seconds to just 0.5 seconds. [HEADING=1] [/HEADING] [HEADING=1]Evaluation of Wakeup System with Stakeholders and Professionals:[/HEADING] The WakeUp system has been evaluated through live, real-time demonstrations in front of two different audiences. The first demonstration took place during the FHIR Hackathon on February 2024, where we presented two signals. The first signal used blinking eyes as triggers to toggle a light bulb, while the second signal triggered an alert via Telegram when the user being monitored fell off a chair. These demonstrations were given to representatives from companies such as Roche, Microsoft and GOSH Drive. The second demonstration took place during Labs Day on 19 March, where we demonstrated Morse Vision and Morse Sound to teams from Intel and to the CTO of NTT DATA. All the live demonstrations went smoothly and there were no embarrassing glitches. All participants who tested the system were impressed and showed great interest in the product. They suggested various ideas on how this concept could be applied to other industries such as nuclear power plants, farms or theatres. From the WakeUp team's perspective, we would like the setup process to be simpler. For example, to add a trigger, you first have to register it with the WakeUp system and then share the credentials with the trigger. This could be streamlined if the trigger could automatically register itself, although this might compromise the security of our system. We are also pleased at what started as an 'exploratory' project has quickly developed into an idea that has been presented to a real audience. [HEADING=1]Lesson Learned:[/HEADING] Throughout the course of the project, we have been left with invaluable lessons about teamwork, communication and strategy. Some of the many lessons learnt include stakeholder and group communication, git practices and exposure to a variety of different technologies. Stakeholder and group communication: As this project was in partnership with different stakeholders, we maintained continuous communication in order to produce a system which would best match their requirements. Weekly meetings would occur, where we would update stakeholders on progress and show what has been developed over the last week. As we used agile methodologies, we learnt that the requirements can change anytime during the project, and we should welcome the changes, as the focus is always on the stakeholder’s value. Communication is vital to ensure progress is made and it is very important to ensure all the voices of the stakeholder and team members are heard to be able to produce a successful product. Git Practices: During this project, new concepts had been used such as squashing commits, git rebase, pull requests and git merge. Exposure to these concepts has further improved the knowledge of Git to us, and has given us the experience of professional git usage. Exposure to different technologies: Over the duration, we worked with a range of technologies and concepts include Home Assistant, integration of Matter devices, machine learning integration, ZeroMQ and we also learnt how to build executable files. The exposure to all these technologies and new concepts has provided some professional programming practice, and as a result, improved the style of our code. [HEADING=1] [/HEADING] [HEADING=1]Collaboration and Teamwork[/HEADING] [HEADING=2]Team member contributions:[/HEADING] This project will be impossible without the collaboration between 9 of our amazing team members. The collaboration and teamwork has played a vital role in the success of this project. To ensure the smooth operation for our nine-person project, we have made a clear split of work between team members, which are shown below: Frischer David Roman Louis (Team Leader): Team management, Stakeholder communication, development of Morse Vision and Morse Sound Trigger. Hussain Fatima: Development of MFC front end, production of presentation video, development of WakeOnLan target. Jakupov Dias: Development of sound based triggers, integration of OpenVINO with sound based triggers. Kang Weihao: Development of Swagger documentation. Rzayev Javid: Development of fall detection triggers. Sivayoganathan Thuvaragan: Development of data statistics of WakeUp system. Sun Ethan: Development of PyTest for continuous integration. Wang Jingyuan: Development of Eye blinking trigger, integration of OpenVINO with vision based trigger, production of presentation video. Wang Zena: Production of MFC front end. [HEADING=2] [/HEADING] [HEADING=2]Project Management[/HEADING] To ensure the smooth operation of our nine-person project, we have established a rigorous methodology and set of rules for project management and teamwork. Throughout the 10 weeks of development, we held three weekly meetings to maintain the ongoing progress of the project. Meetings with our UCL supervisors were scheduled for Mondays, internal meetings to organise the weekly tasks took place on Wednesdays, and update meetings with our Intel partner were held on Fridays. Each member of the WakeUp Team had a specific role to fulfill, with the Team Leader (TL) assisting and ensuring that everyone completed their tasks on schedule. To promote collaboration and integration among team members, strict common practices were implemented. For example: Commit Message: Must follow this template: [subject] Description of the task achieved. Following this convention made it easier to track the progress of each component in our mono-repository on GitHub. Pull Request review: Only the TL was authorised to push to the main branch. Other members were required to create a pull request, where one or two members would review it to ensure that the commit message followed convention, that there was a single commit (squash) to be rebased and merged into main, and that the commit had passed the project's pytest to ensure that it would not break other components of the system. Notion App: The TL motivated all team members to document their work progress and knowledge gained throughout the project in the Notion workspace. Information about organisation, setup guides, troubleshooting, and learning was recorded on the Notion page to centralise the team's knowledge. [HEADING=1] [/HEADING] [HEADING=1]Future Development[/HEADING] The Matter WakeUp Signal System holds significant promise for future development and enhancements. Here are some potential areas for improvement: Simplified Setup and Installation Introducing a graphical installation process would make it easier for non-technical users to configure and use the system. This would streamline the setup process, reducing the potential for errors and making the system more accessible. Advanced and Varied Triggers Integrating more advanced machine learning models for precise patient monitoring and interaction could enhance the system's capabilities. This includes expanding the range of triggers to accommodate various patient needs and conditions. Broader Device Compatibility Expanding the range of compatible smart devices and ensuring seamless integration with other healthcare systems and standards is crucial. Future testing should include a wider array of smart devices like smart speakers and additional sensors. [HEADING=1] [/HEADING] [HEADING=1]Conclusion[/HEADING] The Matter WakeUp Signal System project demonstrates the transformative potential of integrating smart technology into healthcare settings. Here are the key points: Patient Autonomy: By enabling patients with disabilities and the elderly to control their environment autonomously, the system significantly improves their quality of life. Workload Reduction: The system reduces the workload on nurses and hospital stuff by automating routine tasks and monitoring. Innovative Approach: The successful implementation of various triggers, the integration of Matter devices, and the use of advanced AI technologies like OpenVINO showcase the project's innovative approach. Positive Feedback: The system has received positive feedback from stakeholders and has shown practical applications and potential impact through successful demonstrations. The project's commitment to enhancing patient autonomy and improving healthcare efficiency makes it a valuable addition to the field of healthcare technology. [HEADING=1] [/HEADING] [HEADING=1]Special Thanks to Contributors:[/HEADING] Each contributor’s continuous support and involvement all plays a crucial role in the success of the project, here, we present a special thanks to all following contributors. Prof. Dean Mohamedally, Chief Supervisor and Professor of Computer Science at UCL Emmanuel Letier, Associate Professor in Software Engineering at UCL Costas Stylianou, Senior Technical Specialist for Public Sector and Health & Life Sciences at Intel, and Honorary Associate Professor at UCL Prof. Julia Manning, Professor, Clinician, Policy Adviser, Convenor Dr. Rafiq, NHS GP / GP Trainer / Honorary Lecturer Anelia Gaydardzhieva, Assistant Supervisor at UCL Jason Holtom, Account Executive for the Public Sector and Healthcare [HEADING=1] [/HEADING] [HEADING=1]Call to Action[/HEADING] We invite you to explore the Matter WakeUp Signal System further and consider how its innovative approach to patient care can be applied in your own healthcare settings. Here are some steps you can take: Connect with Us: Feel free to reach out to our team for more information or collaboration opportunities. Together, we can continue to innovate and improve the quality of care for patients. [HEADING=1] [/HEADING] [HEADING=1]Team[/HEADING] The team involved in developing this project included 9 members. All of us are Masters students at UCL studying Software Systems Engineering: David Frischer – Team Leader – Full Stack Developer. GitHub URL: davidfrisch - Overview, LinkedIn URL: https://www.linkedin.com/in/david-frischer/ Fatima Hussain – Full Stack Developer. GitHub URL: fatimahuss - Overview, LinkedIn URL: https://www.linkedin.com/in/fatima-noor-hussain/ Dias Jakupov – Full Stack Developer. GitHub URL: Dias2406 - Overview, LinkedIn URL: https://www.linkedin.com/in/dias-jakupov-a05258221/ Jingyuan Wang – Full Stack Developer. GitHub URL: Andydiantu - Overview, LinkedIn URL: https://www.linkedin.com/in/jingyuan-wang-9ba553208/ Weihao Kang – Full Stack Developer. GitHub URL: Kang2001 - Overview, LinkedIn URL: https://www.linkedin.com/in/weihao-kang-b25152294 Thuvaragan Sivayoganathan – Full Stack Developer. GitHub URL: thuvasiva - Overview, LinkedIn URL: https://www.linkedin.com/in/thuvaragan-sivayoganathan-a95991227/ Javid Rzayev – Full Stack Developer. GitHub URL: Javid2002 - Overview, LinkedIn URL: https://www.linkedin.com/in/javidrzayev/ Ethan Sun – Full Stack Developer. GitHub URL: EthanSun11 - Overview Zena Wang – Full Stack Developer. GitHub URL: ZenaWangqwq - Overview #Accessibility #Innovation #SmartHome #TechnologyForGood #Matter #HomeAssistant #Micrososft #Intel #UCL #IXN Continue reading... Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.