What Went Right This Month?
This month, usability testing provided valuable insights that led to a significant breakthrough in the project. During testing, it became apparent that the original approach to input required rethinking, particularly in how data is captured and processed. One of the key findings was the importance of understanding the camera's field of view and the approximate distance from the target in each frame. As a result, I decided to shift the focus from static image-based input to streaming video directly from webcams. This decision is pivotal because streaming video allows the program to capture real-time data more aligned with the dynamic nature of 3D animation. By leveraging the continuous flow of video frames, the program can more accurately track and render movements, resulting in more precise and realistic 3D animations.
What Went Wrong This Month?
The primary challenge this month stemmed from the limitations of the initial image-based input approach. One of the critical issues was the need for essential metadata, such as the camera's field of view and distance from the target, which is crucial for accurate 3D animation generation. With this metadata, the program could produce reliable animations, leading to consistency in the output. This limitation forced a significant shift in focus from using static images to relying entirely on webcam video input. Although this shift was necessary, it highlighted a gap in the original design and required additional time and resources to adjust the development approach.
How Can You Improve Moving Forward?
To improve the accuracy and reliability of the 3D animations produced by the program, it is crucial to implement a dedicated smoothing algorithm for the incoming video data. Raw video input, particularly from webcams, can often be jittery or inconsistent due to various factors such as camera shake or fluctuations in frame rate. A smoothing algorithm specifically designed for motion capture data will help mitigate these issues by stabilizing the captured movements across frames. This will ensure that the animations generated are smooth and realistic, enhancing the overall quality of the output. Moving forward, incorporating this smoothing process will be a priority to refine the data pipeline and ensure that the final animations meet the desired standards of accuracy and fluidity.
Comments