This internet browser is outdated and does not support all features of this site. Please switch or upgrade to a different browser to display this site properly.

Digital Child CRC

Client/Partner
Partners: ARC Centre of Excellence for the Digital Child, Curtin’s School of Allied Health, Curtin’s School of Electrical Engineering, Computing and Mathematical Sciences

Timeline
January 2023 – December 2024

Overview
The increasing use of digital technology by young children raised concerns about its impact on their physical, social and psychological well-being, particularly in relation to posture and movement (PaM). Traditional methods for measuring children’s digital technology use and associated PaM rely heavily on subjective parental reports or direct observation, which are burdensome, prone to bias and lack precision. Existing device-based systems, such as accelerometers, often capture only activity intensity, missing nuanced data on specific postures or device interactions. This gap hinders the ability to collect high-quality, objective data needed to understand the health implications of digital technology use in young children.

The Digital Child Project, a collaboration between the Curtin Institute for Data Science (CIDS), the ARC Centre of Excellence for the Digital Child and Curtin’s School of Allied Health and EECMS, addressed these challenges by developing objective, scalable measurement systems. Using machine learning and sensor technologies, the project created tools to capture detailed behavioural data, enabling research into digital technology’s effect on child development.

The Curtin Institute for Data Science (CIDS) led the development of advanced machine learning models to classify posture and movement (PaM, e.g., lying, sitting, standing, walking) using data from thigh accelerometers, enabling precise tracking of children’s physical behaviours during digital device use. Additionally, CIDS created object detection models to identify digital technology devices, such as smartphones, tablets and laptops, in video footage, incorporating face anonymisation techniques to ensure privacy compliance. Using YOLO-v8 framework, CIDS analysed video data from a 2022 laboratory study to localise children, distinguish them from researchers and extract skeletal poses for details PaM analysis with models optimised on high-performance CIDS GPUs.

The project delivered scalable systems that improved data accuracy for behavioural research, achieving reliable PAM classification and device detection tool in controlled settings, creating a strong foundation for future studies on child health and digital technology.