Hwa-Dong Liu, Chen-Wei Su, Chia-Hsun Chang, Cheng-Ze Li, Ping-Jui Lin
{"title":"Miniature autonomous humanoid robot for environmental sensing and atmospheric water harvesting using bioinspired materials and AI-based vision","authors":"Hwa-Dong Liu, Chen-Wei Su, Chia-Hsun Chang, Cheng-Ze Li, Ping-Jui Lin","doi":"10.1007/s13201-026-02847-5","DOIUrl":null,"url":null,"abstract":"<div><p>This study presents a miniature autonomous humanoid robotic system that not only performs real-time environmental monitoring and atmospheric water harvesting in extreme and unstructured environments, but also introduces several novel system-level innovations. The robot establishes a new Artificial Intelligence–Internet of Things (AI–IoT)–robot coordination architecture that integrates YOLO-based visual perception, ultrasonic ranging, and environmental sensing into a unified decision-making workflow, enabling multi-modal data fusion for adaptive navigation. A compact ESP32-CAM module combined with a customized YOLO detector achieves a 97% F1-score in target recognition and a 91% success rate in dynamic obstacle avoidance. Furthermore, the proposed system incorporates a micro-scale bioinspired water harvesting module, redesigned for mobile operation, which utilizes 100 g of silica gel to collect up to 25 mL of moisture daily under 23 °C and 75% relative humidity, and yields up to 77.6 L annually when scaled to 1000 g with efficiency taken into account. To optimize adsorption performance, this study develops a humidity-driven collection-efficiency model that links real-time sensor inputs with water harvesting predictions and supports path-planning decisions that guide the robot toward high-humidity zones. Environmental parameters—including temperature, humidity, pressure, and volatile organic compounds (VOCs)—are captured by onboard sensors and transmitted to a cloud platform via message queuing telemetry transport (MQTT) and hypertext transfer protocol (HTTP) for real-time visualization, mission adaptation, and autonomous task refinement. These innovations collectively form a new integration workflow that enhances environmental awareness, mobility robustness, and water harvesting efficiency. Experimental validations confirm the feasibility of the system for autonomous deployment in harsh, remote, or post-disaster conditions. Future work will incorporate swarm intelligence to extend multi-robot cooperation and resilience under climate-challenged environments.</p></div>","PeriodicalId":8374,"journal":{"name":"Applied Water Science","volume":"16 5","pages":""},"PeriodicalIF":5.7000,"publicationDate":"2026-04-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s13201-026-02847-5.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Water Science","FirstCategoryId":"93","ListUrlMain":"https://link.springer.com/article/10.1007/s13201-026-02847-5","RegionNum":3,"RegionCategory":"环境科学与生态学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2026/4/29 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"WATER RESOURCES","Score":null,"Total":0}
引用次数: 0
Abstract
This study presents a miniature autonomous humanoid robotic system that not only performs real-time environmental monitoring and atmospheric water harvesting in extreme and unstructured environments, but also introduces several novel system-level innovations. The robot establishes a new Artificial Intelligence–Internet of Things (AI–IoT)–robot coordination architecture that integrates YOLO-based visual perception, ultrasonic ranging, and environmental sensing into a unified decision-making workflow, enabling multi-modal data fusion for adaptive navigation. A compact ESP32-CAM module combined with a customized YOLO detector achieves a 97% F1-score in target recognition and a 91% success rate in dynamic obstacle avoidance. Furthermore, the proposed system incorporates a micro-scale bioinspired water harvesting module, redesigned for mobile operation, which utilizes 100 g of silica gel to collect up to 25 mL of moisture daily under 23 °C and 75% relative humidity, and yields up to 77.6 L annually when scaled to 1000 g with efficiency taken into account. To optimize adsorption performance, this study develops a humidity-driven collection-efficiency model that links real-time sensor inputs with water harvesting predictions and supports path-planning decisions that guide the robot toward high-humidity zones. Environmental parameters—including temperature, humidity, pressure, and volatile organic compounds (VOCs)—are captured by onboard sensors and transmitted to a cloud platform via message queuing telemetry transport (MQTT) and hypertext transfer protocol (HTTP) for real-time visualization, mission adaptation, and autonomous task refinement. These innovations collectively form a new integration workflow that enhances environmental awareness, mobility robustness, and water harvesting efficiency. Experimental validations confirm the feasibility of the system for autonomous deployment in harsh, remote, or post-disaster conditions. Future work will incorporate swarm intelligence to extend multi-robot cooperation and resilience under climate-challenged environments.