{"title":"Closed-loop optogenetic neuromodulation enables high-fidelity fatigue-resistant muscle control","authors":"Guillermo Herrera-Arcos, Hyungeun Song, Seong Ho Yeon, Omkar Ghenand, Samantha Gutierrez-Arango, Sapna Sinha, Hugh Herr","doi":"10.1126/scirobotics.adi8995","DOIUrl":"10.1126/scirobotics.adi8995","url":null,"abstract":"<div >Closed-loop neuroprostheses show promise in restoring motion in individuals with neurological conditions. However, conventional activation strategies based on functional electrical stimulation (FES) fail to accurately modulate muscle force and exhibit rapid fatigue because of their unphysiological recruitment mechanism. Here, we present a closed-loop control framework that leverages physiological force modulation under functional optogenetic stimulation (FOS) to enable high-fidelity muscle control for extended periods of time (>60 minutes) in vivo. We first uncovered the force modulation characteristic of FOS, showing more physiological recruitment and significantly higher modulation ranges (>320%) compared with FES. Second, we developed a neuromuscular model that accurately describes the highly nonlinear dynamics of optogenetically stimulated muscle. Third, on the basis of the optogenetic model, we demonstrated real-time control of muscle force with improved performance and fatigue resistance compared with FES. This work lays the foundation for fatigue-resistant neuroprostheses and optogenetically controlled biohybrid robots with high-fidelity force modulation.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.adi8995","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141080376","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-05-22DOI: 10.1126/scirobotics.adj8812
Letizia Gionfrida, Daekyum Kim, Davide Scaramuzza, Dario Farina, Robert D. Howe
{"title":"Wearable robots for the real world need vision","authors":"Letizia Gionfrida, Daekyum Kim, Davide Scaramuzza, Dario Farina, Robert D. Howe","doi":"10.1126/scirobotics.adj8812","DOIUrl":"10.1126/scirobotics.adj8812","url":null,"abstract":"<div >To enhance wearable robots, understanding user intent and environmental perception with novel vision approaches is needed.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141080484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-05-15DOI: 10.1126/scirobotics.adi0591
F. Paredes-Vallés, J. J. Hagenaars, J. Dupeyroux, S. Stroobants, Y. Xu, G. C. H. E. de Croon
{"title":"Fully neuromorphic vision and control for autonomous drone flight","authors":"F. Paredes-Vallés, J. J. Hagenaars, J. Dupeyroux, S. Stroobants, Y. Xu, G. C. H. E. de Croon","doi":"10.1126/scirobotics.adi0591","DOIUrl":"10.1126/scirobotics.adi0591","url":null,"abstract":"<div >Biological sensing and processing is asynchronous and sparse, leading to low-latency and energy-efficient perception and action. In robotics, neuromorphic hardware for event-based vision and spiking neural networks promises to exhibit similar characteristics. However, robotic implementations have been limited to basic tasks with low-dimensional sensory inputs and motor actions because of the restricted network size in current embedded neuromorphic processors and the difficulties of training spiking neural networks. Here, we present a fully neuromorphic vision-to-control pipeline for controlling a flying drone. Specifically, we trained a spiking neural network that accepts raw event-based camera data and outputs low-level control actions for performing autonomous vision-based flight. The vision part of the network, consisting of five layers and 28,800 neurons, maps incoming raw events to ego-motion estimates and was trained with self-supervised learning on real event data. The control part consists of a single decoding layer and was learned with an evolutionary algorithm in a drone simulator. Robotic experiments show a successful sim-to-real transfer of the fully learned neuromorphic pipeline. The drone could accurately control its ego-motion, allowing for hovering, landing, and maneuvering sideways—even while yawing at the same time. The neuromorphic pipeline runs on board on Intel’s Loihi neuromorphic processor with an execution frequency of 200 hertz, consuming 0.94 watt of idle power and a mere additional 7 to 12 milliwatts when running the network. These results illustrate the potential of neuromorphic sensing and processing for enabling insect-sized intelligent robots.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946765","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-05-15DOI: 10.1126/scirobotics.adl3606
Byungjoon Bae, Doeon Lee, Minseong Park, Yujia Mu, Yongmin Baek, Inbo Sim, Cong Shen, Kyusang Lee
{"title":"Stereoscopic artificial compound eyes for spatiotemporal perception in three-dimensional space","authors":"Byungjoon Bae, Doeon Lee, Minseong Park, Yujia Mu, Yongmin Baek, Inbo Sim, Cong Shen, Kyusang Lee","doi":"10.1126/scirobotics.adl3606","DOIUrl":"10.1126/scirobotics.adl3606","url":null,"abstract":"<div >Arthropods’ eyes are effective biological vision systems for object tracking and wide field of view because of their structural uniqueness; however, unlike mammalian eyes, they can hardly acquire the depth information of a static object because of their monocular cues. Therefore, most arthropods rely on motion parallax to track the object in three-dimensional (3D) space. Uniquely, the praying mantis (Mantodea) uses both compound structured eyes and a form of stereopsis and is capable of achieving object recognition in 3D space. Here, by mimicking the vision system of the praying mantis using stereoscopically coupled artificial compound eyes, we demonstrated spatiotemporal object sensing and tracking in 3D space with a wide field of view. Furthermore, to achieve a fast response with minimal latency, data storage/transportation, and power consumption, we processed the visual information at the edge of the system using a synaptic device and a federated split learning algorithm. The designed and fabricated stereoscopic artificial compound eye provides energy-efficient and accurate spatiotemporal object sensing and optical flow tracking. It exhibits a root mean square error of 0.3 centimeter, consuming only approximately 4 millijoules for sensing and tracking. These results are more than 400 times lower than conventional complementary metal-oxide semiconductor–based imaging systems. Our biomimetic imager shows the potential of integrating nature’s unique design using hardware and software codesigned technology toward capabilities of edge computing and sensing.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-05-15DOI: 10.1126/scirobotics.adi8666
Yu Zhou, Zhibo Sun, Yucheng Ding, Zhengnan Yuan, Xiao Qiu, Yang Bryan Cao, Zhu’an Wan, Zhenghao Long, Swapnadeep Poddar, Shivam Kumar, Wenhao Ye, Chak Lam Jonathan Chan, Daquan Zhang, Beitao Ren, Qianpeng Zhang, Hoi-Sing Kwok, Mitch Guijun Li, Zhiyong Fan
{"title":"An ultrawide field-of-view pinhole compound eye using hemispherical nanowire array for robot vision","authors":"Yu Zhou, Zhibo Sun, Yucheng Ding, Zhengnan Yuan, Xiao Qiu, Yang Bryan Cao, Zhu’an Wan, Zhenghao Long, Swapnadeep Poddar, Shivam Kumar, Wenhao Ye, Chak Lam Jonathan Chan, Daquan Zhang, Beitao Ren, Qianpeng Zhang, Hoi-Sing Kwok, Mitch Guijun Li, Zhiyong Fan","doi":"10.1126/scirobotics.adi8666","DOIUrl":"10.1126/scirobotics.adi8666","url":null,"abstract":"<div >Garnering inspiration from biological compound eyes, artificial vision systems boasting a vivid range of diverse visual functional traits have come to the fore recently. However, most of these artificial systems rely on transformable electronics, which suffer from the complexity and constrained geometry of global deformation, as well as potential mismatches between optical and detector units. Here, we present a unique pinhole compound eye that combines a three-dimensionally printed honeycomb optical structure with a hemispherical, all-solid-state, high-density perovskite nanowire photodetector array. The lens-free pinhole structure can be designed and fabricated with an arbitrary layout to match the underlying image sensor. Optical simulations and imaging results matched well with each other and substantiated the key characteristics and capabilities of our system, which include an ultrawide field of view, accurate target positioning, and motion tracking function. We further demonstrate the potential of our unique compound eye for advanced robotic vision by successfully completing a moving target tracking mission.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.adi8666","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-05-15DOI: 10.1126/scirobotics.adp8107
Robin R. Murphy
{"title":"What to give your robot mother on Mother’s Day","authors":"Robin R. Murphy","doi":"10.1126/scirobotics.adp8107","DOIUrl":"10.1126/scirobotics.adp8107","url":null,"abstract":"<div ><i>Jung_E</i>, the 2023 science-fiction movie from South Korea, suggests that a novel leg-wheel hybrid for robot locomotion might be appreciated.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-04-24DOI: 10.1126/scirobotics.adi9641
Joonho Lee, Marko Bjelonic, Alexander Reske, Lorenz Wellhausen, Takahiro Miki, Marco Hutter
{"title":"Learning robust autonomous navigation and locomotion for wheeled-legged robots","authors":"Joonho Lee, Marko Bjelonic, Alexander Reske, Lorenz Wellhausen, Takahiro Miki, Marco Hutter","doi":"10.1126/scirobotics.adi9641","DOIUrl":"https://doi.org/10.1126/scirobotics.adi9641","url":null,"abstract":"Autonomous wheeled-legged robots have the potential to transform logistics systems, improving operational efficiency and adaptability in urban environments. Navigating urban environments, however, poses unique challenges for robots, necessitating innovative solutions for locomotion and navigation. These challenges include the need for adaptive locomotion across varied terrains and the ability to navigate efficiently around complex dynamic obstacles. This work introduces a fully integrated system comprising adaptive locomotion control, mobility-aware local navigation planning, and large-scale path planning within the city. Using model-free reinforcement learning (RL) techniques and privileged learning, we developed a versatile locomotion controller. This controller achieves efficient and robust locomotion over various rough terrains, facilitated by smooth transitions between walking and driving modes. It is tightly integrated with a learned navigation controller through a hierarchical RL framework, enabling effective navigation through challenging terrain and various obstacles at high speed. Our controllers are integrated into a large-scale urban navigation system and validated by autonomous, kilometer-scale navigation missions conducted in Zurich, Switzerland, and Seville, Spain. These missions demonstrate the system’s robustness and adaptability, underscoring the importance of integrated control systems in achieving seamless navigation in complex environments. Our findings support the feasibility of wheeled-legged robots and hierarchical RL for autonomous navigation, with implications for last-mile delivery and beyond.","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"292 1","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140642540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-04-24DOI: 10.1126/scirobotics.adi9754
Samuel A. Burden, Thomas Libby, Kaushik Jayaram, Simon Sponberg, J. Maxwell Donelan
{"title":"Why animals can outrun robots","authors":"Samuel A. Burden, Thomas Libby, Kaushik Jayaram, Simon Sponberg, J. Maxwell Donelan","doi":"10.1126/scirobotics.adi9754","DOIUrl":"https://doi.org/10.1126/scirobotics.adi9754","url":null,"abstract":"Animals are much better at running than robots. The difference in performance arises in the important dimensions of agility, range, and robustness. To understand the underlying causes for this performance gap, we compare natural and artificial technologies in the five subsystems critical for running: power, frame, actuation, sensing, and control. With few exceptions, engineering technologies meet or exceed the performance of their biological counterparts. We conclude that biology’s advantage over engineering arises from better integration of subsystems, and we identify four fundamental obstacles that roboticists must overcome. Toward this goal, we highlight promising research directions that have outsized potential to help future running robots achieve animal-level performance.","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"8 1","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140642703","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Science RoboticsPub Date : 2024-04-24DOI: 10.1126/scirobotics.adp8528
Amos Matsiko
{"title":"Grasping objects with the aid of haptics","authors":"Amos Matsiko","doi":"10.1126/scirobotics.adp8528","DOIUrl":"10.1126/scirobotics.adp8528","url":null,"abstract":"<div >A smart suction cup uses haptics to supplement vision for exploration of objects in a grasping task.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 89","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140642097","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}