Science Robotics最新文献

筛选
英文 中文
A guiding light for stimulating paralyzed muscles 刺激瘫痪肌肉的指路明灯
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-05-22 DOI: 10.1126/scirobotics.ado9987
Jordan Williams
{"title":"A guiding light for stimulating paralyzed muscles","authors":"Jordan Williams","doi":"10.1126/scirobotics.ado9987","DOIUrl":"10.1126/scirobotics.ado9987","url":null,"abstract":"<div >Improving the performance of closed-loop optogenetic nerve stimulation can reproduce desired muscle activation patterns.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141080295","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Closed-loop optogenetic neuromodulation enables high-fidelity fatigue-resistant muscle control 闭环光遗传学神经调节实现了高保真抗疲劳肌肉控制。
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-05-22 DOI: 10.1126/scirobotics.adi8995
Guillermo Herrera-Arcos, Hyungeun Song, Seong Ho Yeon, Omkar Ghenand, Samantha Gutierrez-Arango, Sapna Sinha, Hugh Herr
{"title":"Closed-loop optogenetic neuromodulation enables high-fidelity fatigue-resistant muscle control","authors":"Guillermo Herrera-Arcos,&nbsp;Hyungeun Song,&nbsp;Seong Ho Yeon,&nbsp;Omkar Ghenand,&nbsp;Samantha Gutierrez-Arango,&nbsp;Sapna Sinha,&nbsp;Hugh Herr","doi":"10.1126/scirobotics.adi8995","DOIUrl":"10.1126/scirobotics.adi8995","url":null,"abstract":"<div >Closed-loop neuroprostheses show promise in restoring motion in individuals with neurological conditions. However, conventional activation strategies based on functional electrical stimulation (FES) fail to accurately modulate muscle force and exhibit rapid fatigue because of their unphysiological recruitment mechanism. Here, we present a closed-loop control framework that leverages physiological force modulation under functional optogenetic stimulation (FOS) to enable high-fidelity muscle control for extended periods of time (&gt;60 minutes) in vivo. We first uncovered the force modulation characteristic of FOS, showing more physiological recruitment and significantly higher modulation ranges (&gt;320%) compared with FES. Second, we developed a neuromuscular model that accurately describes the highly nonlinear dynamics of optogenetically stimulated muscle. Third, on the basis of the optogenetic model, we demonstrated real-time control of muscle force with improved performance and fatigue resistance compared with FES. This work lays the foundation for fatigue-resistant neuroprostheses and optogenetically controlled biohybrid robots with high-fidelity force modulation.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.adi8995","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141080376","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Wearable robots for the real world need vision 现实世界中的可穿戴机器人需要视觉。
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-05-22 DOI: 10.1126/scirobotics.adj8812
Letizia Gionfrida, Daekyum Kim, Davide Scaramuzza, Dario Farina, Robert D. Howe
{"title":"Wearable robots for the real world need vision","authors":"Letizia Gionfrida,&nbsp;Daekyum Kim,&nbsp;Davide Scaramuzza,&nbsp;Dario Farina,&nbsp;Robert D. Howe","doi":"10.1126/scirobotics.adj8812","DOIUrl":"10.1126/scirobotics.adj8812","url":null,"abstract":"<div >To enhance wearable robots, understanding user intent and environmental perception with novel vision approaches is needed.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141080484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fully neuromorphic vision and control for autonomous drone flight 用于无人机自主飞行的全神经形态视觉和控制。
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-05-15 DOI: 10.1126/scirobotics.adi0591
F. Paredes-Vallés, J. J. Hagenaars, J. Dupeyroux, S. Stroobants, Y. Xu, G. C. H. E. de Croon
{"title":"Fully neuromorphic vision and control for autonomous drone flight","authors":"F. Paredes-Vallés,&nbsp;J. J. Hagenaars,&nbsp;J. Dupeyroux,&nbsp;S. Stroobants,&nbsp;Y. Xu,&nbsp;G. C. H. E. de Croon","doi":"10.1126/scirobotics.adi0591","DOIUrl":"10.1126/scirobotics.adi0591","url":null,"abstract":"<div >Biological sensing and processing is asynchronous and sparse, leading to low-latency and energy-efficient perception and action. In robotics, neuromorphic hardware for event-based vision and spiking neural networks promises to exhibit similar characteristics. However, robotic implementations have been limited to basic tasks with low-dimensional sensory inputs and motor actions because of the restricted network size in current embedded neuromorphic processors and the difficulties of training spiking neural networks. Here, we present a fully neuromorphic vision-to-control pipeline for controlling a flying drone. Specifically, we trained a spiking neural network that accepts raw event-based camera data and outputs low-level control actions for performing autonomous vision-based flight. The vision part of the network, consisting of five layers and 28,800 neurons, maps incoming raw events to ego-motion estimates and was trained with self-supervised learning on real event data. The control part consists of a single decoding layer and was learned with an evolutionary algorithm in a drone simulator. Robotic experiments show a successful sim-to-real transfer of the fully learned neuromorphic pipeline. The drone could accurately control its ego-motion, allowing for hovering, landing, and maneuvering sideways—even while yawing at the same time. The neuromorphic pipeline runs on board on Intel’s Loihi neuromorphic processor with an execution frequency of 200 hertz, consuming 0.94 watt of idle power and a mere additional 7 to 12 milliwatts when running the network. These results illustrate the potential of neuromorphic sensing and processing for enabling insect-sized intelligent robots.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946765","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Stereoscopic artificial compound eyes for spatiotemporal perception in three-dimensional space 用于三维空间时空感知的立体人工复眼。
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-05-15 DOI: 10.1126/scirobotics.adl3606
Byungjoon Bae, Doeon Lee, Minseong Park, Yujia Mu, Yongmin Baek, Inbo Sim, Cong Shen, Kyusang Lee
{"title":"Stereoscopic artificial compound eyes for spatiotemporal perception in three-dimensional space","authors":"Byungjoon Bae,&nbsp;Doeon Lee,&nbsp;Minseong Park,&nbsp;Yujia Mu,&nbsp;Yongmin Baek,&nbsp;Inbo Sim,&nbsp;Cong Shen,&nbsp;Kyusang Lee","doi":"10.1126/scirobotics.adl3606","DOIUrl":"10.1126/scirobotics.adl3606","url":null,"abstract":"<div >Arthropods’ eyes are effective biological vision systems for object tracking and wide field of view because of their structural uniqueness; however, unlike mammalian eyes, they can hardly acquire the depth information of a static object because of their monocular cues. Therefore, most arthropods rely on motion parallax to track the object in three-dimensional (3D) space. Uniquely, the praying mantis (Mantodea) uses both compound structured eyes and a form of stereopsis and is capable of achieving object recognition in 3D space. Here, by mimicking the vision system of the praying mantis using stereoscopically coupled artificial compound eyes, we demonstrated spatiotemporal object sensing and tracking in 3D space with a wide field of view. Furthermore, to achieve a fast response with minimal latency, data storage/transportation, and power consumption, we processed the visual information at the edge of the system using a synaptic device and a federated split learning algorithm. The designed and fabricated stereoscopic artificial compound eye provides energy-efficient and accurate spatiotemporal object sensing and optical flow tracking. It exhibits a root mean square error of 0.3 centimeter, consuming only approximately 4 millijoules for sensing and tracking. These results are more than 400 times lower than conventional complementary metal-oxide semiconductor–based imaging systems. Our biomimetic imager shows the potential of integrating nature’s unique design using hardware and software codesigned technology toward capabilities of edge computing and sensing.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946772","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An ultrawide field-of-view pinhole compound eye using hemispherical nanowire array for robot vision 使用半球形纳米线阵列的超宽视场针孔复眼,用于机器人视觉。
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-05-15 DOI: 10.1126/scirobotics.adi8666
Yu Zhou, Zhibo Sun, Yucheng Ding, Zhengnan Yuan, Xiao Qiu, Yang Bryan Cao, Zhu’an Wan, Zhenghao Long, Swapnadeep Poddar, Shivam Kumar, Wenhao Ye, Chak Lam Jonathan Chan, Daquan Zhang, Beitao Ren, Qianpeng Zhang, Hoi-Sing Kwok, Mitch Guijun Li, Zhiyong Fan
{"title":"An ultrawide field-of-view pinhole compound eye using hemispherical nanowire array for robot vision","authors":"Yu Zhou,&nbsp;Zhibo Sun,&nbsp;Yucheng Ding,&nbsp;Zhengnan Yuan,&nbsp;Xiao Qiu,&nbsp;Yang Bryan Cao,&nbsp;Zhu’an Wan,&nbsp;Zhenghao Long,&nbsp;Swapnadeep Poddar,&nbsp;Shivam Kumar,&nbsp;Wenhao Ye,&nbsp;Chak Lam Jonathan Chan,&nbsp;Daquan Zhang,&nbsp;Beitao Ren,&nbsp;Qianpeng Zhang,&nbsp;Hoi-Sing Kwok,&nbsp;Mitch Guijun Li,&nbsp;Zhiyong Fan","doi":"10.1126/scirobotics.adi8666","DOIUrl":"10.1126/scirobotics.adi8666","url":null,"abstract":"<div >Garnering inspiration from biological compound eyes, artificial vision systems boasting a vivid range of diverse visual functional traits have come to the fore recently. However, most of these artificial systems rely on transformable electronics, which suffer from the complexity and constrained geometry of global deformation, as well as potential mismatches between optical and detector units. Here, we present a unique pinhole compound eye that combines a three-dimensionally printed honeycomb optical structure with a hemispherical, all-solid-state, high-density perovskite nanowire photodetector array. The lens-free pinhole structure can be designed and fabricated with an arbitrary layout to match the underlying image sensor. Optical simulations and imaging results matched well with each other and substantiated the key characteristics and capabilities of our system, which include an ultrawide field of view, accurate target positioning, and motion tracking function. We further demonstrate the potential of our unique compound eye for advanced robotic vision by successfully completing a moving target tracking mission.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.adi8666","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946763","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
What to give your robot mother on Mother’s Day 母亲节给机器人妈妈送什么礼物?
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-05-15 DOI: 10.1126/scirobotics.adp8107
Robin R. Murphy
{"title":"What to give your robot mother on Mother’s Day","authors":"Robin R. Murphy","doi":"10.1126/scirobotics.adp8107","DOIUrl":"10.1126/scirobotics.adp8107","url":null,"abstract":"<div ><i>Jung_E</i>, the 2023 science-fiction movie from South Korea, suggests that a novel leg-wheel hybrid for robot locomotion might be appreciated.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 90","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140946773","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Learning robust autonomous navigation and locomotion for wheeled-legged robots 学习轮足机器人的鲁棒自主导航和运动能力
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-04-24 DOI: 10.1126/scirobotics.adi9641
Joonho Lee, Marko Bjelonic, Alexander Reske, Lorenz Wellhausen, Takahiro Miki, Marco Hutter
{"title":"Learning robust autonomous navigation and locomotion for wheeled-legged robots","authors":"Joonho Lee, Marko Bjelonic, Alexander Reske, Lorenz Wellhausen, Takahiro Miki, Marco Hutter","doi":"10.1126/scirobotics.adi9641","DOIUrl":"https://doi.org/10.1126/scirobotics.adi9641","url":null,"abstract":"Autonomous wheeled-legged robots have the potential to transform logistics systems, improving operational efficiency and adaptability in urban environments. Navigating urban environments, however, poses unique challenges for robots, necessitating innovative solutions for locomotion and navigation. These challenges include the need for adaptive locomotion across varied terrains and the ability to navigate efficiently around complex dynamic obstacles. This work introduces a fully integrated system comprising adaptive locomotion control, mobility-aware local navigation planning, and large-scale path planning within the city. Using model-free reinforcement learning (RL) techniques and privileged learning, we developed a versatile locomotion controller. This controller achieves efficient and robust locomotion over various rough terrains, facilitated by smooth transitions between walking and driving modes. It is tightly integrated with a learned navigation controller through a hierarchical RL framework, enabling effective navigation through challenging terrain and various obstacles at high speed. Our controllers are integrated into a large-scale urban navigation system and validated by autonomous, kilometer-scale navigation missions conducted in Zurich, Switzerland, and Seville, Spain. These missions demonstrate the system’s robustness and adaptability, underscoring the importance of integrated control systems in achieving seamless navigation in complex environments. Our findings support the feasibility of wheeled-legged robots and hierarchical RL for autonomous navigation, with implications for last-mile delivery and beyond.","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"292 1","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140642540","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Why animals can outrun robots 动物为何能跑赢机器人
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-04-24 DOI: 10.1126/scirobotics.adi9754
Samuel A. Burden, Thomas Libby, Kaushik Jayaram, Simon Sponberg, J. Maxwell Donelan
{"title":"Why animals can outrun robots","authors":"Samuel A. Burden, Thomas Libby, Kaushik Jayaram, Simon Sponberg, J. Maxwell Donelan","doi":"10.1126/scirobotics.adi9754","DOIUrl":"https://doi.org/10.1126/scirobotics.adi9754","url":null,"abstract":"Animals are much better at running than robots. The difference in performance arises in the important dimensions of agility, range, and robustness. To understand the underlying causes for this performance gap, we compare natural and artificial technologies in the five subsystems critical for running: power, frame, actuation, sensing, and control. With few exceptions, engineering technologies meet or exceed the performance of their biological counterparts. We conclude that biology’s advantage over engineering arises from better integration of subsystems, and we identify four fundamental obstacles that roboticists must overcome. Toward this goal, we highlight promising research directions that have outsized potential to help future running robots achieve animal-level performance.","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"8 1","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140642703","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Grasping objects with the aid of haptics 借助触觉抓取物体
IF 25 1区 计算机科学
Science Robotics Pub Date : 2024-04-24 DOI: 10.1126/scirobotics.adp8528
Amos Matsiko
{"title":"Grasping objects with the aid of haptics","authors":"Amos Matsiko","doi":"10.1126/scirobotics.adp8528","DOIUrl":"10.1126/scirobotics.adp8528","url":null,"abstract":"<div >A smart suction cup uses haptics to supplement vision for exploration of objects in a grasping task.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 89","pages":""},"PeriodicalIF":25.0,"publicationDate":"2024-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140642097","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信