用于二维材料可扩展和智能表征的零射自主显微镜

IF 16 1区 材料科学 Q1 CHEMISTRY, MULTIDISCIPLINARY
ACS Nano Pub Date : 2025-10-02 DOI:10.1021/acsnano.5c09057
Jingyun Yang, , , Ruoyan Avery Yin, , , Chi Jiang, , , Yuepeng Hu, , , Xiaokai Zhu, , , Xingjian Hu, , , Sutharsika Kumar, , , Samantha K. Holmes, , , Xiao Wang, , , Xiaohua Zhai, , , Keran Rong, , , Yunyue Zhu, , , Tianyi Zhang, , , Zongyou Yin, , , Yuan Cao, , , Haoning Tang, , , Aaron D. Franklin, , , Jing Kong, , , Neil Zhenqiang Gong, , , Zhichu Ren*, , and , Haozhe Wang*, 
{"title":"用于二维材料可扩展和智能表征的零射自主显微镜","authors":"Jingyun Yang,&nbsp;, ,&nbsp;Ruoyan Avery Yin,&nbsp;, ,&nbsp;Chi Jiang,&nbsp;, ,&nbsp;Yuepeng Hu,&nbsp;, ,&nbsp;Xiaokai Zhu,&nbsp;, ,&nbsp;Xingjian Hu,&nbsp;, ,&nbsp;Sutharsika Kumar,&nbsp;, ,&nbsp;Samantha K. Holmes,&nbsp;, ,&nbsp;Xiao Wang,&nbsp;, ,&nbsp;Xiaohua Zhai,&nbsp;, ,&nbsp;Keran Rong,&nbsp;, ,&nbsp;Yunyue Zhu,&nbsp;, ,&nbsp;Tianyi Zhang,&nbsp;, ,&nbsp;Zongyou Yin,&nbsp;, ,&nbsp;Yuan Cao,&nbsp;, ,&nbsp;Haoning Tang,&nbsp;, ,&nbsp;Aaron D. Franklin,&nbsp;, ,&nbsp;Jing Kong,&nbsp;, ,&nbsp;Neil Zhenqiang Gong,&nbsp;, ,&nbsp;Zhichu Ren*,&nbsp;, and ,&nbsp;Haozhe Wang*,&nbsp;","doi":"10.1021/acsnano.5c09057","DOIUrl":null,"url":null,"abstract":"<p >Characterization of atomic-scale materials traditionally requires human experts with months to years of specialized training. Even for trained human operators, accurate and reliable characterization remains challenging when examining newly discovered materials such as two-dimensional (2D) structures. This bottleneck drives demand for fully autonomous experimentation systems capable of comprehending research objectives without requiring large training data sets. In this work, we present ATOMIC (Autonomous Technology for Optical Microscopy &amp; Intelligent Characterization), an end-to-end framework that integrates foundation models to enable fully autonomous, zero-shot characterization of 2D materials. Our system integrates the vision foundation model (i.e., Segment Anything Model), large language models (i.e., ChatGPT), unsupervised clustering, and topological analysis to automate microscope control, sample scanning, image segmentation, and intelligent analysis through prompt engineering, eliminating the need for additional training. When analyzing typical MoS<sub>2</sub> samples, our approach achieves 99.7% segmentation accuracy for single layer identification, which is equivalent to that of human experts. In addition, the integrated model is able to detect grain boundary slits that are challenging to identify with human eyes. Furthermore, the system retains robust accuracy despite variable conditions, including defocus, color-temperature fluctuations, and exposure variations. It is applicable to a broad spectrum of common 2D materials─including graphene, MoS<sub>2</sub>, WSe<sub>2</sub>, SnSe─regardless of whether they were fabricated via top-down or bottom-up methods. This work represents the implementation of foundation models to achieve autonomous analysis, providing a scalable and data-efficient characterization paradigm that transforms the approach to nanoscale materials research.</p>","PeriodicalId":21,"journal":{"name":"ACS Nano","volume":"19 40","pages":"35493–35502"},"PeriodicalIF":16.0000,"publicationDate":"2025-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Zero-Shot Autonomous Microscopy for Scalable and Intelligent Characterization of 2D Materials\",\"authors\":\"Jingyun Yang,&nbsp;, ,&nbsp;Ruoyan Avery Yin,&nbsp;, ,&nbsp;Chi Jiang,&nbsp;, ,&nbsp;Yuepeng Hu,&nbsp;, ,&nbsp;Xiaokai Zhu,&nbsp;, ,&nbsp;Xingjian Hu,&nbsp;, ,&nbsp;Sutharsika Kumar,&nbsp;, ,&nbsp;Samantha K. Holmes,&nbsp;, ,&nbsp;Xiao Wang,&nbsp;, ,&nbsp;Xiaohua Zhai,&nbsp;, ,&nbsp;Keran Rong,&nbsp;, ,&nbsp;Yunyue Zhu,&nbsp;, ,&nbsp;Tianyi Zhang,&nbsp;, ,&nbsp;Zongyou Yin,&nbsp;, ,&nbsp;Yuan Cao,&nbsp;, ,&nbsp;Haoning Tang,&nbsp;, ,&nbsp;Aaron D. Franklin,&nbsp;, ,&nbsp;Jing Kong,&nbsp;, ,&nbsp;Neil Zhenqiang Gong,&nbsp;, ,&nbsp;Zhichu Ren*,&nbsp;, and ,&nbsp;Haozhe Wang*,&nbsp;\",\"doi\":\"10.1021/acsnano.5c09057\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p >Characterization of atomic-scale materials traditionally requires human experts with months to years of specialized training. Even for trained human operators, accurate and reliable characterization remains challenging when examining newly discovered materials such as two-dimensional (2D) structures. This bottleneck drives demand for fully autonomous experimentation systems capable of comprehending research objectives without requiring large training data sets. In this work, we present ATOMIC (Autonomous Technology for Optical Microscopy &amp; Intelligent Characterization), an end-to-end framework that integrates foundation models to enable fully autonomous, zero-shot characterization of 2D materials. Our system integrates the vision foundation model (i.e., Segment Anything Model), large language models (i.e., ChatGPT), unsupervised clustering, and topological analysis to automate microscope control, sample scanning, image segmentation, and intelligent analysis through prompt engineering, eliminating the need for additional training. When analyzing typical MoS<sub>2</sub> samples, our approach achieves 99.7% segmentation accuracy for single layer identification, which is equivalent to that of human experts. In addition, the integrated model is able to detect grain boundary slits that are challenging to identify with human eyes. Furthermore, the system retains robust accuracy despite variable conditions, including defocus, color-temperature fluctuations, and exposure variations. It is applicable to a broad spectrum of common 2D materials─including graphene, MoS<sub>2</sub>, WSe<sub>2</sub>, SnSe─regardless of whether they were fabricated via top-down or bottom-up methods. This work represents the implementation of foundation models to achieve autonomous analysis, providing a scalable and data-efficient characterization paradigm that transforms the approach to nanoscale materials research.</p>\",\"PeriodicalId\":21,\"journal\":{\"name\":\"ACS Nano\",\"volume\":\"19 40\",\"pages\":\"35493–35502\"},\"PeriodicalIF\":16.0000,\"publicationDate\":\"2025-10-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Nano\",\"FirstCategoryId\":\"88\",\"ListUrlMain\":\"https://pubs.acs.org/doi/10.1021/acsnano.5c09057\",\"RegionNum\":1,\"RegionCategory\":\"材料科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"CHEMISTRY, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Nano","FirstCategoryId":"88","ListUrlMain":"https://pubs.acs.org/doi/10.1021/acsnano.5c09057","RegionNum":1,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

摘要

原子尺度材料的表征通常需要人类专家经过数月至数年的专门培训。即使对于训练有素的操作人员,在检查新发现的材料(如二维结构)时,准确可靠的表征仍然具有挑战性。这一瓶颈推动了对完全自主实验系统的需求,该系统能够在不需要大型训练数据集的情况下理解研究目标。在这项工作中,我们提出了ATOMIC (Autonomous Technology for Optical Microscopy & Intelligent Characterization),这是一个端到端框架,集成了基础模型,可以完全自主地对2D材料进行零射击表征。我们的系统集成了视觉基础模型(即Segment Anything model)、大型语言模型(即ChatGPT)、无监督聚类和拓扑分析,通过快速工程实现了显微镜控制、样本扫描、图像分割和智能分析的自动化,无需额外的训练。在分析典型的二硫化钼样品时,我们的方法在单层识别中达到99.7%的分割准确率,相当于人类专家的分割准确率。此外,该集成模型还能够检测到人眼难以识别的晶界裂缝。此外,该系统在各种条件下(包括离焦、色温波动和曝光变化)仍保持强大的精度。它适用于广泛的常见二维材料──包括石墨烯、MoS2、WSe2、SnSe──无论它们是通过自上而下还是自下而上的方法制造的。这项工作代表了实现自主分析的基础模型的实现,提供了一种可扩展和数据高效的表征范式,将方法转变为纳米级材料研究。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Zero-Shot Autonomous Microscopy for Scalable and Intelligent Characterization of 2D Materials

Zero-Shot Autonomous Microscopy for Scalable and Intelligent Characterization of 2D Materials

Zero-Shot Autonomous Microscopy for Scalable and Intelligent Characterization of 2D Materials

Characterization of atomic-scale materials traditionally requires human experts with months to years of specialized training. Even for trained human operators, accurate and reliable characterization remains challenging when examining newly discovered materials such as two-dimensional (2D) structures. This bottleneck drives demand for fully autonomous experimentation systems capable of comprehending research objectives without requiring large training data sets. In this work, we present ATOMIC (Autonomous Technology for Optical Microscopy & Intelligent Characterization), an end-to-end framework that integrates foundation models to enable fully autonomous, zero-shot characterization of 2D materials. Our system integrates the vision foundation model (i.e., Segment Anything Model), large language models (i.e., ChatGPT), unsupervised clustering, and topological analysis to automate microscope control, sample scanning, image segmentation, and intelligent analysis through prompt engineering, eliminating the need for additional training. When analyzing typical MoS2 samples, our approach achieves 99.7% segmentation accuracy for single layer identification, which is equivalent to that of human experts. In addition, the integrated model is able to detect grain boundary slits that are challenging to identify with human eyes. Furthermore, the system retains robust accuracy despite variable conditions, including defocus, color-temperature fluctuations, and exposure variations. It is applicable to a broad spectrum of common 2D materials─including graphene, MoS2, WSe2, SnSe─regardless of whether they were fabricated via top-down or bottom-up methods. This work represents the implementation of foundation models to achieve autonomous analysis, providing a scalable and data-efficient characterization paradigm that transforms the approach to nanoscale materials research.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACS Nano
ACS Nano 工程技术-材料科学:综合
CiteScore
26.00
自引率
4.10%
发文量
1627
审稿时长
1.7 months
期刊介绍: ACS Nano, published monthly, serves as an international forum for comprehensive articles on nanoscience and nanotechnology research at the intersections of chemistry, biology, materials science, physics, and engineering. The journal fosters communication among scientists in these communities, facilitating collaboration, new research opportunities, and advancements through discoveries. ACS Nano covers synthesis, assembly, characterization, theory, and simulation of nanostructures, nanobiotechnology, nanofabrication, methods and tools for nanoscience and nanotechnology, and self- and directed-assembly. Alongside original research articles, it offers thorough reviews, perspectives on cutting-edge research, and discussions envisioning the future of nanoscience and nanotechnology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信