2021 IEEE International Joint Conference on Biometrics (IJCB)最新文献

筛选
英文 中文
Multi-subband and Multi-subepoch Time Series Feature Learning for EEG-based Sleep Stage Classification 基于脑电图的睡眠阶段分类的多子带和多子波段时间序列特征学习
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484344
Panfeng An, Zhiyong Yuan, Jianhui Zhao, Xue Jiang, Zengmao Wang, Bo Du
{"title":"Multi-subband and Multi-subepoch Time Series Feature Learning for EEG-based Sleep Stage Classification","authors":"Panfeng An, Zhiyong Yuan, Jianhui Zhao, Xue Jiang, Zengmao Wang, Bo Du","doi":"10.1109/IJCB52358.2021.9484344","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484344","url":null,"abstract":"EEG plays an important role in the analysis and recognition of brain activity, and which has great potential in the field of biometrics, while EEG-based time series classification is complicated and difficult due to the nonstationary characteristics and individual difference. In this paper, we investigate the EEG signal classification problem and propose a multi-subband and multi-subepoch time series feature learning (MMTSFL) method for automatic sleep stage classification. Specifically, MMTSFL first decomposes multiple subbands with various frequency from raw EEG signals and partitions the obtained subbands in-to multiple consecutive subepochs, and then employs time series feature learning to obtain effective discriminant features. Moreover, amplitude-time based signal features are extracted from each subepoch to represent dynamic variation of EEG signals, and MMTSFL conduct further multipurpose feature learning for specific features, consistent features and temporal features simultaneously. Experiment results on three classification tasks of sleep quality evaluation, fatigue detection and sleep disease diagnosis demonstrate the superiority of the proposed method.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127824366","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Concealable Biometric-based Continuous User Authentication System An EEG Induced Deep Learning Model 基于隐式生物特征的连续用户认证系统——脑电诱导深度学习模型
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484345
S. Gopal, Diksha Shukla
{"title":"Concealable Biometric-based Continuous User Authentication System An EEG Induced Deep Learning Model","authors":"S. Gopal, Diksha Shukla","doi":"10.1109/IJCB52358.2021.9484345","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484345","url":null,"abstract":"This paper introduces a lightweight, low-cost, easy-to-use, and unobtrusive continuous user authentication system based on concealable biometric signals. The proposed authentication model continuously verifies a user’s identity throughout the user session while s/he watches a video or performs free-text typing on his/her desktop/laptop keyboard. The authentication model utilizes unobtrusively recorded electroencephalogram (EEG) signals and learns the user’s unique biometric signature based on his/her brain activity.Our work has multifold impact in the area of EEG-based authentication: (1) a comprehensive study and a comparative analysis of a wide range of extracted features are presented. These features are categorized based on the EEG electrodes placement position on the user’s head, (2) an optimal feature subset is constructed using a minimal number of EEG electrodes, (3) a deep neural network-based user authentication model is presented that utilizes the constructed optimal feature subset, and (4) a detailed experimental analysis on a publicly available EEG dataset of 26 volunteer participants is presented.Our experimental results show that the proposed authentication model could achieve an average Equal Error Rate (EER) of 0.137%. Although a thorough analysis on a larger pool of subjects must be performed, our results show the viability of low-cost, lightweight EEG-based continuous user authentication systems.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128008915","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Avoiding Spectacles Reflections on Iris Images Using A Ray-tracing Method 利用光线追踪方法避免虹膜图像上的眼镜反射
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484402
Yu Tian, Kunbo Zhang, Leyuan Wang, Chong Zhang
{"title":"Avoiding Spectacles Reflections on Iris Images Using A Ray-tracing Method","authors":"Yu Tian, Kunbo Zhang, Leyuan Wang, Chong Zhang","doi":"10.1109/IJCB52358.2021.9484402","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484402","url":null,"abstract":"Spectacles reflection removal is a challenging problem in iris recognition research. The reflection of the spectacles usually contaminates the iris image acquired under infrared illumination. The intense light reflection caused by the active light source makes reflection removal more challenging than normal scenes since important iris texture features are entirely obscured. Eliminating unnecessary reflections can effectively improve iris recognition system performance. This paper proposes a spectacle reflection removal algorithm based on ray coding and ray tracking to remove spectacle reflection in iris images. By decoding the light source’s encoded light beam, the iris imaging device eliminates most of the stray light. Our binocular imaging device tracks the light path to obtain parallax information and realizes reflected light spot removal through image fusion. We designed a prototype system to verify our proposed method in this paper. This method can effectively eliminate reflections without changing iris texture and improve iris recognition in complex scenarios.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125457349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Practical Face Swapping Detection Based on Identity Spatial Constraints 基于身份空间约束的人脸交换检测
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484396
Jun Jiang, Bo Wang, Bing Li, Weiming Hu
{"title":"Practical Face Swapping Detection Based on Identity Spatial Constraints","authors":"Jun Jiang, Bo Wang, Bing Li, Weiming Hu","doi":"10.1109/IJCB52358.2021.9484396","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484396","url":null,"abstract":"The generalization of face swapping detectors against unseen face manipulation methods is important to practical applications. Most existing methods based on convolutional neural networks (CNN) simply map the facial images to real/fake binary labels and achieve high performance on the known forgeries, but they almost fail to detect new manipulation methods. In order to improve the generalization of face swapping detection, this work concentrates on a practical scenario to protect specific persons by proposing a novel face swapping detector requiring a reference image. To this end, we design a new detection framework based on identity spatial constraints (DISC), which consists of a backbone network and an identity semantic encoder (ISE). When inspecting an image of a particular person, the ISE utilizes a real facial image of that person as the reference to constrain the backbone to focus on the identity-related facial areas, so as to exploit the intrinsic discriminative clues to the forgery in the query image. Cross-dataset evaluations on five large-scale face forgery datasets show that DISC significantly improves the performance against unseen manipulation methods and is robust against the distortions. Compared to the existing detection methods, the AUC scores achieve 10%~40% performance improvements.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129281022","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
BioCanCrypto: An LDPC Coded Bio-Cryptosystem on Fingerprint Cancellable Template 一种基于指纹可取消模板的LDPC编码生物密码系统
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484391
Xingbo Dong, Zhe Jin, Leshan Zhao, Zhenhua Guo
{"title":"BioCanCrypto: An LDPC Coded Bio-Cryptosystem on Fingerprint Cancellable Template","authors":"Xingbo Dong, Zhe Jin, Leshan Zhao, Zhenhua Guo","doi":"10.1109/IJCB52358.2021.9484391","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484391","url":null,"abstract":"Biometrics as a means of personal authentication has demonstrated strong viability in the past decade. However, directly deriving a unique cryptographic key from biometric data is a non-trivial task due to the fact that biometric data is usually noisy and presents large intra-class variations. Moreover, biometric data is permanently associated with the user, which leads to security and privacy issues. Cancellable biometrics and bio-cryptosystem are two main branches to address those issues, yet both approaches fall short in terms of accuracy performance, security, and privacy. In this paper, we propose a Bio-Crypto system on fingerprint Cancellable template (Bio-CanCrypto), which bridges cancellable biometrics and bio-cryptosystem to achieve a middle-ground for alleviating the limitations of both. Specifically, a cancellable transformation is applied on a fixed-length fingerprint feature vector to generate cancellable templates. Next, an LDPC coding mechanism is introduced into a reusable fuzzy extractor scheme and used to extract the stable cryptographic key from the generated cancellable templates. The proposed system can achieve both cancellability and reusability in one scheme. Experiments are conducted on a public fingerprint dataset, i.e., FVC2002. The results demonstrate that the proposed LDPC coded reusable fuzzy extractor is effective and promising.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131446968","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Identifying Rhythmic Patterns for Face Forgery Detection and Categorization 识别节奏模式的人脸伪造检测和分类
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484400
Jiahao Liang, Weihong Deng
{"title":"Identifying Rhythmic Patterns for Face Forgery Detection and Categorization","authors":"Jiahao Liang, Weihong Deng","doi":"10.1109/IJCB52358.2021.9484400","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484400","url":null,"abstract":"With the emergence of GAN, face forgery technologies have been heavily abused. Achieving accurate face forgery detection is imminent. Inspired by remote photoplethysmography (rPPG) that PPG signal corresponds to the periodic change of skin color caused by heartbeat in face videos, we observe that despite the inevitable loss of PPG signal during the forgery process, there is still a mixture of PPG signals in the forgery video with a unique rhythmic pattern depending on its generation method. Motivated by this key observation, we propose a two-stage network for face forgery detection and categorization consisting of: 1) a Spatial-Temporal Filter Module (STFM) for PPG signals filtering, and 2) an Adjacency Interaction Module (AIM) for constraint and interaction of PPG signals. Moreover, with insight into the generation of forgery methods, we further propose Spatial-Temporal Mixup (ST-Mixup) to boost the performance of the network. Overall, extensive experiments have proved the superiority of our method.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131764999","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Joint Feature Distribution Alignment Learning for NIR-VIS and VIS-VIS Face Recognition NIR-VIS和VIS-VIS人脸识别的联合特征分布对齐学习
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484385
T. Miyamoto, H. Hashimoto, Akihiro Hayasaka, Akinori F. Ebihara, Hitoshi Imaoka
{"title":"Joint Feature Distribution Alignment Learning for NIR-VIS and VIS-VIS Face Recognition","authors":"T. Miyamoto, H. Hashimoto, Akihiro Hayasaka, Akinori F. Ebihara, Hitoshi Imaoka","doi":"10.1109/IJCB52358.2021.9484385","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484385","url":null,"abstract":"Face recognition for visible light (VIS) images achieve high accuracy thanks to the recent development of deep learning. However, heterogeneous face recognition (HFR), which is a face matching in different domains, is still a difficult task due to the domain discrepancy and lack of large HFR dataset. Several methods have attempted to reduce the domain discrepancy by means of fine-tuning, which causes significant degradation of the performance in the VIS domain because it loses the highly discriminative VIS representation. To overcome this problem, we propose joint feature distribution alignment learning (JFDAL) which is a joint learning approach utilizing knowledge distillation. It enables us to achieve high HFR performance with retaining the original performance for the VIS domain. Extensive experiments demonstrate that our proposed method delivers statistically significantly better performances compared with the conventional fine-tuning approach on a public HFR dataset Oulu-CASIA NIR&VIS and popular verification datasets in VIS domain such as FLW, CFP, AgeDB. Furthermore, comparative experiments with existing state-of-the-art HFR methods show that our method achieves a comparable HFR performance on the Oulu-CASIA NIR&VIS dataset with less degradation of VIS performance.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116023568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Message from Program Chairs of IJCB 2021 IJCB 2021项目主席的致辞
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/ijcb52358.2021.9521651
{"title":"Message from Program Chairs of IJCB 2021","authors":"","doi":"10.1109/ijcb52358.2021.9521651","DOIUrl":"https://doi.org/10.1109/ijcb52358.2021.9521651","url":null,"abstract":"","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115702245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
NIR Iris Challenge Evaluation in Non-cooperative Environments: Segmentation and Localization 非合作环境下近红外虹膜挑战评估:分割与定位
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484336
Caiyong Wang, Yunlong Wang, Kunbo Zhang, Jawad Muhammad, T. Lu, Qi Zhang, Q. Tian, Zhaofeng He, Zhenan Sun, Yiwen Zhang, Tian Liu, Wei Yang, Dongliang Wu, Yingfeng Liu, Ruiye Zhou, Huihai Wu, Hao Zhang, Junbao Wang, Jiayi Wang, Wantong Xiong, Xueyu Shi, Shaogeng Zeng, Peihua Li, Haodong Sun, Jing Wang, Jiale Zhang, Qi Wang, Huijie Wu, Xinhui Zhang, Haiqing Li, Yu Chen, Liang Chen, Menghan Zhang, Ye Sun, Zhiyong Zhou, F. Boutros, N. Damer, Arjan Kuijper, Juan E. Tapia, A. Valenzuela, C. Busch, G. Gupta, K. Raja, Xi Wu, Xiaojie Li, Jingfu Yang, Hongyan Jing, Xin Wang, B. Kong, Youbing Yin, Qi Song, Siwei Lyu, Shu Hu, L. Premk, Matej Vitek, Vitomir Štruc, P. Peer, J. Khiarak, F. Jaryani, Samaneh Salehi Nasab, S. N. Moafinejad, Y. Amini, M. Noshad
{"title":"NIR Iris Challenge Evaluation in Non-cooperative Environments: Segmentation and Localization","authors":"Caiyong Wang, Yunlong Wang, Kunbo Zhang, Jawad Muhammad, T. Lu, Qi Zhang, Q. Tian, Zhaofeng He, Zhenan Sun, Yiwen Zhang, Tian Liu, Wei Yang, Dongliang Wu, Yingfeng Liu, Ruiye Zhou, Huihai Wu, Hao Zhang, Junbao Wang, Jiayi Wang, Wantong Xiong, Xueyu Shi, Shaogeng Zeng, Peihua Li, Haodong Sun, Jing Wang, Jiale Zhang, Qi Wang, Huijie Wu, Xinhui Zhang, Haiqing Li, Yu Chen, Liang Chen, Menghan Zhang, Ye Sun, Zhiyong Zhou, F. Boutros, N. Damer, Arjan Kuijper, Juan E. Tapia, A. Valenzuela, C. Busch, G. Gupta, K. Raja, Xi Wu, Xiaojie Li, Jingfu Yang, Hongyan Jing, Xin Wang, B. Kong, Youbing Yin, Qi Song, Siwei Lyu, Shu Hu, L. Premk, Matej Vitek, Vitomir Štruc, P. Peer, J. Khiarak, F. Jaryani, Samaneh Salehi Nasab, S. N. Moafinejad, Y. Amini, M. Noshad","doi":"10.1109/IJCB52358.2021.9484336","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484336","url":null,"abstract":"For iris recognition in non-cooperative environments, iris segmentation has been regarded as the first most important challenge still open to the biometric community, affecting all downstream tasks from normalization to recognition. In recent years, deep learning technologies have gained significant popularity among various computer vision tasks and also been introduced in iris biometrics, especially iris segmentation. To investigate recent developments and attract more interest of researchers in the iris segmentation method, we organized the 2021 NIR Iris Challenge Evaluation in Non-cooperative Environments: Segmentation and Localization (NIR-ISL 2021) at the 2021 International Joint Conference on Biometrics (IJCB 2021). The challenge was used as a public platform to assess the performance of iris segmentation and localization methods on Asian and African NIR iris images captured in non-cooperative environments. The three best-performing entries achieved solid and satisfactory iris segmentation and localization results in most cases, and their code and models have been made publicly available for reproducibility research.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130206555","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
Contrastive Self-supervised Learning for Sensor-based Human Activity Recognition 基于传感器的人类活动识别的对比自监督学习
2021 IEEE International Joint Conference on Biometrics (IJCB) Pub Date : 2021-08-04 DOI: 10.1109/IJCB52358.2021.9484410
Bulat Khaertdinov, E. Ghaleb, S. Asteriadis
{"title":"Contrastive Self-supervised Learning for Sensor-based Human Activity Recognition","authors":"Bulat Khaertdinov, E. Ghaleb, S. Asteriadis","doi":"10.1109/IJCB52358.2021.9484410","DOIUrl":"https://doi.org/10.1109/IJCB52358.2021.9484410","url":null,"abstract":"Deep Learning models, applied to a sensor-based Human Activity Recognition task, usually require vast amounts of annotated time-series data to extract robust features. However, annotating signals coming from wearable sensors can be a tedious and, often, not so intuitive process, that requires specialized tools and predefined scenarios, making it an expensive and time-consuming task. This paper combines one of the most recent advances in Self-Supervised Leaning (SSL), namely a SimCLR framework, with a powerful transformer-based encoder to introduce a Contrastive Self-supervised learning approach to Sensor-based Human Activity Recognition (CSSHAR) that learns feature representations from unlabeled sensory data. Extensive experiments conducted on three widely used public datasets have shown that the proposed method outperforms recent SSL models. Moreover, CSSHAR is capable of extracting more robust features than the identical supervised transformer when transferring knowledge from one dataset to another as well as when very limited amounts of annotated data are available.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122544092","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 27
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信