{"title":"Enhancing E-commerce Chatbots with Falcon-7B and 16-bit Full Quantization","authors":"Yang Luo, Zibu Wei, Guokun Xu, Zhengning Li, Ying Xie, Yibo Yin","doi":"10.53469/jtpes.2024.04(02).08","DOIUrl":null,"url":null,"abstract":"E-commerce chatbots play a crucial role in customer service but often struggle with understanding complex queries. This study introduces a breakthrough approach leveraging the Falcon-7B model, a state-of-the-art Large Language Model (LLM) with 7 billion parameters. Trained on a vast dataset of 1,500 billion tokens from RefinedWeb and curated corpora, the Falcon-7B model excels in natural language understanding and generation. Notably, its 16-bit full quantization transformer ensures efficient computation without compromising scalability or performance. By harnessing cutting-edge machine learning techniques, our method aims to redefine e-commerce chatbot systems, providing businesses with a robust solution for delivering personalized customer experiences.","PeriodicalId":489516,"journal":{"name":"Journal of Theory and Practice of Engineering Science","volume":"85 6","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Theory and Practice of Engineering Science","FirstCategoryId":"0","ListUrlMain":"https://doi.org/10.53469/jtpes.2024.04(02).08","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
E-commerce chatbots play a crucial role in customer service but often struggle with understanding complex queries. This study introduces a breakthrough approach leveraging the Falcon-7B model, a state-of-the-art Large Language Model (LLM) with 7 billion parameters. Trained on a vast dataset of 1,500 billion tokens from RefinedWeb and curated corpora, the Falcon-7B model excels in natural language understanding and generation. Notably, its 16-bit full quantization transformer ensures efficient computation without compromising scalability or performance. By harnessing cutting-edge machine learning techniques, our method aims to redefine e-commerce chatbot systems, providing businesses with a robust solution for delivering personalized customer experiences.