Cloud storage is a vital component of cloud architecture, often utilizing distributed key-value stores like Amazon S3 and Google Cloud Storage for managing da-ta and metadata. these systems distribute data across node...
详细信息
ISBN:
(数字)9798331533991
ISBN:
(纸本)9798331534004
Cloud storage is a vital component of cloud architecture, often utilizing distributed key-value stores like Amazon S3 and Google Cloud Storage for managing da-ta and metadata. these systems distribute data across nodes using key range or consistent hashing, but they face challenges such as load imbalance and limited parallelism due to uneven data distribution and varying node performance. Cur-rent implementations, such as MongoDB, address these imbalances by migrating data between nodes but often neglect the characteristics of the underlying data structures, leading to increased overhead from costly delete and insert operations. To address these issues, this design leverages the properties of the LSM tree, a commonly used storage engine, to optimize data migration. the approach intro-duces hot zone prediction using nonlinear regression to accurately identify data hotspots based on key characteristics, insertion time, and TTL. A storage engine-aware migration system is developed to migrate grouped SSTable files rather than individual key-value pairs, significantly reducing migration overhead. Additionally, the data migration I/O process is offloaded using the NVMe-oF protocol, minimizing CPU involvement and preserving node performance. Implemented on mongo-rocks, this solution improves load balancing by directly moving SSTable files across nodes, enhancing efficiency and reducing performance degradation in distributed key-value stores.
In recent years, pre-trained language models based on Transformer have brought significant breakthroughs to natural language processing (NLP) tasks. their outstanding performance in general text understanding enables ...
详细信息
ISBN:
(数字)9798350373820
ISBN:
(纸本)9798350373837
In recent years, pre-trained language models based on Transformer have brought significant breakthroughs to natural language processing (NLP) tasks. their outstanding performance in general text understanding enables such models to handle a variety of tasks. However, large language models exhibit problems such as the inability to provide a definitive label for topic identification tasks after interactive prompts and difficulties in controlling the label range. In specific application scenarios, the performance of large models cannot accurately perceive the demands of the tasks due to insufficient data. To address this issue, we propose a text semantic classification method called PNAMLTrans. PN-AMLTrans mainly consists of three modules: Non-Adjacent Sequence Learning Module, AMLNet, and AvgSoftmax Classification Module. In model computation, we first use the BERT pre-trained model to learn the initial features of the text, extracting complete semantic representations of the text as much as possible. Next, we utilize the deep residual connection mechanism to serially connect the Transformer Encoder for learning word embedding semantic matrices. Simultaneously, the attention scores from each layer of the Transformer Encoder are input into AMLNet to extract the mapping relationship between attention scores and classification clues. Finally, effective classification with key weight amplification is achieved through joint weighted calculation using average pooling and Softmax. Experimental results show that the model achieves $98.1 \%$ generalization performance on the Chinese topic recognition dataset thUCNews and $83.5 \%$ state-of-the-art performance on the English topic recognition dataset arXiv-10, validating the effectiveness of our method across different languages and demonstrating the effectiveness of the PN-AMLTrans model in specific applications.
the 5thinternationalconference on Chemical and Bioprocess Engineering (ICCBPE) was organized by the Chemical Engineering Programme of the Faculty of Engineering, Universiti Malaysia Sabah. the primary purpose of ICC...
the 5thinternationalconference on Chemical and Bioprocess Engineering (ICCBPE) was organized by the Chemical Engineering Programme of the Faculty of Engineering, Universiti Malaysia Sabah. the primary purpose of ICCBPE 2015 was to discuss bridging technologies from laboratory and pilot scales to commercial scales for manufacturing advanced products by involving innovative processing of biomass and feedstock suitable for SME's. this event serves as a platform for researchers, engineers, academicians as well as industrial professionals from countries such as the United State of America, Taiwan, Korea, Iran, thailand and Malaysia to present their research and development activities in chemical engineering sciences and applications. During the conference, a total of 81 manuscripts under the Oil and Gas, Waste, Biomass and Biofuels, and Environment categories were presented and discussed in a series of parallel presentations. We are honoured to have distinguished keynote speakers participating in this event. they were Prof. Dr. Ahmad Fauzi Ismail, the founding Director of Advanced Membrane Technology Research Center (AMTEC) (Universiti Teknologi Malaysia, Malaysia), Prof. Dr. Ghasem D. Najafpour (Chairman of the Biotechnology Research Center, Babol Noshirvani University of Technology, Iran) and Prof. Dr. Shizhong Li (Executive Director of MOST-USDA Joint Research Center for Biofuels, China). Last but not least, we are grateful to all of the reviewers for maintaining the standards and quality of the manuscripts throughout the reviewing process. Kota Kinabalu, Sabah April 2016 Chairman : Professor Ir. Dr. Rosalam Sarbatly Chief Editor : Professor Dr. Awang Bono Editorial Team : Associate Professor Dr. Chu Chi Ming Dr. tham Heng Jin Dr. Noor Maizura Ismail Ms. Zykamilia Kamin Universiti Malaysia Sabah (UMS)
暂无评论