Applications designed utilizing Microservices Architecture (MSA) provide the desirable trait of good maintainability. To ensure optimal maintainability, it is important to provide services that are suitable and adhere...
Applications designed utilizing Microservices Architecture (MSA) provide the desirable trait of good maintainability. To ensure optimal maintainability, it is important to provide services that are suitable and adhere to prescribed rules. Multiple aspects must be taken into account while designing services to ensure optimal maintainability. The objective of this study is to examine the elements that impact the capacity to sustain and improve maintainability in service design, ultimately resulting in an application that possesses strong maintainability. The Systematic Literature Review (SLR) will be utilized to identify variables and strategies for their enhancement, by examining pertinent publications on the subject. After examining 45 publications, the study discovered 8 elements and 14 solutions that can enhance the highlighted parameters throughout the services design process. The outcomes of this systematic literature review (SLR) are anticipated to give valuable insights to application developers, empowering them to generate service designs that exhibit commendable maintainability for the developed applications.
The purpose of this study is to find out and analyze what has been done by previous studies in knowing the problems faced by SMEs today and how the influence of Industry 4.0 technology in dealing with problems in Smal...
The purpose of this study is to find out and analyze what has been done by previous studies in knowing the problems faced by SMEs today and how the influence of Industry 4.0 technology in dealing with problems in Small and Medium Enterprises (SMEs), so that they become developed and can maintain their business continuity. Most studies show that SMEs have considerable barriers when trying to use the potential of Industry 4.0 to improve the competitiveness and long-term sustain ability of their business. This systematic literature review uses the Kitchenham technique to identify SME problems and the role of Industry 4.0 technologies in SMEs in various industrialized and developing countries. Through several databases including: IEEE, science Direct, Web of science, Taylor & Francis, Emerald and Springer between 2019 - 2023 were used to find relevant research articles in this study. The research findings offer valuable perspectives on the strategies, obstacles and prospects that SMEs in different countries face in adjusting to Industry 4.0. The contribution of academics in this research is to provide solutions by designing more robust approaches to promote the business sustainability of SMEs and their role in driving economic progress in their countries.
Rapid progress in the field of artificial intelligence has created opportunities for extensive applications in software development. One area that receives attention is the evaluation of code quality using machine lea...
Rapid progress in the field of artificial intelligence has created opportunities for extensive applications in software development. One area that receives attention is the evaluation of code quality using machine learning techniques. In this investigation, we examined the possible application of machine learning to predict the likelihood of defects in computer code. We employ NASA archival data as case studies. Machine learning models employ neural network algorithms. Our exploration involves partitioning the dataset into training data and test data for performance evaluation. The findings indicate that the Neural Organize technique with resampling yields a high level of accuracy in predicting software defects. Our simulated neural network is capable of identifying intricate patterns in the data and providing precise measurements of the size and intensity of defects. These results have significant implications in the software business, enabling developers to promptly identify possible vulnerabilities and take preventive measures before product release.
Refasctoring is a technique used in software development to improve the quality of code without changing its functionality. One metric that is often used to measure code quality is Code Coverage. This study aims to ex...
Refasctoring is a technique used in software development to improve the quality of code without changing its functionality. One metric that is often used to measure code quality is Code Coverage. This study aims to examine refactoring techniques that can maximize Code Coverage Metric. Through the study, identification, evaluation, and summary of empirical evidence from various literature sources are carried out. The results of this study provide guidance on effective refactoring techniques to improve Code Coverage as well as other positive impacts for software development. There are ten refactoring techniques that can be used to improve Code Coverage Metrics in software testing.
In real life, many activities are performed sequentially. These activities must be carried out sequentially, such as the assembly process in the manufacturing production process. This series of activities cannot be re...
In real life, many activities are performed sequentially. These activities must be carried out sequentially, such as the assembly process in the manufacturing production process. This series of activities cannot be reduced or added so that the main goal of the series of activities is achieved. Apart from that, there are also time series events that occur naturally, such as rainy and hot conditions in a certain area. The classification process of time series activities is very important to see the possibility of anomalies occurring. The significant development of machine learning models in recent years has made the process of classifying time series data increasingly researched. Several previous studies stated that deep learning models were more accurate in classifying time series data. In this paper, we will compare Convolutional Neural Network (CNN) and Transformer deep learning models in classifying time series data. Experimental results using the same public datasets for CNN and Transformer model show that the CNN model is more accurate than the Transformer model. The results of measuring accuracy using confusion matrix show that CNN has an accuracy of 92% and Transformer has an accuracy of 80%.
The technique of calculating the work required for software development is known as software development estimation. The objective cost estimating technique for organizing and carrying out Multi Account Partner (MAP) ...
The technique of calculating the work required for software development is known as software development estimation. The objective cost estimating technique for organizing and carrying out Multi Account Partner (MAP) software projects, COCOMO II, was employed in this study. The MAP software is a software architecture designed to support brokerage cryptocurrency exchanges using the order book and liquidity of established crypto exchanges. This research uses data sets from MAP project development at Indonesia Crypto Exchange Platform. It aims to create a software cost estimation model for MAP software using COCOMO II so that the resulting estimation model can be used as input or reference for estimates of subsequent MAP software development. The result estimated that MAP software finished in about four to five months, with a price range for software development of $7,441 to $8,780. Further research is needed with datasets from other crypto exchanges tested to increase cost estimation accuracy using COCOMO II.
The development of technology and the internet is one of the critical factors that must be considered by companies especially those engaged in e-commerce. The web-based application is one of the tools used by e-commer...
详细信息
An information system is an important part in an organization to support business processes and to achieve its vision and mission. The information system nowadays has been one of the assets that ought to be protected ...
详细信息
Crop Yield Analysis and Prediction is a fast-expanding discipline that is critical for optimizing agricultural methods. A lack of trustworthy data is one of the challenges in estimating crop yields. We develop predict...
详细信息
Rising cyber risks have compelled organizations to adopt better cyber-protection measures. This study focused on discovering crucial security metrics and assessing the function of red teaming in enhancing cybersecurit...
Rising cyber risks have compelled organizations to adopt better cyber-protection measures. This study focused on discovering crucial security metrics and assessing the function of red teaming in enhancing cybersecurity defenses against novel cyber hazards. The PRISMA standard considered nine core research works issued between 2014 and 2023. The inclusion of red teaming best practices can significantly enhance cybersecurity architecture. Accurate simulations of cyber threats during red teaming exercises help identify vulnerabilities, and actively embracing red teaming can amplify an organization's capacity to repel future cyber assaults. Researchers and practitioners can utilize the study's insights to pioneer novel security solutions. Combining red teaming methodologies with relevant metrics is essential for enhancing cybersecurity posture. The study's discoveries grant companies a priceless benefit in navigating the rapidly changing cyber threat environment and reinforcing their cyber protection mechanisms.
暂无评论