Fuel efficiency of dump trucks is affected by real world variables such as vehicle parameters, road conditions, weather parameters, and driver behavior. Predicting fuel consumption per trip using dynamic road condition data can effectively reduce the cost and time associated with on-road testing. This paper proposes new models for predicting fuel consumption of dump trucks in surface mining operations. The models combine locally collected data from dump truck sensors and analyze it to enhance their capabilities. The architectural design consists of two distinct parts, initially based on dual Long-term Short-Term Memories (LSTMs) and dual dense layers of Deep Neural Networks (DNNs). The new hybrid architecture improves the performance of the proposed model compared to other models, especially in terms of accuracy measurement. The MAE, RMSE, MSE and R2 scores indicate high prediction accuracy.
Keywords: LSTM algorithm, DNN, density, prediction, fuel consumption, quarries
The task of planning the sending of messages of known volumes from source points to destinations with known needs. At the same time, it is assumed that the costs of transmitting information on the one hand are proportional to the transmitted volumes and the cost of transmitting a unit of information over the selected communication channels, and on the other hand are associated with a fixed subscription fee for the use of channels that does not depend on the volume of transmitted information. An indicator of the quality of the plan with such a statement is the total cost of sending the entire planned volume of messages. A comparative characteristic of the effectiveness of methods for obtaining optimal plans using a linearized objective function and an exact solution by one of the combinatorial methods is carried out.
Keywords: message transmission, transport task, criterion of minimum total costs, computational complexity of the algorithm, linearization of the objective function
Outlier detection is an important area of data research in various fields. The aim of the study is to provide a non-exhaustive overview of the features of using methods for detecting outliers in data based on various machine learning techniques: supervised, unsupervised, semi-supervised. The article outlines the features of the application of certain methods, their advantages and limitations. It has been established that there is no universal method for detecting outliers suitable for various data, therefore, the choice of a particular method for the implementation of research should be made based on an analysis of the advantages and limitations inherent in the chosen method, with the obligatory consideration of the capabilities of the available computing power and the characteristics of the available data, in including those including their classification into outliers and normal data, as well as their volume.
Keywords: outliers, machine learning, outlier detection, data analysis, data mining, big data, principal component analysis, regression, isolating forest, support vector machine
This article explores how to optimize Quantum Espresso for efficient use of Nvidia's graphics processing unit (GPU) using CUDA technology. Quantum Espresso is a powerful tool for quantum mechanical simulation and calculation of material properties. However, the original version of the package was not designed for GPU use, so optimization is required to achieve the best performance.
Keywords: Quantum Espresso, GPU, CUDA, compute acceleration
This article discusses the process of collecting initial data for the cadastral assessment of cultural heritage objects. It has been revealed that cultural heritage objects have a number of features among other real estate objects, so special attention should be paid to the methodology for assessing such objects. The purpose of the study is to analyze and identify problematic issues in collecting initial data on cultural heritage objects during the state cadastral assessment. The results of the study showed a number of problematic issues related to the type of cultural heritage object, its binding to the cadastral number and the possibility of interpreting the information received to bring it to automatic data processing.
Keywords: state cadastral valuation, cultural heritage object, collection of initial data, interdepartmental interaction, status of cultural heritage object
The procedure for filling positions of teaching staff of universities belonging to the teaching staff is regulated by federal laws and local regulations. At the same time, it becomes necessary to store and exchange a large number of documents between various participants of competitive events. The aim of the work was to automate the process of holding competitive events and use a common data warehouse, with the help of which it is possible to speed up paperwork, save time and consumables, ensure the safety of storing, transmitting and processing information. The article reflects the obtained results of automation of the competitive selection process at the St. Petersburg State University of Architecture and Civil Engineering.
Keywords: higher education institutions, competitive election, teaching staff, automation
The modern cycle of creating simulation models is not complete without analysts, modelers, developers, and specialists from various fields. There are numerous well-known tools available to simplify simulation modeling, and in addition, it is proposed to use large language models (LLMs), consisting of neural networks. The article considered the GPT-4 model as an example. Such models have the potential to reduce costs, whether financial or time-related, in the creation of simulation models. Examples of using GPT-4 were presented, leading to the hypothesis that LLMs can replace or significantly reduce the labor intensity of employing a large number of specialists and even skip the formalization stage. Work has been conducted comparing the processes of creating models and conducting experiments using different simulation modeling tools, and the results have been formatted into a comparative table. The comparison was conducted based on the main simulation modeling criteria. Experiments with GPT-4 have successfully demonstrated that the creation of simulation models using LLMs is significantly accelerated and has great perspective in this field.
Keywords: Simulation modeling, large language model, neural network, GPT-4, simulation environment, mathematical model
The publication discusses the definition of a shared data environment. The main criteria for choosing SOD are put forward. A generalized analysis of the weaknesses of all existing ODS systems is provided. The article will help you better understand ODS and make the right choice of system.
Keywords: general data environment, design, construction, information, information modeling, ODS, criteria, management, information organization, information transfer
The article is a review work on the methods and technologies used in the analysis of vulnerabilities in information systems. The article describes the main steps in conducting a vulnerability analysis, such as collecting information about the system, scanning the system for vulnerabilities, and analyzing the scan results. It also discusses how to protect against vulnerabilities, such as regularly updating software, conducting vulnerability analysis, and developing a data security strategy.
Keywords: vulnerability analysis, data security, information security threats, attack protection, information security, computer security, security risk, network vulnerability, security system, protection
This study is a pilot one. The purpose of the study is to identify the nature of the relationship between Poisson's ratio and cohesion, on the example of a soil mass. The main objective of the study is to identify the dependence of Poisson's ratio and cohesion coefficient to obtain the fracture limit of the material (in this study of soil massif) - plastic flows in the material. The study is conducted by methods of mathematical modeling. In order to achieve the objective, it is necessary to justify the possibility of performing this experiment by means of boundary value problem, and to perform the ranking of the number of numerical experiments by experiment planning method to obtain the extrema. Next, it is necessary to perform the numerical experiment itself to reveal the relationship between Poisson's ratio and cohesion. The obtained data will be used to compose the inverse problem when testing a new Russian software product in the field of geotechnical and geomechanical modeling.
Keywords: Poisson's ratio, cohesion, soil massif, numerical experiment, finite element method, mathematical modelling, plastic flow, deformation, stress
This article is devoted to solving the problem of research and detection of malware. The method implemented in the work allows you to dynamically detect malware for Android using system call graphs using graph neural networks. The objective of this work is to create a computer model for a method designed to detect and investigate malware. Research on this topic is important in mathematical and software modeling, as well as in the application of system call control algorithms on Android devices. The originality of this direction lies in the constant improvement of approaches in the fight against malware, as well as limited information on the use of computer simulation to study such phenomena and features in the world.
Keywords: system calls, android, virus, malware, neural networks, artificial intelligence, fuzzy logic
The article is devoted to the consideration of topical issues related to the study of the possibility of forecasting the dynamics of stock markets based on neural network models of machine learning. The prospects of applying the neural network approach to building investment forecasts are highlighted. To solve the problem of predicting the dynamics of changes in the value of securities, the problems of training a model on data presented in the form of time series are considered and an approach to the transformation of training data is considered. The method of recursive exclusion of features is described, which is used to identify the most significant parameters that affect price changes in the stock market. An experimental comparison of a number of neural networks was carried out in order to identify the most effective approach to solving the problem of forecasting market dynamics. As a separate example, the implementation of regression based on a radial-basis neural network was considered and an assessment of the quality of the model was presented.
Keywords: stock market, forecast, daily slice, shares, neural network, machine learning, activation function, radial basis function, cross-validation, time series
This article discusses the issue of the features of measuring and predicting changes in the surface electric field strength in the atmosphere. The results of measurements of the atmospheric electric field strength are presented. The possibilities of forecasting changes in the surface electric field strength, including the use of numerical models, as well as the use of measurement results as an indicator of dangerous weather phenomena, are considered. The prospects of using the prediction of variations in the surface electric field intensity to predict adverse weather events and the importance of monitoring the intensity of the atmospheric electric field for understanding global climate change processes and the impact of the electric field on human health and the environment are discussed. For the research, a model was created that allows predicting electric field variations based on meteorological data. The developed neural network has shown good results. It is demonstrated that the use of neural networks can be an effective approach for predicting the parameters of the electric field of the surface layer of the atmosphere. In further research, it is planned to expand the measurement area by including additional parameters such as temperature, pressure and humidity in the analysis, as well as using more complex machine learning models to improve the accuracy of forecasts. In general, the results show that machine learning models can be effective in predicting variations of the electric field in the surface layer of the atmosphere. This can have practical applications in various fields such as aeronautics, meteorology, geology and others. Further research in this area contributes to the development of new methods and technologies in the field of electric power and communications and to improving our knowledge about the nature of the impact of atmospheric electrophysical phenomena on the environment and human health.
Keywords: electric field, surface layer of the atmosphere, measurements, methods, forecasting, modeling of variations in field strength
One of the tasks of data preprocessing is the task of eliminating gaps in the data, i.e. imputation task. The paper proposes algorithms for filling gaps in data based on the method of statistical simulation. The proposed gap filling algorithms include the stages of clustering data by a set of features, classifying an object with a gap, constructing a distribution function for a feature that has gaps for each cluster, recovering missing values using the inverse function method. Computational experiments were carried out on the basis of statistical data on socio-economic indicators for the constituent entities of the Russian Federation for 2022. An analysis of the properties of the proposed imputation algorithms is carried out in comparison with known methods. The efficiency of the proposed algorithms is shown.
Keywords: imputation algorithm, data gaps, statistical modeling, inverse function method, data simulation
The article discusses the use of graph theory to calculate the location of elements and ways of laying information cables in a distributed control system. It describes how the use of graph theory can help improve system performance, reduce maintenance costs, and increase reliability and security. The article presents the general principles of using graph theory to solve problems related to the location of elements and paths for laying information cables in distributed control systems. The authors conclude that the use of graph theory is a powerful tool for solving problems associated with distributed control systems, and can be effectively applied to improve the efficiency of the system, reduce costs and increase reliability and security.
Keywords: graph theory, distributed control system, Python, Matplotlib, production process optimization, automatic analysis, control system, data cable, automation
Software development, namely CRM-systems to improve the efficiency of the company, regardless of the area, is the most effective for business in terms of organizational and managerial activities. An important aspect of the successful implementation and implementation of the system in the company is the principle of developing and building the system architecture at the server level. For users to work in the system, a deep analysis of the company's business processes and the projection of technical requirements, both on the user interface and on the system's performance, are required. The correctness of the system is based on an important factor - further support of the existing code, this requirement is relevant for any project and depends on the initially chosen method of system development and the quality of tasks performed by programmers.
Keywords: CRM-system, BPM, hardcode, development, flexible settings
This article discusses the practical implementation of the self sovereign system based on the technology of a distributed decentralized data registry, also known as blockchain. An implementation of the system based on the Proof of Stake (PoS) consensus-building mechanism is presented, which provides a number of advantages over alternative implementations described in the literature. The results of measuring system performance in comparison with known implementations based on Proof of Work (PoW) are presented, confirming the high efficiency of the proposed solution.
Keywords: decentralized, user-centric, identity-based encryption, blockchain, self Sovereign identity system
This article discusses the forecasting of the collection of payments in post offices, taking into account seasonality and the use of machine learning. An algorithm for constructing a calculation model has been developed, which provides an opportunity for analysts of the Russian Post to make a monthly forecast of the collection of payments for each UFPS (Federal Postal Administration), taking into account seasonality. This model allows you to identify deviations from the norm in matters related to the collection of payments and more accurately adjust the increase in tariffs for services. The SSA algorithm is considered, which consists of 4 steps: embedding, singular decomposition, grouping, diagonal averaging. This information system is implemented in the form of a website using a framework ASP.NET Core and libraries for machine learning ML.NET . Then the forecast is evaluated using various methods.
Keywords: mathematical modeling, seasonally adjusted forecasting, collection of payments, machine learning, neural network
The article provides general information about ontologies (including definitions of ontology), its formal (mathematical) model, and also provides a step-by-step process for developing an ontology. The areas of application of ontologies are considered and special attention is paid to the use of ontologies in the field of education. There are some suggestions about using ontologies as a knowledge base for an information security learning system. Also the fragment of a graphical representation of an ontology for biometrics, which is one of the areas of information security, is given. Ontology for biometrics is based on the national standard and developed in the Protege system.
Keywords: biometrics, knowledge, information security, knowledge representation model, learning system, learning, ontology, ontological model, OWL, RDF
The article discusses correlation methods of image identification. An algorithm of the "rare grid" method has been developed.
Keywords: image identification, algorithm, recognition, cutting, reference frame, element correlations, minimum search
The article discusses the methods and approaches developed by the authors for the recommendation system, which are aimed at improving the quality of rehabilitation of the patient during respiratory training. To describe the training, we developed our own language for a specific subject area, as well as its grammar and syntax analyzer. Thanks to this language, it is possible to build a devereve describing a specific patient's training. Two main methods considered in the article are applied to the resulting tree: "A method for analyzing problem areas during training by patients" and "A method for fuzzy search of similar areas in training". With the help of these methods, it is proposed to analyze the problem areas of patients' training during rehabilitation and look for similar difficult areas of the patient to select similar exercises in order to maintain the level of diversity of tasks and involve the patient in the process.
Keywords: Recommendation system, learning management system, rehabilitation, medicine, respiratory training, marker system, domain-specific language, Levenshtein distance
The paper describes the procedure for conducting a competition for regression models based on statistical data for the East Siberian Railway. At the same time, it is assumed to build a set of additive alternative versions of the model with the subsequent choice of the best option based on the involvement of a number of adequacy criteria. The unloading of wagons is singled out as the output variable of the model, and the input variables are: the average gross weight of a freight train, cases of failures of technical means of the 1st – 2nd category of operational nature, the working fleet of freight wagons. The implementation of the model competition allowed us to build over two hundred alternative options, from which the best alternative was selected using multi-criteria selection methods based in this case on a continuous criterion of consistency of behavior.
Keywords: railway transport, mathematical model, regression analysis, least squares method, model competition, adequacy criteria, multi-criteria selection
Search engine optimization allows a website to rank higher in search engines. Through a lot of manipulations on working with the site, you can achieve good results in increasing the conversion of sites. Modern systems for all kinds of data analysis using neural networks can greatly improve the work on this optimization.
Keywords: website promotion, search engine optimization, neural networks, code optimization, convolutional neural networks
The article discusses the application of machine vision methods for embedded systems using modern microcontrollers. Machine learning methods that are used in embedded systems to solve recognition problems, as well as neural network models, are described. The use of trained models for solving image recognition problems in embedded systems is proposed. The architectures of YOLOv3 and R-CN neural networks are compared. The Jetson TX2 hardware platform is considered. The results of comparing the calculation speed for different modes of the device are presented.
Keywords: machine vision, neural networks, artificial intelligence, embedded systems, pattern recognition, YOLO, RCN, Jetson, Tensorflow
The article considers an approach to estimating the application coefficient of standard control system equipment on a test bench. The relevance of the evaluation task at the design stage of the test bench is shown and a description of the method for solving this problem is given. The proposed approaches can be applied both at the stage of creating a test stand and when upgrading an existing positionю.
Keywords: automatic control system, test bench, analysis of the testing process, experimental testing, standard equipment, centralized control software package, application coefficient of equipment