×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Analysis of factors shaping the national identities of Azerbaijan in the media space using NLP

    The article is devoted to describing approaches to analyzing the information space using low-code platforms in order to identify factors that form new identities of Azerbaijan and the unique features of the country’s information landscape. The article describes the steps to identify key themes and collect big data in the form of text corpora from various Internet sources and analyze the data. In terms of data analysis, the study of the sentiment of the text and the identification of opinion leaders is carried out; the article also includes monitoring of key topics, visualized for a visual presentation of the results.

    Keywords: data analytics, trend monitoring, sentiment analysis, data visualization, low-code, Kribrum, Polyanalyst, big data

  • Optimisation of selective assembly indicators of lubrication system feeders

    The problem of optimisation of selective assembly of plunger-housing precision joints of feeders of centralised lubrication systems used in mechanical engineering, metallurgy, mining, etc. is considered. The probability of formation of assembly sets of all types is used as the target function; the controlled variables are the number and volumes of parts of batches and their adjustment centres, as well as the values of group tolerances. Several variants of solving the problem at different combinations of controlled variables are considered. An example of the solution of the optimisation problem on the basis of the previously developed mathematical models with the given initial data and constraints is given, the advantages and disadvantages of each of the variants are outlined. Optimisation allows to increase the considered indicator by the value from 5% to 20%.

    Keywords: selective assembly, lubrication feeder, precision connection, mathematical model, optimisation

  • How to use the information system « Geonomics » to update and generate data about the residents of the city

    Automating government processes is a top priority in the digital era. Because of historical development, many existing systems for registering and storing data about individuals coexist, requiring intervening IT infrastructures. The article considers the procedure for the development, creation and implementation of software for updating and generating data about residents of the city of Astana. It defines the functional capabilities and determines the role of the information system in automation and monitoring government activities. The authors conducted the study by observing, synthesizing, analyzing, systematizing, and classifying the data received. The authors used scientific works of local and foreign authors on the topic under study and open databases as sources of literature. At the end of the work, the authors list the literature used. The authors have, for the first time, created the structure and algorithms of the information system known as ""Population Database ""Geonomics"". Specifically, they have developed the mechanism and algorithm for the interaction of the ""Geonomics"" information system with government databases. As well as, additional opportunities for using the software have been identified by developing an algorithm for planning and placing social objects when using the information system ""Geonomics"". The authors have concluded that the algorithms developed for the use of the information system ""Population Database “Geonomics"" represent a reliable and powerful tool, which plays a critical role in the optimization and automation of processes related to population accounting and urban infrastructure management. This software contributes to the development of the city and the improvement of its residents' quality of life, based on up-to-date and reliable information. In addition, the developed algorithm allows for real-time monitoring of the current data of city residents and their density, based on which decisions can be made regarding the construction and placement of social facilities for the comfortable service and living of city residents.

    Keywords: automation, updating, government activities, government agency, information system, database

  • A structure-invariant operator for solving problems of stabilization of program motions of a complex dynamic system with restrictions

    A complex dynamic system is defined by a structurally invariant operator. The operator structure allows formulating problems of stabilizing program motions or equilibrium positions of a complex dynamic system with constraints on state coordinates and control. The solution of these problems allows synthesizing a structurally invariant operator of a complex dynamic system with inequality-constraints on the vector of locally admissible controls and state coordinates. Computational experiments confirming the correctness of the synthesized structurally invariant projection operator are performed.

    Keywords: structurally-invariant operator, stabilization of program motions, complex nonlinear dynamic system, projection operator, SimInTech

  • Methods for increasing spatial resolution in holographic microscopy

    Digital holographic microscopy (DHM), is a combination of digital holography and microscopy. It is capable of tracking transparent objects, such as organelles of living cells, without the use of fluorescent markers. The main problem of DHM is to increase an image spatial resolution while maintaining a wide field of view. The main approaches to solving this problem are: increasing of the numerical aperture of lighting and recording systems, as well as using deep learning methods. Increasing the numerical aperture of lighting systems is achieved by using oblique, structured or speckle illumination. For recording systems it is achieved by using hologram extrapolation, synthesis or super-resolution. Deep learning is usually used in conjunction with other methods to shorten the compute time. This article is dedicated to describe the basic principles and features of the above approaches.

    Keywords: digital holographic microscopy, spatial resolution, field of view, numerical aperture, sample, light beam, CCD camera, diffraction, imaging system, super-resolution

  • Detection of blurred frames

    Blurred frames pose a significant problem in various fields such as video surveillance, medical imaging and aerial photography, when solving the following object detection and identification, image-based disease diagnosis, as well as analyzing and processing data from drones to create maps and conduct monitoring. This article proposes a method for detecting blurred frames using a neural network model. The principle of operation of the model is to analyze images presented in the frequency domain in the Hough space. To further evaluate the effectiveness of the proposed author's solution, a comparison was made of existing methods and algorithms that can be used to solve the problem, namely the Laplacian method and the manual sampling method. The results obtained show that the proposed method has high accuracy in detecting blurred frames and can be used in systems where high accuracy and clarity of visual data are required for decision-making.

    Keywords: blurred frames, motion blur, blur, Hough transform, spectral analysis

  • Development of intent and entity recognition for question and answer systems using the "No Code" platform "TWIN"

    In this paper, a new intent and entity recognition model for the subject area of air passenger service, labelled as IRERAIR-TWIN, is developed using the ‘no code’ question-answer development platform ‘TWIN’. The advantages of the no-code platform were analysed in terms of the ease of developing an application question-answer system and reducing the amount of work involved in developing an application model for a narrow subject area. The results show that the ‘TWIN’ system provides an intuitive web-based user interface and a simpler approach to develop the semantic module of a question-answer system capable of solving application problems for a narrow subject area that are not overly complex. However, this approach has limitations for deep semantic analysis tasks, especially in complex contextual inference and processing of large text fragments. The paper concludes by emphasising that future research will focus on using ChatGPT-based ‘low code’ platforms and large language models to further improve the intelligence of the IRERAIR-TWIN model. This extension aims to broaden the scope of the scenarios.

    Keywords: question-answering systems, No-code, Low-code, Intent recognition, Named entity recognition, Data annotation, Feature engineering, Pre-trained model, software development,End-user development

  • Simulation of the design activity diversification of innovative enterprise

    Image super-resolution is a popular task that aims to translate images from low resolution to high resolution. For this task, convolutional networks are often used. Convolutional neural networks, have a great advantage in image processing. But despite this, often information can be lost during processing and increasing the depth and width of the network can make further work difficult. To solve this problem, data transformation into frequency domain is used. In this paper, the image is divided into high frequency and low frequency regions, where higher priority is given to the former. Then with the help of quality check, and visual evaluation, the method is analyzed and the conclusion regarding the performance of the algorithm is drawn.trial enterprise.

    Keywords: super-resolution (SR), low-resolution (LR), high-resolution (HR), discrete-cosine transform, convolution-neural networks

  • On development of an electronic notification system for students in an educational institution

    The article describes the prerequisites for creating an electronic notification system for students in an educational institution. A use case diagram is provided that describes interaction with the system from the point of view of the user-employee of the educational department and the user-student. A diagram of the physical database model is presented and a description of the purpose of the tables is given. The system uses two types of client applications: an administrative client for organizing the work of educational department employees and a Telegram bot for working on the students’ side. A scheme for working with user data when processing chatbot commands is defined in the IDEF0 notation. The choice of the interlocutor program as a communication tool was made based on the popularity of this technology. The administrative client is implemented in C# using Windows Forms technology, the chatbot is implemented in Python using the “schedule” time planning library, “time” working with time and “threading” multi-threading support.

    Keywords: chat bot, Telegram bot, messenger, message, mobile device, information system, database, computer program, application

  • Models of opportunistic behavior in electroenergetics

    The paper is dedicated to the modeling of opportunistic behavior in electroenergetics. We considered two setups: an optimal control problem from the point of view of a separate agent and a Stackelberg game of the controller with several agents. It is assumed that the agents may collude with the controller and to diminish the data about electroenergy consumption proportionaaly to the amount of bribe. The principal attention is paid to the numerical investigation of these problems basing on the method of qualitatively representative scenarios in simulation modeling. It is shown that using of a small number of the correctly chosen scenarios provides an acceptable qualitative precision of the forecast of systems dynamics. The numerical results are analyzed, and the recommendations on the struggle with corruption are formulated. An increase of the penalty coefficient in the case of catching of the controller taking "kickbacks" or an increase of her official reward makes the kickbacks not profitable.

    Keywords: opportunistic behavior, optimal control problem, simulation modeling, Stackelberg games

  • Comparison of different methods for filtering the results of spectroradiometric measurements

    This article provides a comparative analysis of various methods for filtering a signal obtained using a spectroradiometer. The following filtering methods were used in the study: moving average method, spline interpolation method, and Savitsky-Golay method. An Ocean Insight SR-2XR250-25 spectroradiometer was used as a spectral radiation receiver, and a white LED was used as a radiation source. Based on the results of the study, the most optimal filter for processing the results of spectral measurements of light sources was determined, which will be further used in the software of the goniospectroradiometer being developed.

    Keywords: spectral density of radiation, spectroradiometer, radiation receiver, radiation source, signal filtration methods

  • A set of methods for processing three-dimensional digital geoinformation based on a probabilistic and statistical approach

    The paper proposes a solution to geological problems using probabilistic and statistical methods. It presents the results of using spectral correlation data analysis, which involves the processing of digital geoinformation organized into three-dimensional regular networks. The possibilities of applying methods of statistical, spectral, and correlation analysis, as well as linear optimal filtering, anomaly detection, classification, and pattern recognition, are explored. Spectral correlation and statistical analysis of geodata were conducted, including the calculation of Fourier spectra, various correlation functions, and gradient characteristics of geofields.

    Keywords: interprofile correlation, self-adjusting filtering, weak signal detection, geological zoning and mapping, spatially distributed information

  • Development of a web application for data preprocessing using Python libraries

    The article discusses the development of data normalization and standardization tools using Python libraries. A description of the theoretical foundations and formulas used to normalize and standardize data is considered. For internal calculations of the developed software, the Pandas and NumPy libraries were used. The external interface was built on the basis of the Streamlit library, which allows you to deploy web applications without any additional resources. Code fragments are provided and implementation mechanisms are explained. A description of the developed tool is provided: a detailed explanation of the functionality of the tool, user interface and examples of use. The importance of data preprocessing, selection of an appropriate method, and final remarks on the usefulness of interactive data processing tools are discussed.

    Keywords: data processing, statistics, information systems, Python web systems.

  • Development analyzer of morphological composition of solid municipal waste

    For the design of automated sorting stations of solid municipal waste it is necessary to develop algorithms and devices that allow to determine the fractions of municipal solid waste (MSW) with the necessary detail. Currently, sorting stations have been created that allow to determine the basic morphological components of MSW, but the problem of in-depth detailing needs to be elaborated. The aim of the work is to develop an algorithm for the extraction of MSW fractions with the possibility of regulating the component composition of waste. A methodology for synthesizing a device for determining waste fractions is shown. It is proposed to use a finite sequence automaton as a sorting algorithm. The synthesis of logical equations on the basis of Moore's automaton is shown. Simulation of the device operation is carried out with the help of MULTISIM program. In the presence of certain sensors it is possible to realize this technique in practice. The results can be useful for the design of sorting stations of MSW . The results of the experiment demonstrated that with the help of sequence automaton synthesis technique it is possible to develop an analyzer for determining the refined waste fractions. For implementation in practice, it is necessary to have certain analyzers for determining the components of MSW, which can contribute to more detailed sorting of MSW in the design of sorting stations.

    Keywords: solid municipal waste, MSW, sorting, sequence automaton, Moore's automaton

  • Development of a dataset storage module for collision detection using polygonal mesh and neural networks

    This article is devoted to the development of a collision detection technique using a polygonal mesh and neural networks. Collisions are an important aspect of realistically simulating physical interactions. Traditional collision detection methods have certain limitations related to computational accuracy and computational complexity. A new approach based on the use of neural networks for collision detection with polygonal meshes is proposed. Neural networks have shown excellent results in various computer vision and image processing tasks, and in this context they can be effectively applied to polygon pattern analysis and collision detection. The main idea of ​​the technique is to train a neural network on a large data set containing information about the geometry of objects and their movement for automatic collision detection. To train the network, it is necessary to create a special module responsible for storing and preparing the dataset. This module will provide collection, structuring and storage of data about polygonal models, their movements and collisions. The work includes the development and testing of a neural network training algorithm on the created dataset, as well as assessing the quality of network predictions in a controlled environment with various collision conditions.

    Keywords: modeling, collision detection techniques using polygonal meshes and neural networks, dataset, assessing the quality of network predictions

  • Aspects of architecture and integration with external services of the project "PetsCosiness"

    The article discusses the concept of software implementation of complex tools on the platform "1C:Enterprise" for automating the accounting of the activities of shelters for homeless animals. The architecture of the solution is described, highlighting aspects of the functioning of the system’s integration modules with the social network “VKontakte” and the Telegram messenger. Diagrams of the sequence and activity of processes regarding the interaction of citizens with the key functionality of the system are presented.

    Keywords: animal shelter, homeless animals, 1C:Enterprise, automation, activity accounting, animals, software package, information system, Telegram bot, integration with VKontakte, pet search

  • A structural model of an interference-resistant system with orthogonal frequency division of channels using modular turbocodes of the residual class system

    Orthogonal Frequency Division Multiplexing –OFDM) multiplexing technology is quite promising in wireless communication systems. Simultaneous use of multiple subcarriers allows for a relatively high information transfer rate. The use of mathematical models of discrete wavelet transformations instead of the fast Fourier transform (hereinafter FFT), allows you to increase the speed of signal processing by using modular codes of residue classes (hereinafter MKV). At the same time, these codes can be used to increase the noise immunity of systems with OFDM. It is known that block turbo codes (hereinafter referred to as TC) are widely used to combat packets of errors that occur when transmitting signals over a communication channel. The article presents a developed method for constructing modular turbocodes based on a system of residual classes (hereinafter MTKSOC). Obviously, the use of MTCS entails changes in the structure of the system with OFDM. Therefore, the development of a method for constructing a modular turbo code of SOC and a structural model of an interference-resistant system with OFDM using MTXOC is an urgent task. The purpose of the article is to increase the level of noise immunity of systems with OFDM, using wavelet transformations implemented in MKV instead of FFT, through the use of modular turbo code SOC.

    Keywords: modular codes of residue classes, residual class system, modular turbo code of residual class system, error correction algorithm, structural model, multiplexing, orthogonal frequency division of channels

  • Assessing the probability of message duplication in an Industrial Internet of Things system

    Currently, Internet of Things technologies are actively used in manufacturing enterprises for remote monitoring and preventive control of technological processes. The article is devoted to the development of an original mathematical model of the process of transmitting information packets and confirmations in the Industrial Internet of Things system, the use of which allows us to assess the probability of duplication of messages sent to the production process control center. To develop the model, the mathematical apparatus of probabilistic graphs was used, which makes it possible to take into account all possible states of the simulated process and the probabilities of transitions from one state to another. The results of computational experiments showed that the use of the developed model makes it possible to justify the choice of the maximum number of retransmissions, in which the probability of message duplication does not exceed the specified permissible values ​​at the current level of bit errors.

    Keywords: industrial Internet of things, telemetry data, production process control, message duplication, retransmissions, bit error rate, sensor devices, server, probabilistic graph

  • The influence of data set expansion methods on the quality of training neural network models. Adaptive data set expansion approach

    The article analyzes the impact of transformation types on the learning quality of neural network classification models, and also suggests a new approach to expanding image sets using reinforcement learning.

    Keywords: neural network model, training dataset, data set expansion, image transformation, recognition accuracy, reinforcement learning, image vector

  • Review of Machine Learning Methods for Driver Behaviour Classification

    Detecting aggressive and abnormal driver behavior, which depends on a multitude of external and internal factors, is critically important for enhancing road safety. This article provides a comprehensive review of machine learning methods applied for driver behavior classification. An extensive analysis is conducted to assess the pros and cons of existing machine learning algorithms. Various approaches to problem formulation and solution are discussed, including supervised and unsupervised learning techniques. Furthermore, the review examines the diverse range of data sources utilized in driver behavior classification and the corresponding technical tools employed for data collection and processing. Special emphasis is placed on the analysis of Microelectromechanical Systems sensors and their significant contribution to the accuracy and effectiveness of driver behavior classification models. By synthesizing existing research, this review not only presents the current state of the field but also identifies potential directions for future research, aiming to advance the development of more robust and accurate driver behavior classification systems.

    Keywords: machine learning, driver classification, driver behavior, data source, microelectromechanical system, driver monitoring, driving style, behavior analysis

  • Methods for increasing the reliability of telecommunication systems in Turkmenistan

    This article explores methods for improving the reliability of telecommunication systems in Turkmenistan. The authors consider modern approaches to ensuring the stability and reliability of communication networks in the context of a rapidly changing technological environment. The article analyzes the main challenges faced by telecom operators in the country and proposes effective strategies to ensure the smooth operation of telecommunication systems. The results of the study allow us to identify key measures to improve the reliability of the communication infrastructure in Turkmenistan and ways to optimize user service processes.

    Keywords: communication infrastructure, trends, prospects, system reliability, mobile communications, evolution, 2G, 3G, 4G, network reliability

  • Method of building three-dimensional graphics based on distance fields

    This paper investigates the effectiveness of the distance fields method for building 3D graphics in comparison with the traditional polygonal approach. The main attention is paid to the use of analytical representation of models, which allows to determine the shortest distance to the objects of the scene and provides high speed even on weak hardware. Comparative analysis is made on the possibility of wide model detailing, applicability of different lighting sources, reflection mapping and model transformation. Conclusions are drawn about the promising potential of the distance field method for 3D graphics, especially in real-time rendering systems. It is also emphasized that further research and development in this area is relevant. Within the framework of this work, a universal software implementation of the distance fields method was realized.

    Keywords: computer graphics, rendering, 3D graphics, ray marching, polygonal graphics, 3D graphics development, modeling, 3D models

  • Forecasting the state of the vehicle sensor system

    The condition of a vehicle sensor system is an effective indicator used by many other vehicle systems. This article is devoted to the problem of choosing a forecasting method for vehicle sensors. Sensor data are considered as multivariate time series. The aim of the study is to determine the best forecasting model for the type of data under consideration. The LSTM neural network-based method and the VARMA statistical method were chosen for the analysis. These methods are preferred because of their ability to process multivariate series with complex relationships, their flexibility, which allows them to be used for series of varying lengths in a wide variety of scenarios, and the high accuracy of their results in numerous applications. The data and plots of computational experiments are provided, enabling the determination of the preferred option for both single-step and multistep forecasting of multivariate time series, based on the values of error metrics and adaptability to rapid changes in data values.

    Keywords: forecasting methods, forecast evaluation, LSTM, VARMA, time series, vehicle sensors system

  • Using virtual reality technology to develop an algorithm to stop bleeding

    This article identifies the main advantages and disadvantages of using VR simulators to improve the professionalism of employees when performing work at an enterprise (organization). An analysis of existing projects used in various industries was carried out. A description of the developed first aid project is presented. The developed simulator allows you to practice skills in eliminating bleeding in different parts of the body: arm, leg, neck. While working on the project, the main factors influencing the quality of the developed VR simulator were identified. Thus, it was found that VR simulators are not capable of fully simulating fine motor skills of the hands. In addition, the simulator has restrictions on the position of the body in space. Despite the identified shortcomings, the use of the simulator allows you to practice key skills in providing first aid.

    Keywords: virtual reality, VR simulator, personnel training, professional activity, first aid, information technology, modeling.

  • Development of a client-server application for constructing a virtual museum

    The article describes the methodology for developing a client-server application intended for constructing a virtual museum. The creation of the server part of the application with the functions of processing and executing requests from the client part, as well as the creation of a database and interaction with it, is discussed in detail. The client part is developed using the Angular framework and the TypeScript language; the three-dimensional implementation is based on the three.js library, which is an add-on to WebGL technology. The server part is developed on the ASP.NET Core platform in C#. The database schema is based on a Code-First approach using Entity Framework Core. Microsoft SQL Server is used as the database management system.

    Keywords: client-server application, virtual tour designer, virtual museum, three.js library, framework, Angular, ASP.NET Core, Entity Framework Core, Code-First, WebGL