Menu
TERMINOLOGICAL ANNOTATION OF THE DOCUMENT IN A RETRIEVAL CONTEXT ON THE BASIS...
By: Olga Nevzorova, Vladimir Nevzorov  (3073 reads)
Rating: (1.00/10)

Abstract: In this article the method of terminological annotation of mathematical documents which is used in a context of text mining (in particular, for RDF network developing of a collection of mathematical documents) is considered. Terminological annotation of mathematical documents is carried out on the basis of universal design technology for applied problems solving developed in ontolinguistic system "OntoIntegrator" under control of system of ontological models.

Keywords: Natural language processing, ontological models, terminological annotation

ACM Classification Keywords: H.3.1.Information storage and retrieval: linguistic processing

Link:

TERMINOLOGICAL ANNOTATION OF THE DOCUMENT IN A RETRIEVAL CONTEXT ON THE BASIS OF TECHNOLOGIES OF SYSTEM "ONTOINTEGRATOR"

Olga Nevzorova, Vladimir Nevzorov

http://foibg.com/ijitk/ijitk-vol05/ijitk05-2-p02.pdf

DATA ACQISITION SYSTEMS FOR PRECISION FARMING
By: Palagin et al.  (3159 reads)
Rating: (1.00/10)

Abstract: In the article it is described two structures of data acquisition systems, which are based on the family of portable devices "Floratest" and suitable for using in precision farming.

Keywords: Cautsky effect; chlorophyll; chlorophyll fluorescence induction; data acquisition system; fluorometer; portable device; precision farming.

ACM Classification Keywords: J.3 Life and Medical Sciences - Biology and Genetics.

Link:

DATA ACQISITION SYSTEMS FOR PRECISION FARMING

Oleksandr Palagin, Volodymyr Romanov, Igor Galelyuka, Vitalii Velychko, Volodymyr Hrusha, Oksana Galelyuka

http://foibg.com/ijitk/ijitk-vol05/ijitk05-2-p01.pdf

TWO APPROACHES TO ONE OPTIMIZATION PROBLEM IN RECOGNITION THEORY
By: Nataliya Katerinochkina  (3688 reads)
Rating: (1.00/10)

Abstract: In recognition theory a number of optimization problems appear. The search for the maximum solvable subsystem of the system of linear inequalities is one of these tasks. In this paper two approximation solution methods for given problem are proposed. The first method is based on the theory of nodal solutions of systems of linear inequalities. But the second algorithm principally differs from the first: it is based on some reasoning of geometric nature. The comparison of the represented methods is produced in the number of problems.

Keywords: recognition, optimization, system of linear inequalities, nodal solution

ACM Classification Keywords: G.1.6 Optimization – Constrained Optimization, G.1.3 Numerical Linear Algebra – Linear Systems, I.5 Pattern Recognition

Link:

TWO APPROACHES TO ONE OPTIMIZATION PROBLEM IN RECOGNITION THEORY

Nataliya Katerinochkina

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p09.pdf

COMPUTATIONAL MODEL FOR SERENDIPITY
By: A. Anguera, M.A. Diaz, A. Gutierrez  (3994 reads)
Rating: (1.00/10)

Abstract: In recent years there have been various attempts and studies that are eager to serendipity in Computer Science. Authors such as Campos and Dias have tried to model the serendipity in order to get serendipitous behaviors in Information Retrieval (IR). There have been attempts to introduce serendipity in Recommender Systems (RS), although the latter proposals have led efforts using metrics for measure Serendipity in those RS, rather than to emulate Serendipitous behaviors in recommender systems. However, so far there haven´t been succeeded in designing a model which can be applied to different Web browsing. The main problem we have detected analyzing the proposals in this field is that the solutions provided do not take into account the two aspects of the concept of serendipity. Do not consider that in addition to the accidental discovery of information that is not sought for, is also required some characteristics in the user like sagacity, perception, flexible thinking or intensive preparation. If we could develop a model that support any search engine or search tool, we would facilitate an incredible advantage to the user by offering specially information that the user is not focused upon. The aim of this paper is to propose a computational model that supports Serendipity and induce and facilitate serendipity through the use of a special-purpose designed system.

Keywords: Supporting serendipity, : Designing Serendipity Serendipia, Data Minig, Artificial intelligence, SOM, Models of Computation

ACM Classification Keywords: H.2.8 Data Mining, F.1.1 Models of Computation, I.2 Artificial intelligence

Link:

COMPUTATIONAL MODEL FOR SERENDIPITY

A. Anguera, M.A. Diaz, A. Gutierrez

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p08.pdf

DIFFERENTIAL EVOLUTION – PARTICLE SWARM OPTIMIZATION
By: Blas et al.  (4447 reads)
Rating: (1.00/10)

Abstract: This paper shows the Particle Swarm Optimization algorithm with a Differential Evolution. Each candidate solution is sampled uniformly in −5,5 D, whereDdenotes the search space dimension, and the evolution is performed with a classical PSO algorithm and a classical DE/x/1 algorithm according to a random threshold.

Keywords: Benchmarking, Black-box optimization, Direct search, Evolutionary computation, Particle Swarm Optimizacin, Differential Evolution

Categories:G.1.6 Numerical Analysis: Optimization-global optimization, unconstrained optimization; F.2.1 Analysis of Algorithms and Problem Complexity: Numerical Algorithms and Problems.

Link:

DIFFERENTIAL EVOLUTION – PARTICLE SWARM OPTIMIZATION

Nuria Gómez Blas, Alberto Arteta, Luis F. de Mingo

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p07.pdf

HISTOLOGY IMAGE SEGMENTATION
By: Cisneros et al.  (3321 reads)
Rating: (1.00/10)

Abstract: In this article, a technique for the segmentation of the components of histological images will be explained. To be able to do a study about the various microscopic components of the animal tissues and to reach to the histological images; first it is tried to be obtained some sections of the tissues and then; dye them in accordance with the different components which wanted to be studied. The image-analysis is a statistical work and most of the time, the data which is reached in the end, depends on the observer who is carrying out the study. Keywords: Image Processing, Image Segmentation

ACM Classification Keywords: I.4.3 Image Processing and computer vision – Enhancement, J.3 Biology and Genetics

Link:

HISTOLOGY IMAGE SEGMENTATION

Francisco J. Cisneros, Paula Cordero, Alejandro Figueroa, Juan Castellanos

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p06.pdf

EVALUATION OF GREEDY ALGORITHM OF CONSTRUCTING (0,1)-MATRICES WITH DIFFERENT ...
By: Hasmik Sahakyan, Levon Aslanyan  (3080 reads)
Rating: (1.00/10)

Abstract: An approximation greedy algorithm is considered for reconstruction of (0,1)-matrices with different rows. Numbers of pairs of different rows is taken up as a quantitative characteristic, maximization of which, when appropriate, leads to matrices with different rows. Properties of the algorithm are studied and the performance is evaluated based on series of experiments.

Keywords: (0,1)-matrices, greedy algorithms.

ACM Classification Keywords: F.2.2 Nonnumerical Algorithms and Problems: Computations on discrete structures.

Link:

EVALUATION OF GREEDY ALGORITHM OF CONSTRUCTING (0,1)-MATRICES WITH DIFFERENT ROWS1

Hasmik Sahakyan, Levon Aslanyan

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p05.pdf

METHOD FOR EVALUATING OF DISCREPANCY BETWEEN REGULARITIES SYSTEMS IN ...
By: Senko et al.  (4073 reads)
Rating: (1.00/10)

Abstract: A new method of data analysis is discussed. Goal of represented techniques is complete and statistically valid comparing of regularities existing in two different groups of objects. It is supposed that regularities that tie levels of forecasted and explanatory variables are searched with the help of optimal partitioning technique. The developed technique was applied for analysis of genetic factors impact on severity of discirculatory encephalopathy (DEP). At the first stage computer method for evaluating of DEP severity was developed with the help of pattern recognition techniques. It was revealed that computer estimates of severity rather strongly correlate with DD variant of gene coding angiotensin-converting enzyme (ACE). Systems of regularities that ties various clinical or biochemical factors with computer estimates of severity were found with the help of optimal valid partitioning ( OVP) method in groups of patients with three different variants of gene coding ACE. Statistically significant discrepancies were found between revealed regularities systems with the help of developed methods of comparing.

Keywords: Optimal partitioning, statistical validity, permutation test, regularities, explanatory variables effect,pattern recognition, discirculatory encephalopathy, genetic factors

ACM Classification Keywords: H.2.8 Database Applications - Data mining, G.3 Probability and Statistics -Nonparametric statistics, Probabilistic algorithms

Link:

METHOD FOR EVALUATING OF DISCREPANCY BETWEEN REGULARITIES SYSTEMS IN DIFFERENT GROUPS

Oleg Senko, Anna Kuznetsova, Natalia Malygina, Irina Kostomarova

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p04.pdf

REGIONS OF SUFFICIENCY FOR METRICAL DATA RETRIEVAL
By: Mashtalir et al.  (3170 reads)
Rating: (1.00/10)

Abstract: In this paper the fast metrical search for large scaled and poor structured databases using objects eliminating from the consideration without calculating the distance between them and query is considered and grounded. This search is based on the pre-calculated distances between pivot points and database objects, and triangular inequality as the base for the objects elimination without calculating distances between them. For that the sufficient and necessary conditions of the objects elimination are explored and mathematical foundation for the sufficiency and necessity regions in the metric space are given.

Keywords: Metrical Retrieval, pivot point, sufficiency

ACM Classification Keywords: H3.3.3 Information Search and retrieval.

Link:

REGIONS OF SUFFICIENCY FOR METRICAL DATA RETRIEVAL

Vladimir Mashtalir, Konstantin Shcherbinin, Vladislav Shlyakhov, Elena Yegorova

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p03.pdf

EVOLVING CASCADE NEURAL NETWORK BASED ON MULTIDIMESNIONAL EPANECHNIKOV’S ...
By: Bodyanskiy et al.  (4331 reads)
Rating: (1.00/10)

Abstract: At present time neural networks based on Group Method of Data Handling (GMDH-NN), nodes of which are two-input N-Adalines?, is well-known. Each of N-Adalines? contains the set of adjustable synaptic weights that are estimated using standard least squares method and provides quadratic approximation of restoring nonlinear mapping. On the other hand, for needed approximation quality ensuring this NN can require considerable number of hidden layers. Approximating properties of GMDH-NN can be improved by uniting the approaches based on Group Method of Data Handling and Radial-Basis-Functions? Networks that have only one hidden layer, formed by, so-called, R-neurons. Such networks learning reduces, as a rule, to the tuning of synaptic weights of output layer that are formed by adaptive linear associators. In contrast to neurons of multilayer structures with polynomial or sigmoidal activation functions R-neurons have bell-shaped activation functions. In this paper as activation functions multidimensional Epanechnikov’s kernels are used. The advantage of activation function is that its derivatives are linear according all the parameters that allows to adjust sufficiently simply not only synaptic weights but also centers with receptive fields. Proposed network combines Group Method of Data Handling, Radial-Basis-Functions? Networks and cascade networks and isn’t inclined to the “curse of dimensionality”, is able to real time mode information processing by adapting its parameters and structure to problem conditions. The multidimensional Epanechnikov’s kernels were used as activation functions, that allowed to introduce numerically simple learning algorithms, which are characterized by high speed.

Keywords: evolving neural network, cascade networks, radial-basis neural network, Group Method of Data Handling, multidimensional Epanechnikov’s kernels.

ACM Classification Keywords: F.1 Computation by abstract devices – Self-modifying machines (e.g., neural networks), I.2.6 Learning – Connectionism and neural nets, G.1.2 Approximation – Nonlinear approximation.

Link:

EVOLVING CASCADE NEURAL NETWORK BASED ON MULTIDIMESNIONAL EPANECHNIKOV’S KERNELS AND ITS LEARNING ALGORITHM

Yevgeniy Bodyanskiy, Paul Grimm, Nataliya Teslenko

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p02.pdf

DESIGN, IMPLEMENTATION, AND TESTING OF A MINIATURE SELF-STABILIZING CAPSULE ...
By: Filip et al.  (4311 reads)
Rating: (1.00/10)

Abstract: Video capsule endoscopy (VCE) enables examination of the small intestine. In large-lumen organs of the gastrointestinal (GI) tract (e.g. the stomach and the colon), the capsule tumbles around and therefore cannot image systematically. This limitation underscores the need for a novel approach that allows capsule imaging of these organs without tumbling. This paper describes the design and implementation of a self-stabilizing capsule prototype for colonic imaging. The present design consists of a capsule endoscope (CE) coupled to an expandable stabilizing component comprising a liquid-permeable sac filled with dry superabsorbent polymer granules (swellable material) integrally covered by an outer colon-targeting coating. Once the capsule enters the colon, the outer coating dissolves and this allows the expansion of the swellable material attached to the back side of the capsule endoscope. This volumetric increase of the expandable component provokes peristalsis by activating colonic mass reflex. The expanded end of the capsule stabilizes the entire implement, preventing it from tumbling, and at the same time activating the capsule endoscope. Once activated, the capsule begins to record images and to transmit them to an external receiver, which records the data to a computer. As a safety measure, the expandable component can be electronically separated from the capsule at any time. The capsule is eventually expelled out of the body as a fecal matter. A prototype of the self-stabilizing capsule has been developed conceptually and electronically to fulfill the requirements of image stabilization while coping with the restrictions of commercial CEs. Self-stabilizing CEs and non-stabilized CEs were comparatively tested in laboratory and canine experiments. The self-stabilizing CE eliminated the tumbling effect and demonstrated its potential to greatly improve colonic imaging.

Keywords: Colon, Gastrointestinal Tract, Video Capsule Endoscopy, Wireless Image Transmission,

ACM Classification Keywords: A.0 General Literature - Conference proceedings; B.7.1. – Advanced Technologies.

Link:

DESIGN, IMPLEMENTATION, AND TESTING OF A MINIATURE SELF-STABILIZING CAPSULE ENDOSCOPE WITH WIRELESS IMAGE TRANSMISSION CAPABILITIES.

Dobromir Filip, OrlyYadid-Pecht?, Christopher N. Andrews, and Martin P. Mintchev

http://foibg.com/ijitk/ijitk-vol05/ijitk05-1-p01.pdf

CROSS INTERSECTION SEQUEL OF DISCRETE ISOPERIMETRY1
By: Levon Aslanyan, Vilik Karakhanyan  (3579 reads)
Rating: (1.00/10)

Abstract: This work inspired by a specifically constrained communication model. Given collections of communicating objects, and communication is by means of several relay centres. The complete cross connectivity of elements of different collections is the target, supposing that communicating objects differ by their connections to the relay centres. Such models exist only for proper object groups – when they have specific sizes and there is a corresponding number of relay points. We consider optimization problems studying the validity boundaries. Terms are combinatorial – geometry of binary cube, lexicographical orders, shadowing and isoperimetry. The main interest is methodological and aims at extending the consequences that can be delivered from the solution of the well known discrete isoperimetry problem.

Keywords: communication, optimization, isoperimetry.

ACM Classification Keywords: G.2.1 Discrete mathematics: Combinatorics

Link:

CROSS INTERSECTION SEQUEL OF DISCRETE ISOPERIMETRY1

Levon Aslanyan, Vilik Karakhanyan

http://www.foibg.com/ijita/vol18/ijita18-3-p01.pdf

ADAPTIVE NEURO-FUZZY KOHONEN NETWORK WITH VARIABLE FUZZIFIER
By: Bodyanskiy et al.  (3732 reads)
Rating: (1.00/10)

Abstract: The problem of neuro-fuzzy Kohonen network self-learning with fuzzy inference in tasks of clustering in conditions of overlapped classes is considered. The basis of the approach are probabilistic and possibilistic methods of fuzzy clustering. The main distinction of the introduced neuro-fuzzy network is the ability to adjust the values of fuzzifier and synaptic weights in on-line mode, as well as the presence except convenience Kohonen layer an additional layer to calculate the current values of the membership levels. The network characterized of computational simplicity, and is able to adapt to data uncertainty and detect new clusters appearance in real time. The experimental results confirm effectiveness of the approach developed.

Keywords: clustering, neuro-fuzzy network, self-learning algorithm, self-organizing Kohonen map.

ACM Classification Keywords: I.2.6 Artificial Intelligence - Learning - Connectionism and neural nets

Link:

ADAPTIVE NEURO-FUZZY KOHONEN NETWORK WITH VARIABLE FUZZIFIER

Yevgeniy Bodyanskiy, Bogdan Kolchygin, Iryna Pliss

http://www.foibg.com/ijita/vol18/ijita18-3-p02.pdf

CORRELATION MAXIMIZATION IN REGRESSION MODELS BASED ON CONVEX COMBINATIONS
By: Oleg Senko, Alexander Dokukin  (3749 reads)
Rating: (1.00/10)

Abstract: A new regression method based on convex correcting procedures over sets of predictors is developed. In contrast to previously developed approach based on minimization of generalized error, the proposed one utilies correcting procedures of maximal correlation with the target value. In the proposed approach a concept of a set of predictors irreducible against target functional is used where irreducibility is understood as lack of combinations of at least the same value of the functional after removing any of its predictors. Sets of combinations simultaniously irreducilbe and unexpandable are used during the construction of a prognostic rule. Results of some computational experiments described in the present article show an efficiency comparison between the two approaches.

Keywords: forecasting, bias-variance decomposition, convex combinations, variables selection.

ACM Classification Keywords: G.3 Probability and Statistics - Correlation and regression analysis, Statistical computing.

Link:

CORRELATION MAXIMIZATION IN REGRESSION MODELS BASED ON CONVEX COMBINATIONS

Oleg Senko, Alexander Dokukin

http://www.foibg.com/ijita/vol18/ijita18-3-p03.pdf

NEURAL NETWORK SEGMENTATION OF VIDEO VIA TIME SERIES ANALYSIS
By: Kinoshenko et al.  (3793 reads)
Rating: (1.00/10)

Abstract: Semantic video retrieval which deals with unstructured information traditionally relies on shot boundary detection and key frames extraction. For content interpretation and for similarity matching between shots, video segmentation, i.e. detection of similarity-based events, are closely related with multidimensional time series representing video in a feature space. Since video has a high degree of frame-to-frame-correlation, semantic gap search is quite difficult as it requires high-level knowledge and often depends on a particular domain application. Based on principal components analysis a method of video disharmony authentication has been proposed. Regions features induced by traditional frame segmentations have been used to detect video shots. Results of experiments with endoscopic video are discussed.

Keywords: Video Data, Frames, Time series segmentation, Principal component

ACM Classification Keywords: I.2.10 Vision and Scene Understanding (Video analysis), G.3 Probability and Statistics (Time series analysis).

Link:

NEURAL NETWORK SEGMENTATION OF VIDEO VIA TIME SERIES ANALYSIS

Dmitry Kinoshenko, Sergey Mashtalir, Andreas Stephan, Vladimir Vinarski

http://www.foibg.com/ijita/vol18/ijita18-3-p04.pdf

ASTRONOMICAL PLATES SPECTRA EXTRACTION OBJECTIVES AND POSSIBLE SOLUTIONS ...
By: Knyazyan et al.  (4125 reads)
Rating: (1.00/10)

Abstract: The process of spectra extraction into catalogs from astronomical images, its difficulties and usage on the Digitized First Byurakan Survey (DFBS) plates are presented. The DFBS is the largest and the first systematic objective prism survey of the extragalactic sky. The large amount of photometric data is useful for variability studies and revealing new variables in the observed fields. New high proper motion stars can be discovered by a comparison of many observations of different observatories having large separation in years. The difficulty of DFBS images extraction is that extraction tools and programs are not adapted for such kind of plates. Astronomical images extraction process with usage of the Source Extractor (SE) tool is presented in this paper. The specificity of DFBS plates is that objects are presented in low-dispersion spectral form. It does not allow extraction tools to detect the objects exact coordinates and there is need of coordinates’ correction. Apart this, it is required to configure SExtractor for current type of the plates so, that the output results be as close to real as possible. The extraction of DFBS plates will allow the creation of astronomical catalogs’ database, which can be cross-correlated with known catalogs for investigation of the changes on sky during the years.

Keywords: IVOA, ArVO, DFBS, Plates extraction, SExtractor, VizieR, Astronomical catalogs

ACM Classification Keywords: I.4.1 Imaging geometry, Scanning I.4.3 Geometric correction, H.2.8 Data Mining, Scientific databases.

Link:

ASTRONOMICAL PLATES SPECTRA EXTRACTION OBJECTIVES AND POSSIBLE SOLUTIONS IMPLEMENTED ON DIGITIZED FIRST BYURAKAN SURVEY (DFBS) IMAGES

Aram Knyazyan, Areg Mickaelian, Hrachya Astsatryan

http://www.foibg.com/ijita/vol18/ijita18-3-p05.pdf

MEMBRANES DISTRIBUTION USING GENETIC ALGORITHMS
By: Miguel Ángel Peña, Juan Castellanos  (3662 reads)
Rating: (1.00/10)

Abstract: Membrane computing is an area of natural computing, which solves NP-complete problems simulating permeability of live cells membranes. Different researchers have developed architectures to distribute membranes in clusters. They have studied, at theoretical level, the system behavior and the minimum time it would take to executing. In this paper proposes the use of genetic algorithms to distribute membranes in processors, thanks to their evolving capacities, they achieve distributions better than random distribution. Theoretical results are compared with a set of examples, noting improvement that genetic algorithms produce on these systems and how architectures are beneficial from execution viewpoint.

Keywords: Distributed Communication, Membrane Computing, Membrane Dissolution, P-Systems? Architectures, Genetic Algorithm

ACM Classification Keywords: F.1.2 Modes of Computation, I.6.1 Simulation Theory, H.1.1 Systems and Information Theory, C.2.4 Distributed Systems

Link:

MEMBRANES DISTRIBUTION USING GENETIC ALGORITHMS

Miguel Ángel Peña, Juan Castellanos

http://www.foibg.com/ijita/vol18/ijita18-3-p06.pdf

VIRTUAL MEMBRANE SYSTEMS
By: Arteta et al.  (2948 reads)
Rating: (1.00/10)

Abstract: Within the membrane computing research field, there are many papers about software simulations and a few about hardware implementations. In both cases, algorithms for implementing membrane systems in software and hardware that try to take advantages of massive parallelism are implemented. P-systems are parallel and non deterministic systems which simulate membranes behavior when processing information. This paper presents software techniques based on the proper utilization of virtual memory of a computer. There is a study of how much virtual memory is necessary to host a membrane model. This method improves performance in terms of time.

Keywords: P-systems, Parallel systems, Natural Computing, evolution rules application, set of patterns, Virtual structure.

ACM Classification Keywords: D.1.m Miscellaneous – Natural Computing

Link:

VIRTUAL MEMBRANE SYSTEMS

Alberto Arteta, Angel Castellanos, Nuria Gómez

http://www.foibg.com/ijita/vol18/ijita18-3-p07.pdf

HIGAIA METHODOLOGY
By: A. Anguera, A. Gutierrez, M.A. Diaz  (4851 reads)
Rating: (1.00/10)

Abstract: At present there is a deficiency in the field of scientific theories that support software development. On the other hand, the few existing scientific theories do not provide methodological support for all phases in software development. It is necessary to combine both aspects and develop a methodology, supported by a scientific theory, which extends these methodological support to all phases of software life cycle. The proposed Software Development methodology combines Holons and Informons theory with GAIA, a well known methodology in the field of MultiaAgent? Systems (MAS). The elements defined in scientific theory are used in the description of the software development phases included in GAIA, extending them to complete the software life cycle. Analysis, Architectural Design and Detailed Design phases of GAIA have been completed with Requirements Elicitation and Implementation phases, the latter based on the AUML standard. In this way we obtain a complete methodology supported by a scientific theory that allows develop software systems based on Holonic Integrated Systems (HIS).

Keywords: HIGAIA,HIS, HIP, Holon, Models, MAS, Software Development Methods.

ACM Classification Keywords: C.2.4 Dsitributed Systems – Distributed applications, D.2.11 Software Architectures-Languages?

Link:

HIGAIA METHODOLOGY

A. Anguera, A. Gutierrez, M.A. Diaz

http://www.foibg.com/ijita/vol18/ijita18-3-p08.pdf

DIGITAL ARCHIVE AND MULTIMEDIA LIBRARY FOR BULGARIAN TRADITIONAL CULTURE ...
By: Pavlov et al.  (3141 reads)
Rating: (1.00/10)

Abstract: In this paper we present investigation of methods and techniques for digitization and security in digital folklore archive - an archive that consists of unique folklore artifacts stored and annotated in the National center for non-material cultural heritage, Institute of Folklore, Bulgarian Academy of Sciences. The research is separated in several basic aspects. First we investigate techniques for digitization of different multimedia types - text, images, audio and video. We use this research to selected collections of artifacts. Second we describe several methods applied for securing the intellectual property and authors’ rights. These include digital watermarking and error-correcting codes. The paper also presents the functional specification, implementation and testing procedures of the Bulgarian folklore digital library, where the digital folklore archive is kept.

Keywords: multimedia digital libraries, digital archive, systems issues, user issues, online information services, watermarking.

ACM Classification Keywords: H.3.5 Online Information Services – Web-based services, H.3.7 Digital Libraries – Collection, Dissemination, System issues, K.6.5 Security and Protection.

Link:

DIGITAL ARCHIVE AND MULTIMEDIA LIBRARY FOR BULGARIAN TRADITIONAL CULTURE AND FOLKLORE

Radoslav Pavlov, Galina Bogdanova, Desislava Paneva-Marinova?, Todor Todorov, Konstantin Rangochev

http://www.foibg.com/ijita/vol18/ijita18-3-p09.pdf

CORRELATION ANALYSIS OF EDUCATIONAL DATA MINING BY MEANS A POSTPROCESSOR’S...
By: Teodorov et al.  (3088 reads)
Rating: (1.00/10)

Abstract: The paper deals with the correlation analysis as educational data technique that is easy to interpret and simple to implement. Two datasets respectively from environment for knowledge testing and for exercise tasks modelling testing are gathered. Programming of tasks for test parameters relationships, test reliability, cheat recognition, and test validation in a specialized postprocessor tool is discussed .

Keywords: data mining, correlation analysis, dataset, test reliability, test validity, cheap recognition, postprocessor tool

ACM Classification Keywords: Computer and Information Science Education, Knowledge Representation

Link:

CORRELATION ANALYSIS OF EDUCATIONAL DATA MINING BY MEANS A POSTPROCESSOR’S TOOL

Georgi Teodorov, Oktay Kir, Irina Zheliazkova

http://www.foibg.com/ijita/vol18/ijita18-3-p10.pdf

ELECTION DATA VISUALIZATION
By: Elena Long, Vladimir Lovitskii , Michael Thrasher  (3231 reads)
Rating: (1.00/10)

Abstract: Data visualization has direct link to data interface, data capture, data analysis, and data presentation. At the present time there is still a huge gap between our ability to extract answers and our ability to present the information in meaningful ways. There is consensus that future breakthroughs will come from integrated solutions that allow end users to explore data using graphical metaphors - the goal is to unify data mining algorithms and visual human interfaces. The main purpose of our paper is to discuss one approach to that “breakthrough”. The paper uses data from recent UK parliamentary elections to illustrate the approach.

Keywords: natural interface, data visualization, graphical interface

ACM Classification Keywords: I.2 Artificial intelligence: I.2.7 Natural Language Processing: Text analysis.

Link:

ELECTION DATA VISUALIZATION

Elena Long, Vladimir Lovitskii , Michael Thrasher

http://www.foibg.com/ijita/vol18/ijita18-2-p05.pdf

COGNITIVE MODELLING AS THE INSTRUMENT IN THE COURSE OF KNOWLEDGE OF LARGE SYSTEM
By: Galina Gorelova  (3432 reads)
Rating: (1.00/10)

Abstract: In this report we observe the possibilities offered by cognitive methodology of modeling of complex systems (social and economic, sociotechnical) and the developed software from positions of process of knowledge of complex object, and also extraction of different aspects of knowledge from the data about an object. The maintenance and program of researching of complex systems are set in the form of model of a metaset of the researching system, which distinctive feature is the description not only of big system and its interaction with environment, but also introduction in a metaset of "observer" that allows to build methodology of research and decision-making taking into account development of process of knowledge of object in consciousness of the researcher. Generally the model of the complex system is under construction in the form of hierarchical dynamical cognitive model. The mathematical model is exposed to formal researches. Connectivity, complexity, controllability, stability, sensitivity, adaptability and other properties of model on which the conclusion about presence (absence) of similar properties at studied big system becomes are analyzed. In the course of research self-training of the analyst ("observer") takes place by using developed toolkit for extraction of knowledge of object and decision-making.

Keywords: The expert, extraction of knowledge, cognitive, complex system, model, behavior, structure, decision-making, information technology.

ACM Classification Keywords: I.2.0 General - Cognitive simulation

Link:

COGNITIVE MODELLING AS THE INSTRUMENT IN THE COURSE OF KNOWLEDGE OF LARGE SYSTEM

Galina Gorelova

http://www.foibg.com/ijita/vol18/ijita18-2-p04.pdf

CONCEPTUAL KNOWLEDGE MODELING AND SYSTEMATIZATION ON THE BASIS OF NATURAL ...
By: Bondarenko et al.  (3528 reads)
Rating: (1.00/10)

Abstract: Knowledge management is aimed at the sustainable development and competitiveness increasing of an organization, a state, a human. The appropriateness of knowledge application for solving a new class of complete ill-structured qualitative problems in weak-structured domains is noted, for example, of many problems in the social (organizational) systems, ecological systems; for the information society creation; creating and implementing new informational technologies; for improving the management and many others. To solve such problems is also needed a preliminary information-analytical phase, of taking into account the semantics of information and the application of new effective system methodology - systemology, which corresponds to the new noospheric stage of science development. The sections of this work are meaningfully combined, in the first place, by using the new method of systemological classification analysis for the knowledge systematization and conceptual models creation, taking into account the criteria of natural classification. The method of systemological classification analysis, for example, allows obtaining new deep knowledge and systematizing the knowledge in any domain the most adequately and objectively, taking into account the essential properties and relations. Using the systemological classification analysis allows evaluating any knowledge classification, to take into account the objects essential properties and relations; to predict the new objects on the base on their properties. New constructive criteria of natural classification allow creating "correct" classifications, which in all spheres of application ensure the effectiveness of solving problems. The examples of systemology and systemological classification analysis application are proposed for creating the domain ontological models - social networks, change management, human needs, directly related to knowledge management, as well as during the development the online store products and services catalog. When using systemology in change management, the organizations can obtain weighty benefits. The application of systemological classification analysis in social networks allows increasing the effectiveness of their functioning based on the use of knowledge systematization (through the development of an effective system of functions and menus). All this will help the company to increase significantly its intellectual capital without using large investments.

Keywords: knowledge systematization, natural classification, ontology, systemological classification analysis, conceptual knowledge, conceptual modeling, deep knowledge, knowledge management, social network in Internet, change management, hierarchy, systemology, artificial intelligence, Protégé, context diagram.

ACM Classification Keywords: 1.2 Artificial Intelligence – 1.2.6 Learning: Knowledge Acquisition

Link:

CONCEPTUAL KNOWLEDGE MODELING AND SYSTEMATIZATION ON THE BASIS OF NATURAL CLASSIFICATION

Mikhail Bondarenko, Nikolay Slipchenko, Kateryna Solovyova, Andrey Danilov, Ruslan Kovalchuk, Irina Shcurenko

http://www.foibg.com/ijita/vol18/ijita18-2-p03.pdf

GOD-ICS. ON FUNDAMENTAL INFORMATION FIELD QUEST
By: Vitaliy Lozovskiy  (5283 reads)
Rating: (1.00/10)

Abstract: Further progress in AI research requires more complete and comprehensive study of information interactions in nature, not confined to psyche and intellect of individuals. One should not ignore evidences in favor of "unconventional" information interactions. The paper deals with two aspects of this research: from the viewpoint of physics of the microworld, and improving accuracy and correctness of the experimental studies. Is introduced the concept of "natural science" God - God-ICS. Are examined some arguments in favor of fundamental information field existence. Is considered the concept of non-locality, introduced in quantum mechanics, “spooky action at a distance” and experiments admittedly demonstrating their reality. Proposed is an idea of the RNG-controlled two-slit experiment. The relation between reality and modeling it theories is specified, which quantum mechanics still manages to safely get around. Critically considered is one of the well-documented experiments on registration of psychic phenomena, identified is the need for careful research and parameters selection for control random sequences. Ignoring this aspect may lead to erroneous conclusions regarding the "detection" of phenomena on the verge of accuracy and reliability of measurements, uncertainties in their statistical representativeness. Proposals for further research in this area are formulated.

Keywords: philosophy, noosphere, esotericism, intangible world, mystic theories, consciousness, mind-matter interaction, quantum mechanics, nonlocality.

ACM Classification Keywords: A.0 General Literature - Conference proceedings, G.3 PROBABILITY AND STATISTICS - Random number generation, H.1.1 Systems and Information Theory (E.4), I.2 ARTIFICIAL INTELLIGENCE - I.2.0 General: Cognitive simulation, Philosophical foundations, I.6 SIMULATION AND MODELING (G.3), I.6.5 Model Development - Modeling methodologies, J.2 PHYSICAL SCIENCES AND ENGINEERING - Physics

Link:

GOD-ICS. ON FUNDAMENTAL INFORMATION FIELD QUEST

Vitaliy Lozovskiy

http://www.foibg.com/ijita/vol18/ijita18-2-p02.pdf

[prev]  Page: 31/66  [next]
1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31  32  33  34  35  36  37  38  39  40  41  42  43  44  45  46  47  48  49  50  51  52  53  54  55  56  57  58  59  60  61  62  63  64  65  66 
World Clock
Powered by Tikiwiki Powered by PHP Powered by Smarty Powered by ADOdb Made with CSS Powered by RDF powered by The PHP Layers Menu System
RSS Wiki RSS Blogs rss Articles RSS Image Galleries RSS File Galleries RSS Forums RSS Maps rss Calendars
[ Execution time: 0.16 secs ]   [ Memory usage: 7.57MB ]   [ GZIP Disabled ]   [ Server load: 0.38 ]