Menu
ИНФОРМАЦИОННЫЕ МАШИНЫ: НЕКОТОРЫЕ КАТЕГОРИИ
By: Мержвинский Анатолий Александро  (3110 reads)
Rating: (1.00/10)

Abstract: Отмечено, что развитие теории информатики пока не привело к единству определений отличающегося наибольшим объемом понятия информационные машины (ИМ). Цель статьи – выделение категорий актуальных компонент и разработка обобщенной схемы ИМ, овершенствование онтологии. На основе анализа взаимодействий объектов приводятся концепция структуры, основные компоненты и определение ИМ как родового понятия компьютерных и информационных систем. Показано, что для всех типов взаимодействий компонент ИМ характерным есть феномен присутствия носителей взаимодействий в виде вещественных или энергетических потоков, названных коммуникатами. В концепцию структуры ИМ введено универсальное понятие оперант: реализатор операций любого уровня, начиная от простейших - операций с коммуникатами - до самых сложных - со знаниями. По аналогии с понятиями «пиксель», воксель и др. введено и определено понятие иксели - простейшие материальные элементы, реализующие преобразование коммуникатов в информационные объекты и наоборот, а также выполняющие операции фиксации, хранения, отображения и передачи информации. По аналогии с икселем определена универсальная структура: агрегат "физический объект – информационный объект», кратко ФИОб. Множество агрегатов представлено диаграммой доменов объектов категорий R материального мира и отражений материальных объектов и ментальной деятельности. Завершенные акты взаимодействия коммуникантов, по аналогии с логической единицей работы с данными, определены как транзакции. В соответствии с ипостасью агрегата, материальной R или информационной, определены категории транзакций как , классы информационных (содержащих I-объекты), неинформационных машин (не содержащих I-объекты) и компонент ИМ.

Ключевые слова: ИНФОРМАЦИОННАЯ МАШИНА, ФУНКЦИОНАЛЬНАЯ ЦЕПЬ, ВЗАИМОДЕЙСТВИЯ, КОММУНИКАТ, ОПЕРАНТ.

ACM Classification Keywords: Theory of the Information. Philosophy and Methodology of Informatics.

Link:

ИНФОРМАЦИОННЫЕ МАШИНЫ: НЕКОТОРЫЕ КАТЕГОРИИ ФУНКЦИЙ И КОМПОНЕНТ

Мержвинский Анатолий Александрович

http://www.foibg.com/ijita/vol20/ijita20-01-p04.pdf

TOWARDS A NOVEL DESIGN OF CREATIVE SYSTEMS
By: Vladimir Jotsov  (3368 reads)
Rating: (1.00/10)

Abstract:The topic of the presented investigation is the automation of creative processes via one Synthetic Mata Method (SMM) and few analytic methods considered in the paper and used under SMM control. The prevention of contemporary web threats is discussed at an agent/application level. Advantages and disadvantages of synthetic data mining methods are investigated, and obstacles are revealed to their application in contemporary systems. Novel results for juxtaposing statistical vs. logical data mining methods aiming at possible evolutionary fusions are described. Recommendations are made on how to build more effective applications of classical and/or presented novel (meta) methods: SMM, KALEIDOSCOPE, FUNNEL, PUZZLE, and CONTRADICTION. The usage of ontologies is investigated with the purpose of information transfer by sense. Practical aspects of agent plications, intrusion detection, intrusion prevention, cryptography applications, multiple software and other research results are mentioned aiming to show that intelligent and classical technologies should be carefully combined in one software/hardware complex to achieve the creative goals. It is shown that all the demonstrated advantages may be combined with other known methods and technologies.

Keywords: automation of creative processes, human-machine brainstorming methods, knowledge discovery, data mining, web mining, ontology, information security systems, intrusion detection, intrusion prevention, human-centered systems, knowledge management, agent, collective evolution.

Link:

TOWARDS A NOVEL DESIGN OF CREATIVE SYSTEMS

Vladimir Jotsov

http://www.foibg.com/ijita/vol20/ijita20-01-p03.pdf

RISKS IN USING BIBLIOMETRIC INDICATORS FOR PERFORMANCE EVALUATION OF SCIENTISTS
By: Douhomir Minev  (3014 reads)
Rating: (1.00/10)

Abstract:The issues being discussed in this article are the consequences of the use of specific (journal – or article - and researcher-based) metrics (“bibliometric indices”) for assessment of the performance of scientists and research proposals.The analysis is focused on the potential of the use of such indices to operate as a mechanism for control over the production of knowledge.The methodology is based on the complexity of relationships between sciences as systems for production of knowledge and their surrounding social environment. In these interactions arise motives for control and impact over knowledge production. The effects of these motive s are expanding mechanisms for control over sciences and the knowledge they produce. The impact of the control mechanisms distorts knowledge and co-generates non-knowledge. When societies use distorted knowledge they face expansion of the so called “new risks”.On this basis “bibliometric indices” are identified as co mponents of larger (in many cases - supranational) system for control over knowledge production (sciences’ dyna mics) and as generators of distorted knowledge and unexpected and negative consequences (new risks) for societies.

Keywords: control (over sciences and knowledge); crisis of sciences; social knowledge; distorted knowledge, non-knowledge; new risks; bibliometric; academic assessment.

Link:

RISKS IN USING BIBLIOMETRIC INDICATORS FOR PERFORMANCE EVALUATION OF SCIENTISTS

Douhomir Minev

http://www.foibg.com/ijita/vol20/ijita20-01-p02.pdf

USEFULNESS OF SCIENTIFIC CONTRIBUTIONS
By: Krassimir Markov, Krassimira Ivanova, Vitalii Velychko   (3814 reads)
Rating: (1.00/10)

Abstract:The prevailing role of counting citations over the added scientific value evaluating distorts the scientific society. As result, the scientific work becomes a kind of business, for instance, to obtain as more citations as possible. It is important to counterbalance the role of coun ting citations by using additional qualitative criteria. The aim of this survey is to discuss an approach based on measure of “usefulness of scientific contribution” called “usc-index” and published in Markov et al, 2013. It is grounded on theory of Knowledge Market. In accordance with this, we remember main elements of this theory. After that we recall some information about Bibliometrics, Scientometrics, Informetrics and Webometrics as well as some critical analyses of journals’ metrics and quantity measures. Finally, we outline the approach for evaluation usefulness of the scientific contributions.

Keywords: Information Market, Knowledge Market, Usefulness of the Scientific Contributions

ACM Classification Keywords: A.1 Introductory and Survey

Link:

USEFULNESS OF SCIENTIFIC CONTRIBUTIONS

Krassimir Markov, KrassimiraIvanova?, Vitalii Velychko

http://www.foibg.com/ijita/vol20/ijita20-01-p01.pdf

МЕТОД ТРАНСЛЯЦИИ SDL-СПЕЦИФИКАЦИЙ...
By: Анастасия Заболотная  (3508 reads)
Rating: (1.00/10)

Аннотация: Рассматриваются SDL-спецификации распределенных систем с динамическим порождением и удалением экземпляров процессов. Для них предложен метод трансляции в модифицированные цветные сети Петри - иерархические временные типизированные сети (ИЧТ- сети), в которых используется предложенная Мерлином концепция интервального времени. Естественный подход к верификации основан на использовании формальных моделей, таких, как конечные автоматы, сети Петри и их обобщения. При этом процесс анализа и верификации упрощается. Данная работа описывает SDL-системы с таймерами, которые позволяют адекватно представить значительный класс коммуникационных протоколов. Для них предложен метод трансляции в модифицированные цветные сети Петри - иерархические временные типизированные сети, в которых используются предложенная Мерлином концепция интервального времени. Алгоритм трансляции SDL-спецификаций в сетевые модели системы SDLE реализован методом трансляции в два этапа. Способ моделирования основывается на том, что в многоуровневом описании системы в SDL позиция каждого экземпляра процесса в общей иерархии системы остается неизменной, что позволяет описание системы транслировать в структуру сети, а экземпляры процесса моделировать с помощью фишек. В результате работы алгоритма создается такая сетевая модель, в которой в каждом месте будет содержать не более одной фишки, моделирующей некоторый экземпляр процесса. Таким образом, если во время функционирования системы может существовать n разных экземпляров любого процесса, то в каждом месте моделирующей его сети может содержаться не более n фишек, причем каждая фишка будет соответствовать своему экземпляру процесса. Это факт позволяет существенно повысить эффективность моделирования, так как существенно уменьшает перебор вариантов связывания переменных.

Ключевые слова: сети Петри, SDL, сетевая модель, коммуникационный протокол, экземпляр процесса.

Link:

МЕТОД ТРАНСЛЯЦИИ SDL-СПЕЦИФИКАЦИЙ С ПОМОЩЬЮ МОДИФИЦИРОВАННЫХ СЕТЕЙ ПЕТРИ ВЫСОКОГО УРОВНЯ

Анастасия Заболотная

http://www.foibg.com/ijima/vol01/ijima01-4-p09.pdf

ВЫБОР ИСТОЧНИКОВ ДАННЫХ ДЛЯ РЕАЛИЗАЦИИ...
By: Нина Баканова  (3414 reads)
Rating: (1.00/10)

Абстракт: В работе рассматривается подход, позволяющий проанализировать информационный потенциал системы организационного управления, с точки зрения возможности реализации режимов поддержки принятия управленческих решений. Основой анализа служит исследование основных составляющих управленческого процесса: функций-задач и функций-операций.

Ключевые слова: системы организационного управления, поддержка управленческой деятельности, источники данных, повышение эффективности управления.

Link:

ВЫБОР ИСТОЧНИКОВ ДАННЫХ ДЛЯ РЕАЛИЗАЦИИ ИНФОРМАЦИОННОЙ ПОДДЕРЖКИ УПРАВЛЕНЧЕСКОЙ ДЕЯТЕЛЬНОСТИ

Нина Баканова

http://www.foibg.com/ijima/vol01/ijima01-4-p08.pdf

ИДЕНТИФИКАЦИЯ ВРЕМЕНИ РАСПРОСТРАНЕНИЯ...
By: Александр Джулай, Артем Быченко  (4042 reads)
Rating: (1.00/10)

Аннотация: В статье рассмотрены вопросы идентификации времени распространения пожара на основе сети TSK.

Ключевые слова: нейронные сети, экспертные оценки

ACM Classification Keywords: H.4 Information Systems Applications, J.6 Computer-aided Engineering

Link:

ИДЕНТИФИКАЦИЯ ВРЕМЕНИ РАСПРОСТРАНЕНИЯ ПОЖАРА НА ОСНОВЕ СЕТИ TSK

Александр Джулай, Артем Быченко

http://www.foibg.com/ijima/vol01/ijima01-4-p07.pdf

ОЦЕНКА ИНТЕРВАЛЬНЫХ АЛЬТЕРНАТИВ:...
By: Михаил Стернин, Геннадий Шепелёв  (4163 reads)
Rating: (1.00/10)

Аннотация: Рассмотрена задача принятия решений в условиях неопределенности с одним интервальным показателем качества сравниваемых альтернатив. Введено понятие иерархии неопределенностей в множествах предлагаемых к сравнению альтернатив, показатели качества которых описываются различными представлениями интервальных оценок, – моно интервальными и поли интервальными. Из-за типичного для задач выбора в множествах интервальных оценок пересечения интервалов задача сравнения таких, «интервальных», альтернатив может быть решена только с учетом предпочтений лица, принимающего решение. В рамках введенной иерархии предлагаются некоторые подходы к сравнению интервальных альтернатив и связанные с ними способы описания предпочтений ЛПР. Анализируется возможность введения предпочтений на базе предложенного авторами ранее метода расчета коэффициента уверенности в истинности проверяемой гипотезы о предпочтительности той или иной альтернативы. Рассматриваются возможности сравнения интервальных альтернатив и описания предпочтений посредством функций полезности, отражающих три основных вида предпочтений ЛПР, - безразличие к риску; постоянную несклонность к риску и постоянную склонность к нему. Здесь проводится также сопоставление точечных оценок, эквивалентных сравниваемым интервальным, которые рассчитываются в рамках аппарата функций полезности («детерминированные эквиваленты»), с оценками «пессимизма – оптимизма» Гурвица. Показано, что в обоих этих методах результатами сравнения выступают характерные для многократно повторяющихся ситуаций осредненные индикаторы, используемые затем для описания предпочтений, что не всегда адекватно содержанию решаемых задач. В связи с этим предложен новый метод сравнения интервальных величин и описания предпочтений, сравнение оценок и задание предпочтений в котором осуществляется на основе сопоставления «гарантированных» значений разностей показателей качества сравниваемых альтернатив, трактуемых как случайные переменные, для выбранных экспертом уровней шансов реализации проверяемой гипотезы о предпочтительности. Этот метод сравнения и описания предпочтений, который свободен от использования осредненных величин, иллюстрируется для случая полученных авторами соотношений для функций распределения вероятностей разностей двух равномерно распределенных величин, заданных на сравниваемых интервалах. Дан численный пример сравнения интервальных альтернатив, осуществляемого разными методами.

Keywords: interval alternatives, hierarchy of uncertainties, preferences, utility functions, probability distribution of difference for two random variables, methods comparing interval alternatives.

ACM Classification Keywords: H.1.2 Human information processing. G3 Distribution functions. I.2.3 Uncertainty, “fuzzy,'' and probabilistic reasoning.

Link:

ОЦЕНКА ИНТЕРВАЛЬНЫХ АЛЬТЕРНАТИВ: НЕОПРЕДЕЛЕННОСТИ И ПРЕДПОЧТЕНИЯ

Михаил Стернин, Геннадий Шепелёв

http://www.foibg.com/ijima/vol01/ijima01-4-p05.pdf

РЕТРОСПЕКТИВНЫЙ АНАЛИЗ РЕЗУЛЬТАТИВНОСТИ ...
By: Петровский et al.  (3346 reads)
Rating: (1.00/10)

Аннотация: В работе рассматривается новый подход, ориентированный на ретроспективный анализ результативности научных проектов. Подход позволяет находить интегральные показатели оценки результативности научных проектов, используя методы группового вербального анализа решений и теорию мультимножеств. Рассмотрено применение предложенного подхода для выявления наиболее результативных научных проектов в Российском фонде фундаментальных исследований. Проведен многокритериальный анализ результатов, планируемых при подаче заявки на проект, промежуточных в ходе выполнения проекта и итоговых при завершении проекта.

Ключевые слова: групповой вербальный анализ решений, интегральный показатель оценки, результативность научного проекта, ретроспективный анализ

Link:

РЕТРОСПЕКТИВНЫЙ АНАЛИЗ РЕЗУЛЬТАТИВНОСТИ НАУЧНЫХ ПРОЕКТОВ

Алексей Петровский, Григорий Ройзензон, Александр Балышев, Игорь Тихонов

http://www.foibg.com/ijima/vol01/ijima01-4-p04.pdf

АНАЛИЗ ФИНАНСОВОГО СОСТОЯНИЯ...
By: Ови Нафас Агаи Аг Гамиш, Юрий Зай�  (3500 reads)
Rating: (1.00/10)

Abstract: The problem of corporations bankruptcy risk prediction is considered. Classical methods of discriminant analysis are described and analyzed. The matrix method based on fuzzy sets and new methods using fuzzy neural networks for bankruptcy risk prediction are considered, The experimental investigations of classical and new fuzzy methods for Ukrainian corporations bankruptcy risk prediction were carried out, their efficiency estimated and the best method for Ukrainian economy was determined.

Keywords: bankruptcy risk prediction, method of discriminant analysis, fuzzy neural networks

Link:

АНАЛИЗ ФИНАНСОВОГО СОСТОЯНИЯ И ПРОГНОЗИРОВАНИЕ РИСКА БАНКРОТСТВА КОРПОРАЦИЙ В УСЛОВИЯХ НЕОПРЕДЕЛЕННОСТИ

Ови Нафас Агаи Аг Гамиш, Юрий Зайченко

http://www.foibg.com/ijima/vol01/ijima01-4-p03.pdf

ПЕРСПЕКТИВНЫЕ НАПРАВЛЕНИЯ РАЗВИТИЯ ...
By: Олег Майданович, Михаил Охтилев,   (3834 reads)
Rating: (1.00/10)

Аннотация: Рассматриваются проблемы создания и применения автоматизированных систем. Особое внимание уделяется одному из важных видов автоматизированных систем — автоматизированных систем мониторинга (АСМ) состояний сложных организационно-технических комплексов (СОТК) в режиме реального времени с учетом возможной деградацией их структур, проведен обзор существующих исследований и технологических подходов к решению проблем создания и применения АСМ состояния СТО и управления в реальном масштабе времени.

Ключевые слова: интеллектуальные информационные технологии мониторинга и правления сложными объектами.

Ключевые слова по ACM классификатору: J.6 Computer-Aided? Engineering and I.2.2 Automatic Programming.

Link:

ПЕРСПЕКТИВНЫЕ НАПРАВЛЕНИЯ РАЗВИТИЯ ИНФОРМАЦИОННЫХ ТЕХНОЛОГИЙ МОНИТОРИНГА И УПРАВЛЕНИЯ СОСТОЯНИЯМИ СЛОЖНЫХ ТЕХНИЧЕСКИХ ОБЪЕКТОВ В РЕАЛЬНОМ МАСШТАБЕ ВРЕМЕНИ

Олег Майданович, Михаил Охтилев, Борис Соколов

http://www.foibg.com/ijima/vol01/ijima01-4-p02.pdf

НЕЧЕТКИЙ МЕТОД ИНДУКТИВНОГО МОДЕЛИРОВАНИЯ
By: Юрий Зайченко  (4031 reads)
Rating: (1.00/10)

Abstract: The problem of prediction of British Petroleum Corp. stock prices and the Dow Jones Industrial Average stock quote is considered. For the prediction data stock quote of the largest oil companies at the stock exchange NYSE were used as input data. The obtained experimental results of prediction using FGMDH were compared with the classical GMDH and cascade neo-fuzzy neural networks. For the classical and fuzzy GMDH four classes of functions- linear, quadratic, Fourier polynomial and Chebyshev polynomial were used, and the variation in the form of membership function, the size of learning sample and freedom of choice with the developed software were performed. Experimental results of forecasting at NYSE are presented enabling to estimate efficiency of different forecasting methods and to choose the most proper method.

Keywords: fuzzy group method of data handling, stock exchange, stock prices forecasting, cascade neo-fuzzy neural networks.

Link:

НЕЧЕТКИЙ МЕТОД ИНДУКТИВНОГО МОДЕЛИРОВАНИЯ В ЗАДАЧАХ ПРОГНОЗИРОВАНИЯ НА ФОНДОВЫХ РЫНКАХ

Юрий Зайченко

http://www.foibg.com/ijima/vol01/ijima01-4-p01.pdf

HTML VALIDATION THROUGH EXTENDED VALIDATION SCHEMA
By: Radoslav Radev  (4416 reads)
Rating: (1.00/10)

Abstract: The paper presents extensible software architecture and a prototype and an implementation of a highly configurable system for HTML validation. It is based on validation rules defined in an XML document called “extended validation schema”. It serves as an extended validation schema beside the official HTML specification, because the browsers’ and other web clients’ differences in HTML visualization makes the HTML specification insufficient and it is perfectly possible an HTML document to be syntax valid and yet not well visualized in some browser or mail-client. The extended validation schema allows definition of custom and specific validation rules in three levels - document rules, element (or tag) rules and attributes rules. The correctness of the validation schema is checked via a predefined XSD schema. The paper defines a prototype of a validation engine that consists of HTML parser, HTML validator, Storage module and Statistics module. The HTML parser parses the HTML file and breaks it into corresponding elements. The HTML validator applies the custom validations defined in the extended validation schema for every single element and attribute along with document-level validations, and also automatically corrects the errors wherever possible. The Storage module saves the validation results to a persistent storage. They can be considered for unit tests and used by the Statistics module to create additional statistics, analyses, quality assurance and bug tracking. A comparison is made with other HTML validation services and solutions. The results of an implementation of the prototype system in a software company are also presented.

Keywords: HTML validation, XML schema, quality assurance, unit tests, bugs tracking.

ACM Classification Keywords: D.4.m Software – Miscellaneous.

Link:

HTML VALIDATION THROUGH EXTENDED VALIDATION SCHEMA

Radoslav Radev

http://www.foibg.com/ijima/vol01/ijima01-3-p09.pdf

ANALYSIS AND JUSTIFICATION FOR SELECTION PARAMETERS OF WIRED ACCESS SYSTEMS
By: Svetlana Sakharova  (3944 reads)
Rating: (1.00/10)

Abstract: The executed researches belong to area of design of perspective access networks. Work is devoted to the analysis of parameters of access networks and a choice of the most significant among them. Results of researches for wire decisions of the organization of a network are given.

Keywords: access network, parameters of access networks.

ACM Classification Keywords: С.2. Computer-communication networks, H. Information Systems - H.1 Models and Principles, K. Computing Milieux - K.6 Management of computing and information system.

Link:

ANALYSIS AND JUSTIFICATION FOR SELECTION PARAMETERS OF WIRED ACCESS SYSTEMS

Svetlana Sakharova

http://www.foibg.com/ijima/vol01/ijima01-3-p08.pdf

THEORETICAL ANALYSIS OF EMPIRICAL RELATIONSHIPS FOR PARETODISTRIBUTED...
By: Vladimir Atanassov, Ekaterina Detcheva  (3999 reads)
Rating: (1.00/10)

Abstract: In this paper we study some problems involved in analysis of Pareto-distributed scientometric data (series of citations versus paper ranks). The problems include appropriate choices of i) the distribution type (continuous, discrete or finite-size discrete) and ii) statistical methods to obtain unbiased estimates for the powerlaw exponent (maximum likelihood procedure or least square regression.). Since relatively low magnitudes of the power exponent (less than 2), are observed massively in scientometric databases, finite-size discrete Pareto distribution (citations, distributed to finite number of paper ranks) appears to be more adequate for data analysis than the traditional ones. This conclusion is illustrated with two examples (for synthetic and actual data, respectively). We also derive empirical relationships, in particular, for the maximum and the total number of citations dependence on the Hirsch index. The latter generalize results of previous studies.

Keywords: Scientometrics, Hirsch index, Pareto distributions, data analysis, empirical relationships

ACM Classification Keywords: H. Information Systems, H.2. Database Management, H.2.8. Database applications, subject: Scientific databases; I. Computing methodologies, I.6 Simulation and Modeling, I.6.4. Model Validation and Analysis

Link:

THEORETICAL ANALYSIS OF EMPIRICAL RELATIONSHIPS FOR PARETODISTRIBUTED SCIENTOMETRIC DATA

Vladimir Atanassov, Ekaterina Detcheva

http://www.foibg.com/ijima/vol01/ijima01-3-p07.pdf

THE USE OF TIME-SERIES OF SATELLITE DATA TO FLOOD RISK MAPPING
By: Sergii Skakun  (4724 reads)
Rating: (1.00/10)

Abstract: In this paper we propose a novel approach for flood hazard mapping by processing and analyzing a time-series of satellite data and derived flood extent maps. This approach is advantageous in cases when the use of hydrological models is complicated by the lack of data, in particular high-resolution DEM. We applied this approach to the time-series of Landsat-5/7 data acquired 2000 to 2010 for the Katima Mulilo region in Namibia. We further integrated flood hazard map with dwelling units database to derive flood risk map.

Keywords: flood hazard, flood risk assessment, Earth remote sensing, Earth observation, satellite data processing, UN-SPIDER.

ACM Classification Keywords: H.1.1 Models and Principles Systems and Information Theory; I.4.8 Image Processing and Computer Vision Scene Analysis - Sensor Fusion.

Link:

THE USE OF TIME-SERIES OF SATELLITE DATA TO FLOOD RISK MAPPING

Sergii Skakun

http://www.foibg.com/ijima/vol01/ijima01-3-p06.pdf

CROP STATE AND AREA ESTIMATION IN UKRAINE BASED ON REMOTE AND INSITU ...
By: Kussul et al.  (5215 reads)
Rating: (1.00/10)

Abstract: This paper highlights the current state on establishing a network of test sites in Ukraine within the Joint Experiment for Crop Assessment and Monitoring (JECAM) project of the Global Earth Observation System of Systems (GEOSS). The results achieved so far on developing methods for crop state and area estimation using satellite and in situ observations are presented. The agromonitoring portal that provides access to geospatial products is described as well.

Keywords: Earth remote sensing, GEOSS, JECAM, satellite data processing, agriculture, area estimation.

ACM Classification Keywords: H.3.4 Information Systems Systems and Software - Distributed systems; I.5.1 Computing Methodologies Models –Neural nets; I.4.8 Image Processing and Computer Vision Scene Analysis - Sensor Fusion.

Link:

CROP STATE AND AREA ESTIMATION IN UKRAINE BASED ON REMOTE AND INSITU OBSERVATIONS

Nataliia Kussul, Andrii Shelestov, Sergii Skakun, Oleksii Kravchenko, Bohdan Moloshnii

http://www.foibg.com/ijima/vol01/ijima01-3-p05.pdf

AN IN-DEPTH ANALYSIS AND IMAGE QUALITY ASSESSMENT OF AN EXPONENTBASED...
By: Chika Ofili, Stanislav Glozman, Orly Yadid-Pecht  (4528 reads)
Rating: (1.00/10)

Abstract: In order to view wide contrast details in an image scene, a wide dynamic range (WDR) image sensor is required. However, these wide dynamic range images cannot be accurately viewed on a regular display device due to its limited dynamic range. Without the proper use of a WDR image compression algorithm, the details of images will be lost. Tone-mapping algorithms are used to adapt the captured wide dynamic range scenes to the low dynamic range displays available. This paper explores the utilization of an exponent-tone mapping algorithm for colored and monochrome WDR images in lure of a regular display. The exponent-based tone mapping algorithm utilizes only the Bayer (CFA) of the WDR image to produce tone mapped image results. High quality results are achieved without the use of additional image processing techniques such as histogram clipping. The image results are then compared with other conventional tone mapping operators available.

Keywords: Tone mapping, Wide dynamic range, High Dynamic Range Image, Image enhancement.

ACM Classification Keywords: A.0 General Literature - Conference proceedings; I.4.0 Image processing and Computer Vision- General (or .3 enhancement)

Link:

AN IN-DEPTH ANALYSIS AND IMAGE QUALITY ASSESSMENT OF AN EXPONENTBASED TONE MAPPING ALGORITHM

Chika Ofili, Stanislav Glozman, Orly Yadid-Pecht?

http://www.foibg.com/ijima/vol01/ijima01-3-p04.pdf

AUTOMATED SYSTEM FOR QUANTIFYING THE LEVEL OF PREPARATION IN COLONOSCOPY
By: Rodríguez et al.  (6760 reads)
Rating: (1.00/10)

Abstract: Colonoscopy is the gold standard method for the diagnosis of colorectal cancer (CRC). It detects the first clinical manifestation of CRC, known as polyps. One night prior to a colonoscopy procedure, patients are instructed to take laxative agents in order to completely cleanse the colon. This process is called bowel preparation. Contemporary sensitivity of colonoscopy for detecting polyps of a size larger than 10 mm is 98% with the limitation in detection mainly due to poor visualization related to inadequate bowel preparation. Unfortunately, there is not yet a metric (formally recommended by means of guidelines) for the quantification of bowel preparation. Scales used nowadays are not objective, because generally colonoscopists estimate the level of cleanliness after the conclusion of the colonoscopic test. This limitation leads to the formalization of the present study, which focuses on the development of a novel cleansing evaluation system for bowel preparation and the assessment of its clinical efficacy. The proposed system consists of a computer-based tool that can automatically measure the quantity of stool and waste matter existing within the patient during a colonoscopy procedure. As these metrics can be obtained automatically, the proposed method can lead to future quality control in daily medical practice. Furthermore, it can be used to create best practice standards for colonoscopy training or as part of medical skill evaluation.

Keywords: Colonoscopy; Colon preparation; Efficacy; Quality measurement metrics; Video segmentation

ACM Classification Keywords: A.0 General Literature - Conference proceedings; J.3. Life and Medical Sciences

Link:

AUTOMATED SYSTEM FOR QUANTIFYING THE LEVEL OF PREPARATION IN COLONOSCOPY

Leticia Angulo-Rodríguez?, Xuexin Gao, Dobromir Filip, Christopher N. Andrews and Martin P. Mintchev

http://www.foibg.com/ijima/vol01/ijima01-3-p03.pdf

SOLVING DIOPHANTINE EQUATIONS WITH A PARALLEL MEMBRANE COMPUTING MODEL
By: Alberto Arteta, Nuria Gomez, Rafael Gonzalo  (3498 reads)
Rating: (1.00/10)

Abstract: Membrane computing is a recent area that belongs to natural computing.. P-systems are the structures which have been defined, developed and implemented to simulate the behavior and the evolution of membrane systems which we find in nature. Diophantine equations are those equations that have integer solutions. Currently, the extended Euclidean algorithm works to find integer solutions. .This paper shows a step by step procedure that solves a Diophantine equation by processing the extended Euclidean Algorithm

Keywords: Extended Euclidean Algorithm, Membrane systems .

Link:

SOLVING DIOPHANTINE EQUATIONS WITH A PARALLEL MEMBRANE COMPUTING MODEL

Alberto Arteta, Nuria Gomez, Rafael Gonzalo

http://www.foibg.com/ijima/vol01/ijima01-3-p02.pdf

POLYNOMIAL APPROXIMATION USING PARTICLE SWARM OPTIMIZATION OF LINEAR ...
By: Mingo et al.  (5043 reads)
Rating: (1.00/10)

Abstract: This paper presents some ideas about a new neural network architecture that can be compared to a Taylor analysis when dealing with patterns. Such architecture is based on lineal activation functions with an axo-axonic architecture. A biological axo-axonic connection between two neurons is defined as the weight in a connection in given by the output of another third neuron. This idea can be implemented in the so called Enhanced Neural Networks in which two Multilayer Perceptrons are used; the first one will output the weights that the second MLP uses to computed the desired output. This kind of neural network has universal approximation properties even with lineal activation functions. There exists a clear difference between cooperative and competitive strategies. The former ones are based on the swarm colonies, in which all individuals share its knowledge about the goal in order to pass such information to other individuals to get optimum solution. The latter ones are based on genetic models, that is, individuals can die and new individuals are created combining information of alive one; or are based on molecular/celular behaviour passing information from one structure to another. A swarm-based model is applied to obtain the Neural Network, training the net with a Particle Swarm algorithm.

Keywords: Neural Networks, Swarm Computing, Particle Swarm Optimization.

ACM Classification Keywords: F.1.1 Theory of Computation - Models of Computation, I.2.6 Artificial Intelligence - Learning, G.1.2 Numerical Analysis - Approximation.

Link:

POLYNOMIAL APPROXIMATION USING PARTICLE SWARM OPTIMIZATION OF LINEAR ENHANCED NEURAL NETWORKS WITH NO HIDDEN LAYERS

Luis F. de Mingo, Miguel A. Muriel, Nuria Gómez Blas, Daniel Triviño G.

http://www.foibg.com/ijima/vol01/ijima01-3-p01.pdf

SOFTWARE FOR THE RECOGNITION OF POLYHEDRON CONTOUR IMAGES IN THE FRAMEWORK ...
By: Natalya Bondar, Tatiana Kosovskaya  (4369 reads)
Rating: (1.00/10)

Abstract. The paper is devoted to the implementation of logic-objective approach to the solving of a polyhedron contour images (in particular partially covered images) recognition problem in a complex scene represented on the display screen. A way of predicate value calculation for representation the display screen is described in the paper. Examples of a program run constructing descriptions of both separate pictures and classes of objects are presented. For recognition of partially covered objects on the complex scene the concept of partial deducibility is used. Additionally the certainty level of the correct recognition is calculated.

Keywords: artificial intelligence, pattern recognition, predicate calculus.

ACM Classification Keywords: I.2.4 ARTIFICIAL INTELLIGENCE Knowledge Representation Formalisms and Methods – Predicate logic.

Link:

SOFTWARE FOR THE RECOGNITION OF POLYHEDRON CONTOUR IMAGES IN THE FRAMEWORK OF LOGIC-OBJECTIVE RECOGNITION SYSTEM

Natalya Bondar, Tatiana Kosovskaya

http://www.foibg.com/ijima/vol01/ijima01-2-p09.pdf

ABOUT POSSIBILITY-THEORETICAL METHOD OF PIECEWISE-LINEAR APPROXIMATION ...
By: Veda Kasyanyuk, Iryna Volchyna  (3866 reads)
Rating: (1.00/10)

Abstract: This paper considers the problem of recognizing and classifying the odorants to preset classes of volatile matters. It is assumed that the data registered by sensory elements and been liable to processing has been distorted by errors – fuzzy values. The possibility-theoretical method of piecewise-linear approximation of functional dependencies is proposed to solve the problem.

Keywords: possibility-theoretical method, odorants, fuzzy errors.

ACM Classification Keywords: I.6 Simulation and Modeling.

Link:

ABOUT POSSIBILITY-THEORETICAL METHOD OF PIECEWISE-LINEAR APPROXIMATION OF FUNCTIONAL DEPENDENCIES IN PROBLEM OF ODOURS’ RECOGNITION

Veda Kasyanyuk, Iryna Volchyna

http://www.foibg.com/ijima/vol01/ijima01-2-p08.pdf

PARETO-OPTIMUM APPROACH TO MATHEMATICAL MODELING OF ODOURS IDENTIFICATION ...
By: Andriy Zavorotnyy, Veda Kasyanyuk  (3742 reads)
Rating: (1.00/10)

Abstract: Mathematical model of vapor identification system is developed. Calibrating signals from vapor sensors are used to identify unknown input to vapor sensors and approximate output from eventual sensor system. Approximation formulas are resulted from pareto-optimum solution of multi-criterion problem. The developed method can be used to create new measuring-calculating systems within "device + PC = device with added benefits" framework.

Keywords: identification, an odorant, impacted data, measuring system, pareto-optimization

ACM Classification Keywords: I.6 Simulation and Modeling

Link:

PARETO-OPTIMUM APPROACH TO MATHEMATICAL MODELING OF ODOURS IDENTIFICATION SYSTEM

Andriy Zavorotnyy, Veda Kasyanyuk

http://www.foibg.com/ijima/vol01/ijima01-2-p07.pdf

ACTIVITY RECOGNITION USING K-NEAREST NEIGHBOR ALGORITHM ON SMARTPHONE WITH...
By: Sahak Kaghyan, Hakob Sarukhanyan  (4693 reads)
Rating: (1.00/10)

Abstract: Mobile devices are becoming increasingly sophisticated. These devices are inherently sensors for collection and communication of textual and voice signals. In a broader sense, the latest generation of smart cell phones incorporates many diverse and powerful sensors such as GPS (Global Positioning Systems) sensors, vision sensors (i.e., cameras), audio sensors (i.e., microphones), light sensors, temperature sensors, direction sensors (i.e., magnetic compasses), and acceleration sensors (i.e., accelerometers). The availability of these sensors in mass-marketed communication devices creates exciting new opportunities for data mining and data mining applications. So, it is not surprising that modern mobile devices, particularly cell phones of last generations that work on different mobile operating systems, got equipped with quite sensitive sensors. This paper is devoted to one approach that solves human activity classification problem with help of a mobile device carried by user. Current method is based on K-Nearest? Neighbor algorithm (K-NN). Using the magnitude of the accelerometer data and K-NN algorithm we could identify general activities performed by user.

Keywords: human activity classification; K-NN algorithm; mobile devices; accelerometer; Android platform

Link:

ACTIVITY RECOGNITION USING K-NEAREST NEIGHBOR ALGORITHM ON SMARTPHONE WITH TRI-AXIAL ACCELEROMETER

Sahak Kaghyan, Hakob Sarukhanyan

http://www.foibg.com/ijima/vol01/ijima01-2-p06.pdf

[prev]  Page: 21.6/66  [next]
1  2  3  4  5  6  7  8  9  10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31  32  33  34  35  36  37  38  39  40  41  42  43  44  45  46  47  48  49  50  51  52  53  54  55  56  57  58  59  60  61  62  63  64  65  66 
World Clock
Powered by Tikiwiki Powered by PHP Powered by Smarty Powered by ADOdb Made with CSS Powered by RDF powered by The PHP Layers Menu System
RSS Wiki RSS Blogs rss Articles RSS Image Galleries RSS File Galleries RSS Forums RSS Maps rss Calendars
[ Execution time: 0.16 secs ]   [ Memory usage: 7.58MB ]   [ GZIP Disabled ]   [ Server load: 1.34 ]