|J. Blahuta (Silesian University in Opava, Opava, Czech Republic), T. Soukup, P. Čermák, J. Rozsypal, M. Večerek (Silesian University in Opava, Department of Informatics, Opava, Czech Republic)
Ultrasound Medical Image Recognition with Artificial Intelligence for Parkinson´s Disease Classification
This paper shows how to classify the medical ultrasound images by using artificial inteligency with experimental software in MATLAB. The main goal is a classification of ROI substantia nigra in midbrein. The classification of image is useful to detection Parkinson´s disease. Work is based on image processing and is realized with the help of artificial inteligence which has been experimental simulate in MATLAB software environment. This methode is well applicable in Neural tools MATLAB.
|S. Tulyakov (Belarusian State University of Informatics and Radio Electronics, Minsk, Belarus), D. Vyleghzanin (Belarusian State University, Minsk, Belarus), R. Sadykhov (Belarusian State University of Informatics and Radio Electronics, Minsk, Belarus)
Neural Network-Based Digital Map Extraction Approach
The paper describes the first version of the digital map detection and extraction mechanism. This approach is targeted to be applied in content-based image retrieval systems. The algorithm is based on a multilayer perceptron classifier, used to separate two kinds of objects: maps and non-maps. The classifier accepts image histograms, computed for each colour layer as an input. We show that these histograms are characteristic features of the map images compared to other images. The proposed algorithm is able to detect maps that constitute a part of larger images. Due to proposed optimization techniques the algorithm is fast and has low memory requirements. We also outline potential improvements and future research directions in this area.
|M. Marčetić, S. Ribarić (FER, Zagreb, Croatia)
Personal Recognition Based on Gabor Features of Colour Palmprint Images
In this paper a personal recognition system based on Gabor features of colour palmprint images is described. The features are extracted by a bank of Gabor filters from palmprint region represented by three primary spectral components R, G and B. Fusion of the system at the matching score level is used to improve recognition accuracy. The recognition results of the system are compared by the results of gray-level-based palmprint recognition system for the same database.
|V. Shehu (South East European University, Tetovo, Macedonia), A. Dika (University of Prishtina, Prishtina, Kosovo)
Curve Similarity Measurement Algorithms for Automatic Gesture Detection Systems
In the previous years, there has been a lot of research done in the field of Computer Vision and its application in Natural Interaction. Based on the fact that communication using gestures is one of the primary methods of communication between people, this discipline has seen a lot of advancement. In this paper we present a comparative study of curve representation and similarity measurement algorithms with the purpose of using them for gesture detection and interpretation. Of special interest of this paper are algorithms that will be able to detect and track different types of gestures. Therefore, a thorough discussion of gesture classification has been important part of this research.
|V. Braut, M. Čuljak, V. Vukotić, S. Šegvić (Faculty of Electrical Engineering and Computing, Zagreb, Croatia), M. Ševrović, H. Gold (Faculty of Transport and Traffic Sciences, Zagreb, Croatia)
Estimating OD Matrices at Intersections by Computer Vision - a Pilot Study
Modern transportation systems are designed by optimizing
traffic flow models which are parameterized by
actual demands estimated from empirical measurements.
Typical traffic flow parameters include
frequency, density, headway, etc. of the vehicles
at relevant sections of the transportation network.
There are many suitable commercial technologies
for estimating these parameters
at straight sections of the network.
However, none of them is applicable at intersections
where one needs to establish temporal correspondence
between the detected vehicles in order to estimate
microscopic OD (origin-destination) matrices.
This paper presents a pilot study
towards achieving that goal
by automatic processing of video
acquired from a tall building.
In the paper, we first describe
the prototype software system
we have designed and developed
in order to support the study.
The developed system attempts to solve the problem
by straightforward (but nevertheless involved)
computer vision techniques.
In the experimental part, we perform
a rigorous performance evaluation
of the prototype system on real traffic video.
The evaluation finally allows us to conclude the study
by characterizing the achieved baseline performance,
as well as by devising suitable directions for future research.
|D. Brodić (University of Belgrade, Bor, Serbia), D. Milivojević (Institute for Mining and Metallurgz, Bor, Serbia), B. Dokić (University of Banja Luka, Banja Luka, Bosnia and Herzegovina)
Comparison Between Initial Skew Rate and Moment Based Method for the Printed Text Skew Estimation
In this paper, the methods for the estimation of the printed text skew are explored. Initail skew rate and moment based method are presented. They are tested with single and multi line document image samples in different resolutions. Furthermore, the results are evaluated by common measures. Moment based method proved its advantage in the domain of accuracy and robustness over the initial skew rate method.
|M. Pobar, I. Ipšić (Sveučilište u Rijeci, Odjel za informatiku, Rijeka, Croatia)
Croatian Visual Speech Synthesis
Virtual human characters are being used in a variety of applications, such as computer games, movies, tutoring software and virtual guides. Virtual characters interact with persons using speech and gestures. In some applications speech can be pre-recorded and synchronized with animation to produce a believable virtual character. However in many cases an ability to produce synthesized speech synchronized with facial animation is necessary or desireable, for example because the output text is not known in advance, for ability to choose among various voices, or because recording natural speech is too time consuming and expensive for a given application. This paper presents the effort to integrate and extend existing text-to-speech system for Croatian language and a face and body animation system to produce a system capable of real-time visual text-to-speech synthesis for Croatian language.
|t. mitsuishi (University of Marketing and Distribution Sciences, Kobe, Japan)
Continuity of Approximate Reasoning Using Center of Sums Defuzzification Method
In this work, a mathematical structure based on the functional analysis for studying the existence of fuzzy optimal control, which the feedback is evaluated through approximate reasoning using the center of sums defuzzification method on IF-THEN fuzzy rules. The framework consists of two propositions: To guarantee the convergence of optimal solution, a set of fuzzy membership functions (admissible fuzzy controller) which are selected out of continuous function space is compact metrizable. And assuming approximate reasoning to be a functional on the set of membership functions, its continuity is proved. Then, we show the existence of a fuzzy controller which minimize (maximize) the integral performance function of the nonlinear feedback fuzzy system.
|L. Nikolic (RT-RK, Institute for Computer Based Systems, Novi Sad, Serbia), D. Kukolj, Z. Tekic (Faculty of Technical Sciences, University of Novi Sad, Novi Sad, Serbia), M. Drazic, D. Nemet, M. Pokric, M. Vitas (RT-RK, Institute for Computer Based Systems, Novi Sad, Serbia)
Comparison of Clustering Techniques in Classification of the Patent Documents
Ever increasing number of patents makes impossible to find and analyze relevant documents manually. Various software tools have been developed in the patent field. They could analyze individual patents as well as patent portfolios; retrieve patents and make basic statistics as well as visualize, map and landscape the same data. Most of these tools use statistical methods to analyze patent data in a specific period, and represent patent trends by visualization graphs and tables. However, most of the tools available today are expensive. Costly patent databases and tools are not affordable to small and medium enterprises and academic institutions, especially in developing countries. In addition, these tools can be complicated to use or ask for a strong expertise in the field of intellectual property. Therefore, it is important to develop available tool which will easier patent portfolio analysis and enable in-depth understanding technology trends, market place and competitors. The essential function any tool should provide is patent clustering. The clustering task consists in grouping the given unlabelled collection of patent documents into meaningful clusters without any prior information. It enables to find undetected or unexpected patterns of information concentrations embedded in databases. There have been many different clustering approaches. In this paper we compare performances of k-means, the neural-gas, fazzy c-means and ronn clustering technique when used on patent data set that was also clustered by the experts.
|I. Nikolova (Technical University of Sofia, Bulgaria, Plovdiv, Bulgaria)
Deep Learning Architecture for Data Mining from Surgical Data
The paper addresses data mining for surgical data. Surgical data has several distinguishing features – it is voluminous, heterogeneous, noise-prone, has low level of formalization due to lack of standardization in domain and highly correlated features. Architecture for data mining from surgical data is proposed, which deals with described characteristics of data. It is related to deep learning architectures as it consists of several hierarchical levels. Robust counterparts are added to process noise in data. Context-wise input layer modules solve problem of heterogeneous data, and results from these modules are calibrated before being passed to the next layer.
|M. Horvat (FAKULTET ELEKTRONIKE I RAČUNARSTVA, ZAGREB, Croatia), S. Popović, K. Ćosić (Faculty of Electrical Engineering and Computing, University of Zagreb, ZAGREB, Croatia)
Towards Semantic and Affective Coupling in Emotionally Annotated Databases
Emotionally annotated databases are repositories of multimedia documents with annotated affective content that elicit emotional responses in exposed human subjects. They are primarily used in research of human emotions, attention and development of stress-related mental disorders. This can be successfully exploited in larger processes like selection, evaluation and training of personnel for occupations involving high stress levels. Emotionally annotated databases are also used in multimodal affective user interfaces to facilitate richer and more intuitive human-computer interaction. Multimedia documents in emotionally annotated databases must have maximum personal ego relevance to be the most effective in all these applications. For this reason flexible construction of subject-specific of emotionally annotated databases is imperative. But current construction process is lengthy and labor intensive because it inherently includes an elaborate tagging experiment involving a team of human experts. This is unacceptable since the creation of new databases or modification of the existing ones becomes slow and difficult. We identify a positive correlation between the affect and semantics in the existing emotionally annotated databases and propose to exploit this feature with an interactive relevance feedback for a more efficient construction of emotionally annotated databases. Automatic estimation of affective annotations from existing semantics enhanced with information refinement processes may lead to an efficient construction of high-quality emotionally annotated databases.
|M. Gaudina (IIT, Genova, Italy), M. Migliardi (Universita' di Padova, Padova, Italy)
Virtual Social Multimedia Streaming with a Novel Haptic Device
In the last years, technology has seen an uprising public interest around multitouch displays and futuristic
graphic user interfaces, giving new life to Virtual Reality Environments. We are surrounded by several
devices that allow the user to interact in a way that was not even thinkable few years ago.
Three dimensional and stereoscopic displays are concepts that are now widely spread and users
are getting used day by day. Unfortunately, devices that let the user to interact with the environment
are still cumbersome and expensive. The sociocultural era we are immersed in suggests the need for each
user to communicate with friends sharing experiences and contents. Unifying all these considerations,
we propose a virtual reality application to stream and share multimedia contents like TV channels, web
radio an others. In addition, , we introduced and tested a wireless vibrotactile haptic device to interact
with the various objects in the virtual scene, increasing the interaction feeling in a more involving experience.
|A. Kartelj, V. Filipovic (School of Mathematics, University of Belgrade, Belgrade, Serbia), V. Milutinovic (School of Electrical Engineering, University of Belgrade, Belgrade, Serbia)
Novel Approaches to Automated Personality Classification: Ideas and Their Potentials
In this paper, we propose several new research directions regarding the problem of Automated Personality Classification (APC). Firstly, we investigate possible improvements of the existing solutions to the problem of APC, for which we use different combinations of the APC corpora, psychological trait measurements, and learning algorithms. Afterwards, we consider extensions of the APC problem and the related tasks, such as dynamical APC and detecting personality inconsistency in a text. This entire research was performed in the context of social networks and the related datamining mechanisms.
|S. Parsazad (Ferdowsi University of Mashhad, Mashhad, Iran), E. Saboori (K.N Toosi University of Technology, Tehran, Iran), A. Allahyar (Ferdowsi University of Mashhad, Mashhad, Iran)
Fast Feature Reduction in Intrusion Detection Datasets
In the most intrusion detection systems (IDS), a system tries to learn characteristics of different type of attacks by analyzing packets that sent or received in network. These packets have a lot of features. But not all of them is required to be analyzed to detect that specific type of attack. Detection speed and computational cost is another vital matter here, because in these types of problems, datasets are very huge regularly. In this paper we tried to propose a very simple and fast feature selection method to eliminate features with no helpful information on them. Result faster learning in process of redundant feature omission. We compared our proposed method with three most successful similarity based feature selection algorithm including Correlation Coefficient, Least Square Regression Error and Maximal Information Compression Index. After that we used recommended features by each of these algorithms in two popular classifiers including: Bayes and KNN classifier to measure the quality of the recommendations. Experimental result shows that although the proposed method can’t outperform evaluated algorithms with high differences in accuracy, but in computational cost it has huge superiority over them.
|V. Jelisavčić (Mathematical Institute of the Serbian Academy of Sciences and Arts, Belgrade, Serbia), B. Furlan, J. Protić, V. Milutinović (School of Electrical Engineering, Belgrade, Serbia)
Topic Models and Advanced Algorithms for Profiling of Knowledge in Scientific Papers
Survey of probabilistc topic models is presented with emphasis on fundamentally different approaches used in modeling. Introduced classification differs from earlier efforts, providing a complementary view of the field. Purpose of this survey is to provide a brief overview of the current probailistic topic models as well as an inspiration for future research.
|I. Radchanka (ATD "BEKOMP", Minsk, Belarus), S. Basalayev (Open joint stock company "Optoelectronics systems", Minsk, Belarus), R. Sadykhov (Belarusian State University of Informatics and Radio Electronics, Minsk, Belarus)
Etalon-Based Integrated Microchip Inspection System
This paper presents automated integrated microchip inspection system, which is aimed to minimize manual inspection efforts of quality control specialist, by analyzing images of microchips taken by electronic microscope. To perform inspection the system requires three to five images of the analyzed region to build etalon, no other a priori information is needed. The system classifies each of the defects as belonging to one of the following types: closure of two or more neighboring conductors, an islet, representing defect fully contained by the conductor or metallization layer, and group type of defect including break of the conductor into several parts, protrusive and tear-out defects. The decision whether a particular defect is significant is taken automatically based on the comparison of the defect and affected conductor dimensions. The system recognizes contact pads allowing diverse analysis process for them. To correct the slope caused by the impossibility to precisely fix circuit on the positioning table, Hough transform based algorithm is proposed. In the end of the paper experimental results are given.
|M. Yaman (International University of Sarajevo, Sarajevo, Bosnia and Herzegovina), A. Subasi (International Burch University, Sarajevo, Bosnia and Herzegovina), E. Yaman (International University of Sarajevo, Sarajevo, Bosnia and Herzegovina), F. Rattay (Vienna University of Technology, Vienna, Austria)
Principal Component Anlaysis Using Eigenfaces
Face recognition takes very important role in computer technology and it has extensive attention for human computer interaction in computer science especially in computer vision area. Over the last decade or so, face recognition has become a popular and significant area of research in computer vision and one of the most successful applications of image analysis and understanding especially for purposes of better human computer interaction and biometrics applications. In this article, we are going to study on face recognition to understand and analyze existing face recognition techniques and what are their advantages over another. There are many available techniques to apply on face recognition of which Principle Component Analysis (PCA) with eigenfaces which we analyze and apply to identify patterns in faces and to express the face in such a way to highlight their similarities and differences.
|A. Sović (Faculty of Electrical Engineering and Computing, University of Zagreb, Zagreb, Croatia), K. Potočki (Faculty of Civil Engineering, University of Zagreb, Zagreb, Croatia), D. Seršić (Faculty of Electrical Engineering and Computing, University of Zagreb, Zagreb, Croatia), N. Kuspilić (Faculty of Civil Engineering, University of Zagreb, Zagreb, Croatia)
Wavelet Analysis of Hydrological Signals on an Example of the River Sava
Spectral analysis and wavelet analysis of successive flood waves on the basis of measured water levels help to understand hydrological processes and to improve hydrological modeling. In this paper, we compare Fourier transform, short time Fourier transform and continuous wavelet transforms to describe behavior of River Sava. Urban, agricultural and nature protected areas in the Republic of Croatia are under considerable influence of floods caused by the River Sava. Better understanding of the temporal patterns of the hydrological signals (discharge, water - level) is an important contribution to the water management of the River Sava.
|a. smrke (msx doo, domžale, Slovenia), J. Prezelj, P. Šteblaj (strojna fakulteta Ljubljana, Ljubljana, Slovenia), I. Smrke (smrkearhitekti , Ljubljana, Slovenia)
Sound of boiling as a disturbing noise and as a functional signal
Hot or even boiling water is often used in the household. During the process of heating the water, molecules become more and more active. Intensity of their movement depends on the temperature. Moving molecules are bumping into the container and causing audible noise. This audible noise can be regarded as a useful and functional signal on one hand, but some people find it as a disturbing noise.
A MSX technology was developed. MSX technology is presented in this article, together with some results of applied active noise control. MSX technology uses a sound sensor for determination of temperature of the water and even more important, the state of boiling itself. Microphone is placed in the base of the kettle and is recording sound. A microcontroller regulates the power of kettle according to calculated algorithm, which is based on the sound signal. The boiling point of the water is the main regulation point and by monitoring the sound, we can detect it much better then by monitoring the thermal sensor, which is commonly used in digital kettles. A new step forward in MSX technology is integration of Active Noise Control algorithm. Such algorithm enables the use of generated noise for control purposes and it attenuates the very same signal with additional loudspeaker. Audible noise is attenuated and almost silent performance of the kettle is achieved.
|N. Krpan (n/a, Zagreb, Croatia), D. Jakobović (Faculty of Electrical Engineering and Computing, Zagreb, Croatia)
Parallel Neural Network Training on GPU
This paper describes the parallelization of neural network training algorithms on heterogeneus architectures with graphical processing units (GPU). The algorithms used for training are particle swarm optimization and backpropagation. Parallel versions of both methods are presented and speedup results are given as compared to the sequential version. The efficiency of parallel training is investigated in regards to various neural network and training parameters.
|D. Petranović (Institut za elektroprivredu i energetiku d.d., Zagreb, Croatia)
Application of Genetic Algorithms in Wire Busbars Design
Application of genetic algorithms to resolve conflicting variants of wire busbars. Reducing the spacing between the phase conductors provides narrow wire busbars, but requires larger tensile force of the phase conductors, due to the reduction of conductor deflection during a short circuit, so the higher cost of the steel structure and foundations. Earlier work was based on the use of standardized portals, but in this paper will be presented the results of models that are based on optimization of steel structures by selection of elements of steel structures. Program for the application of methods of genetic algorithms is linked with the program for the calculation of steel structures based on current regulations (EURO CODE).
|S. Picek, M. Golub, D. Jakobović (Fakultet elektrotehnike i računarstva, Zagreb, Croatia)
On the Analysis of Experimental Results in Evolutionary Computation
Evolutionary computation methods are successfully applied in solving of combinatorial optimization problems. Since the “No Free Lunch“
theorem states that there are no single best algorithm to solve all the problems, throughout the years many algorithms and their
modifications have emerged. When a new algorithm is developed, one question that naturally arises is how it compares to other algorithms,
whether to some specific problem or in general performance. Because of the stochastic nature of systems involved, usually only possible way
of deriving answer is to perform extensive experimental analysis. In this paper we provide an overview of possible approaches in the
experimental analysis, and describe statistical methods that should be used. Furthermore, we outline similarities and differences between
these methods, which lead to a discussion of important issues that need to be resolved when using these methods.
|D. Petranović (Institut za elektroprivredu i energetiku d.d., Zagreb, Croatia)
Football Stadium Reflector Aiming Using Genetic Algorithms
Football pitches lighting must allow play and monitor the game (directly at stadium and indirectly via the TV broadcast). Requirements to lighting are given in the relevant standard. Aiming of the football stadium reflectors is the longest job which should ensure a high enough horizontal and vertical Illuminance with the required uniformity. Development of methods of genetic algorithms provides the possibility that this part of the project leave to the computer. The paper will deal with methods of applying genetic algorithms to this activity. The results will be demonstrated on a real football stadium and will include numerical and graphical results of the optimal design for a variety of criteria (highest mean value, the highest minimum value, maximum uniformity, etc.).
|D. Radošević, T. Orehovački, I. Magdalenić (FOI, Varaždin, Croatia)
Towards Software Autogeneration
Program generators are usually aimed for generation of program source code. This paper introduces the idea of software source code generation and its execution on demand, that we named as Autogeneration. Autogeneration avoids generation of program files by using the possibility of scripting languages to evaluate program code from variables. There are several features that could be achived by autogeneration. Some of them are program update during its execution, optimized code without temporarily unnecessary instructions and introspection of generation process for development purposes.
An example of web application for database content management that is implemented as an autogeneration process is presented and discussed.
|E. Cherkashin (Institute of System Dynamics and Control Theory SB RAS, Irkutsk, Russian Federation), I. Terehin (Irkutsk State University, Irkutsk, Russian Federation), V. Paramonov (Institute of System Dynamics and Control Theory SB RAS, Irkutsk, Russian Federation), V. Tertychniy (Irkutsk State Technical University, Irkutsk, Russian Federation)
New Transformation Approach for Model Driven Architecture
Any software development life cycle consists of distinct stages and involves various formal and informal models, agents (designers, software developers, users, etc.) and technologies. Various combinations of stages form a number of software development life cycle schemes, such as waterfall and spiral models, ‘V’-model, agile and extreme approaches, iterative and incremental development, and various improvement models . All the approaches make use of models of various degrees of abstraction and formalization. We consider a general case of the life cycles as a process of adaptation of new ideas, requirements and specifications, where each adaptation event is a model change. The adaptation implies modification of all the affected due to the event. Thus, the software development process is represented as propagation of the modifications within a set of models representing the software under development. The target of the research is to construct an approach to describe the process of the modification propagation as a basis of a corresponding instrumental environment for software development.
The modification propagation in our approach is interpreted in terms of model transformation of Model Driven Architecture (MDA) . In the ideal case the transformation is automatic. MDA is aimed at automation of creative activity of designers and programmers and implemented in some instrumental software. The software development technique of MDA is based on multistage transformation of Platform Independent Model (PIM) into a number of Platform Specific Models (PSM). PIM is a model of the software reflecting most of the structural and some semantic aspects of the software, but the model contains no information about implementation of the structures on the target program architecture. The software development tools having a model at input generate a model or source code, which is also considered as a software model.
The transformation of the PIM into PSMs is carried out under control of a Platform Model (PM) and a transformation scenario. PM contains information and algorithms of PIM’s structure analysis and generation of corresponding structures in PSMs. The main idea of MDA is to allow developer to modify PM according programmers' preferences and task properties. Our experience shows that usage of present logical languages and PMs based on formalized knowledge  allows us to affect the transformation in an efficient way by means of refinement of the rule set content.
MDA is successfully used in development complex software, but it has significant disadvantages. Usage of MDA in simple projects usually extends time of software construction; MDA is of little use in already constructed and implemented systems and systems based on stored data manipulation; modification of PIM and source code is ignored by the procedures of transformations. To deal with these disadvantages we consider application of the theory of complex systems and complexes, successfully used in geography research , to software life cycle, implying that the software development is a natural process. The life cycle is represented as system of models and morphisms between them.
The application of the theory of complex systems and complexes to the software life cycle show, that the implementation of the modification propagation does not require extension of the expressive capability of the models. The modifications are represented as morphisms between changing parts of the software model by means of the same informational models. Main problem is to create new rules for PM to interpret the morphism during the propagation. A technique of implementation of the interpretation as knowledge based system is presented as main results of the theoretical research.
The proposed approach to the MDA technologies extension allows us not only to cope with the above mentioned disadvantages, but to develop MDA further to support automatic PM knowledge mining and software development process based on logical inference on the analogy of a previously constructed history of modifications of the model set.
 “Software development process - Wikipedia, the free encyclopedia”, access date – 08-jan-2012, http://en.wikipedia.org/wiki/Software_development_process.
 D.S. Frankel. “Model Driven Architecture: Applying MDA to Enterprise Computing”. Wiley Publishing, USA, 331 p.
 E.A. Cherkashin, V.V. Paramonov An Intelligent Programming System for Information System Generation. // MIPRO 2005, XXVIII International Convention, May 30-June 03, 2005, Opatija, Croatia.- pp. 140-143.
 “Homology And Homoyopy in Geographic Systems”. A.K. Cherkashin, E.A. Istomina eds. Novosibirsk Academic Publishing House “GEO”, Novosibirsk, Russia, 2009, 351 p. (in Russian)
|A. Sokolov, I. Golovko (Irkutsk State Technical University, Irkutsk, Russian Federation), E. Cherkashin (Institute of System Dynamics and Control Theory SB RAS, Irkutsk, Russian Federation)
Extraction of Thesaurus and Project Structure from Linux Kernel Source Tree
The evolution of instrumental software is aimed at shifting the programmer’s creative activity to the higher level of abstraction. The programming technologies like UML, CASE, MDA (Model Driven Architecture) allow programmer to consider (to model) the software systems under development on the higher level of abstraction as compared to the source code of subsystems; all of them usually generate parts of the source code from their abstract models. Ontological models are abstract models which usually denote only the terminological basis of the developed software. The ontological models reflect a set of terms used by programmers to describe software artifacts, various technical documentation and other texts related to the software. We consider a problem of ontology models construction and use them in development automation within the software life cycle.
The ontology model (hereafter “ontology”) can be extracted from the texts as thesaurus and be further developed in the life cycle of the software. From other hand side the ontology modification implies also the texts and the software modifications, thus, the ontology controls in some degree the software development process. To support the statement above we try to construct a thesaurus used in the Linux kernel project, and investigate the correlation between the thesaurus terms structure used by programmers in their commit messages and the branching structure of the project.
In order to compare the thesaurus and the project structure a tree of project structure should be extracted from GIT-tree of Linux kernel source code. The project structure tree reflects the division of the project to inferior subprojects and kernel to its subsystems. To represent the tree a theory of system of complexes (configurations) is used. The theory allows us to represent the tree in a number of dimensions to take advantage of additional information of the source code tree of the Linux kernel, such as timelines, branching and merging structure.
The source information for the investigation is a mesh-like graph structure representing all development, branching, and merging modes of the Linux kernel sources. Each new branching node can be interpreted as starting or continuation point of a subproject generally and particularly, but many of the branches are devoted to fixing errors. Merge points can be interpreted as finishing or committing points of a subproject. So the analysis of the Linux sources tree structure is in general is a pattern recognition problem.
An application is being developed for processing the Linux tree of sources. The application is being implemented in the Python programming language; the tree of sources is accessed by means of a Python GIT library. The tree is converted into a set of evidences about branching and merging. These evidences are stored in a text file. At the next stage the text file is converted into GNU Prolog facts. This set of facts is analyzed by Prolog rules and the evidence set is reduced to a tree.
The thesaurus extraction subsystem is based on . Each commit message is considered to be independent text, characterizing a corresponding stated or solved task. At the first step of the processing the set of all the messages are extracted from the Linux source tree and stored in a fast indexed accessed warehouse. At the second stage a stemmed text index generated by means of “Sphinxsearch” service (www.sphinxsearch.com) is created. A terminological basis is induced at the third stage: the program finds a greatest sequence of neighbor words – terms – whose appearance probabilities does not significantly depend on length of the sequence. The constructed set of terms forms comparison basis for the messages. If term appearances probabilities are similar for chosen two texts then the texts are similar. At the next stage the hierarchical clustering is induced. For each mode a term from the terminological basis is associated; the term is randomly chosen from the most probable in the corresponding set of messages. The constructed hierarchy of terms is the thesaurus.
If a correlation between the thesaurus and Linux kernel project structure will be find then it will be possible to predict the set of programming artifacts to be modified in case of the thesaurus modification. The results of the investigations supposed to be used as an advisory or a project planning subsystem in a new generation of MDA-based instrumental software. For example, having modified a formalized ontology, the instrumental software can suggest a modification of an UML Class Diagram.
|M. Jurišić, D. Kermek, M. Konecki (Fakultet organizacije i informatike, Varaždin, Croatia)
A Review of Iterated Prisoner's Dilemma Strategies
The iterated prisoner's dilemma game is a widely used tool for modeling and formalization of complex interactions within groups. Every player tries to find the best strategy which would maximize long-term payoffs. Tournaments were organized to determine whether there is a single best stable strategy.
This paper presents a summary of tournaments held in 1980, 2004 and 2005, reviews strategies which were presented during the last 30 years, both in tournaments and in scientific literature and outlines current issues and trends.
|D. Drašković (University of Belgrade, Faculty of Electrical Engineering, Belgrade, Serbia), S. Vukićević (University of Belgrade, Faculty of Organizational Sciences, Belgrade, Serbia)
A Model of Software System for Parking Using Search Algorithms
In this paper a model of software system for parking using search algorithms is described. The basic idea behind this system is to enable user to find parking space in a busy car park, quickly and easily. During a search, a modified branch and bound method is used.
A user of the system is connected to a server when entering the parking. The server sends the schema of parking in a predefined XML format to a smart phone, as well as the identifier of the entrance where the user is located. Then, a schema of parking is created on the phone and displayed with a route to the closest parking space. The phone sends an identifier of parking space that a user selected to the server. The software system has been developed to be used as an educational system for teaching of the academic course of Expert Systems, but it could be improved and used in real-life applications.
|M. Gulić (FER, Zagreb, Croatia), D. Lučanin (Vienna University of Technology, Vienna, Austria), N. Skorin-Kapov (FER, Zagreb, Croatia)
A Two-Phase Vehicle Based Decomposition Algorithm for Large-Scale Capacitated Vehicle Routing with Time Windows
With significant advances in computing power during recent years, increasingly complex variants of the vehicle routing problem (VRP) with added constraints are coming into focus. VRP is a combination of the classical traveling salesman and bin packing problems, with many real world applications in various fields - from physical resource manipulation planning to virtual resource management in the ever more popular cloud computing domain. In this paper, we consider large-scale VRP problem instances with time window constraints. Due to their complexity, we propose a solution approach based on the divide and conquer paradigm, decomposing problem instances into smaller, mutually independent subproblems which can be solved using traditional algorithms and integrated into a global solution of reasonably good quality. Numerical results indicate the efficiency and scalability of the proposed approach, making it highly applicable to large-scale realistic VRP problem instances.
|D. Donko, A. Alispahic (Faculty of Electrical Engineering University of Sarajevo, Sarajevo, Bosnia and Herzegovina)
Model and Implementation of Monitoring Centre for Global Positioning System
The paper present utilisation of existing aspects such as global navigation, global positioning system (GPS) and virtualization of Earth using satellite images in order to store and analyse particular data. The analysed GPS data are related to the position of important person, vehicle or device. Described multidimensional data model presented in the paper and developed application allow deeper analysis and insight in the more dense space on the map in the specific amount of time. The paper also describes practical aspects of implemented monitoring centre for GPS devices and possible future improvements.
Slobodan Ribarić (Croatia), Andrea Budin (Croatia)
International Program Committee Chairman:
Petar Biljanović (Croatia)
International Program Committe:
Alberto Abello Gamazo (Spain), Slavko Amon (Slovenia), Michael E. Auer (Austria), Mirta Baranović (Croatia), Ladjel Bellatreche (France), Nikola Bogunović (Croatia), Andrea Budin (Croatia), Željko Butković (Croatia), Željka Car (Croatia), Matjaž Colnarič (Slovenia), Alfredo Cuzzocrea (Italy), Marina Čičin-Šain (Croatia), Dragan Čišić (Croatia), Todd Eavis (Canada), Maurizio Ferrari (Italy), Bekim Fetaji (Macedonia), Tihana Galinac Grbac (Croatia), Liljana Gavrilovska (Macedonia), Matteo Golfarelli (Italy), Stjepan Golubić (Croatia), Francesco Gregoretti (Italy), Niko Guid (Slovenia), Yike Guo (United Kingdom), Jaak Henno (Estonia), Ladislav Hluchy (Slovakia), Vlasta Hudek (Croatia), Željko Hutinski (Croatia), Mile Ivanda (Croatia), Hannu Jaakkola (Finland), Robert Jones (Switzerland), Peter Kacsuk (Hungary), Aneta Karaivanova (Bulgaria), Miroslav Karasek (Czech Republic), Bernhard Katzy (Germany), Christian Kittl (Austria), Dragan Knežević (Croatia), Mladen Mauher (Croatia), Branko Mikac (Croatia), Veljko Milutinović (Serbia), Alexandru-Ioan Mincu (Slovenia), Vladimir Mrvoš (Croatia), Jadranko F. Novak (Croatia), Jesus Pardillo (Spain), Nikola Pavešić (Slovenia), Ivan Petrović (Croatia), Radivoje S. Popović (Switzerland), Goran Radić (Croatia), Slobodan Ribarić (Croatia), Karolj Skala (Croatia), Ivanka Sluganović (Croatia), Vanja Smokvina (Croatia), Ninoslav Stojadinović (Serbia), Aleksandar Szabo (Croatia), Laszlo Szirmay-Kalos (Hungary), Dina Šimunić (Croatia), Jadranka Šunde (Australia), Antonio Teixeira (Portugal), Ivana Turčić Prstačić (Croatia), A. Min Tjoa (Austria), Roman Trobec (Slovenia), Walter Ukovich (Italy), Ivan Uroda (Croatia), Mladen Varga (Croatia), Tibor Vámos (Hungary), Boris Vrdoljak (Croatia), Robert Wrembel (Poland), Baldomir Zajc (Slovenia)
Registration / Fees:
REGISTRATION / FEES
Price in EUR
Before May 7, 2012
After May 7, 2012
|Members of MIPRO and IEEE
|Students (undergraduate), primary and secondary school teachers
Faculty of Electrical Engineering and Computing
HR-10000 Zagreb, Croatia
Phone: +385 1 612 99 52
Fax: +385 1 612 96 53
Ericsson Nikola Tesla Inc.
HR-10000 Zagreb, Croatia
Phone:+385 1 365 34 23
Fax: +385 1 365 3548
Opatija, often called the Nice of the Adriatic, is one of the most popular tourist resorts in Croatia and a place with the longest tourist tradition on the eastern part of Adriatic coast. Opatija is so attractive that at the end of the 19th and beginning of the 20th centuries it was visited by the most prominent personalities: Giacomo Puccini, Pietro Mascagni, A. P. Čehov, James Joyce, Isidora Duncan, Beniamino Gigli, Primo Carnera, Emperor Franz Joseph, German Emperor Wilhelm II, Swedish Royal Couple Oscar and Sophia, King George of Greece.
The offer includes 20-odd hotels, a large number of catering establishments, sports and recreational facilities.
For more details please look at www.opatija.hr/ and www.opatija-tourism.hr/.
|Currently there are no news